Sunday, November 6, 2016

Digital Natives Are Lost

I have had this conversation a thousand thousand times with people of my own generation, people who don't actually work with students. They will be going on about their own computer illiteracy and waxing rhapsodic about the super-duper skills of the young generation, the digital natives.

"You don't understand," I'll tell them. "The vast majority of my students don't know jack about modern technology. They know how to operate one or two apps that they use regularly. Beyond that, they are as lost as their own grandparents."

And now there is research to back me up.

At EdWeek, Sam Wineburg and Sarah McGrew report on their own research at Stanford as well as research by folks at Northwestern. 

In the Northwestern study, college students turned out to believe that Google lists links in the order of accuracy and trustworthiness-- good news for all those people making a living optimizing websites for search ranking, and bad news for everyone wishes people would stop using the internet to make themselves stupider.

Stanford's study involved several different tests; the results of all were depressing.

At every level, we were taken aback by students' lack of preparation: middle school students unable to tell the difference between an advertisement and a news story; high school students taking at face value a cooked-up chart from the Minnesota Gun Owners Political Action Committee; college students credulously accepting a .org top-level domain name as if it were a Good Housekeeping seal.

In a particularly alarming exercise, twenty-five Stanford students (as Wineburg and McGrew point out, a super-selective group from a university that rejects 95% of its applicants) could not tell the difference between the American Academy of Pediatrics and the American College of Pediatricians. The first is a legitimate professional organization and the second is a fringe group that ties homosexuality to pedophilia and made the Southern Poverty Law Center hate group list. More than half of the students determined that the hate group was "more reliable" as a source.

None of this surprises me. My students are adept at operating their favorite apps and can managed the backwater sites that are now out of favor (Facebook? Puh-lease!) They can play whatever game is big at the moment (as near as I can tell phone games have about a two-week life). But not only do they not make very good use of the internet as a ready source of information, both factual and craptaculous, it doesn't even occur to them to look things up in the first place. I find this crazy maddening-- my curiosity has been an itch that I couldn't reach for much of my life, and now modern tech means I can carry a long backscratcher with me everywhere. And yet, I am regularly responding to student questions with, "Gee, if only there were a way to quickly access all collected human knowledge." And I teach, minds you, at a one-to-one school-- every single one of my students has access to at least one computer device.

Fifteen years ago, I had students who could design a website from scratch, writing their own code and design work. No longer. This is not an abnormal progression. Early adopters of new-fangled automobiles had to be prepared for and capable of doing their own mechanical work to keep the vehicle functioning. Within a generation or two, being a gearhead had become a pastime for a select few. Fast forward to today, when some automotive systems cannot be worked on except with specialized training and tools. Making technology more accessible and usable (and therefor marketable) means freeing the user from any need to do maintenance and repair.

But instead of a communication or transportation system, we're now doing this with an information system, and we have a problem that parallels a mistake found in some education programs. Some policy makers and edubiz folks are trying to push a model of reading that treats it as a group of discrete skills, decoding tricks that are independent of what the words actually say. But reading cannot be separated from content; how well you can read is inextricably tied to what you know. And how well you can research and filter the research you find is inextricably tied to what you know.

There are skills we can teach. The EdWeek piece says that good fact-checkers do three things that help:

1) When facing an unfamiliar site, leave it and find out from other sources if it's reliable or worthwhile. Far better than going ahead and reading the site itself.

2) Same idea-- don't depend on the site's own "about," because no site has a page in which it explains why it's actually full of baloney. Not on purpose.

3) Ignore the search engine ordering results.

And that's before we even get to effective search methods. The majority of my students make truly bizarre use of Dr. Google with a grab-bag of random search terms and no awareness that there are tools for narrowing the search.

But beyond those (and other) simple check techniques, you have to actually Know Something. If you want to know if the painting in your attic is worth a million dollars or a buck and a half, you have to talk to someone who actually knows the difference. Talking about 21st Century Skills as if they aren't tied directly to knowledge is bunk.

Meanwhile, folks who think "Let the students go on the net and educate themselves" is a plan must be unfamiliar with both students and the internet. Just as lots of natural-born citizens of the USA could not pass the citizenship test if their lives depended on it, many digital natives have never tried to explore, understand, or make sense of the tech landscape into which they've been born. We really need to do better-- I'd suggest we get at it before the next election rolls around.


  1. There is serious confusion about whether or not sorting out truth from falsehood is the end of education or just an intermediate skill in the process. Ultimately, the reason you study history or science is so you aren't an ignorant sucker. You can't avoid that fate by just learning a few research tips and skills.

  2. I really love your made up words and I think "craptaculous" is my favorite!

  3. I couldn't agree more. I like your analogy with early car owners. Too many "educators" have leapt to the assumption that putting computers or tablets in front of kids will somehow magically transform them into educated beings. It just ain't so. Frankly, I believe that education in its most profound sense can't be separated from genuine and ongoing human interaction. Without the human factor, "education" is just programming.

    And I just have to note that Bart Simpson originated a similar word in a Simpsons Christmas episode: "craptacular." But your variant works, too!

  4. On a related, but slightly different topic: the emphasis on STEM and, more recently, coding is creating a mythology around student technology use that leads to a false sense of mastery/proficiency (much as I dislike those words). When I first heard a few years ago that kids would be coding, I was thinking in terms of learning DOS or COBOL or Fortran (yeah, I really am that old, so my frame of reference is skewed by punch cards and mainframes) to create programs. Nope; it's primarily using drag-and-drop modules created by someone else to accomplish a certain task. They have no idea what the process looks like before it gets to them. Using iPads/tablets makes the problem worse because the students can't see the background files, folders, applications, and operating systems. Maybe - like your driving a car example - they don't need be able to fix everything, but knowing what parts there are would be helpful.

    1. Similar to the reason everyone's nephew is a "web developer!" That said, my 7-year-old came home from 1st grade one day last year and showed me a coding lesson he got at school that used Minecraft to teach if-then, etc statements. He would not be capable of learning about complex ordering and files, but I felt that he was learning foundational concepts. I only bring this up because I have hope! :)