Search This Blog

Thursday, April 15, 2021

Spotting Fake Science

  In The Politics of Autism, I write:

Many articles and blog posts arguing for the vaccine-autism link have the trappings of genuine academic research: tables, graphs, citations, and scientific jargon. Some of the authors have credentials such as M.D. or Ph.D. degrees. None of these things is a guarantee of scientific value, as the history of science is full of crackpot theories (e.g., AIDS denialism) that are the heavily-footnoted products of people with letters after their names. But most people will not be able to spot the scientific weaknesses of such work. Outside of academia, few understand concepts such as peer review. Jordynn Jack describes one dubious article that appeared in a non-peer-reviewed publication: “Regardless of the scientific validity of the article, though, the writers perform the writing style quite effectively. It would be difficult for the layperson to distinguish this article from any other scientific research paper, especially if one did not investigate the nature of the journal … or of the scientific response to the article.”

Frederick Hess at AEI:

[Sam] Wineburg, Stanford’s Margaret Jacks Professor of Education, studies how people judge the credibility of digital content. A former history teacher with a PhD in education psychology, he’s perhaps the nation’s leading scholar when it comes to helping people figure out what’s actually true on the Internet. I recently had the chance to talk with him about his work and the practical lessons it holds.

Wineburg approaches his work with a simple guiding principle: “If you want to know what people do on the Internet, don’t ask them what they would do. Put them in front of a computer and watch them do it.”

He recounts a 2019 experiment studying how high school students evaluate digital sources, in which 3,000 students performed a series of web tasks. One task asked students to evaluate a website about climate change. Wineburg notes, “When you Google the group behind it, you learn that they’re funded by Exxon—a clear conflict of interest. Yet, 92 percent of students never made the link. Why? Because their eyes remained glued to the original site.” In other words, looking into the source of information is essential to judging its veracity—and yet, students didn’t make that leap.

In another study, Wineburg compared how a group of PhD students and Stanford undergraduates stacked up against fact-checkers at leading news outlets in New York and Washington when it came to assessing the credibility of unfamiliar websites. He says that fact-checkers speedily “saw through common digital ruses” while trained scholars “often spun around in circles.”

Why? Wineburg concludes, “The intelligent people we’ve studied are invested in their intelligence. That investment often gets them in trouble. Because they’re smart, they think they can outsmart the Web.” The result is that when they see a professional-looking website with scholarly references, they conclude it’s legitimate. “Basically,” he says, “they’re reading the web like a piece of static print—thinking that they can determine what something is by looking at it . . . On the Internet, hubris is your Achilles heel.”

Fact-checkers employ a different approach, one that Wineburg terms “lateral reading.” This involves only briefly looking at a website, then leaving it to search for background information on the organization or group behind the original site to determine if it is worth returning to. “In other words,” he says, “they learn about a site by leaving it to consult the broader Web.”

The problem for educators, according to Wineburg, is that this goes against the grain of how teachers usually teach students to evaluate a text. Usually, students are taught to read carefully and fully, and only then render judgment. “Yet, on the Web, where attention is scarce, expending precious minutes reading a text, before you know who produced it and why, is a colossal waste of time,” Wineburg says.

In fact, the usual methods teachers use for addressing online credibility are mostly unhelpful. Wineburg laments that we often approach the subject like a game of twenty questions. We ask, “‘Is the site a .org?’ If so, ‘It’s good.’ ‘Is it a .com?’ If so, ‘It’s bad.’ ‘Does it have contact information?’ That makes it ‘good.’ But if it has banner ads? ‘It’s bad.’” The problem, he says, “is that bad actors read these lists, too, and each of these features is ludicrously easy to game.”

To help teachers wrestling with all this, Wineburg and his collaborators have created a “digital literacy curriculum” with 65 classroom-ready lessons and assessments, a complementary set of videos, and an online course on “Online Civic Reasoning” done with MIT’s Teaching Systems Lab. Wineburg notes that all of these materials are free and can be downloaded by registering at sheg.stanford.edu.