School for Skeptics

In the internet age, literacy means distinguishing between fact and fiction


When the BBC offered a quiz titled “Can You Spot the Fake Stories?” I was confident that I would do well. With a  master’s degree in journalism, I thought falling for “fake news” only happened to other people. But I was  fooled four times on the seven-question quiz. I’m not the only one who has trouble with this. Even the digitally  savvy generation now growing up has a difficult time distinguishing credible content from fake stories.

In 2015,  Stanford University launched an 18-month study of students in middle school, high school, and college across  several states to find out how well they were able to evaluate the information they consume online. Nearly  8,000 students took part in the study, and the results showed that they were easily duped. Many middle schoolers couldn’t tell the difference between a news story and an advertisement. College students weren’t  able to distinguish a mainstream source from a group promoting a certain point of view. Students often  decided if somethingwas credible just by how polished the website looked.

The study highlighted a  fundamental problem: Today’s students are struggling to differentiate fact from fiction online. “We’re living in the most overwhelming information landscape in human history,” says Peter Adams, a senior vice president  for the News Literacy Project, a non-profit organization that aims to add information literacy to middle and  high school classrooms across the United States. “It’s confusing because people are consuming information in  an aggregated stream, and social media gives things uniformity. A post from a conspiracy theory blog looks the same as a post from the Washington Post.”

To help students learn how to evaluate and verify information, the  News Literacy Project launched a virtual classroom called Checkology. One part of the web-based tool allows teachers to present students with news reports, tweets, and other social media posts. The students mustdetermine whether they are credible by looking for a variety of “red flags.” Jodi Mahoney found  Checkology last summer while researching ways to educate her students about fake news. Now she uses it in her classroom, where she teaches students about technology, from email etiquette to basic coding. “What’s the best way to prevent yourself from spreading misinformation?” she asks a group of sixth-graders at Carl Von  Linné Elementary School in Chicago. Eleven-year-old Michael raises his hand. “I think, first you double-check  the site where you got it from,” he says. “Then look for clues to see if it’s credible.”

“Good. What kind of clues?” Mahoney encourages the students to start naming them. One student calls out  that you want to avoid clickbait. “OK, what’s clickbait?” she asks. The room is quiet. “If you’re not sure, look it up. Let’s Google it.” The class decides that clickbait is something “designed to get attention or arouse emotion.” The students have learned that’s a red flag because a strong emotional reaction can override your ability to  critically evaluate information – a tendency often exploited by people trying to spread misinformation.

Next,  Mahoney asks them to log in to to practice figuring out whether information is fact or fiction. “Go to module three,” Mahoney instructs. The students put on headphones and log in. A few minutes later, 12- yearold Guadalupe struggles to determine whether a sample Facebook post sharing an article headlined “CDC  Issued a Warning – Don’t Get a Flu Shot This Year” is real.

She ultimately decides it’s real because the post “gave   lot of facts about the flu” and included a source. She clicks “fact,” and Checkology corrects her. This post  as fiction. “That lesson shows that just looking at it doesn’t give you what you need to know,” Adams  explains. “If you don’t go upstream to another source, you can’t know if it’s true or not.” While the sixth-graders  an’t always tell fact from falsehood, Mahoney says she appreciates that Checkology encourages students to be sceptical. “They are so comfortable using the internet that they don’t question it,” she says.

She  sees it at home too. “My [third-grade] daughter recently told me that the platypus wasn’t a real animal because  of a YouTube video she saw.” After the class completes a module, Mahoney can create a  spreadsheet to  see how the students did. “The first week,they all scored very low,” she says. “The data showed me  that I  needed to be concerned.” At that point, her students couldn’t distinguish among types of media: News, entertainment, ads – they all seemed the same to them.

After 13 weeks, she says, she’s starting to see students connect the dots, but emphasizes that they need to continue to practice. She adds, “This needs to be taught all the way through college.”Mahoney included a unit  on fake news for her sixth-graders, because that’s when most of her students get a mobile phone. “They start getting bombarded with content in fifth, sixth, or seventh grade,” she says. She also wants schools to put more  emphasis on teaching news literacy. “We spend a lot of time lecturing kids on what not to do on the internet  and how to be safe on the internet,” she says. “Now we need to teach them how to understand the content that’s out there.” Former teacher Michael Spikes agrees. When he taught media studies and news production  to high school students in Washington, D.C., he would tell them, “You can’t be Sponge-Bob and just absorb. You have to be an active consumer of information.” His mantra: “Where is the evidence?”

Now he’s the project director for educator training and digital resources at the Center for News Literacy, a program of New York’s Stony Brook University. Part of his job is to help teachers integrate news literacy into  their curricula. “I make my workshops very teacher-centric,” Spikes says.

He often encourages educators to use  the Center’s free resources within an existing curriculum – social studies, language arts, civics – given that every state has its own standards and that teachers can’t always adopt a whole course. “We take a ‘train the  trainer’ model in our approach and focus on teaching educators our content,” Spikes says. “High school   teachers are our largest audience right now, followed by college educators.” But, he says, the Center plans to  expand to middle school as well, calling that age the “sweet spot” to learn information literacy. “We’ve gone  from Gutenberg to Zuckerberg,” Spikes says. “We now have unfettered access to information.

Along with that, we’ve become not only consumers of information, but publishers as well.” Because anyone can put content  online and reach a wide audience, he’s adamant that information literacy needs to be integrated into public education. “We teach much more than spotting fake news,” he explains. “It’s about developing a critical eye, so  when a student comes across a Facebook or Twitter post, or a site that seems to have all the answers they  need, they ask, ‘Hold on, is this info verified? Is the source independent, or is it tied to some type of organization?”

We thought of young people as digital natives who would know how to discern information  because they know how to turn on the mobile phone and get on the internet, but less than 20 percent know  how to critically evaluate the information that’s in front of them. In our media landscape, it’s up to us to figure  out what’s reliable and what’s not.”  In July 2017, the Pew Research Center partnered with Elon University’s  Imagining the Internet Center to solicit experts’ predictions on whether new methods would emerge over the  next 10 years to block false narratives and  allow accurate information to prevail online – or if the quality of information  online would deteriorate.

They collected responses from 1,116 people, including experts in the technology   sector, researchers, journalism professors, experts in internet policy, and media watchdogs. Just over half of respondents (51 percent) said the accuracy of information online would deteriorate. The rest thought it would improve. Of those who believed it would not improve, many said our natural preference to believe stories that  confirm our biases and craving for validation would continue to be reinforced on social media, worsening echo  chambers.

They also couldn’t imagine a technological solution that someone couldn’t manipulate. But  those who were more optimistic predicted that technological advances would become better at stopping the  spread of misinformation. One compared it to how spam filters were created to sort out junk email.

They also  predicted that improved information literacy would help people become better judges of the accuracy of  content online. “Echo chambers and filter bubbles will continue to exist, as these attitudes are typical of  people’s behaviour offline and online,” said survey respondent Sharon Haleva-Amir, a lecturer at Bar-Ilan  University in Israel. “In order to change that, people will have to be educated from early childhood about the importance of the credibility of sources as well as the variability of opinions that create the market of ideas”

Pew researchers summarized one of the main themes to emerge from the survey this way: “Technology alone can’t win the battle. The public must fund and support the production of objective, accurate information.  It also must elevate information literacy to be a primary goal of education.” “Figuring out what’s news and what’s credible is a daunting task for most adults,” Adams says. “News literacy skills give all  consumers, but especially teens, a chance to discern credible information, which is vital to civic involvement and democracy.” This story first appeared in the July 2018 edition of The Rotarian’ magazine.