Fighting Fake News: Lessons From the Information Wars
Executive Summary
Fake news is a problem that threatens the very roots of modern democracy. Yet there’s no easy or simple solution to fighting fake news.(1)
There are far too many incentives to spread false information. Nations often promote fake news as part of disinformation campaigns to further political agendas. Profiteers generate fake news material and distribute it widely for advertising revenues. The design of online social media platforms enables — and often encourages — the widespread distribution of fake news. The result is significant amounts of disinformation cutting across the world’s media landscape. Research on the impact of fake news is sparse. Although research on propaganda goes back decades, modern fake news is typically spread through online social networks and comes more clearly under the guise of “news,” making it different from state propaganda. What’s more, new technologies can frequently make the problem worse. For instance, bots increasingly pervade social networks, often amplifying fake news–like messages. One study estimates that 9 to 15 percent of Twitter users are bots, and, at least some research suggests, they play a key role in amplifying fake news.(2)Fake news has other costs: mainstream media and fact-checking sites have to spend time debunking the latest viral lie. Scholars also suspect that conflicting reports between fake news outlets and actual news outlets can lead to skepticism or disengagement of all media, fake and legitimate alike.
The proliferation of fake news also comes at a time when trust in legitimate news sources has reached historic lows.(3) Only 32 percent of Americans say that they have “a great deal” or “a fair amount” of trust in the media to report the news “fully, fairly, and accurately,” which is the lowest ever documented.(4)
Another fear is that the combination of fake news and online social networks, at least as they currently work, produces high levels of extremism. Since social networks want to hold users’ attention, they routinely suggest material that users might like, based on what they have already seen. Demonstrations with YouTube suggest that merely following the suggested videos there quickly leads to increasingly extreme content.(5)
Given these dynamics, fake news raises clear risks to democracy.(6) Democratic governments depend upon voters being well informed, yet elections across the world have been rocked by disinformation campaigns. Many Western countries have accused Russia, for instance, of using fake news to disrupt elections, and political actors have used fake news to further their agendas in several countries, including in Myanmar, the Philippines, and Indonesia.(7)
False information and media manipulation will always be with us. But changes to technology — and media consumption — have made it harder for citizens to detect weak information and have made the influence of fake news more noxious and widespread.(8)
So what can be done? What should be done?
This report aims to answer how we can fight fake news, and it relies on reviews of the literature, interviews with experts, evaluations of state’s websites, and our own independent research conducted with thousands of individuals via an online platform.
Our main findings demonstrate that:
Simple interventions — like reading an article on how to spot illegitimate sources of information — can help people identify fake news.
The Reboot Foundation has conducted experiments that demonstrate that people can become better at identifying fake news. Specifically, our team of researchers looked at the effectiveness of three short interventions that aimed to improve people’s ability to distinguish fake and legitimate news.
We found that participants who watched a short video or read an article outlining how to go about determining if a news article is legitimate, or not, identified fake news at higher rates than before the intervention. Interestingly, after participants played a game in which they were asked to moderate a news feed, their ability to discern fake news did not increase.(9)
Simply making people aware of the problem of disinformation campaigns might be enough to make people more discerning in the short-term. However, this finding raises the tantalizing idea of using simple interventions to teach people to be more savvy news consumers. Future research should explore the efficacy of these approaches.
The fake news crisis is ultimately a crisis of media literacy, and more than a third of students report rarely learning key media literacy skills like judging the reliability of a source.
Although the recent rise in fake news is enabled by technology companies, ultimately individuals are the ones sharing — and being exposed to — fake news.(10)
A crucial part of the solution to combating fake news is to increase the capability of people to identify fake or weak information. More must be done to teach people how to critically evaluate news sources and the common tactics of fake news purveyors.
But our research revealed that more than a third of middle school students in the U.S. say that they “rarely” or “never” learn how to judge the reliability of sources. We analyzed survey data from the National Assessment of Educational Progress, otherwise known as the “Nation’s Report Card,” and looked at eighth-grade students’ responses to questions regarding technology literacy and instruction. We also found that over a fifth of U.S. middle school students report “rarely” or “never” learning how to properly credit the ideas of others.(11)
People should also be able to check information about drugs that are sold on the Internet and especially about cheap analogues, it is very difficult to distinguish a real generic Viagra from a fake.
The good news is that interest in media literacy programs is growing.(12) Several organizations are lobbying for media literacy programs and also provide curricula, tools, and other resources to schools.
Governments can play an important role in supporting media literacy and developing a healthy public sphere.
Recent attempts by governments to confront fake news often take a frontal attack including passing libel laws, forming task forces that assess the legitimacy of news and news sources, and fining or potentially imprisoning people.
Such approaches raise concerns about censorship, political partisanship, and political opportunism. It’s also unclear whether such approaches truly improve the media climate; in some cases, they may corrode it.
Rather than trying to directly stop fake news through state power, a handful of examples suggest that supporting the growth of healthy media ecosystems is likely a better approach.
Governments can promote the development of media literacy programs and of in-depth, public interest journalism through financial incentives, school requirements, and even new forms of business organization. This approach avoids issues of censorship and is less likely to be misused for ideological or political ends.
Technology companies must do more to promote better and more informed forms of engagement on their platforms.
Social media platforms have been roundly criticized for their role in enabling and promoting misinformation.(13) In response, the platforms have made some changes. Facebook, in particular, has a team dedicated to stemming the flow of misinformation on the site, and they’ve done a lot to remove or limit the purveyors of fake news.(14)
But, ultimately, social media platforms will have to do more than just identify and remove fake news content. They need to redesign the user experience to combat (rather than prey upon) psychological biases and rethink how communities of people with diverse viewpoints can engage with each other productively.
The rest of this report outlines our findings and recommendations in greater depth. But what’s clear
— from Reboot’s own research — is that interventions against fake news can be effective and far more needs to be done to stop the disinformation campaigns that risk the very core of modern democracy.
Introduction to fighting fake news
The term “fake news” burst on to the scene in the run-up to the 2016 United States presidential election.(15)
While historians have not written an exhaustive history of the expression, the phrase appears to have gained traction after a Buzzfeed editor, Craig Silverman, came across outlandish headlines dominating Facebook feeds.(16)
Silverman could barely believe the headlines, which included items like “Pope Francis Shocks World, Endorses Donald Trump for President.” After some digging, Silverman noticed that the news items all came out of a small, rural town in Macedonia while gaining lots of traction on social media.
Silverman called the information simply “fake news.” As Silverman once told PBS, “it’s not like I sat down and said, ‘All right, I need the perfect term to describe this.’ It’s just sort of what came naturally.”(17)
After the election, politicians re-purposed the term “fake news” to attack legitimate news articles and organizations, even as many other outlets continued to use the term to refer to media passed off as legitimate news designed to deceive the news consumer. As a result, some leading media literacy organizations argue that the term “disinformation” is both more inclusive (covering not just news but other forms of information disruption) and less co-opted.(18)
In this report, we have, however, chosen to use the term “fake news” because it’s still the most widely used term to describe purposefully misleading headlines or articles. Furthermore, the approach is often designed to prey on anxieties, fears, or existing conspiracy theories and to frequently spread through social media. Fake news is not usually only a bald-faced lie, but a mixture of false claims and real events crafted (or algorithmically created) to appeal to a specific audience.
The exact effect of the rise of fake news on public discourse is unclear. One large-scale study found that, among fact-checked claims, false information diffused more rapidly and more widely through Twitter than true information for the 2006-2017 period.(19) This raises the troubling suggestion that, at least for articles that required some fact-checking, falsehood spreads faster than truth.
There have also been several high-profile events inspired by fake news: notably, Pizzagate.(20) A man inspired by fake news articles claimed that a pizza restaurant in D.C. was the hub of an underground pedophile ring. He drove to the restaurant to “investigate” the claims and fired a rifle off inside the restaurant while searching for evidence. The restaurant owners, staff, and even owners of nearby restaurants continued to receive death threats from people who believed these (extensively debunked) claims.
That said, it’s possible that claims about a fake news crisis are overblown. After all, many Facebook users share news articles without reading them, which is both an indication that many people aren’t thinking critically about what they share, and potentially an indication that they’re not really engaging with fake news in the first place.(21)
At the same time, people’s beliefs are hard to change. Even expensive political campaigns seem to be ineffective at persuading people, so it’s plausible that the rise of fake news has less dire consequences than many fear.(22)
The limited studies we have on the prevalence of fake news over social media also emphasize two important points. First, most people do not share fake news stories. A Twitter panel study found that only 0.1 percent of users accounted for nearly 80 percent of the fake news sources shared. (23) A Facebook panel study found that over 90 percent of users never shared a fake news article.(24)
Second, older users share fake news at much higher rates than younger users, regardless of political ideology. The Facebook panel study found that users over 65 shared over twice as many fake news articles as the next oldest group (45-65 years old) and seven times as many fake news articles as the youngest group (18-29 years old). The Twitter panel study found a similar trend. These findings suggest that the fake news phenomenon is concentrated in specific subpopulations and that sharing fake news stories is simply not a prevalent behavior.
Psychological research, however, suggests that the proliferation of fake news can still cause problems. For example: the repetition of incorrect information makes the information feel true, leading people to accept statements that they might otherwise question.(25) This is known as the “illusory truth effect.” And it occurs even if we have some knowledge about the questionable information.(26) Research demonstrates that this effect applies to fake news, suggesting that mere repeated exposures of headlines can impact voter beliefs.(27)
Methodology
The Reboot Foundation took several approaches to discover these lessons on fake news.
First, to gain a better sense of the field, we conducted a literature review of publications concerned with the fake news phenomenon across several different disciplines. Most of these articles were published between 2016 and 2019.
Second, we tried to find all of the important media literacy organizations, fact-checking organizations, and educational apps in this space. After coming across several foreign language media literacy organizations, we decided to restrict ourselves to English-speaking organizations when possible.
We also pursued our own independent research on the efficacy of interventions intended to make people better at identifying fake news. All participants initially reviewed a set of real headlines: some legitimate, some fake. Participants in a control group played a short, unrelated game, while participants in the intervention group participated in an intervention aimed to make participants better news consumers — an article, a video, or a videogame — before evaluating news headlines again.
Over 2,000 people on Amazon’s Mechanical Turk participated near the end of April, 2019. All headlines were actual stories recently published on online media sites, with half of these articles classified as fake news because of deliberate misinformation and misleading headlines.
Some examples of real headlines we used included: “Virtual Reality Helping in Fight Against Opioid Deaths,” “Conservative Activist Punched at UC Berkeley,” “A Border Agent Detained Two Americans Speaking Spanish. Now They Have Sued.”
Some examples of fake headlines we used: “Billions Take to the Streets in Celebration as President Trump Repeals Obamacare,” “Government Releases the Cure for Cancer, but with the Largest Price Tag of Any Drug in History,” “‘Republicans are Brainwashed,’ Democrats Dutifully Chant While Clad in Identical White Uniforms.”
We based our approach on Gordon Pennycook and David Rand’s research published earlier this year, which explored the relationships between reflective thinking, political views, and judgments of fake news.(28) In their study, as in ours, participants’ overall accuracy at distinguishing legitimate news hinges on their accuracy ratings for fake and legitimate news, which range from “not at all accurate” to “very accurate” and included a “not sure” option. Ideally, participants would judge real news as “very accurate,” and fake news as “not at all accurate”. We computed the difference in their average ratings of legitimate and fake news both before and after our interventions.
For the analysis of state policies, we visited all 50 state department of education websites during the summer of 2019 and looked for their mandatory course requirements. Our work on the legislation was informed by the work of Media Literacy Now.(29)
We also wanted to better understand students’ actual learning of media literacy and so we collected data from the 2018 National Assessment of Educational Progress (NAEP), a nationally representative assessment of what students know and can do in the U.S. We relied on survey results from the 2018 Technology and Engineering Literacy assessment, an exam administered to eighth-grade students, and we focused on survey questions related to media literacy education, like judging credibility and properly crediting sources. We collected this data in the summer of 2019.
Lesson Learned
Simple interventions — like reading an article on how to spot illegitimate sources of information — can help people identify fake news.
Short interventions can be effective ways of encouraging critical news consumption, according to our study. We tested whether three interventions could influence how well people distinguish legitimate news from fake news. The interventions included an article surveying the current media environment and provided a procedure to spot fake news; a video with largely the same content; and a fake news game called NewsFeed Defenders.(30) Before and after each intervention, participants rated the accuracy of fake and legitimate news headlines on a 4-point scale from “not accurate at all” to “very accurate”.
NewsFeed Defenders, a collaboration between iCivics (an organization started by former Supreme Court Justice Sandra Day O’Connor) and the Annenberg Public Policy Center, asks students to moderate a social media newsfeed, identifying and removing fake news as necessary.(31)
The result of our study found that short interventions can have an effect on people’s ability to spot fake news. The article and the video both had significant effects in increasing the ability of the participants to discern real from fake news (p-values: < .01), while the game did not (p-value .44).
After viewing the video, participants viewed fake news headlines as less accurate than they did before (a .21 point shift on a 4-point scale; about half a standard deviation); their perceptions of legitimate news remained stable (a .02 point shift towards more accurate). In other words, if someone had judged a piece of fake news to be “somewhat accurate” prior to the intervention, after the intervention their views on that news would shift about one-fifth of the way toward the “not accurate at all” scale point.
Reading the article both decreased perceptions of the accuracy of fake news by .12 points and increased perceptions of accuracy for legitimate news by .10 points. By contrast, the game increased perceptions of accuracy slightly for both fake and legitimate news (by .04 and .08 points respectively). The control group’s perceptions of accuracy were only .01 points different for both fake and legitimate news.
Thus, both viewing the video and reading the article measurably increased participants’ discernment between fake news (less accurate) and legitimate news (more accurate). This suggests that even just making consumers aware of the problem and providing them with a framework to help them distinguish between legitimate and fake news can help them do so.
a relatively straightforward procedure to help respondents distinguish legitimate from fake news. The article, for example, urged readers to ask three questions: Who is the creator? What is the message? Why was this created?
The game also gave advice about how to distinguish high quality from low quality news, but doled this advice out at a much slower pace, over the course of several interactions with the game. It’s possible that respondents just didn’t spend enough time with it to learn the immediately applicable procedures that the article and video provided. Further research, however, would be needed to substantiate this idea.
There are some other limitations to the study. Mechanical Turk workers are younger than the U.S. population, so our sample under-represents older populations. This is a particularly important limitation in this context, because older people share fake news at much higher rates than younger people.(32)
We also did not explore whether the effects of the intervention would “stick.” Our post-intervention measures were taken immediately after the intervention. Indeed, a delayed measure may very well show smaller effects. The news differentiation task was limited — participants were only given a headline, an image, and a source to work from — but this mimics how people see these headlines in their social media feed.
The fake news crisis is ultimately a crisis of media literacy, and and more than a third of students report rarely learning key media literacy skills like judging the reliability of a source.
Concerns over manipulation by mass media date back to the middle of the 20th-century and the use of film by advertisers and governments to influence audiences.
In the past, though, propaganda was limited to the broadcast or printed material in which it appeared. It could be distributed widely and experienced by huge numbers of people, but its effects were constrained by the physical character of the media. With new technology and media platforms, those limitations have largely been erased.
The Internet, of course, allows content to be endlessly manipulated, repurposed, recycled, and shared. Perhaps the most alarming result is that disinformation can embed itself in social networks and exploit network effects. False and dangerous information in these networks becomes more convincing, and it can spread far more widely and quickly than before.
But those effects depend on individual actors who, usually with no explicit ill intention, share information that is inaccurate, often plainly so. In short, individuals themselves need training in applied reasoning about news and other media.
To address this issue, academics and educators have recognized the importance of educating students and adults in how to critically engage with a continually changing media environment, recognize efforts at manipulation, and communicate actively and effectively through media. The rise of fake news has renewed calls for a greater emphasis on critical thinking and media literacy in classrooms throughout the world.
What makes the calls for media literacy education even more urgent is the lack of time spent on media literacy in the classroom. According to survey data from the 2018 National Assessment of Educational Progress, more than a third of U.S. middle school students report “rarely” or “never” learning how to judge the reliability of sources. While over 40 percent of middle school students report learning this skill “sometimes” in class, just over a fifth of students report learning this skill “often.”
Learning how to cite and credit sources is another necessary media literacy skill, but fewer than half of U.S. middle school students report spending a lot of time learning how to properly credit others for their ideas. About 44 percent of eighth-grade students report learning this skill “often” in the classroom, and over a fifth of students report “rarely” or “never” learning this skill.
One solution would then be to implement stand-alone courses on media literacy. These courses would focus on the basics of media engagement and critical thinking. Educators teach students to run through questions like: Where does this information or opinion come from? What biases or explicit efforts at manipulation may be reflected in its points of view? What evidence is offered for the position? Is the evidence corroborated by other reliable sources? What counter-evidence might be posed by opponents of the position?
While such an approach has obvious benefits, such media literacy education is underfunded. There is limited national funding source for media literacy programs, and only a handful of states provide curricula or professional development funding.(33) The result is that media literacy is far from universal — and many students simply don’t get the applied critical thinking that they need (see sidebar for details).(34) National interest in standardized testing and STEM education in the 2000s and 2010s diminished the importance of civics education in classrooms, which are only recently being reemphasized.(35)
Several schools have begun to experiment with next-generation news literacy programs, taking new approaches and trying to reach more students. Stony Brook University and the News Literacy Project have both been recent leaders in developing curricula and digital resources for schools to use in media literacy programs. Preliminary results suggest that such programs can be effective at both the middle-school and university levels.(36)
In a recent test of Stony Brook’s university-level programs, students who took the media literacy course self-reported more reflective behavior and had more accurate knowledge of media infrastructures.(37) Students, however, self-selected into the course, limiting the strength of the conclusions the researchers could make. A more rigorous study in a Ukraine-based program aimed at working adults and retirees suggests that such programs can have a small, measurable impact.(38) Researchers have not yet conducted randomized controlled trials of these programs.
Another recent study correlates student self-reports of media literacy learning experiences with the persuasive effect of evidence-based arguments.(39) Students exposed to more media literacy learning experiences were less persuaded by misinformation and more persuaded by evidence-based arguments than students with less exposure.
Other critical thinking and media literacy approaches have focused on educators. In many areas, educators have begun teaching students to analyze patterns in the news that questionable sources may exploit to create a veneer of reliability.(40) Games are another approach, and the game Bad News has players create fake news in an effort to teach them about the common tactics of disinformation campaigns.(41) This game design is different from Newsfeed Defenders — the game that we tested — and relies on the notion of inoculating players to fake news by getting the players to participate directly in the spreading of fake news. Preliminary research suggests that playing the game does make people more skeptical of fake news headlines, at least on an immediate posttest.(42)
It’s important to note that some forms of media literacy can backfire.(43) Fluency is a common psychological effect that describes how hearing or seeing the same claim over and over again reinforces our memory of that claim. Even when it’s framed as being completely untrue, it feels familiar, so it feels true.(44) The debunking literature is full of examples where participants hear a debunked claim, but tend to remember the false information instead of the true information.(45)
Often, students are encouraged to follow a checklist as they evaluate a news article. Checklists, however, may not be all that’s needed.(46) In some cases, the checks can become obsolete (e.g., checklists that tell students that “.org” sites are more legitimate than “.com” sites, for example). There are also many ways for websites to meet checklist standards; fake news often mimics real news.
More importantly, research comparing the behavior of professional fact-checkers to students suggests that focusing on features of the article itself may have limited utility.(47) Professional fact-checkers don’t spend much time on the site they’re checking. Rather, they move quickly to other sites that can offer useful perspectives on the site they’re checking.
In the end, it will take individuals learning the practice of critical thinking to stem the spread of fake news.(48) Specifically, people need a healthy skepticism of news sources, knowledge of misinformation techniques, and the willingness to be persuaded by evidence.
By encouraging people to challenge assumptions, interrogate sources, and inhabit other points of view — and by showing how vital these habits are to both the full development of human intellect and to the healthy functioning of democracies — we can mitigate the harm of fake news, even if we can’t stop it entirely.
Do states require media literacy?
Given the growing concern over fake news, we wanted to see whether states required schools to teach important media literacy skills, like detecting bias and disinformation.
So we performed an analysis of the academic standards across the fifty states and the District of Columbia. We looked at whether states had a progression of learning standards that would help students learn how to evaluate content in the media.
While we didn’t require states’ standards to have exact language, we considered if students are expected to: identify bias and point of view; understand tactics intended to persuade a consumer; evaluate the credibility of sources; and analyze information on a topic or event from multiple sources.
As background, every state in the U.S. sets academic standards, expectations of what students should know and be able to do at every grade level. These standards guide local schools’ design of curricula, and students are expected to demonstrate these competencies in the classroom.
We found that almost every state has some language about students learning how to evaluate sources of information. The widespread adoption of the Common Core State Standards (CCSS) in English Language Arts (ELA) appears to have been behind this policy. CCSS was a state-led initiative to unify learning standards in English and Math, and 41 states and the District of Columbia adopted the ELA Standards. These standards contain language related to assessing credibility in media sources.
Most states that did not formally adopt CCSS ELA either used language that was closely similar to CCSS ELA or developed their own standards focused on media literacy.
In addition, many states that adopted CCSS ELA also created supplemental standards to further media literacy education. Some of these standards were the state’s own expansion of the ELA Standards, and some standards were located in other content areas (e.g., Technology, Social Studies, Library Media).(49) These standards had more precise language detailing issues around the use of digital and Internet resources to conduct research.
But some states still lack robust standards. We found that Oklahoma did not have clear standards related to media literacy at every grade level.
Two states have an entire strand titled “media literacy” that is not focused on helping students evaluate accuracy and bias of media and instead focus on understanding the purposes of specific media forms (e.g., video, audio, speech, book) or techniques for creating and distributing media.
For instance, Missouri has a set of ELA standards titled “Digital and Media Literacy” for students in kindergarten through fifth grade, and these standards focus on “techniques for creating media” and “understanding how communication changes when moving from one genre of media to another.” North Dakota is similar in that its standards described as “media and technology literacy” focus on the creation of media rather than on the evaluation of its messaging and content.(50)
Our sources for this analysis were published documents from state education agencies’ websites, and we collected this information in June and July of 2019.
Of course, requiring media literacy skills and providing effective media literacy education are two different things. As we note above, only one out of every five middle school students “often” report learning how to judge the reliability of sources in class. A quarter of middle school students report “rarely” learning this skill.
To gain such skills, students must also learn from well-trained teachers, as well as have access to a quality curriculum. Washington and Minnesota lead the states in providing support to educators who teach media literacy. New Jersey and Illinois require lessons for students on how to use social media as early as elementary school.
A number of states are also considering legislation to establish state-level leadership in media literacy.(51) According to Media Literacy Now, Virginia, New York, New Mexico, Massachusetts, Illinois, Hawaii, Colorado, and Arizona all have introduced bills to create media literacy advisory councils, a task force that would monitor the design and implementation of media literacy education statewide.
Massachusetts offers a unique model of media literacy education because the state’s standards have a special focus on 21st century news media.(52) Massachusetts adopted CCSS and added additional media literacy standards in social studies that explore five topics including the challenges of news and media literacy in contemporary society and gathering and reporting information, using digital media.
In the Bay State, high school students are also expected to learn how technology has impacted the dissemination of information, how online journalism and social media networks have affected news consumers, and how various methods (including websites like FactCheck.org) can be used to discern the legitimacy of news reporting.
While few states have established the detection of fake news as a necessary performance standard, Massachusetts sets an important precedent for how states can ensure this essential skill is instructed in the classroom.
Governments can play an important role in supporting media literacy and a healthy public sphere.
Governments have many tools at their disposal to combat fake news. Many governments have attempted to combat the fake news problem directly, often by applying state power against fake news creators. Italy has created an online portal where citizens can report misinformation to the police.(53) Pakistan’s Ministry of Information and Broadcasting started a Twitter account that identifies fake news and refers incidents to the authorities.(54)
Indonesia, which has faced many controversies due to the spread of fake news on popular social media networks there, recently unveiled plans to hold weekly briefings on fake news in an effort to combat the spread of disinformation, devoting 70 government employees to fact-checking and responding to news articles.(55) Perhaps the most extreme government reaction to misinformation has been to shut down the internet completely. After riots were sparked by rumors shared on WhatsApp, India’s biggest social media platform, the government has shut the internet down on numerous occasions.(56)
These direct approaches often face criticism. South Korea’s effort to fight fake news has sparked concerns of censorship by those critical of the government.(57) So has the Egyptian government’s efforts to attack the fake news problem. Two issues are central to these criticisms.(58)
First, government assessments of fake news are bound to be suspect; the government is not an objective third party. Second, applying state power directly against fake news purveyors necessarily means restricting speech. Why should citizens trust the government to get it right, and not, say, use this power as a way of attacking political opponents? Furthermore, when government programs make mistakes, they can undermine trust in government.(59)
But governments have at least three other avenues for confronting fake news: collaborating with technology companies to stem the spread of fake news, supporting public and local broadcasting, and supporting media literacy education programs.
Governments can also contribute to the fight against fake news online by strengthening and enforcing online privacy laws. The majority of fake news is produced and disseminated for advertising revenue. It takes advantage of the structure of social media to serve personalized content to users that confirms their existing views.(60) Limiting the access to and use of private data by social media and other tech companies — and, more generally, by giving users greater control of their private data — could, in principle, also limit the extent to which fake news can effectively target and manipulate users.
The government can play a key role in supporting such programs. In Brazil, the government made media analysis studies compulsory for students beginning in December of 2017.(61) Finland has developed media literacy programs for residents, students, journalists, and even politicians.(62)
These programs aim to teach the skills that professional fact-checkers use. Several factors seem to contribute to the program’s success: Finland’s high literacy rate, their prior experience in combating disinformation, strong public and regional news sources, and a strong national identity linked to the rule of law.
One effective action that governments can take without imperiling freedoms of speech and the press is to help raise awareness around fake news and support improved media literacy. Belgium, for example, has adopted a less authoritarian approach to engaging the public, launching a website to inform people about misinformation and a forum where users can upvote or downvote solutions proposed by the government.(63)
Governments can support the development of media literacy curricula, bring media literacy tools and courses into schools, and make media literacy programs available to adult learners on the internet and in public spaces like libraries. While Italy applied direct pressure to fake news producers, they also developed educational programs to promote media literacy. Finland’s government has been developing media literacy programs since at least 2014.(64)
Strong, independent media organizations that work in the public interest are also vital in the fight against fake news. In part because of Facebook and Google’s cornering of the online advertising market, media outlets are struggling to find business models that sustain in-depth, public-interest journalism.(65) Governments can support the development and growth of such organizations in several ways.(66) They can directly fund them, offer tax deductions, and provide beneficial legal forms for organizations dedicated to social good.
Fact-checking to the rescue?
Over the past several years, fact-checking resources have proliferated. But the idea that the mere presence of corrective information will stem belief in fake news is an overly simplistic one. While there is little doubt that fact-checking plays a vital role in correcting the public record, whether people engage critically with the material (or even realize claims have been fact-checked) is a separate question.
What’s more, fact-checking has its own set of issues, and a number of factors limit the effectiveness of fact-checking efforts from how efficiently facts can be checked to whether the fact-checks actually persuade people.
Traditionally, experts or journalists check facts: they interpret the claim, apply their knowledge or do some research, and verify whether the claim is true. But this can be a slow process, so journalists and computer scientists have come up with new forms of fact checking.(67)
One approach uses crowdsourcing: news consumers become de facto fact-checkers by fact-checking a specific article or source; then these results are aggregated and shared.
Another approach is to use automated fact-checking methods, drawing on data sets curated by experts.(68) There are several ways to do this, each with benefits and drawbacks. One way is to have automated fact-checkers mimic human fact-checkers: the algorithm identifies claims that might be checked, and then goes about checking them.
A third factor that influences the efficacy of fact-checking is the thoroughness and transparency of the process itself. The fact-checking process requires having standard verification practices and developing a classification scheme. Often, claims are not merely honest truths or bald-faced lies; they are exaggerations, omissions, and mischaracterizations. The fact-checking website Politifact.com, for example, divides claims on a range from “pants-on-fire” lies to true statements, with several gradations in between.(69)
To encourage fact-checking organizations to meet high standards, the Poynter Institute created an international code of principles for fact-checking organizations to commit to, which ensures that the process is transparent, non-partisan, and ethical.(70)
Technology companies must do more to promote better and more critical forms of engagement on their platforms.
Online social media platforms are the main channel for the distribution of false and misleading news. Over 40 percent of the traffic to fake news comes from social media networks, while only 10 percent of the traffic to legitimate news sites comes from social media.(71) Companies have designed these platforms to maximize user engagement. As currently designed, they are not healthy forums for informing citizens or hosting public debate.
News, advertising, and personal communication remain integrated on the platform, making it difficult to know the source of new information. News stories and political commentary are posted and shared in the same way as personal news, photos, and videos. In this way, social networks not only amplify disinformation, they make it appear reliable since it is implicitly endorsed by someone you know.
Psychological effects can further exacerbate the problem. People tend to interpret information in a way that conforms to their pre-existing views. They also tend to seek out information that conforms to their pre-existing views. Evidence suggests that these tendencies, combined with the platform’s own incentive to provide content that users will like, lead to “echo chambers” on social media platforms. The presence and extent of these echo chambers, however, depends on the nature of the topic under discussion.(72)
Several efforts are underway to combat these effects, lessening the impact that fake news and disinformation more generally.
Companies can, for example, identify, marginalize, and even remove fake news content. Facebook has collaborated with third-party fact-checkers and other groups, and created an in-house “war room” to identify and reduce the influence of disinformation during elections.(73) They have also specifically targeted disinformation that leads to violence, moved questionable stories down in the News Feed rankings, and taken steps to make political advertising on the platform more transparent.(74)
Facebook’s rhetoric regarding these changes, however, does not necessarily match their actions. Many remain skeptical of Facebook’s transparency, commitment, and capacity to fight fake news, and fear the consequences of putting an issue with such serious society-wide consequences in the hands of a private corporation.(75)
Leaders at the company seem doubtful that current efforts are a long-term solution. Facebook founder and CEO Mark Zuckerberg has characterized the company’s recent efforts to identify false information and reduce its influence as an “arms race,” suggesting that the changes Facebook makes will be met with further sophistication by those promoting false or misleading content.(76)
Another movement from inside Silicon Valley seeks more fundamental changes to the structure of our online lives. The Center for Humane Technology — founded by a former “design ethicist” at Google — has explored ways to change industry business models to “align with our own humanity.”(77)
One first step along these lines is to redefine user engagement, which social media platforms try to optimize. User engagement has typically been measured as total amount of time spent on the platform, leading to concerns about promoting extremism. Facebook itself has taken to emphasizing “time well spent” on the platform, which in practice means prioritizing relationships over engagement with media sources and businesses.(78) In principle, this model would reduce the prevalence of fake news, which is highly engaging and consumable, but hardly time well spent. But critics warn that such changes are superficial at best and simply provide opportunities for companies to collect more personal — and therefore more valuable — data on their users.(79)
Third parties have also begun to develop technology to fight fake news. The NewsGuard app, for instance, categorizes websites using a color system: green indicates that the site follows basic standards of accuracy and accountability, while red signals the opposite.(80) But the practical impact of third-party software — which primarily appeals only to those already concerned with fake news and reliable sourcing — is unclear. Further research could test the effectiveness of such efforts.
What social media platforms should do to combat fake news is not always clear. A reasonable first step would be to sincerely commit to reducing the distribution of false information. Facebook, for example, even after so much criticism, remains committed to letting Holocaust denial proliferate on the platform.(81) Is reading and sharing such material “time well spent?” The idea that social media companies should either do nothing (for fear of quashing free speech) or outright ban accounts and remove material, however, is a false dichotomy.
Recent research on the spread of fake news through Twitter and Facebook suggest several plausible options. On Twitter, a small handful of accounts share a tremendous amount of fake news (along with other political and non-political links). Researchers suspect that these accounts are “cyborgs” — accounts that employ a mixture of automated and human-created content. Targeting these accounts for further scrutiny is one option. Another is to include policies that restrict sharing, and a simulation that capped the sharing of political URLs to 20 a day reduced fake news content by 32 percent.(82)
Social media networks could also focus on monitoring the relatively small and increasingly well-defined audience for fake news. Fact-checking warnings, which accompany headlines disputed by third-party fact-checkers, can increase overall skepticism to fake news slightly, but do not seem to be like a full solution.(83) These platforms can also further critical engagement by promoting sincere dialogue and by linking to relevant resources with more information.(84)
Techniques of Savvy News Consumers
A study recently tested university students, history professors, and professional fact-checkers in their ability to distinguish fake from legitimate news.(85)
In one test, 100 percent of professional fact-checkers were able to figure out that a purportedly non-partisan website was backed by an industry-supported PR firm. Only 60 percent of professional historians and 40 percent of university students were able to do the same.
Professional fact-checkers exhibit key behaviors that students and professional historians did not. They checked multiple sources, often laying references side-by-side to evaluate them.
Below we outline some tips for spotting fake news:(86)
Avoid relying on a single source of information. The best news consumers make a habit of corroborating information by checking it against other reliable sources. This is even true of information shared by someone you know, or that seems to come from a legitimate source.
Resist clicking on initial links. The best news consumers look at the landscape of search results before carefully deciding which paths to follow. Many websites are designed to get you to click on links automatically.
Familiarize yourself with common fake news tactics. Students in Finland’s media literacy programs explore how fake news leverages emotional reactions. The Bad News game covers five others: trolling, leveraging conspiracy theories, impersonating legitimate sources, discrediting critics, and polarizing their reader base. Often, fake news stories rely on doctored photos or videos; reverse image searching through Tineye or Google can help spot reused photos.
Make arguments more precise. Another way to combat fake news is to paraphrase or summarize the arguments being made. This helps reveal weaknesses between the purported evidence and the claims being made.
Recognize material designed to persuade. Media literacy programs also teach students to distinguish between paid advertisements and news sources trying to be as objective as possible. The online environment often blurs the distinction, making company press releases or clickbait articles seem like objective news.
Conclusion
Finding quality sources online is a monumental challenge, mostly because of the continual proliferation of information sources and news aggregators, both legitimate and illegitimate. Because fake news, in particular, seems to be a plague that needs a cure these days — and happens to serve as an extreme example of the sourcing challenge — many recent efforts made by organizations focus specifically on this phenomenon.
While such efforts will certainly play an important role, the fight against disinformation on social media platforms will likely take time and may remain unresolved in the long run. Facebook founder and CEO Mark Zuckerberg has characterized the company’s recent efforts to identify false information and reduce its influence as an “arms race.”(87) The changes Facebook makes to its algorithms and oversight practices can be countered by new strategies for creating and promoting false or misleading content.
It is not just tech firms or governments that need to concern themselves with detecting and discarding fake news, but news consumers as well. Happily, the basic skills involved in dealing with fake news are applicable to information-gathering in general. In fact, long before the term “fake news” became de rigueur — and even before news was consumed online — forward-thinking educators were decrying the lack of “media literacy” among multiple generations in American society.
This report, then, recommends strong, independent media organizations that adhere to high journalistic standards — and social media platforms that privilege quality engagement over advertising revenue. But above all, the solution to fake news comes down to savvy news consumers. More engaged critical thinkers can stem the fake news crisis and strengthen democracies around the world.
Appendix
Media Literacy Organizations and Projects
NewsWise is a news literacy program for primary school children based in the UK. It provides news literacy education resources, experiences, and support for teachers aligned with the UK’s national curriculum. The program is a collaboration between the Guardian Foundation, the National Literacy Trust, the PSHE Association, and Google.
Young Reporter is a media literacy program aimed at middle schoolers in the UK developed by the BBC. It offers mentoring from professional journalists as well as online resources to educators and students around the country.
Entre Les Lignes (Between the Lines) is a network of journalists, photographers, and other members of the media who organize and lead workshops for students in France. The organization teaches students about journalism, social media, internet misinformation, and influenced the development of a governmental program to educate the youth of France. It was founded in Lyon in 2010.
The Poynter Institute is a main sponsor of media literacy research. They regularly issue reports on fake news and disinformation campaigns, as well as provide training for students and practicing journalists.
First Draft is an international collaboration of journalism, human rights, and tech organizations, focused on finding ways to streamline the verification of textual and visual information. Their “Field Guide to ‘Fake News’,” offers, “a range of methods and procedures which readers may use in order to explore fake news phenomena online for themselves.”
The Center for News Literacy is part of The State University of New York at Stony Brook. It teaches students of all ages how to use critical thinking skills to evaluate the news. They have developed curricula, lesson plans, and other resources for teachers and students to use in the classroom.
News Literacy Project is an organization that provides numerous resources to educators who want to lead media literacy programs. Curricula developed by the News Literacy Project have been used in Philadelphia schools. The project is supported by the John S. and James L. Knight Foundation.
While such efforts will certainly play an important role, the fight against disinformation on social media platforms will likely take time and may remain unresolved in the long run. Facebook founder and CEO Mark Zuckerberg has characterized the company’s recent efforts to identify false information and reduce its influence as an “arms race.”(87) The changes Facebook makes to its algorithms and oversight practices can be countered by new strategies for creating and promoting false or misleading content.
It is not just tech firms or governments that need to concern themselves with detecting and discarding fake news, but news consumers as well. Happily, the basic skills involved in dealing with fake news are applicable to information-gathering in general. In fact, long before the term “fake news” became de rigueur — and even before news was consumed online — forward-thinking educators were decrying the lack of “media literacy” among multiple generations in American society.
This report, then, recommends strong, independent media organizations that adhere to high journalistic standards — and social media platforms that privilege quality engagement over advertising revenue. But above all, the solution to fake news comes down to savvy news consumers. More engaged critical thinkers can stem the fake news crisis and strengthen democracies around the world.coration: underline;”>The Computational Propaganda Project
is an academic collaboration that investigates how algorithms, automation, and computational propaganda are used to manipulate public opinion. The scientists involved with the project are sharing their findings via research publications and news agencies covering the practices.
Media Literacy Now is an organization that advocates for media literacy programs in all states and all schools. They also provide educational resources for students and educators.
Center for Media and Information Literacy is an organization that promotes media literacy by developing educational resources and lobbying for policy change. It’s administered through Temple University.
Common Sense Media is an organization dedicated to helping parents, teachers, and children use media wisely. They have a digital citizenship program that includes media literacy.
The Center for Media Literacy is an organization that provides professional development and educational services to promote media literacy in schools. They offer a MediaLit Kit meant to help districts and schools implement media literacy programs.
Faktabaari (FactBar in English) is a Finnish fact-checking service that has worked closely to fact-check both Finnish and European elections. They also have created educational materials and worked closely with the Finnish government to develop media literacy programs.
Fact-Checking Organizations
PolitiFact is a website that fact checks information and labels claims accordingly. It’s owned by the Poynter Institute, a non-profit dedicated to issues in media literacy.
FactCheck.org is a website that focuses on fact-checking claims made by politicians. A number of related websites focus on slightly different topics (popular claims made on Facebook, popular scientific claims, etc.). It’s run out of the Annenberg Public Policy Center at the University of Pennsylvania.
Fiskkit is a crowdsourced fact-checking and discussion platform that lets users tag sentences in articles and discuss them, aggregating the tags into statistics about the article as a whole. It’s also being developed into a classroom application.
Snopes is an extremely well-established fact-checking site that often addresses conspiracy theories and fake news.
StopFake is a project that originated in the Ukraine to oppose Russian propaganda/fake news, and has expanded into other countries. They have several active Twitter accounts.
Disinfo Portal is a website focused on dissecting Russian disinformation campaigns.
Boomlive is a fact-checking organization in India.
Misinfocon is a loose organization dedicated to addressing online trust, fact-checking, and misinformation.
Pagella Politica is a program in Italy to fight fake news.
Crosscheck is a project funded in part through Google (via Google Labs) and run by First Draft that fact-checked news items during the French presidential elections. Claims are user-submitted,
crowd-sourced, and checked by several organizations in France that collaborate with each other.
Educational and Fact-Checking Apps
NewsGuard is a browser plug-in that lets users assess the credibility of visited media outlets. Websites are categorized using a color system: green indicates that the site follows basic standards of accuracy and accountability, while red signals the opposite. NewsGuard also includes a more detailed explanation of why each site received its rating.
Newsfeed Defenders is an educational game created by iCivics, a non-profit founded by Justice Sandra Day O’Connor, and the Annenberg Public Policy Center. Players manage a social media newsfeed to combat disinformation.
Fakey is an app and an online game that tests news literacy by challenging users to fact-check stories.
Hoaxy is a search engine that users can use to visualize how low-credibility sources spread on Twitter. It was developed by researchers at Indiana University. The same researchers who developed the tool also published a study entitled “Anatomy of an online misinformation network” that shows how Hoaxy was developed and how it works.
Snipfeed is an AI-based recommendation system which breaks up original content into chat messages, images, videos and quizzes. Snips last just eight seconds, but the AI adds lessons and stories enabling users to dig deeper into topics and do some “fact checks.”
Verifact is a voting platform currently in development that rewards users for voting on the accuracy of claims, news companies, and even individual journalists. The idea is to incentivize users to vote for what is true by having them stake money on the veracity of the claim.
Media Bias/Fact Check (MBFC News) is a news organization that has developed several apps aimed at combating fake news.
ZenMate SafeSearch is an extension for Chrome that claims to identify sites that spread fake news and malicious content. How it does so is not transparent. ZenMate’s main product is a VPN service.
NewsCracker is a Chrome extension and Twitter feed that algorithmically rates news articles, attempting to predict the likelihood that the article represents fake news.
FactPopUp is a Chrome extension that draws on Politifact research to show relevant fact checks in real time. It’s put out by the Reporters’ Lab, a center for journalism research at Duke University.
FactStream is an iTunes app that draws on three sources (the Washington Post, Politifact, and FactCheck.org) for fact-checking claims in real-time. It’s also put out by the Reporters’ Lab.
Fake Tweet Buster is a web tool that identifies tweets and users promoting fake news through a mixture of reverse-image searchers, user analysis, and crowdsourcing. It doesn’t seem to have been released yet.
Factitious is a game where players try to spot fake news. After determining whether an article is fake news, the game provides players feedback as to whether they were right or not. It’s also being developed as a classroom tool.
Bad News is a game put out by DROG, an interdisciplinary team of academics and journalists, aimed at “inoculating” people to fake news. Players take on the role of a fake news purveyor and try to drum up false controversies while engaging in common fake news tactics, such as attacking fact checkers for bias, and playing on people’s fears and anxieties.
Checkology is a virtual classroom that aims to help students tell the difference between fact and fiction. It is used as an educational tool by thousands of teachers in the United States, as well as in over 110 other countries. Checkology is produced by the News Literacy Project.
Media Literacy Strands in Common Core State Standards, English Language Arts
[table id=66 /]
(1)* White, A. (2017, November 17). Fake news: How the business of the digital age threatens democracy. Ethical Journalism Network. Retrieved from https://ethicaljournalismnetwork.org
(2)* Varol, O., Ferrara, E., Davis, C. A., Menczer, F., & Flammini, A. (2017, May). Online human-bot interactions: Detection, estimation, and characterization. In Eleventh international AAAI conference on web and social media.
(3)* Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-36.
(4)* Swift, A. (2019, May 24). Americans’ trust in mass media sinks to new low. Gallup. Retrieved from https://news.gallup.com
(5)* Tufekci, Z. (2018, March 10). YouTube, the great radicalizer. The New York Times. Retrieved from https://www.nytimes.com; Nicas, J. (2018, February 07). How YouTube drives people to the Internet’s darkest corners. The Wall Street Journal. Retrieved from https://www.wsj.com
(6)* White, A. (2017, November 17). Fake News: How the business of the digital age threatens democracy. Ethical Journalism Network. Retrieved from https://ethicaljournalismnetwork.org
(7)* Brattberg, E., & Maurer, T. (2018). Russian election interference: Europe’s counter to fake news and cyber attacks (Vol. 23). Carnegie Endowment for International Peace.
(8)* Cook, J. (2018, June 28). Technology helped fake news. Now technology needs to stop it. Bulletin of the Atomic Scientists. Retrieved from https://thebulletin.org
(9)* Preliminary research on other games, however, have demonstrated the potential of game experiences to influence people’s judgments of fake news. Roozenbeek, J., & Linden, S. V. D. (2019). Fake news game confers psychological resistance against online misinformation. Palgrave Communications, 5(1). doi: 10.1057/s41599-019-0279-9
(10)* Turk, Ž. (2018). Technology as enabler of fake news and a potential tool to combat it. European Parliament.
(11)* The National Assessment of Educational Progress. (2018). The nation’s report card: 2018 Technology and Engineering Literacy student questionnaire Grade 8. Retrieved from https://www.nationsreportcard.gov
(12)* Foley, R. J. (2017, December 31). Efforts grow to help students evaluate what they see online. AP News. Retrieved from https://apnews.com
(13)* Relihan, T. (2018, December 19). Social media advertising can boost fake news — or beat it. MIT Management Sloan School. Retrieved from https://mitsloan.mit.edu
(14)* Mosseri, A. (2017, April 7). Working to stop misinformation and false news. Facebook Newsroom. Retrieved from https://newsroom.fb.com
(15)* Martens, B., Aguiar, L., Gomez-Herrera, E., & Mueller-Langer, F. (2018). The digital transformation of news media and the rise of disinformation and fake news-An economic perspective. Digital Economy Working Paper 2018-02.
(16)* Wendling, M. (2018, January 22). The (almost) complete history of ‘fake news’. BBC News. Retrieved from https://www.bbc.com
(17)* Silverman, C. (n.d.). The Facebook dilemma [Interview by J. Jacoby]. Public Broadcasting Service. Retrieved from https://www.pbs.org
(18)* 8 de Cock Buning, M., Allen, R., Bargaoanu, A., Bechmann, A., Curran, N., Dimitrov, D., … & Goyens, M. (2018). A multi-dimensional approach to disinformation. European Commission. Retrieved from https://ec.europa.eu/digital-singlemarket/en/news/final-report-high-level-expert-group-fake-news-and-online-disinformation
(19)* Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151.
(20)* Kang, C., & Goldman, A. (2016, December 05). In Washington pizzeria attack, fake news brought real guns. The New York Times. Retrieved from https://www.nytimes.com
(21)* Gabielkov, M., Ramachandran, A., Chaintreau, A., & Legout, A. (2016). Social clicks: What and who gets read on Twitter? ACM SIGMETRICS Performance Evaluation Review, 44(1), 179-192.; Fletcher, R., Cornia, A., Graves, L., & Nielsen, R. K. (2018). Measuring the reach of “fake news” and online disinformation in Europe. Reuters Institute Factsheet
(22)* Kalla, J. L., & Broockman, D. E. (2018). The minimal persuasive effects of campaign contact in general elections: Evidence from 49 field experiments. American Political Science Review, 112(1), 148-166.
(23)* Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B., & Lazer, D. (2019). Fake news on Twitter during the 2016 U.S. presidential election. Science, 363(6425), 374-378
(24)* Guess, A., Nagler, J., & Tucker, J. (2019). Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances, 5(1). doi: 10.1126/sciadv.aau4586
(25)* Fazio, L. (2018, March 30). Why you stink at fact-checking. The Conversation. Retrieved from http://theconversation.com
(26)* Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144(5), 993.
(27)* Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General.
(28)* Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39-50.
(29)* Media Literacy Now. (2019, May 27). Retrieved from https://medialiteracynow.org
(30)* Fake News: How to Spot It. (n.d.). Retrieved from https://www.prattlibrary.org; Video: Spotting Fake News. (2016, December 08). Retrieved from https://www.factcheck.org; NewsFeed Defenders. (2019, March) Retrieved from https://www.icivics.org
(31)* Civics. (n.d.). Retrieved from https://www.icivics.org; Research and engagement that matter. (n.d.). Retrieved from https://www.annenbergpublicpolicycenter.org
(32)* Guess, A., Nagler, J., & Tucker, J. (2019). Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances, 5(1). doi: 10.1126/sciadv.aau458
(33)* Bulger, M., & Davison, P. (2018). The promises, challenges, and futures of media literacy.; Legislative Activity Across the Country. (2019, February 22). Retrieved from https://medialiteracynow.org
(34)* Atkins, L., & Atkins, L. (2017, July 13). States should require schools to teach media literacy to combat fake news. HuffPost. Retrieved from https://www.huffpost.com
(35)* Kavanagh, J., & Rich, M.D. (2018). Truth decay: An initial exploration of the diminishing role of facts and analysis in American public life. Rand Corporation. Retrieved from https://www.rand.org/pubs/research_reports/RR2314.html
(36)* Digital Resource Center. (n.d.). Digital Resource Center: Stony Brook Center for news literacy. Retrieved from https://digitalresource.center; The News Literacy Project. (n.d.). Retrieved from https://newslit.org
(37)* Maksl, A., Craft, S., Ashley, S., & Miller, D. (2017). The usefulness of a news media literacy measure in evaluating a news literacy curriculum. Journalism & Mass Communication Educator, 72(2), 228-241.
(38)* Guernsey, L. (2018, May 09). A new program brings better media literacy to Ukraine. Could It work in the U.S.? Slate. Retrieved from https://slate.com
(39)* Kahne, J., & Bowyer, B. (2017). Educating for democracy in a partisan age: Confronting the challenges of motivated reasoning and misinformation. American Educational Research Journal, 54(1), 3-34.
(40)* Ahn, C. (2019, February 5). How analyzing patterns helps students spot deceptive media. The Conversation. Retrieved from http://theconversation.com
(41)* Roozenbeek, J., & van der Linden, S. (2019). The fake news game: actively inoculating against the risk of misinformation. Journal of Risk Research, 22(5), 570-580.
(42)* Roozenbeek, J., & van der Linden, S. (2019). Fake news game confers psychological resistance against online misinformation. Palgrave Communications, 5(1), 12.
(43)* Bulger, M., & Davison, P. (2018). The promises, challenges, and futures of media literacy. Data & Society. Retrieved from https://datasociety.net/output/the-promises-challenges-and-futures-of-media-literacy/
(44)* Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General.
(45)* Cook, J., & Lewandowsky, S. (2011). The debunking handbook. Sevloid Art.
(46)* Breakstone, J., McGrew, S., Smith, M., Ortega, T., & Wineburg, S. (2018). Why we need a new approach to teaching digital literacy. Phi Delta Kappan, 99(6), 27-32.
(47)* Breakstone, J., McGrew, S., Smith, M., Ortega, T., & Wineburg, S. (2018). Why we need a new approach to teaching digital literacy. Phi Delta Kappan, 99(6), 27-32.
(48)* Bouygues, H. (2018, October 26). The best defense against fake news. Reboot Foundation. Retrieved from https://rebootfoundati.wpengine.com
(49)* ”Library Media” is a content area where states have a formal school library program for students to learn how to access various information sources (e.g., books, websites, videos, historical artifacts) in library settings. “Social Studies” is an umbrella content area that includes History, Civics, United States History, and Social Science.
(50)* Missouri Department of Elementary and Secondary Education (2016). K-5 ELA Missouri learning standards. Retrieved from https://dese.mo.gov/sites/default/files/curr-mls-standards-ela-k-5-sboe-2016.pdf; North Dakota State Government (2012, December). North Dakota Library and Technology content standards Grade K-12. Retrieved from https://www.nd.gov/dpi/districtsschools/k-12-education-content-standards
(51)* Your State Legislation. (2019, February 22). Retrieved from https://medialiteracynow.org
(52)* Massachusetts Department of Elementary and Secondary Education (2018). History and Social Science Framework. Retrieved from http://www.doe.mass.edu/frameworks/hss/2018-12.pdf
(53)* Serhan, Y. (2018, February 26). Italy Scrambles to Fight Misinformation Ahead of Its Elections. The Atlantic. Retrieved from https://www.theatlantic.com
(54)* FakeNews_Buster. (2018, October). Retrieved from https://twitter.com/FakeNews_Buster
(55)* Handley, L. (2018, September 27). Indonesia’s government is to hold public fake news briefings every week. CNBC. Retrieved from https://www.cnbc.com; Informatika, S. Welcome – Stop Hoax. (n.d.). Retrieved from https://stophoax.id
(56)* Burgess, M. (2019, January 28). To fight fake news on WhatsApp, India is turning off the internet. Wired UK. Retrieved from https://www.wired.co.uk
(57)* Sang-hun, C. (2018, October 02). South Korea declares war on ‘fake news,’ worrying government critics. The New York Times. Retrieved from https://www.nytimes.com
(58)* Magdy, S. (2018, September 17). Egypt says it fights fake news, critics see new crackdown. Associated Press News. Retrieved from https://www.apnews.com
(59)* About – EU vs Disinformation campaign. (2015). Retrieved from https://euvsdisinfo.eu
(60)* How private information helps fake news hoodwink the public. (2018, October 11). Retrieved from https://knowledge.wharton.upenn.edu
(61)* Brazil fighting fake news in the classroom. (2018, July 13). The Star Online. Retrieved from https://www.thestar.com.my/tech/tech-news/2018/07/13/brazil-fighting-fake-news-in-the-classroom
(62)* Mackintosh, E. (2019, May). Finland is winning the war on fake news. Other nations want the blueprint. CNN. Retrieved from https://edition.cnn.com
(63)* Belgium – Stop Fake News. (n.d.). Retrieved from https://www.stopfakenews.be/
(64)* Mackintosh, E. (2019, May). Finland is winning the war on fake news. Other nations want the blueprint. CNN. Retrieved from https://edition.cnn.com
(65)* Lynn, B. (2018, July 26). Google and Facebook are strangling the free press to death. Democracy is the loser. The Guardian. Retrieved from https://www.theguardian.com
(66)* Dodd, A. (2018, September 19). Should governments provide funding grants to encourage public interest journalism? The Conversation. Retrieved from https://theconversation.com
(67)* Shu, K., Sliva, A., Wang, S., Tang, J., & Liu, H. (2017). Fake news detection on social media: A data mining perspective. ACM SIGKDD Explorations Newsletter, 19(1), 22-36.
(68)* Wang, W. Y. (2017). “Liar, Liar Pants on Fire”: A new benchmark dataset for fake news detection. Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). doi: 10.18653/v1/p17-2067
(69)* Fact-checking U.S. politics | PolitiFact.(n.d.). Retrieved from https://www.politifact.com/
(70)* Commit to transparency – sign up for the International Fact-Checking Network’s code of principles. (2019). Retrieved from https://www.ifcncodeofprinciples.poynter.org/
(71)* Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-36.
(72)* Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting from left to right: Is online political communication more than an echo chamber? Psychological Science, 26(10), 1531-1542.
(73)* Ananny, M. (2018). Checking in with the Facebook fact-checking partnership. Columbia Journalism Review.; Frenkel, S., & Isaac, M. (2018, September 19). Inside Facebook’s election ‘war room’. The New York Times. Retrieved from https://www.nytimes.com
(74)* Castillo, M. (2018, July 18). Facebook will begin taking down fake news intended to encourage violence. CNBC, Retrieved from https://www.cnbc.com; Isaac, M. (2016, December 15). Facebook mounts effort to limit tide of fake news. The New York Times. Retrieved from https://www.nytimes.com; Zuckerberg, M. (2018, April 7). With important elections coming up in the US, Mexico, Brazil, India, Pakistan and more countries in the next year, one of my top priorities for 2018 is making sure we support positive discourse and prevent interference in these elections…[Facebook Post]. Retrieved from https://www.facebook.com/zuck/posts/10104784125525891
(75)* Legum, J. (2018, July 20). Facebook’s pledge to eliminate false information is itself fake news. The Guardian. Retrieved from https://www.theguardian.com
(76)* Zuckerberg, M. (2018, September 04). Mark Zuckerberg: Protecting democracy is an arms race. Here’s how Facebook can help. The Washington Post. Retrieved from https://www.washingtonpost.com
(77)* Center for Humane Technology. (n.d.). Retrieved from https://humanetech.com/
(78)* Tarnoff, B., & Weigel, M. (2018, May 03). Why Silicon Valley can’t fix itself. The Guardian. Retrieved from https://www.theguardian.com
(79)* Tarnoff, B., & Weigel, M. (2018, May 03). Why Silicon Valley can’t fix itself. The Guardian. Retrieved from https://www.theguardian.com
(80)* NewsGuard (n.d.). Retrieved from https://www.newsguardtech.com/
(81)* Legum, J. (2018, July 20). Facebook’s pledge to eliminate false information is itself fake news. The Guardian. Retrieved from https://www.theguardian.com
(82)* Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B., & Lazer, D. (2019). Fake news on Twitter during the 2016 US presidential election. Science, 363(6425), 374-378.
(83)* Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General.
(84)* Pardes, A. (2019, April 08). ‘Change my view’ Reddit community launches its own website. WIRED. Retrieved from https://www.wired.com
(85)* Wineburg, S., & McGrew, S. (2017). Lateral reading: Reading less and learning more when evaluating digital information. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3048994
(86)* These tips are drawn from research on the habits of professional fact-checkers by Sam Wineburg and Sarah McGrew and from Faktabarri’s guide to fact-checking for educators and future voters. Breakstone, J., McGrew, S., Smith, M., Ortega, T., & Wineburg, S. (2018). Why we need a new approach to teaching digital literacy. Phi Delta Kappan, 99(6), 27-32; Fact-checking for educators and future voters[PDF]. (2018). Finland: FactBar EDU.
(87)* Zuckerberg, M. (2018, September 04). Mark Zuckerberg: Protecting democracy is an arms race. Here’s how Facebook can help. The Washington Post. Retrieved from https://www.washingtonpost.com