The Baltimore Gazette had its share of scoops leading up to the 2016 presidential election. But one stands out: Every presidential race since John F. Kennedy’s election was rigged.
That blockbuster story spread quickly across social media, with readers praising the Gazette for having the guts to report “the truth.”
Today, the Gazette’s website no longer offers any news. Instead, visitors find a statement that reads: “Our apologies, we appear to be experiencing technical difficulties.”
The site, like many that thrived during the contentious 2016 election, offered fake news — sensational, untrue tales like the Gazette story about an Atlanta police officer who gunned down a mother as she breastfed her baby.
A wildly partisan presidential election defined by deep ideological divides offered the perfect breeding ground for fake news sites to pander to readers craving information that affirms their views. And social media sites such as Facebook offered the extra turbocharge needed to blast these stories across countless networks of friends who all share the same sensibilities.
“We like to believe more of what is already in line with what we believe,” said Alexios Mantzarlis, director of the Poynter Institute for Media Studies International Fact-Checking Network. “And we tend to explain away, through motivated reasoning, stuff that doesn’t fit into that pattern.”
A study conducted by the Pew Research Center in December found that 64 percent of Americans could not tell the difference between real and fake news. At least 23 percent acknowledged sharing a fake news story, either knowingly or not.
Mantzarlis’ group and others want to fix that. He said his organization is creating a ready-made lesson plan for high school teachers to educate students about discerning fact from fiction in news, a solution widely viewed as the best long-term approach to creating news literacy.
“Really now we need to teach about differentiating rather than searching and cross-referencing,” Mantzarlis said.
Facebook also is offering help: Readers can flag content as fake news. Their complaints are passed along to the Poynter network, and to independent media groups that investigate the truthfulness of items (the first such groups to take part included ABC News, The Associated Press, FactCheck.org, Politifact and Snopes). Stories that flunk the fact check are pushed down in people’s news feeds, and anyone who wants to share the story is warned that it has been disputed.
“That’s just one of the things we’re trying to do, but there are plenty of other plans also in the works,” Facebook spokesman Tucker Bounds said.
Bounds pointed to a letter that Facebook founder Mark Zuckerberg wrote to the site’s users on Feb. 16, reporting that his staff had extensively studied the filter bubbles people have developed and the abundance of fake news. As a result, Facebook was able to differentiate between sensational news and in-depth stories by the way people share the content: Stories that are not shared as frequently by users are identified by Facebook as potentially sensational or fake. In-depth stories thrive, Zuckerberg wrote.
“There is more we must do to support the news industry to make sure this vital social function is sustainable — from growing local news, to developing formats best suited to mobile devices, to improving the range of business models news organizations rely on,” he wrote.
The Trust Project at Santa Clara University in California is developing a system to help readers discern legitimate news sites from fake ones.
The project has partnered with news organizations around the world to develop “trust indicators” signaling solid information, said program director Sally Lehrman. The idea is to develop ways for consumers to easily identify credible news sites and break through the reader’s information bubble that only allows content reinforcing their beliefs.
Lehrman said trust indicators include whether the publication follows policies on best practices and corrections, or if it leans toward a particular political party. Other indicators include the author of the content and the number of sources included in the report.
“What we hope to do is elevate the quality of journalism that you will see online,” Lehrman said.
Similar to Twitter and Facebook, the Trust Project also is exploring ways to identify trusted sites and separate them from questionable ones.
“With all of the fake news you see out there, people didn’t know where to turn to,” Lehrman said. “The project will help readers become better informed.”
The Trust Project is working with sites Google, Facebook and others to promote credible news. Lehrman said Facebook recently adopted a button users can press to report fake news.
Beyond technical advances and accountability projects, re-establishing a culture that recognizes and relies on credible news is the goal. The days when schoolchildren came home to find a newspaper on the kitchen counter are long gone, said Frank LoMonte, director of the Washington, D.C.-based Student Press Law Center.
“We have to teach kids where to look for the information they can trust,” LoMonte said. “It’s no longer a trip to the library, but turn on the computer and start searching.”
The erosion of independent school newspapers also leaves students with the impression that news is no longer important, he said. School administrators, for example, risk sending the wrong message when they spike a story because it may tarnish someone’s reputation.
“That’s telling those students and their parents that it’s OK to not consider certain viewpoints, even if they’re controversial,” he said.
The news industry should work to correctly identify news that may seem distorted or fake, said Ken Paulson, a former editor-in-chief of USA Today who is dean at the media school of Middle Tennessee State University and president of the Newseum’s First Amendment Center.
“I think we need to stop using the term, ‘fake news,’ because it’s not that type of news at all,” Paulson said. “We have a better word, and it’s called a hoax.”