To Stop Fake News, Social Media Firms Need Our Help

Misinformation is as old as communication itself. In television’s glory days, self-styled psychic Uri Geller fooled viewers for years before being outed as a fraud. Centuries earlier, in the adolescence of print media, British con artist William Chaloner circulated pamphlets attacking the national mint.

But never has misinformation spread so widely, readily, and willfully as it has in the age of social media. And never have so many different actors been culpable in creating that reality.

Take the dreadful Parkland, Florida, school shooting earlier this year. While Twitter and Facebook afforded students and their families access to potentially life-saving information sooner than other media, their algorithms also amplified right-wing conspiracy theories claiming student survivors were “crisis actors.” Although multiple print and digital media outlets quickly debunked the theory, the damage had already been done.

Often unwittingly, everyday Americans are caught in the crossfire of politically charged misinformation. Understandably, they’ve come to rely on social media to stay in touch. How else could a 50-something dad circle back with an elementary school friend who moved away decades prior? But they’ve also been shepherded into echo chambers by algorithms that prioritize clicks over truth — echo chambers that the majority of Americans, according to Pew Research, expect to get worse over the coming decade.

Certainly, it would be easy to point the finger at social media companies alone. But these platforms are neither the first nor the only perpetrators. Tribalism, a vacuum of government policy, and, yes, the very business model of social media firms have all played a part in this problem.

Inside the Social Media Machine

Compared to its media ancestors, social media is the perfect vector for spreading misinformation. The first of its three problematic attributes is its decentralized architecture, punctuated by highly influential nodes. Each nodular individual or company attracts like-minded media consumers, magnifying its influence on a given topic, regardless of the node’s expertise or truthfulness.

Although decentralization delivers media that’s maximally applicable to the user and prevents a single authority from controlling the narrative, it’s also dangerous. Misinformation spreads like wildfire in such a forum, where competence and truth matter less than the emotional payload of what’s being discussed.

Furthermore, social media makes it easy to link or break ties with connections, enabling users to self-select informational inputs. Over time, users can and do shut out information they dislike or don’t believe, distorting their own reality according to what’s “true” within their information bubbles. Because they’ve insulated themselves from uncomfortable ideas, the shock value of those ideas increases and drives users to respond with vitriol rather than reason.

The final systemic flaw of social media? Just follow the money — and, more specifically, the clicks. Clicks are literal currency for social media companies. Information that provides immediate gratification is good for business, and outrage-triggering content offers it like nothing else. Until that incentive structure shifts, social media’s echo chambers are likely here to stay.

Does that mean society is doomed to a truthless future? Not necessarily. But to rectify the situation, social media users, government entities, and social media platforms themselves must all be willing to alter their behaviors.

A 3-Pronged Defense Against Misinformation

For better or worse, social media users must be the first line of defense against the spread of half-truths and outright falsehoods. In short, they must be responsible informational bartenders. If a bartender serves an intoxicated person who later kills someone with her car on the way home, the bartender is at least morally culpable for fueling the tragedy.

Each time a social media user takes an action, such as retweeting a 280-character rant, he serves that information up to someone else. If he doesn’t critically consider content before sharing it, he’s putting someone else at risk — this time, with added social proof behind it, a cue to trust the information.

Fortunately, critical consumption of media is something everyone is capable of. Reading content entirely before sharing it, asking whether the content is coming from a reputable source, and searching for corroborating evidence from another source are easy and powerful guardrails against misinformation.

Couldn’t government entities also act as guardrails, playing the referee of truth? They certainly could try, but appointing a singular authority to separate fact from fiction invites an opportunity to propagandize. Facts are rarely black-and-white, and government officials are often all too happy to dole out “alternative facts” that advance their own narratives.

Instead, the role of governments (if any) must be to set policies that encourage all media companies, traditional and social, to build models that encourage deliberative engagement over clicks. About six in 10 American media consumers scan only the headline of news content before moving on. Something as simple as having share buttons placed in or at the bottom of content rather than directly on social platforms would at least force readers to open the source content before sharing it with others.

And what would social media companies think of such a policy? Obviously, they’re beholden to shareholders and market realities, just like other companies. Under their present model, they’re going to fight tooth and nail against any regulation that could cut into clicks and shares.

But there are certainly other business models that they could adopt. For example, switching to a subscription-based forum would weed out bots and give users more ownership over the media community they’re paying to be a part of. Such a system would also provide a revenue buffer to experiment with less emotionally charged, higher-quality content.

Incentivizing longer engagement with media through gamification, such as a system of points or social rewards, could be an effective compromise. Medium is exploring this path with a reader-assessed content quality metric called “claps.” Whether Medium’s approach becomes a viable long-term revenue model or not remains to be seen, however.

In today’s hyperpoliticized media environment, it can be difficult to remember social media’s original purpose: to inform and bring people together. Although social media has connected friends and families in some contexts, it’s driven wedges between others, sometimes to the point of job termination, social isolation, and even suicide.

If social media is ever to achieve its stated goal, we must start by fighting misinformation. And winning the war on misinformation will require all of us — people, companies, and governments and liberals, conservatives, and independents — to choose truth over comfort both on social media and off.

Study Finds Fringe Communities on Reddit and 4chan Have High Influence on Flow of Alternative News to Twitter

Researchers at the University of Alabama at Birmingham, Cyprus University of Technology and University College London have conducted the first large-scale measurement of how mainstream and alternative news flows through multiple social media platforms.

After analyzing millions of posts containing mainstream and alternative news shared on Twitter, Reddit and 4chan, Jeremy Blackburn, Ph.D., and collaborators found that fringe communities within 4chan, an image-based discussion forum where users are anonymous, and Reddit, a social news aggregator where users vote up or down on posts, have a surprisingly large influence on Twitter. The results of the study were published this week in a paper at the ACM Internet Measurement Conference in London.

“Based on our findings, these smaller, fringe communities on Reddit and 4chan serve as an incubation chamber for a lot of information,” said Blackburn, assistant professor of computer science in the UAB College of Arts and Sciences. “Many online hoaxes, false or misleading stories have been traced back to users on these platforms. The content and talking points are refined until they finally break free and make it to larger, more mainstream communities.”

The team gathered information from posts, threads and comments on Twitter, Reddit and 4chan that contained URLs from 45 mainstream and 54 alternative news websites. Activity on the three platforms was measured between June 20, 2016, and Feb. 28, 2017.

They analyzed more than 400,000 tweets, 1.8 million posts and comments on Reddit, and 97,000 posts and replies on 4chan. After analyzing the occurrence of 99 URLs on the top 20 mainstream and alternative news sites, they found Breitbart.com made up 55 percent of the URLs from six selected subreddits, and the nytimes.com made up 14 percent. Breitbart.com made up 44 percent of the URLs on Twitter, while theguardian.com made up 19 percent, and Breitbart.com made up 53 percent of the URLs, with theguardian.com making up 14 percent.

Using the unique URLs across all platforms and the time they first pop up, the team analyzed their appearance in one, two or three platforms, and the order in which the appearance occurred. Examination of the path of a URL reveals the domains whose URLs tend to appear first on each of the platforms.

Flow of Mainstream News

For the mainstream news domains, the group found that URLs from nytimes.com and cnn.com tend to appear first more often on Reddit than Twitter and 4chan. On the other hand, URLs from other domains like bbc.com and theguardian.com tend to appear first more often on Twitter than Reddit. There was no instance where mainstream news URLs tended to appear first on 4chan.

Flow of Alternative News  

The group found that breitbart.com URLs appear first in Reddit more often than on Twitter, and more frequently than they do on 4chan. However, for other popular alternative domains, such as infowars.com, rt.com and sputniknews.com, URLs appear first on Twitter more often than Reddit and 4chan. As is the case with the mainstream domains, there was no domain where 4chan dominates in terms of first URL appearance.

In addition to studying how news is shared on the three platforms, the researchers were able to estimate how much influence each platform has on the information shared on other platforms, using a mathematical technique knowns as Hawkes process.

The group measured the influence of six subreddits from Reddit.com, “The_Donald,” “politics,” “worldnews,” “AskReddit,” “conspiracy,” and “news,” the “/pol/” board on 4chan and the Twitter platform. They found that Twitter has a heavy influence on the posting of URLs from alternative news sites on the other social platforms, and is the most influential single source for most of the other web communities.

“These platforms have become an important piece of the modern information ecosystem,” Blackburn said. “As we continue to see the creation and spread of hoaxes, rumors and false information online, this knowledge is crucial to understand the risks associated with alternative news and to aid in designing appropriate detection and mitigation strategies.”

Blackburn is a co-founder of the International Data-driven Research for Advanced Modelling and Analysis Lab, or iDRAMA Lab, an international group of scientists focusing on modern socio-technical issues with expertise ranging from low-level cryptography to video games. The paper, “The Web Centipede: Understanding How Web Communities Influence Each Other Through the Lens of Mainstream and Alternative News Sources,” can be found here.

Exit mobile version