HighIQCommunity.Com

4,883,774 words. 6,012 posts.

Facebook Groups Are Destroying America

by | Jun 17, 2020 | New, News

If you really look closely, most overnight successes took a long time.

— Steve Jobs

The Covid-19 “infodemic” has laid bare how vulnerable the United States is to disinformation. The country is less than five months away from the 2020 presidential election, and by the thousands, Americans are buying into conspiracy theories about vaccines containing microchips and wondering about the healing powers of hair dryers. Where does all this come from? Let’s not be too distracted by a fear of rumormonger bots on the rampage, or divisive ads purchased with Russian rubles. As two of the leading researchers in this field, we’re much more worried about Facebook Groups pumping out vast amounts of false information to like-minded members.

For the past several years, Facebook users have been seeing more content from “friends and family,” and less from brands and media outlets. As part of the platform’s “pivot to privacy” after the 2016 election, Groups have been promoted as trusted spaces that create communities around shared interests. “Many people prefer the intimacy of communicating one-on-one or with just a few friends,” explained Mark Zuckerberg in a 2019 blog post. “People are more cautious of having a permanent record of what they've shared.”

But as our research shows, those same features—privacy and community—are often exploited by bad actors, foreign and domestic, to spread false information and conspiracies. Dynamics in Groups often mirror those of peer-to-peer messaging apps: People share, spread, and receive information directly to and from their closest contacts, whom they typically see as reliable sources. To make things easier for those looking to stoke political division, Groups provide a menu of potential targets organized by issue and even location; bad actors can create fake profiles or personas tailored to the interests of the audiences they intend to infiltrate. This allows them to seed their own content in a Group, and also to repurpose its content for use on other platforms.

WIRED OPINIONABOUT

Nina Jankowicz is the Disinformation Fellow at The Wilson Center, and the author of How to Lose the Information War: Russia, Fake News, and the Future of Conflict, out July 9. Cindy Otis is the Senior Fellow at the Atlantic Council’s Digital Forensic Research Lab, and the author of True or False: A CIA Analyst’s Guide to Spotting Fake News, out July 28.

This was already evident in 2018, when associates of Shiva Ayyadurai, an independent candidate for US Senate, used Groups as part of their astroturfing campaign to boost his online support. Today, Ayyadurai is one of the most dangerous vectors of health disinformation, racking up millions of engagements on posts that rail against vaccinations, claim Anthony Fauci is a member of the “deep state,” and instruct followers to point blow dryers down their throats to kill coronavirus.

Groups continue to be used for political disinformation. The “Obamagate” conspiracy theory has yet to be defined in clear terms, even by its own adherents, and yet our analysis of Facebook Groups shows the false narrative that the Obama administration illegally spied against people associated with the Trump campaign is being fueled and nurtured there. Related memes and links to fringe right-wing websites have been shared millions of times on Facebook in the last few months. Users coordinating their activity across networks of Groups and Pages managed by a small handful of people boost these narratives. At least nine coordinated Pages and two Groups—with more than three million likes and 71,000 members, respectively—are set up to drive traffic to five “news” websites that promote right-wing clickbait and conspiracy theories. In May, those five websites published more than 50 posts promoting “Obamagate,” which were then shared in the linked pro-Trump Groups and Pages. The revolving door of disinformation continues to spin.

A recent Wall Street Journal investigation revealed that Facebook was aware of Groups’ polarizing tendencies from 2016. And despite the company’s recent efforts to crack down on misinformation related to Covid-19, the Groups feature continues to serve as a vector for lies. As we wrote this story, if you were to join the “Alternative Health Science News” Group, for example, Facebook would then recommend, based on your interests, that you join a Group called “Sheep No More,” which uses Pepe the Frog, a white supremacist symbol, in its header; as well as “Q-Anon Patriots,” a forum for believers in the crackpot QAnon conspiracy theory. As protests in response to the death of George Floyd spread across the country, members of these Groups claimed that Floyd and the police involved were “crisis actors” following a script. In recent days, Facebook stopped providing suggestions on the landing pages of certain Groups, but they still populate the “Discover” tab, where Facebook recommends content to users based on their recent engagement and activity.

In order to mitigate these problems, Facebook should radically increase transparency around Groups’ ownership, management, and membership. Yes, privacy was the point—but users need the tools to understand the provenance of the information they consume. First, Facebook needs to vet more carefully how Groups and Pages are categorized on the site, ensuring their labels accurately reflect the content shared in that community. In the current system, a Page owner chooses its category— “Cuisine,” “Just For Fun,” and so forth—which then shows up in that community’s search results and on its front page. Most Groups, meanwhile, are categorized as “General,” which assists neither users nor Facebook’s threat investigation teams in understanding each one’s purpose. In both cases, owners can be misleading: a large Page that shares exclusively divisive or political content might be categorized as “Personal Blog,” so as to escape the added scrutiny that might come with a more explicitly political tag. Such descriptors should be more specific, and applied more consistently. That’s especially important for Groups or Pages with tens of thousands of members or followers. Facebook should also make it easier to spot when multiple Groups and Pages are managed by the same accounts. That way the average user can easily identify concerted efforts to flood the platform with particular content.

As the Wall Street Journal found, Facebook’s own research showed that algorithmically-suggested Groups and “Related Pages” suggestions lead users further into conspiracy-land. They should be eliminated entirely. If users had to search out Groups for themselves, they might be a bit more thoughtful about which they joined. Finally, very large Groups should not be afforded the same level of privacy as family groups where Grandma shares recipes and Cousin Sally posts baby pics. If a Group exceeds a certain membership threshold—say, 5,000 people—it should be automatically set to public, such that any Facebook user can participate. That way, these Groups can be observed by the researchers and journalists on whom Facebook now relies to police its platform.

A few months ago, during the 2020 Super Bowl, Facebook ran an ad lauding the power of Groups to bring people together. The 60-second spot was called “Ready to Rock?” But unless Facebook stops bad actors from taking advantage of the community that Groups provide, perhaps we should be ready for an earthquake.

WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here. Submit an op-ed at opinion@wired.com.

WIRED

WIRED is where tomorrow is realized. It is the essential source of information and ideas that make sense of a world in constant transformation. The WIRED conversation illuminates how technology is changing every aspect of our lives—from culture to business, science to design. The breakthroughs and innovations that we uncover lead to new ways of thinking, new connections, and new industries.

  • Facebook
  • Twitter
  • Pinterest
  • Youtube
  • Instagram

More From WIRED

  • Subscribe
  • Newsletters
  • FAQ
  • Wired Staff
  • Press Center

Contact

  • Advertise
  • Contact Us
  • Customer Care
  • Send a tip securely to WIRED
  • Jobs
  • RSS
  • Site Map
  • Accessibility Help
  • Condé Nast Store
  • Condé Nast Spotlight

© 2020 Condé Nast. All rights reserved. Use of this site constitutes acceptance of our User Agreement (updated 1/1/20) and Privacy Policy and Cookie Statement (updated 1/1/20) and Your California Privacy Rights. Do Not Sell My Personal Information Wired may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices


View Original Article

Site VisitorsMap

Min-Min, n.

Min-Min, n. A mysterious, phosphorescent light observed sporadically in the Australian outback.

Recent News

Site Statistics

338 registered users
4,883,774 words
6,012 posts, 3 comments
10563 images