Insights: Facebook Cuts Out Middleman, Runs Own Disinformation Campaign

Insights: Facebook Cuts Out Middleman, Runs Own Disinformation Campaign

Wow, what a disastrous September for Facebook! And, because a third of the planet uses Facebook, Instagram, Oculus, and/or WhatsApp, probably a pretty disastrous month for the rest of us, too.


Now the company is pivoting from a seemingly endless string of bad news about mammoth missteps, mistakes, and miscalculations to run a disinformation–er, public relations campaign to distract its 2.3 billion users from its manifold problems. What could go wrong? 


To be sure, the month’s mayhem list is long, long, long, and worthy of all the disinforming and distracting Facebook can muster on its own behalf, which is prodigious.


The problems most notably began with findings of a weeklong Wall Street Journal series built around a Panama Papers-sized trove of internal information (a trove also alliteratively called the Facebook Files). But that was just the worst of a very bad lot of reports of company problems. To quickly recap:


  • Facebook’s own research found Instagram makes about a third of teen girls (IG’s most ardent users) feel worse about their bodies. Being a teenager is pretty much open season on self-loathing, but Instagram algorithms apparently pour emotional lighter fluid on that particular bonfire of anti-vanities. 

  • Other algorithm tweaks designed to increase “meaningful social interactions” on Facebook created incentives to post things that just made viewers more angry, amping up “misinformation, toxicity and violent content.” Worse, executives left the algorithm changes in place once problems surfaced, because it valued longer user engagement more than reducing negative impacts on billions of users.  

  • Facebook treated VIPs differently than the rest of us when they violated company policies on areas such as hate speech. Often, the secret XCheck program would highlight a problematic post, and then…nothing would happen. A company review of XCheck said the double standard was “a breach of trust.”  

  •  In many developing countries, posts encouraging ethnic violence, human trafficking, and other potentially lethal content tend to stay on Facebook long after they’ve been flagged for removal.

  • Even in Germany, The Markup found, Facebook was disproportionately amplifying the anti-immigrant and anti-pandemic-restriction posts of an extreme right-wing party in this week’s election to replace long-time chancellor Angela Merkel.

  • And it’s not just overseas elections getting monkeyed with. Troll farms on Facebook reached 140 million Americans per month (close to half the populace) with election disinformation in 2020, MIT Technology Review reported. The trolls targeted Black and Christian websites with false information and dirty tricks posts. 

  • In the middle of a world-changing pandemic, the company left open a loophole fueling a huge expansion of anti-vaccine disinformation, despite widespread internal concerns. 

  • A plan to create an “Instagram for kids” program was halted amid heavy criticism, especially after that WSJ report on grown-up Instagram’s disastrous impact on teen girls.   

  • The Federal Trade Commission refiled its antitrust case seeking to unwind several big acquisitions and essentially break up the company. 

  • Facebook Marketplace, an increasingly lucrative ecommerce service with one billion users, is riddled with fraud and scam listings, according to ProPublica

I’m sure I’ve missed a couple of things. But even so, it’s a lot. And worse, as the Journal put it, “Facebook knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.”

Facebook responds, kind of


Facebook chief fixer Nick Clegg (technically, he’s VP of global affairs) had a weaksauce response to all the Journal stories. 


“The fact that not every idea that a researcher raises is acted upon doesn’t mean Facebook teams are not continually considering a range of different improvements,” he wrote in a blog post. 


So what’s a stumbling social media giant to do? Maybe try to fix the problems, reduce the social harm, and focus on slightly less profit-making and a bit more mental-health-improving. Nah, not our stumbling social media giant. 


Instead, Facebook has rolled out a new feature designed to promote positive stories about the social network in your news feed in an attempt to bury all the negative publicity about the company, and especially negative stories about its robo-monarch Mark Zuckerberg, who approved Project Amplify last month. 


And Project Amplify is only part of Facebook’s efforts to apply lipstick and rouge to its very piggy public persona.


You may have noticed Facebook ads calling for amendments to Section 230 of the Communications Decency Act, legislation that literally has allowed Facebook to survive and flower without legal liability for all that crummy content posted on its site.


Of course, Facebook wants just the right kind of amendments, understandable when the existing legislation helped make your founder one of the world’s richest men, and his company one of its most highly valued.


One big, unhappy community


The Section 230 spots are only part of the company’s ad campaigns, which continue to push a broader feel-good narrative about the company’s ability to bring us all together in one big, unhappy, dysfunctional community. 


Other tactics include burying a negative internal report, distancing Zuckerberg from all the scandals (perhaps keeping him in a secure, undisclosed location?), and reducing researcher access to data about the company that shows it’s a sinkhole for disinformation. 


It’s all part of an aggressive effort to reshape negative perceptions of the company that have been building at least since its role became known in Russian dirty tricks in the 2016 U.S. elections, and the Cambridge Analytica scandal. 


Out go the apologies, in come more aggressive responses to things like President Joe Biden‘s comment that vaccine disinformation on Facebook has been “killing people.” 


“Facebook is not the reason this (vaccination) goal was missed,” wrote Guy Rosen, the company’s VP for, ahem, integrity. 


Becoming the guy on the hydrofoil


Zuckerberg, meanwhile, has stopped talking about controversies and snafus in his blog posts and social media. These days, he’s posting things like an improbable image of him skimming across a lake on a hydrofoil while waving an American flag. As you do. 


It’s possible that all these efforts will humanize the inhuman face of one of our biggest companies. It’s also possible that these campaigns will only lead to more efforts to uncover what’s really going on behind the face of one of our most problematic companies, exposing its algorithms, internal research, and executive decisions for everyone to see.


The fact that Facebook is both those companies means it’s vital that those executives spend more time on meaningful change and improvements, and less time on public relations, marketing, and advertising.


But somehow I’m guessing Zuckerberg prefers we keep skimming just above that vast lake of data, waving a flag, and acting like nothing bad is happening below the surface.