Posted on November 8, 2021 in Uncategorized

Facebook is Horrible for Information

The 2016 and 2020 elections were some of the tensest elections Americans had ever faced, thanks, in part, to Facebook and Twitter.

Why is nobody defending Facebook, even as they begrudgingly defend Twitter?

1) Facebook Knows it Makes You Feel Bad – and it’s Okay with That

Studies show repeatedly that social media makes many people a little unhappier than they were before they opened it – or that a control group that didn’t go to social media sites was happier than the one that did.

If you feel like the internet got negative, and it used to be much easier to use without feeling like garbage after, you’re not necessarily wrong. The rise of engagement tracking means that different things are being used to keep your attention. People will spend more time replying to comments if they’re emotionally invested in the conversation – even the worst, most inflammatory, keyboard-smashing arguments are ‘good’ for algorithmically-based social media sites. Additionally, sad or alarming news generates more clicks than good news, and sensationalism has always sold more papers. It might not have been any website’s original intent, but because people are drawn to this stuff, it gets heavier weight by algorithms relying on clicks to determine ad value. More clicks means more money, so they make more of it. Facebook in particular is worse off for this – even if you go out of your way to block or unfollow things that make you unhappy, your network can still lead to exposure by proxy.

When social media only tracked account usage, and not the person using the account, people were encouraged to stick around in healthier ways. The content everywhere but news sites and intentionally dark corners was generally a little less stressful, and users weren’t exposed to it nonstop. MySpace had much less data on you, and image-sharing sites and forums were often civil by agreement even if they were anonymous. Not everything was peachy and friendly, and moderators still had their work cut out for them, but you weren’t deliberately being shown things that upset you by an algorithm.

2) Facebook’s Survival Depends on its Opacity to Outsiders

Facebook cannot let anyone see under the hood. If anyone could, they’d know how Cambridge Analytica happened, and how disinformation is still happening, and – I suspect – how little Facebook is doing to combat it. This opacity permeates every level of Facebook’s organization, so much so that employees feared that they were being listened to outside of work by their company-issued phones. It’s one thing for a tech company to say ‘no pictures, no phones inside the facility’ – it’s another entirely for the employees to fear discussing their employment at home because they don’t trust that spyware stayed at work. Reports of burner phones made the rounds a little while ago for exactly this reason.

It’s also difficult to tell exactly what they’re doing with valuable personal data – the Cambridge Analytica trial revealed some things, but left much in the dark. Zuckerberg navigated the questions so carefully and deliberately that memes about how robotic he sounded circulated for weeks. Facebook has trapped themselves between a rock and a hard place – continue to be evasive and opaque, or reveal that they’ve been behaving almost maliciously to keep growing.

It knows this is also a bad look, and so Zuckerberg planned on making them look good to consumers with what is effectively positive propaganda disguised as ordinary news in the news feed via Project Amplify. Project Amplify would mix positive articles about Facebook, some of them written by the company itself, in with the regular newsfeed.

The recent whistleblowing by Frances Haugen shows the enormity of the issue – more in-depth articles are already out there, and here’s a link to one: (https://www.technologyreview.com/2021/10/05/1036519/facebook-whistleblower-frances-haugen-algorithms/?utm_source=pocket-newtab). This is really an article all by itself, so I won’t double the length of this listicle off of this one point – but it is a good read.

3) Facebook still has Troll Farms – And Won’t Get Rid Of Them

Twitter did admit it had bots – it’s mass-banning of them was a PR win. It was a relief. As of September 29th of 2021, Youtube has finally taken action to remove information that’s deliberately incorrect from its platform – not just demonetize it, which allows these alternate-reality peddlers to continue making content and potentially money off of other Patreon-like sites. This leaves Facebook as the last member of the triad, but Facebook can’t seem to figure out how to do the same on their own platform.

The fastest way to share information is images and memes. It requires no copy-pasting, it’s eye-catching, usually short, and basic bots and text crawlers can’t read it. Any ‘meme’ style infographic is immediately suspect now because Facebook has allowed troll farms to pump out these misinformative memes at high speeds. Even if they link back to a respectable source, the consumer now has to verify that the respectable source says what the meme said it did – otherwise, they can still be exposed to disinformation. ‘It’s your fault for not checking’, is a common rebuttal. ‘You can’t trust everything you see online’, is another.

And yet we’ve seen time and time again that people do trust these troll farms over doctors and scientists. They cannot possibly check every single picture they see. If it looks good enough, people will believe it, and the line between ‘skewed’ and ‘false’ has never been so tenuous. A comfortable lie with one big, bad enemy is obviously easier to stomach than the reality of the situation, and it’s irresponsible for Facebook to keep playing the ‘you should have known it was fake’ card when the troll farms are manipulative enough and spread out enough to make a narrative so convincing that it’s literally defying reality. Yes, they shouldn’t be trusting everything they see, but they want to trust something. They want something to be stable.

By allowing troll farms and the 12 or so accounts making up 67% of the misinformation on the site to continue posting, they’re tacitly agreeing that it’s okay to be so wrong that people die. Even worse – they agreed it was wrong! They planned to post good information in place of the stuff they were removing. In the first push before the pandemic was widely politicized, Facebook removed an enormous amount of posts containing mis- and disinformation from the platform, only to slow down – for some reason – as it became clear that their options were to take a stand and risk ad revenue, or stay the course and guarantee that certain viewers would cling, later on.

It’s one thing to try and protect free speech – every platform is struggling to juggle where deliberate, damaging disinformation begins and personal opinion ends. It’s another entirely to allow troll farms to continue pumping out disinformation after it becomes clear they’re doing it despite measurable harm to people who have reposted their memes, especially when the posts are still earning money for the farm behind them. Even Youtube didn’t allow their disinformation peddlers to make money off of it.

4) Facebook is Trying to Tie Itself To Everything Else to Make Itself Unremovable and Unavoidable

Oculus Rift. Instagram. Messaging apps. Facebook itself. And now, a meta-verse. Facebook knows it would die in a much more undignified manner than MySpace did if it ever stopped trying to grow, and so it is steadily trying to get its foot in the door of as many other industries as it can. It knows it can squeeze a few more drops of data out of its most loyal customers if it just keeps growing. It knows it can earn a few more percentage points of market share if cool, cutting-edge tech requires a Facebook account. Amazon, Nestle, and other ethically-debatable companies are also doing this. Once the company is big enough, the consumer would have to put in special effort to avoid that company’s products, and many won’t know or won’t bother.

Branching out in itself is not the problem: Colgate made TV dinners for a while, and Barnes and Noble sells board games and stationary alongside the books. The problem is in making an interconnected network that can stomp out competition, harvest data from multiple angles, and then use funding from itself to weather out storms that the remaining competition can’t.

Sources:

https://www.technologyreview.com/2021/09/16/1035851/facebook-troll-farms-report-us-2020-election/?utm_source=pocket-newtab

https://www.bbc.com/future/article/20140728-why-is-all-the-news-bad

https://www.latimes.com/science/story/2019-09-05/why-people-respond-to-negative-news

https://www.businessinsider.com/facebook-employees-paranoid-burner-phones-2018-12

https://www.consumerreports.org/social-media/facebook-approved-ads-with-coronavirus-misinformation/

https://www.theguardian.com/world/2021/jul/17/covid-misinformation-conspiracy-theories-ccdh-report

https://www.technologyreview.com/2021/10/05/1036519/facebook-whistleblower-frances-haugen-algorithms/?utm_source=pocket-newtab