Posted on May 3, 2022 in Technology

The ‘Slap a Teacher’ Trend Wasn’t Real

Facebook paid for those ads and it’s weird.

TikTok Trends

Most of the TikTok trends you see reach ‘the outer world’ are pretty harmless. Some have the potential to be dangerous – the one I remember causing the most problems was the dance for ‘Kiki’ by Drake, where the participant walks alongside a car while doing a dance, and not all of them figured out that somebody else was driving the car before they did it themselves – but most are inoffensive because most people are normal humans who don’t want to be annoying. If they don’t realize that what they’re about to do will be annoying, their comment section will usually call them out on it. If their commentors don’t realize or don’t care, then the content creator may get a twenty-minute Youtube callout post from an outsider, reserved for the most annoying individuals on the platform.

This bizarre chain of watchers means that generally, people want to be nice, or at least harmlessly annoying. You don’t hit people. You don’t shove things in their face. You can startle them, but not too badly, and only if you know them from class or something. You don’t do things like grab onto someone’s cart and take it from them or walk up to someone and start moaning in front of them while making eye contact. You don’t disrespect teachers unless they disrespect you first, and by extension you generally don’t have a good excuse to disrupt class. All of these things are not only rude, they’re cringe. You’re cringy if you don’t know you’re not supposed to actually be mean. It is cringy to think that you’re the main character, that you can disrupt class for 39 other kids. Believe it or not, many teens do have a grasp on empathy, and as a large group, tend to reject things that aren’t harmless fun. (Individually and in small groups is a different story).

Which is why this ‘Slap a Teacher’ thing was fishy to anybody actually on TikTok, outside of the boomer meme circles.

Are Kids Degenerates?

In TikTok’s early days, I watched a creator get told off because he’d brought a microwave to his college class, walked up to the front of the lecture hall full of students when the teacher asked him to stop disrupting class, and then told the teacher his new career was being transgender for some reason (allegedly, he was retelling this story via voiceover on footage from the incident, and he could have said anything that got him kicked out of class). People in the comments started ragging on him immediately, pointing out that other fully grown adults had paid to be there, and his shenanigans were interrupting their class time. This is the video I remember as the turning point of the Wild-West TikTok into the Social Media TikTok – the final transition from 4Chan to Twitter. The antics that used to be funny no longer were, and you’d have to get new ones if you didn’t want people criticizing you for your meanspirited jokes. We want something funny, not shocking! Any idiot can be shocking! Was the general consensus.

Of course shock artists tried to stick around, but few of them were coming up with anything new to say about stereotypes – the jokes were old, they got buried, they were briefly resurfaced for TikTok, and then TikTok got bored of them too. This meant any trend that could be boiled down to “look how mad these people got when I inconvenienced them” or “haha, that’s politically incorrect but I said it anyway” were doomed to die out from the start.

Facebook Hates TikTok

Facebook haaaaates TikTok. Facebook experienced real subscriber count decline for the first time in forever, and part of that can be attributed to better entertainment sources elsewhere. People don’t trust Facebook, or they trust it too much – Facebook insists on taking as much information about a person as it can and has a profile on you even if you’ve never used it ever. Facebook runs ads as a company, so it does the same thing any third-party ad company does and follows you around. Of course, TikTok is harvesting information on its users and adjacent non-users as well, but the effects of this haven’t really been seen yet. It takes time to build a database as extensive and far-seeing as Facebook’s is.

Plus, the era in which people signed up for each site was different. Users felt betrayed when it came out that Facebook was harvesting their data, but all but the most tech-illiterate out there knew TikTok was going to be consuming their data when they signed up. Facebook was invited into people’s home’s as a guest, and it’s been behaving like it’s at a bar. Meanwhile TikTok was invited to a bar, and it’s behaving like it’s at a bar. People expect annoying or creepy behavior from TikTok before they decide to sign up – they didn’t for Facebook.

Facebook is Actually Pretty Soft on Terrorists

Facebook has had its fair share of issues relating to genocides in other countries – The Rohingya Genocide, for example, can be partly blamed on a combination of inaction and deliberate stoking for engagement on Facebook’s part. They didn’t delete what they should have in time for it to matter, and a literal genocide took place as a result. Is it entirely their fault? No. Is enough of it their fault for a lawsuit to get off the ground? Yes. Facebook is also a notorious source of false and dishonest information. Users had become so accustomed to it that Facebook’s labeling of posts matching disinformation criteria was called censorship, even though the posts were allowed to stay, just with the label! An absurdly small amount of people were allowed to circulate tons of disinformation, and there was nothing in place to stop them from doing so, up to and including the company itself.

While TikTok also has disinformation, it has a better auto filter and a less forgiving algorithm for controversial videos. Shares and comments reward content, but blocking the user, saying ‘show me less videos like these’, or otherwise expressing displeasure with the content means it’s not going to get put in front of everyone on the For You Page. Facebook’s algorithms are murkier. Users who post, say, Covid misinformation are not blasted out to the wider userbase like they’d been on Facebook, instead generally corralled into their own echo chamber. Is that good? No, but it’s better than nothing. During the BLM protests in 2020, TikTok also moved very slowly to remove videos of the protests, for good or bad, which earned a lot of favor from the mostly young userbase – videos of police using extreme methods to disperse crowds who weren’t legally overstepping their right to protest went far and wide in part thanks to TikTok.

As a result, the younger demographics using social media had a better perception of TikTok than they did Facebook, and combined with the first loss of subscribers in some time, Facebook took to an advertising firm to try and ‘fix’ this. Specifically, a right-leaning one, or at least one that didn’t care who its clientele were as long as they could pay the fee. This lead to a right-leaning bias, and left-leaning ad campaigns sought different firms, leading to this firm ending up with a mostly right-leaning portfolio. Facebook may have chosen this firm for many reasons, including a slack attitude when it came to content (this ‘Slap A Teacher’ trend is entirely fake news), but we don’t know for sure that it was because it leaned right – the already existing contacts with right-leaning news may have been all Facebook wanted out of that firm.

Slap A Teacher

The firm they hired to advertise targeted reactionary channels with a fake story to rile up the most reactionary viewers. Allegedly, slapping a teacher was the hot new trend on TikTok, and this firm was going to make sure everyone watching TV knew. The firm focused most of its efforts on the farthest right channels available to the U.S., but the demographics work out such that the largely young audience of TikTok wouldn’t see (and therefore wouldn’t deny) the marketed trend, while the older adults who were watching said right-leaning channels had never been on TikTok (or had been, but got funneled to the previously-mentioned echo chambers). Meaning neither side would challenge the assertion, so parents ‘knew’ this was happening, but kids had never heard of it. The perfect crime to trick gullible or tech-illiterate reactionaries into believing TikTok was a site for violent children, or that it would turn their children violent.

Of course, this was never a thing, as I said earlier.

Anything relating to any violence (even red paint in the wrong spots can trigger the automated filters) is either removed or tagged with a banner at the bottom, a few months after the peak of the BLM protests passed. Incidents with real violence can’t be shared on TikTok anymore. Any video of a singular incident wouldn’t have been shown to many people, and even if it did somehow escape the filters, the user themselves would have likely been challenged on it via the watching system described at the very start of this article. The campaign never made sense if you understood anything about the website, and it’s unfortunate that that was used to Facebook’s advantage!

Fake news that amounts to advertising is a new phenomenon on Cable TV – it seems disinformation used to push users into buying a specific product over another has come for us all with Facebook (now Meta’s) funding.

Sources:

https://www.wavy.com/news/national/facebook-paid-gop-firm-to-run-campaign-against-tiktok-report/