AI Generated Shows Are Not A Solution

Endless Seinfeld was an experiment ran using a ChatGPT cousin tailored to the purpose of writing scripts. Sitcoms, with rare exceptions, do not allow their characters to change. They will begrudgingly hand birthdays to a child character as time turns incoherent toddler actors into walking, talking children, but that’s generally the full extent of it until a dog needs to be introduced to the show at the end of season 6 because ratings are falling. Sitcoms are designed so that they can end whenever, but that ending can be pushed out indefinitely until the show is no longer profitable, and then it can end. Shows like How I Met Your Mother, where the ending felt bizarrely rushed, are actually pretty common as a result.

TV sitcoms represent a cozy place where everyone knows everyone else. The characters will never betray the viewer. They are perfect parasocial friends. But the writers run out of material, and the actors get better parts, and slowly, the show falls apart, as naturally as iron rusting away.

ChatGPT and other automatable art plagiarizers content generators are aiming to provide a solution for this, the perpetual motion machine to keep Seinfeld in comedic situations forever. It was unfunny, and sometimes it said stuff that didn’t make any sense, but hey – give it some time and it’ll surely be as funny as the real deal.

And then the AI behind Forever Seinfeld went transphobic, and Twitch (the platform where the AI show was hosted) pulled the plug. Is there enough content on the web to scrape for network-safe comedy, or will non-human writers inevitably run out of clean content on an open web?

The Problem of Treating All of Online Like Edible Content

The reason these things turn racist, or bigoted, or political, is because they don’t have a human sense for what bigotry is, or what’s appropriate for ‘TV’ (Twitch TV in this case). Look at what happened to Microsoft’s Tay – she was designed to sponge up human communication patterns on an open forum and then replicate them. However, tossing a sponge into a bucket of hot acid (Twitter) means the sponge soaks up the hot acid. And hot acid is unpleasant! Tay began responding with racism and threats of violence to other Twitter users just trying to ask Tay questions. The same thing is happening here, because the underlying technology powering Endless Seinfeld is relying on all of the text it was able to crawl on the open web, with very limited filtering. As for why it took so long for that to break down, the version Endless Seinfeld was initially using had content gates built in (and it worked fairly well), but they experienced an outage, and switched to an earlier version that had significantly worse idea filtering. And boy, did that come back to haunt them.

Jokes that don’t “sound” racist or transphobic to an AI with no strong concepts of either, but are written with the cadence of a joke, will inevitably sneak into these productions. The AI understands what a punchline is grammatically, but not in abstract. How may jokes, racist or not, start with an [X] and a [Y] walk into a bar? How can the AI tell where it’s supposed to draw the line? A human certainly can. Many of the edgier versions of that joke are left anonymously on social media platforms, safely sequestered away from the poster’s real name and life. Posters say things on Reddit they’d never say out loud, for example. The robot has no such protection and no ability to read the room – it reads those jokes out loud as if it’s seeing them for the first time. All jokes are equally funny to an AI that doesn’t have a sense of humor itself.  

Worse, actually stopping this from happening in the first place is incredibly difficult because the program is so complex. ChatGPT knows what slurs are, it’s just been politely asked not to say them by its creators – even then, sometimes, something slips out if the question-asker is tricksy enough, and patching up those leaks is a long-term project.  

You Can’t Have Something Forever

Shows are usually started with the belief they will one day expire. When human writers run out of content, the show usually ends. The characters have their arcs resolved, and the writers move on to new projects. Shows like Fairly Oddparents, where every possible sitcom end-of-life trope is used to introduce new material (adding a baby, adding a dog, adding a “long lost cousin” type character who sucks away time from the flanderized main character, etc.) demonstrate what happens when the network won’t let a cash cow go: the show dies twice. The Simpsons are still going, a bizarro-world version of the original that may as well be a parody of itself now. The same goes for Spongebob. Some people herald AI-generated content as a solution to such problems, allowing those mainstay shows to become permanent fixtures of their channels, but the problem would still exist even if AI was writing the scripts. There is no accounting for material fatigue. There’s a joke that the Simpsons have done everything there is to do in TV – how many more wacky hijinks could someone expect Lisa to get into, for example, unless she turns into a character that is no longer Lisa, one that doesn’t learn anything from anybody? How much time can an AI buy a show without repeating other, better material, or without writing a completely different genericized show? How long can it keep going after that, even if the owners of the property find that acceptable?

The Phantom of the Opera, a Broadway show that’s been running since the eighties, has employed several members of its orchestra since the show began. Phantom of the Opera is a legend. A career-maker. Culture changed around Broadway when that show was running! New techniques were developed so a chandelier could come crashing down in front of the audience every night! It’s one of very few great Broadway-to-movie musicals. The script was always the same, and yet every fresh casting of Christina or Phantom gave new life to their role in spite of that, delivered the same lines on that stage slightly differently, carried themselves a little differently. And yet this incredible hour in history, a blink of an eye that could have gone on as a tradition perhaps forever, ended. This ending coincided with the release of Bad Cinderella to America, a show that fell off Broadway embarrassingly soon after its release. It doesn’t matter who’s writing it, whether the story progresses or stays the same: there is no content that can live forever, changed or not.

No matter how good something is or was, we’re going to lose it. AI will not stop this, partly because even people can’t – the AI is relying on people to fuel its modelling, so it has human limitations when it comes to imagination even if it has a robot’s writing endurance. A sequel to the movie Phantom of the Opera exists, and it’s not very good. Many of Disney’s Golden-Age-era movies do too, and they’re also generally nowhere near as good as the original. Demanding a beautiful, brilliant story continue past its obvious conclusion because viewers can’t bear to watch such a wonderful movie, TV show, etc. end is just killing it a different way.

https://arstechnica.com/information-technology/2023/02/endless-seinfeld-episode-grinds-to-a-halt-after-ai-comic-violates-twitch-guidelines/
https://www.npr.org/2008/08/10/93419533/phantom-of-the-opera-20-years-in-the-pit