Category Archive

Uncategorized

What is -Core? Why is it All Online?

Elizabeth Uncategorized November 5, 2021

Subcultures: The Big Ones

Alternative, Goth, Emo, and Punk may be indistinguishable to someone outside those scenes, but the people within them know exactly what they’re looking at. A crow isn’t going to confuse a grackle or a raven for another crow.

All of these counter-culture movements have stuff in common – they all have music genres associated with them, they were all pointedly against whatever mainstream culture was doing, and they were all made for people who didn’t fit in or didn’t want to. Punks were anti-war and anti-establishment. Emos were frustrated with a society that put looking perfect above genuine human connection. Goths don’t want the relationship with death and tragedy to be so strained.  

All of these big ones have also been around for so long that they’ve earned mainstream acknowledgement – punk music written post-Vietnam is as relevant today as it was then, Hot Topic is proof that emo is still around, and older goth stuff may look dated, but it’s not uncool, Addams Family style, all the way back to black and white TV.

Splitting

All of these subcultures being this old and this popular means that they’re changing as the next generations build their own subcultures within them. See E-Girls and E-Boys, a more recent offshoot of Emo that uses modern fashion to craft the look. It’s also escaped some of the negative associations of Emo, which included self-harm and untreated mental illness. Meanwhile, within goth, there’s now stuff like nature goth and pastel goth alongside trad goths – the focus is still on being goth, and it has the same roots, but it’s all different colors. Cyber-goths and vampire goths live in two different worlds, and they go to two different shows with wildly different music, even though both are technically goth.

Many of these counter-cultures eventually bleed into mainstream, at least a little – popular music from a genre is usually popular because it’s good! See Evanescence, Paramour, My Chemical Romance, Rage Against the Machine, Green Day, Linkin Park, etc. etc. and when it gets exposed to mainstream culture, the subculture grows and splits naturally.

And then social media began connecting people with pictures and music instead of just forums, and the world got both bigger and smaller. Here we see “core” stuff begin to form.

Sub-Subculture

Labeling things helps people identify that thing.

Nightcore is a genre of music characterized by speeding up songs until the singer is very high-pitched and the beat is double or triple what it used to be, so it was more danceable. The term “core” here was coined by the band because they’re the ‘core of the night’, according to Dictionary.com, who also states that “core” was frequently used in reference to music genres – as the band that started the trend was Norwegian, it’s possible they just superimposed their own meaning onto the word so they had something to go by when asked. Nightcore is the first time I remember seeing “core” used as a descriptor alone. Nightcore isn’t the only root for the trend, but it’s a notable one because it came with aesthetics attached.

Years passed, and a movement slowly formed. Before you could just call something ‘weebcore’ or ‘nerdcore’ or anything else, you had to assemble the look the hard way. You want to build mood boards for Tumblr, and you want to find clothes that fit the look you like? If that’s not already named by popular online culture or style magazines, you have to seek out individual bits and bobs to make your style a living thing. What’s really annoying is when other people like it too, but you can’t find each other on a platform because you’re calling your stuff “Celestial” and they’re calling their stuff “Midnight Chic”. Or Blue Goth and Night Goth. Or Night Faerie and Starlight. You get the picture. If the names that blogs made up for their style didn’t catch on or didn’t get popular, well – neither did their looks, and it never turned into something bigger. Tumblr’s search function is notoriously horrible, and tags are the only real way to navigate the site, even now. Pinterest isn’t much better. Having a name for a style makes connecting online so much easier.

Cottagecore took off on Tumblr in 2018, and others followed it. While it wasn’t the first, it was very popular and easy to add to. Others followed suit.

If you like fairies, magic, and all of the feelings that come with them, and you like pictures of green woods – boom, fairycore, now you find more of it. If you really like the aesthetic of candy, and you like brightly colored clothing, you could be called candy-core, and other people in the know can put together what you mean. This naming scheme is very convenient, and it makes searching easy once you understand it – it’s also easier to tag and easier to navigate once people cotton on to the trend. Other, older subcultures that already had names didn’t need to adapt, but this made it easier to make new ones.

In the beginning, that was great, but now it’s turning into a nightmare.

Sub-Sub-Subculture

You can be freed by a label in the same way you’re constricted by it. People want more of something they think is really cool, and as a result you get hyper specific ___cores, stuff that’s identifiable by one or two things alone. See Glitchcore: Glitchcore comes down to anime girls and glitchy, psychedelic rainbows – at least, that’s all you can find of it on Google. The music, too, often sounds pretty similar across bands, because if it doesn’t hit certain points, it’s hyperpop or electronic instead of glitchcore. It’s so unique that it’s difficult to make more of.

Alternatively, look at metal bands that made it big – their fans usually have a pretty clear style, and if the band’s not deliberately inviting, the hardcore fandom will do everything in it’s power to sort out “posers” who want new stuff from the band, cutting off creativity at the knees to keep the style ‘pure’.

Now, you have a name for what you like, but you’ve pigeonholed yourself, and the style is so small it’s tough to connect with people who really get it.

Insensitivity

Besides, trends that piggyback off of other, older stuff can create issues in different ways when all of their context is online. Cottagecore clothes and retro-style clothes are only recognizable as ‘old-fashioned’ to anyone who’s not in-the-know on sites like TikTok or Instagram. People online know that traditional clothing doesn’t equal “traditional” views on women and society… people offline might not, and that can lead to some unwanted attention for the wearer.

Similarly, wearing a cowboy hat and bolo tie when you’ve never ridden a horse isn’t going to be met with ‘oh, wow, cowboycore, huh?’ offline, especially in the South or Southwest.

Certain ‘aesthetics’ also ignore the roots of a look. To use cottagecore again, cute farm animals are common items in mood boards and on cottagecore-themed clothing. Notice I said items – the reality of taking care of these animals isn’t a crucial part of the board, because why would it be? Cottagecore creates a fantasyland where the people out in the sticks who actually own chickens are living like Snow White, not getting up at 5 AM with the rooster to feed them, not getting pecked for eggs, never having to clean out a smelly roost, and never having to worry about coyote or fox attacks in the night, or vet visits. There are no spiders for horse riders to accidentally run into, and no power outages or spotty internet. The weather is always somewhere between temperate and cold. This romanticized version of how people on farms live is obviously incorrect to anyone who’s actually lived on one – to the point that it’s almost insulting.

Aaaaand We’re Back On Microtrends

If you look at it – really look at it – sometimes aesthetics are just shopping lists of items because you can’t achieve the look with other items you already have. This is a problem. Look at punk jackets – each one is unique because it was made special, not bought special. Look at E-Girls and E-Boys – stripes and black clothes stick out, but the items are generally bought because they can be cut to shreds and stitched or pinned back together in interesting ways. It doesn’t have to be new to be E-Girl/E-Boy stuff. OG Cottagecore often encourages wearers to buy old dresses and tailor them to custom-fit, making them new again. All of these styles are A) accessible, and B) possible to buy for ethically, meaning buying second-hand clothes or buying from sources that don’t use sweatshops. Of course, some consumption from bad sources happens with any trend, but the point is that the option exists.

The issues with microaesthetics/__cores are much worse when the options are limited and there’s no other way to get the look but buying new items. You’ve heard of Retro/50’s style, but what about RetroFuturism? Specifically, Retro-Space-Core? If only one company makes see through plastic shoes, that company just hit the jackpot when Retro-Space-Core becomes the next hot micro-aesthetic, even if it’s issues with plastic waste and worker treatment are well-documented. People will buy them anyway.

Because of this, it is much worse that small aesthetics and __core clubs can be so small that new unique items constitute a major development in the style space. You can make something that nobody wants, but everyone will buy and discard, just because it’s aesthetic. See the frog chair that got big and then died out. Frogs are cute, and the frog chair was very cute, but TikTok cottagecore latched onto it as a shortcut to cottage vibes… and then gradually came to realize it was difficult to actually decorate with because it was aimed at kids. The same goes for wicker furniture – you do actually have to take care of wicker, or it deteriorates, but the people buying wicker stuff for the first time when it was hot often didn’t know that. Or, you could look at any number of boots, sweaters, and jackets that flash into aesthetic ‘must-have’ lists like magnesium in a pond before fading back into the background noise.

Ultimately, doing research on products and the microaesthetics themselves can stop most of the issues associated with them. The rest comes down to maybe accepting that sometimes, things look unique because they are unique, and asking for more of it or trying to make a whole style out of it is going to suck the fun out of the inspiration. If the movie, or music, or whatever isn’t broad enough to style around, that’s okay. It doesn’t have to be. Not every aspect of media has to be drawn out and analyzed into an aesthetic.

Sources:

https://www.dictionary.com/e/pop-culture/nightcore/

https://www.thelist.com/418037/what-is-a-micro-trend-and-how-can-it-affect-fashion-sustainability/

https://www.architecturaldigest.com/story/what-exactly-is-cottagecore

https://pitchfork.com/thepitch/is-glitchcore-a-tiktok-aesthetic-a-new-microgenre-or-the-latest-iteration-of-glitch-art/

Apple’s Cameras are Becoming Too Delicate for Consumers

Elizabeth Uncategorized November 3, 2021

Do consumers want photography equipment, or a regular small camera with worse quality?

Apple

Apple is known for having the best in-phone cameras in the industry – they have been for quite some time, even as other companies like Samsung and Microsoft try to catch up and take that title for themselves. The phones are quite pricey, and things like the OS and the hardware inside the device are rarely fully appreciated for how much they cost in R&D and rare earth metals.

Apple, during it’s time under Jobs, pushed to create value for the consumer that could justify the price of a device that shattered its screen easily, but had to go to a proprietary repair shop afterwards (and still cost hundreds of dollars used). Cameras are one of the most visible and most used parts of a phone – Instagram, TikTok, and a number of other social media and content-sharing websites rely on phones with cameras for their user’s content to the point that they wouldn’t exist without them. These apps, of course, reward better cameras, which starts a feedback loop of demand.

Jobs made sure the camera was at least a little bit better with every new edition of the iPhone, and the people who took over when he left didn’t buck the trend.

Cameras

On professional equipment, certain lenses give better results at certain distances. The curvature of the lens directly affects the way the subject looks at the end! Bigger cameras that take in more data also give AI more to work with – and if they want to keep slapping in features that rely on AI, they’re going to need it.  

The camera on the last two iPhones boasts three lenses, and each serves a different purpose – one is long-distance, one is an all-rounder, and one is for closeups. It switches automatically between the three as the user uses the camera app. As said before, different lenses produce different results, and by allowing two of these lenses to specialize, they’ve made the iPhone even better at taking pictures. Additional features including internal gyroscopes and vibration sensors keep the camera focused on the right thing and reduce the effect of user movement, further improving the image. These tiny, delicate machinery parts have to be incredibly small to fit inside the phone – which, surprisingly, is somehow only 0.5 mm thicker than the iPhone 6, which was notorious for bending under pressure due to its aluminum frame. It’s incredible engineering!

All of this adds up to a phone that is easy to take and edit pictures with, better than what Samsung or Android offers most of the time. However, these gigantic, hyper-specialized cameras are beginning to present issues for the consumer – stuff that Apple can’t simply program away.

The Issue

https://vm.tiktok.com/ZM8SJxQMA

This video demonstrates the damage it does to the phone.

Motorcycle vibrations are damaging the internal components of the device necessary to keep the camera focused in the right place – those little gyroscopes and vibration sensors are extremely fine and delicate. Other phones have things like these too, but they’re bigger, and clunkier; their job is simpler than the iPhone counterpart. The phone mounts available on the market don’t compensate for these new tools, and allow too much vibration to travel up from the engine and into the phone. Even if you don’t own or use a motorcycle, weaker vibrations are also suspect: Apple recommends a vibration-dampening mount even for cars.

This is a problem for Apple’s long-term plans. Internal items become more delicate in the vicious cycle of thinner and thinner phones – as mentioned earlier, the iPhone 6 was only 0.5 mm thinner than the new iPhone 12, even though the amount of hardware inside has increased by quite a lot. Apple is pushing its devices to the limit of what the materials it’s made out of can do!

Obsolescence

Watching Apple go through this process is really fascinating. It’s like watching a tapir turn into a dolphin. They’ve hyperspecialized so hard that new phones can take over as hobbyist items! They took out the aux cord; in-brand accessories are wildly expensive; it takes special mounts to use; it’s resistant to viruses but downloading a different browser means violating warranty; the camera is phenomenal; it’s faster than ever; the battery promises 22 hours’ worth of video playback; the outside is rock hard now. All for around a thousand dollars. Really, it’s a photography tool with phone capabilities, not the other way around!

Unfortunately, for the casuals, this overdeveloped camera and its vestigial phone material means that the iPhones 12 and 13 are actually less sturdy than they used to be, in spite of the harder ceramic shell on the outside. It’s designed to be dropped on adventures, not hooked up to a windshield for daily driving with it’s GPS. Is that what consumers want? Either way, it’s what they’re getting – Apple has never made the camera worse on the main line. That’s always relegated to the secondary line, the iPhone minis and pros, which are also weaker devices (and sometimes the camera isn’t smaller or less powerful anyway, just the device). While it’s not a pressing split now, it could turn into one if the trend continues and Apple doesn’t thicken back out for longevity’s sake.

Sources:

https://www.theverge.com/2020/10/14/21515158/iphone-12-pro-max-best-camera-biggest-phone

https://support.apple.com/en-us/HT212803

https://backlightblog.com/iphone-12-pro-camera

https://www.theverge.com/2020/10/14/21515158/iphone-12-pro-max-best-camera-biggest-phone

Stop Asking For Free Labor In Your Ads

Elizabeth Uncategorized November 1, 2021

It’s one thing when a corporation’s business account asks for interaction.

It’s another when they’re literally asking the fans to produce lyrics for them.

The Nature Of TikTok

Corporations, especially ones with mascots, want to be friends with you. They want you to love them, they want you to choose them over the store brand, and they want you to get warm fuzzies when you think of them. To do this, they want to distance themselves from their abstract representation (some corporation with an office in Illinois, some corporation with an office in Silicon Valley, some corporation…) and present you with the substitute. When older millennials think of McDonalds, they think of Ronald McDonald. When almost anyone pictures Wendy’s, they’re usually picturing the girl on the sign. The same goes for any number of products. Mascots represent the good parts of the brand, the part that customers interact with. It’s only natural that in a world with more and more visual media, the mascots would become a bigger and bigger part of campaign advertising.

Unfortunately, sometimes the brand overestimates the love for said mascot, and by extension, the brand – consumers are increasingly aware of the tactics brands use to get people to love them, and Mr. Peanut found this out the hard way when it asked fans to duet their mascot on TikTok… with real, serious labor.   

Interaction on TikTok

Bad campaigns happen on TikTok. Ironically, they often come from the companies with the most money to spend on advertising.

Small companies got it immediately, and companies with a reputation for subversive and unusual advertising got it immediately – TikTok is for interaction. Comments, “sounds”, duets, stitches, hashtags and more make TikTok an interaction paradise. Anything they produce for their account has to consider interaction. Meanwhile, corporate-giant marketing campaigns are really struggling to grasp the idea of an app that allows viewers to make fun of ads in real time, in front of other potential customers. Even now, the biggest companies are releasing ad campaigns that show a fundamental misunderstanding of the TikTok ecosystem.

The best example is McDonalds’ chicken sandwich campaign. Users were asked to repeat a phrase in time with the primary video, an attempt at encouraging interaction after a slew of plain, TV-ad style videos. This went off the rails almost immediately! The issue is that they asked for interaction, but didn’t consider what that interaction would look like – what is the other side getting out of duetting that video? Just repeating the phrase wasn’t fun. Users come to TikTok for a number of reasons, but the primary one is entertainment. When something isn’t entertaining, they make their own fun, and screw up the campaign by making videos too crass to show to children. You might say any advertising is good advertising – that is not always true.

Free Labor

When people make dances or duets to songs, they’re doing it because they like the song – they aren’t doing it for the singer or producer. If the singer asks people to participate, and they don’t do it carefully enough, it can come across as desperate, and have the exact opposite of the intended effect (it’s an open secret that some influencers can be sponsored to make dances for music, so it’s doubly desperate-looking to beg for interaction for free). One singer who comes to mind was comparing her music to Billie Eilish, and was upset that she wasn’t getting the views/streams she felt she deserved. While that by itself is just harmless kvetching, making said complaint online is likely to turn indifferent onlookers into spiteful non-fans. TikTok is surprisingly meritocratic – nobody owes you views for mediocre content, so why are you begging if your content is supposedly good?

The same goes for asking for free stuff from fans when the account itself is part of a megacorporation conglomerate. The issue is not in asking for interaction – it’s in how they ask for it, and what they ask for. If you could pay them… why are you begging creatives for free stuff?

Believe it or not, most not-chronically-online people do like it when big brands interact in good faith, or at least with the idea of profits tucked away behind good faith. See Gatorade’s community efforts in developing countries. Obviously, building those kids a better outdoor area entirely Gatorade-branded is good for profits in the long run, but in the short run, the company spent money so that kids could have something nice. Branded fun is supposed to at least pretend like the corporation cares about its customers beyond the money they spend.

When you take this principle online, it looks like ‘having fun’ coming before ‘brand exposure’. For example, one TikToker “redesigned” a bunch of brands using MSPaint, making them deliberately badly. Brands offered themselves up in her comments section for “Redesign”, and some even swapped out their profile pic for her updated version. The exposure to the brand name is happening in a way that’s not icky to the anti-big-business segment of TikTok. From a marketing perspective, that’s fantastic! Interacting with a willing, popular, grassroots account in a fun way is really the best you can ask for.

Mr. Peanut

All that said, TikTok’s young, artistic demographic doesn’t like it when a brand asks them to make them things for free. Artists are very aware that companies rely on art, but don’t value it – there’s a joke about “exposure bucks” because it’s so common to try and switch out money for something the company is already going to give the artist when they use the art in question, exposure.

Mr. Peanut’s account asked fans to produce lyrics for them over a simple piano track playing in the background. The caption reads “It’s National Nut Day, so naturally we’re gonna do something nuts, like letting you write our jingle [emojis of a whacky face and a peanut] Duet this to show us what you got [emoji of a microphone]”. https://vm.tiktok.com/ZM8u3VSS4/

Notice the phrasing: we’ll let you write a jingle for us. It echoes the sentiment that it is a privilege to create the art they demand.

Again, users who make stuff voluntarily are not doing it because the original asked them to remix or write lyrics for them, they’re doing it for themselves, because doing it is fun, and the fans like it. Duet chains of opera singers and musicians adding on to a video that wouldn’t otherwise be music are totally different from doing that for a brand that could ask and exchange money, that could sponsor written lyrics… but wants people to do it for them for free. Of course, because it was a TikTok, nobody can tell what exactly they planned to do with the duets, or if they even planned to do anything with them – but the way copyright works, nobody wanted to make something good that they should have been compensated for only to have Mr. Peanut steal it.

It took some time, but once marketing professionals unearthed their contracts for jingles, duets made them look even worse – one Tik Toker revealed that she was paid 10,000$ and still owned 50% of a jingle she was professionally commissioned to create. Jingles take work. Sounding carefree is work. Mr. Peanut asking people to make them something useful and usable for free when the going rate is so high is just gross:

https://vm.tiktok.com/ZM8uTfveK/

The worst part is that plenty of companies ask for fan contribution, they just do it with compensation. This is the kind of thing that sweepstakes used to be for: if we use your song, you can win money. An exchange of value is believed to have taken place, and fans who didn’t get their lyrics used still get to keep what they made most of the time. Mr. Peanut did not set up a reward system, leaving fans wondering what they would get out of it, much like the McDonald’s campaign but worse.

Even if they planned, cynically, for fans to write crass and unusable lyrics, they likely overestimated the benefits of TikTok users turning the campaign into a joke. The comments are laughing with the creator duetting with something too foul to use, at Mr. Peanut. The general sensation is that the company is not in on the joke, even when they think they are. Even if you think almost any brand exposure is good exposure… this is inefficient.

Mr. Peanut brand peanuts should have known better.

Sources: https://www.pepsico.com/sustainability/philanthropy

https://vm.tiktok.com/ZM8uTfveK/

ttps://vm.tiktok.com/ZM8u3VSS4/

Peloton Tread+: What’s The Deal?

Elizabeth Uncategorized October 29, 2021

It seems impossible that such an over-engineered device could be missing safety features. A touch screen, fine-tuned speed controls, internet access… but no emergency brakes. And a totally exposed belt.

Safety Precautions (Skip to Treadmill Injuries if you’re not interested in Liability)

There are a couple of different ways to get to a final, safe product, and limit liability. (This is not legal advice.)

The first way, and the way that provides the most safety for the customer, is to design the product in a way that prevents the customer from injuring themselves accidentally. For example, many toasters get very hot on the outside if they’re used too many times in a row. If the toaster company wanted to prevent that from hurting the customer, they’d want to re-design their toaster casing so it doesn’t get so hot. That’s engineering around the problem.

The second way, which is often cheaper, is to include warnings. This passes the responsibility of not using the toaster too many times in a row to the customer. This has obvious problems – namely that customers don’t like to read, especially if they think they already know how toasters work – but sometimes warnings are the best the company can do. For example, you can’t really engineer a fork-proof or waterproof toaster. Many companies have tried. The best the toaster manufacturer can do is warn customers that using a fork in the toaster is dangerous.

The third way is recommending personal protective equipment. A fireworks manufacturer can suggest that their users should always use goggles when using fireworks, to prevent eye injury. Sometimes consumer products are dangerous just by their function, and the customer has to take extra steps to keep themselves safe. A toaster manufacturer would not be able to say “goggles recommended” and get away with it. If the toaster spontaneously shot a spring into the end user’s eye, the toaster maker can’t say “well, we told you to wear goggles” – that’s out of the range of normal behavior around toasters.  

Now, with all of that said, lets get to Peloton’s new product.

Treadmill Injuries

Believe it or not, treadmill accidents are pretty common, but rarely fatal. You may have seen videos of teens shooting themselves off the back at high speed on purpose, but that happens accidentally all the time too. Head injuries are a pretty common result. This is where warnings come in – the manufacturer wants to include high speeds, but they can’t control what the end user uses that speed setting for. Warnings are an answer to this problem – PPE and engineering can’t be.

Manufacturers also discovered they got sued a whole lot less if they included certain safety features, like an emergency stop key/button and back guards. Both of these are essential for keeping kids and pets safe around moving equipment, as well as the user themselves – getting sweatpants or fur caught up in the tread can cause serious injury, from road rash to broken bones. It’s very important that the machine is easily stopped. You can’t warn someone out of tripping! Engineering takes over from warnings and PPE.

 As a result, these safety features became industry standard. The treadmill company can genuinely say in a civil court that they did their best, and any accidents were a fluke and/or the customer’s fault. Even Peloton had safety keys on this latest model, even though it was missing everything else. Peloton’s decision to remove the other things and not put in anything to replace it speaks to poor management, or poor safety testing – warnings are not suitable for every danger on the device.

Preventing It, In Writing

I mentioned those manufacturer liability ideals at the top for this reason. Many treadmills choose the engineering route, meaning they try their best to child-proof the device so that people and their loved ones can’t be hurt by a simple mistake. Something as easy as leaving the keys in the same room, or letting their cat get too close. These are things that Peloton has compensated for in the past – why now, with the Tread +, did they choose to leave these factors as warnings, instead of testing for them and correcting them in the design room?

Peloton did not include the classic back tread guard on their 4,000$ machine. That alone could have saved the child that got sucked into the treadmill. Their warning manual says not to use the machine around kids, and not to leave the safety keys in the device, and not to let pets or kids get to close to the back of the machine… but the safety engineering that could stop that (and the same engineering that other brands use, which includes a back guard on the tread and a shielded underside) was, for some reason, dropped in favor of just warning about these newly made dangers in the manual. This thing is overengineered as it is, the least they could have done is left the safeties in place! Why would you remove something that worked?! Was it an issue with the weight? Cost? Who knows!

And now regulators want it recalled! It’s obvious that the warnings aren’t good enough to prevent accidents – the Peloton Tread + has killed a child and injured 39 people, far more than it’s fair share of the statistics, as this is written in 2020.

Potential Solutions: From Computers

Assume that Peloton removed the back guards for a functional reason, or left the entire belt exposed for a functional reason, whatever that reason may be. (Keep in mind that this has a 32 inch touch screen with internet access, and costs over 4,000$.)

The belts along the bottom are exposed, meaning that once something gets sucked under, it’s going to get torn apart by the friction between the belt and the ground as it attempts to keep rolling.  Most treadmills use something like a safety key – Peloton does too. However, that’s not much help from the back of the machine, where most of the injuries are happening. Once something is behind or under the Peloton, they would have to be strong enough to lift it to free themselves or reach the stop keys, but nobody but an adult is going to be tall enough to reach those keys while also caught by the machine. A shielded underside would have fixed that, but let’s say that’s not an option even though it definitely was. (Again, 4,000$ device).

What would help? What would keep the end user safe even though most of the safety items are gone?

Easy – a resistance detector. Garages use it. Vacuums use it. All sorts of devices use it – resistance detectors keep people from being crushed to death, and they also keep the motor from burning out, something that the Peloton Tread + could do to itself if it sucks up something like a rag and doesn’t stop. This machine’s ticket price is definitely high enough to tack on some extra R&D for a resistance brake. There are issues with it, yeah, but if Peloton wants to be cutting-edge enough to take out all of those safety features, maybe it could be the cutting-edge of new safeties that make this smaller treadmill feasible and safe. Attach a warning to the touch screen that’s already there, maybe. Maybe it could take voice commands. That’s still not great, but the other answer is nothing. Nothing is between long-haired pets and kids who can’t read yet, and getting dragged under forcibly if they step just a little too close and get caught. We have technology, other treadmill brands have already been through this!

In an already overcomputered device, there’s no reason not to add a couple more, or even just keep the old safety guards. Sure, the Peloton + needs to tilt, but it can do that with a shielded bottom. Sure, it wanted to be thinner, but it didn’t need to be. Warnings are obviously not sufficing, but they refuse to do a recall anyway – more warnings aren’t going to solve a fundamental lack of protection around the back of the device.

This is a simple matter of too many computers for the user’s enjoyment and not enough for safety.

Sources:

https://www.findlaw.com/injury/product-liability/defects-in-warnings.html

https://pubmed.ncbi.nlm.nih.gov/29601218/

A History of Gaming as Told by the Elder Scrolls Series

Elizabeth Uncategorized October 27, 2021

Elder Scrolls: Arena

The first Elder Scrolls game set the stage: magic, the continent of Tamriel, and combat systems in line with other games of the time. Believe it or not, this first game was supposed to be a combat game first and an RPG second, but programmers discovered that the game was much more fun when the player was in side-quests. Gradually, the “Arena” in the original script of the game shrank away, and the new game, a game about dungeons and sidequests and overthrowing a king, came to be, reaching completion in early 1994.

The graphics are fairly interesting! It looks a lot like Doom – three-dimensional first-person games were heavily stylized with interesting pixel art and all of the colors a 1990’s screen could produce. Doom may be red and dull orange on the cover, but the insides have levels that are entirely midnight blue, acid green, etc. Elder Scrolls: Arena is no different, they had their colors and by golly they were going to use them.

It also set up things like day and night cycles, shops that closed at night, and flavor text from NPCs, all things that weren’t unique to Arena, but certainly added to the RPG feel of the game and led to a longer-lasting playable experience.  If you got out of the first dungeon. Like many games of the time, it was… somewhat unforgiving. It was also kind of demanding, computer-wise: Doom was a gimme on nearly any computer, but Arena’s size and complexity meant low-end computers would sometimes struggle to keep up.

Elder Scrolls 2: Daggerfall

Daggerfall looked a lot like Arena, at first. Most sequels at the time aimed to provide more of the same good stuff that sold the original game with less of the flaws and pain-points, and the story was less important than the playing of the game itself. However, the two year gap also made Daggerfall much bigger than the original, changing the character of the game into something even better. While all of the Elder Scrolls games allow you to free roam basically indefinitely, Daggerfall was noticeably freer than Arena. Now that the series knew what it wanted to be, it could use its resources better towards its goals – boasting an explorable area equivalent to Great Britain, the game moved away from the 2.5-D system Doom and Elder Scrolls: Arena used, and upgraded to one that was truly three dimensional. This meant it was still quite a heavy burden for computers, especially the older ones.

Games were moving beyond the limited confines of arcade-style shoot-em-ups, the Pac-Mans, the Centipedes. Where some like Doom had (and have) been stand-out exceptions, games that were like Arena’s first-planned incarnation were a dime a dozen. Daggerfall set out deliberately to create something that users could play indefinitely, something that offered a totally unique experience, something completely separate from the other games available at the time. Other games had no choice but to follow suit. While shoot-em-ups remained popular, RPGs and other more complex games gained market share.

Elder Scrolls 3: Morrowind

The jump in quality between Daggerfall and Morrowind was enormous. Polygonal art was becoming mainstream, and every game circa the early 2000s was using it – Morrowind was no exception. Game art looks dated, but not necessarily ancient, like Daggerfall’s art can to younger gamers. The game’s open world system made it an instant classic, as just like Daggerfall, you never have to do the main quest. You have plenty of alternatives in-game, and you can actively change the world you’re playing in. Other games at the time were beginning to dabble in sandbox games too: contemporaries included SimCity 4 and Grand Theft Auto: Vice City. Still, open world games weren’t everywhere. Open-worlders made up a very small percentage of the games released that year, although there were more than before. Coding an entire game to always be accessible for the player can be kind of intense, and short, plot-driven games were less effort and time for developers. WarioWare, which has nearly no plot, and Silent Hill 3, which is almost all plot, are other notable standouts from this time.

Alongside Morrowind came a bunch of new developments. Game consoles were more common than ever, and screen technology was improving. Morrowind specifically was available on the Xbox, and ports were available to play on Mac and Linux – gamers who wouldn’t have had access otherwise could now get in on the series, and game designers made sure that players could jump right into the fantasy setting with minimal prior knowledge of the series. Dark elves? Tamriel? Magic that uses a mana bar? Cool. Monsters, gods, and steampunk elements made Morrowind one of the most distinct among the Elder Scrolls games by itself.

While Morrowind also earned it’s reputation as an unforgiving and sometimes cruel game (cliff racers), it also captured the hearts and souls of an entire generation of gamers, prepping them for the next step: Oblivion.

Elder Scrolls 4: Oblivion

Elder Scrolls 4 for the Xbox 360 has aged just as well as many of its predecessors. The gameplay is certainly as fun as it used to be, but the art can be a little lacking in areas because of the awkward transitional period polygonal art went through to get to the smooth faces and realistic hair we have today. The NPCs are still full of life, but the scripting and voice-acting of some of the characters is awkward enough to be memed on over a decade after its release, because the voice actors were given their lines alphabetically and without context. Oblivion captures the essence of this time period – games can have artful moments, but they could also have goofy slicing and dicing. They could have serious combat and dramatic storylines mixed in with missions that were little more than ‘bring me stuff’. The AI of other characters and enemies made the game, and you could be buddies with NPCs instead of killing them.

And, most importantly, the game was slightly easier to get around in than Morrowind, making it more friendly to a younger audience. You could, in most cases, outrun enemies. That wasn’t always the case in Morrowind – Cliff Racers are absurdly fast and forced you to fight as you ran.

Other games from this time include Call of Duty 3 and Gears of War, a Hitman game and Bully. Classics from this era still scatter critics’ favorite game lists – the philosophy surrounding games had changed.

Elder Scrolls 5: Skyrim

Skyrim came out in 2011, ten years ago now. The game itself is pretty good, it has all of the same features of the games before it plus a near-infinite amount of dungeons. It’s also surprisingly easy to mod in the modern era – everything from tropical weather to new enemies to new character skins can be found in the modding community. One mod turns trees into hands. People with enough expertise to code on top of a base no longer need to learn how to make a game from scratch to see their ideas for preexisting games realized.

Other parts of the game, such as it’s incredibly muted color palette, are common threads among games of this time. Call of Duty’s muted color pattern, Dark Souls’s distinct color palette of blacks, browns, and every shade of gray, the plainly bleached out sky and buildings in Grand Theft Auto 4, the list goes on. Computers and screens had evolved to the point where games could wash everything in gray and still be legible, and Skyrim was victim to this design choice. Only the spells and the occasional butterfly break the pattern.

Elder Scrolls 5: Skyrim again

DLCs have existed for other games too, and in an increasingly online world, gamers can buy and download those DLCs directly to their console, which makes them much less of a hassle to get on said consoles than they used to be. Skyrim has a number of DLCs, most notably Hearthfire, which allows players to marry NPCs and adopt children (who will all still use the same lines they always have before they were married or adopted by you) and Dawnguard, which allowed werewolves and vampires in the game.

Their third DLC, Dragonborn, expands even further and allows players a glimpse of what happened after the events of Morrowind. It’s a symptom of a larger trend – it’s easier to build out levels on top of preexisting games than it is to make new ones, and when players aren’t quite done with the old world even though they’ve completed all of the interesting missions and completed the mainline quests, this can breathe new life into the game.

Oblivion had DLC. Borderlands had DLC. DLC was the hot new thing to show off that you could download stuff online, and when it wasn’t prohibitively priced, many gamers were cool with it.

And Again

The issue seems to be when DLC and other projects for a preexisting game stop the development of new ones. The big difference between Oblivion’s DLC and Skyrim’s DLC is that Oblivion eventually stopped getting DLCs – because Skyrim came out. The same goes for Borderlands and Borderlands 2, the DLC stopped because competing with the next game in the series was going to split the player base and funding.

In 2013, Bethesda released a compilation of the DLC plus a patch that made the game work better. That’s cool, it’s a way to still give the players good value for triple A price.

In 2014, Greymoor, an online version of the Elder Scrolls that was really still very Skyrim-flavored, launched. Cool – two games so close together kind of glazed over the lore issues of so much content being Skyrim.

In 2016, Bethesda released a remaster of Skyrim. That’s cool, whatever – if Oblivion got a nice makeover, it would probably sell better to new players too. However, the game is now five years old, and this is the biggest gap between games since Oblivion and Skyrim. Fans are beginning to wonder whether or not they’re going to get a new Elder Scrolls anytime soon.

This is where Skyrim breaks from the path of most games.

Ten Years of Skyrim, Skyrim Forever, Only Skyrim Now

The answer was no. Starfield, Bethesda’s next big game, is set to release on 11/11/22. There is no chance of an Elder Scrolls game getting released before or right after that date, because Elder Scrolls games are huge and consume a lot of the company’s resources to make. This means Skyrim is going to be all we see until the mid 2020’s if we get another Elder Scrolls game at all. Games have evolved. They’re bigger, now.

Skyrim continues to update only to add more things to itself. Instead of seeing more from swamps or other worlds, Skyrim’s base engine allows for essentially infinite dungeons to spawn. Other games, too, follow the pattern of riffing off the best rather than making something new, but usually, it’s not all stuffed into the last great release unless it’s a perpetually online game like Overwatch, Fall Guys, or Fortnite.

Every update to Skyrim is a disappointment to fans who want more lore about the rest of the world, or even improvements to flaws within the game that the engine couldn’t handle at the time. There’s supposed to be a civil war going on in one of the cities, and yet Skyrim can’t spawn enough NPCs to make it feel like one – wouldn’t it be super cool for a game to be able to really nail that? Skyrim did many things Oblivion did, but better – we may never get a game that does many things that Skyrim did, but better, because of how long Skyrim has spent on the buffet table. Why fix perfection if people still play the game?

The ten year anniversary of Skyrim came with a special anniversary edition pack you could buy, and that would be super cool if there were other games in the same universe that could have distracted long-term players from Skyrim in the meantime.

Sources: https://elderscrolls.bethesda.net/skyrim

https://elderscrolls.bethesda.net/en/arena

https://www.imperial-library.info/content/elder-scrolls-arena-storyline

https://elderscrolls.fandom.com/wiki/The_Elder_Scrolls_II:_Daggerfall

https://www.ign.com/wikis/elder-scrolls-online/Elder_Scrolls_Timeline

https://steamcommunity.com/app/22320/discussions/0/41973820864472523/

Google Stadia: A Prophecy Fulfilled

Elizabeth Uncategorized October 25, 2021

Google Stadia was Google’s attempt at a console-style Game Pass. Was it successful? Kinda.

Pre-Launch Haters

Google has had failed attempts at games before. Why should the Stadia be any different? Factoring in the terrible odds against it (like competing with two other top-of-the-line consoles, one of which was only a year old), the Stadia was nearly guaranteed to DOA. However, Google kept at it, certain that this time, things would be different. They really did try. They promised things like 4K, 60 FPS streaming. They promised compatibility with other platforms. They promised that their content library was going to expand at launch, so the line-up they had wasn’t supposed to get in the way.

This was a very genuine attempt at breaking into the market, it just didn’t have enough of the right stuff.  

It even prophesized it’s own death, by featuring other failed attempts at industry break-ins next to the newly announced Stadia in a promotional pic. Picture the Sega Saturn lined up next to the Genesis, and then the Saturn also comparing itself to the PS2, with the added bonus of needing your own high-speed internet and game-capable computer/phone/etc. to use it. That’s the kind of uphill battle Google would have had to prepare for.

Still, many people were hopeful.

Hope

Stadia was supposed to function much like a Game Pass. Play as many games as you want for a low monthly fee of 10$ and do it on your own device, no upfront cost for a console necessary. There were web-based apps to run it on laptops, regular apps for phones and compatible TVs, and it had a lot of games, for the same price as the Playstation’s PS Now and the Xbox’s Game pass. The BYOD nature of the Stadia app was a double-edged sword, but it meant that it was possible to bring your Stadia account to a better device at a friend’s house.  

Stadia’s initial release library featured such classics as Orcs Must Die! And Hitman – by all accounts, the Stadia should have been an excellent example of what game streaming services had to offer. Services like Gamefly might have fallen off, but surely the idea was still a good one, right?

Where did it all go wrong?

Failure

Well, firstly, the BYOD set-up. Some people have fantastic computers, others rely on consoles to provide the high-quality gaming they desire, and keep a computer for simple things that don’t take too much computing power. After all, it doesn’t take a quad-core 256 GB rig to book an appointment for a haircut. For many people who already had a console, the Stadia was just the PS Now pass or the Game Pass with more steps, and less power behind it.

Secondly, it lagged behind for game releases. The Stadia system didn’t have many proprietary games to its name, and it didn’t get to release many before its game development studios shuttered, sixteen months after launch. Google Stadia is still around today, but it’s one or two Resident Evil games behind on its releases – understandably, consoles and online retailers like Steam get the first of the first.

This brings up the other major issue: Stadia was monthly. Steam is not. Consoles can use a mix of bought games and game-pass style content – they have access to the newest and best games on the market. Steam, which doesn’t use a game pass system (as of this article) has very regular sale events and caters to small indie games, as well as the largest of the large triple A studios. Stadia fell somewhere between the two, not having access to the latest and greatest, but not having enough small-studio content to pad out its library. And you wouldn’t own the game. Ten dollars a month for games that are behind what they already have access to via Steam or their consoles just wasn’t worth it to most consumers. It wasn’t a niche that needed filling, even though the idea itself could have been compelling if executed right.

Another major issue that reviewers note is Stadia’s strange behavior when the connection’s not quite strong enough. Audio de-syncs, but the video doesn’t de-sync with it – a Tom’s Guide reviewer compared it to a skipping record. The ultra-high quality at launch usually meant bad things for the actual gameplay, especially for weaker internet. That alone put out people with poor connections, the same way ‘Always On’ did.

Stadia is still up and running today, but it’s lack of widespread use means it gets those same high-value games even later than it was getting them before. But hey, it’s been a wild couple of years – maybe it could make a comeback now that 5G internet and better devices are on the horizon.

Sources:

https://stadia.google.com/

https://www.tomsguide.com/reviews/google-stadiahttps://www.gamesradar.com/google-stadia-release-date-price-games-controller/

The Three-headed Jack Hydra

Elizabeth Uncategorized October 22, 2021

If you had a CRT monitor and had other systems to hook up to it, you might remember the three-headed plug-ins that they used to take. The backs of those giant monitors looked like a Star Trek dashboard, with all sorts of ports for all sorts of cords. Parallel ports, TV direct-line ports, dial up, serial ports, all kinds of ports. Truly, information could come from anywhere, as long as there was a plug to connect two things to each other. Nowadays, the HDMI has become the first choice of data-transmitting plug, but the three-headed plug hasn’t completely disappeared.

What Was It?

The jack I’m remembering is also known as an RCA connector. There were two popular types: composite, and component. There were usually three heads, all part of the same cable, and the output side was color coded to match an input side on the television.

To get input from your device to the TV, you’d have to match up the plugs, which each carried a different signal to the plugs on the other side. This tech uses analog signals to send the information to the TV, instead of digital, like HDMI’s use.

The three plugs on a composite type RCA consisted of ‘composite’ video, left-side audio, and right-side audio. Component video had plug-ins for just the red/blue/green components of the image, instead of including audio in the jack hydra. ‘So what?’ you might say. Nowadays, that’s just more steps to get to the same result, a clear video with stereo audio. Back then, separating the signals was the only way the tiny computer in the television could handle it. It wasn’t about the cable’s capacity, it was about the TV – and computers were weak when composite RCAs were introduced. If all three output cables were instead fused into one, the final quality of the broadcast would be noticeably worse on screen.

Composite video, in this way, was a breakthrough! Splitting it up meant that the TV processor didn’t have to make some huge leap to clear the gap between black-and-white and color. The downside was the split: the video had a maximum possible quality, and it wasn’t very far away from where they were already. Composite was the quick solution, but the video had to be compressed to traverse the cable, and the analog signal could come out fuzzy if the screen was too big, or if something producing radio waves (like a faulty microwave, or a radio) was too close by. Component signals TV was a similar breakthrough, and suddenly TVs were high-definition – separating out the colors instead of putting them all in one meant that more information was reaching the screen, resulting in a better picture. People were already used to that three-port system, so a straightforward upgrade was welcomed pretty quickly. Audio was separate, but whatever, the picture was crystal clear!

RCAs were also great even later because of the freedom they gave the host TV – you could have multiple rows of RCA inputs on the backs of the particularly large TVs, so houses with a VCR and a game console wouldn’t have to unplug one to put in the other. Nowadays, TVs that do still have RCA connections have maybe one or two sets for component video, and HDMI is expected to fill in the rest.  

Why Did Manufacturers Switch?

RCA connectors sound great on paper, and they can give a lot of freedom. However, composite video RCAs buzz if they’re not connected right, they’re easily tangled, they look messy, and they’re all around slightly worse quality-wise than HDMI, depending on screen size. Particularly the video: HDMI can deliver a much better-quality image than composite RCAs could because it was digital, not analog. Not much could compete, although the UK had an equivalent called a SCART connector that behaved a lot like component RCA did. And then HDMI came along and blew RCA and SCART out of the water! An RCA cable could never handle 4K TV. Component video, which only focused on video, hung in where composite RCA failed because the quality was better. Component can produce images up to 1080p because instead of one compressed visual signal, it has three! Component RCA is still on the backs of TVs even to this day.

Besides quality, they looked messy. HDMIs might still be cords, but they’re all more or less uniform in size, so they look nicer behind a TV. RCA cables could be any number of sizes because they needed shielding, so the cable could be thicker or thinner depending on the manufacturer. HDMIs can carry both audio and video, and TVs are smarter, so there’s no need to divvy up the information before the computer receives it. Even if all else was equal, manufacturers might have still switched to HDMI, to lighten the load for the customer.

Still, a lot of old consoles and DVD players with component options only are still kicking around, so component RCA ports are going to hang around a little longer.

Sources:

https://www.pcmag.com/encyclopedia/term/composite-video

Hyperlink for Readability: MultiCom Inc.

Deepfakes: Should You be Concerned?

Elizabeth Uncategorized October 22, 2021

You might have seen those videos of “Tom Cruise” on TikTok, or maybe you saw someone’s face superimposed onto Superman. Deepfakes are getting better by the day!

Deepfake Software

Deepfakes are a species of visual edits that use pictures and video, combined with AI, to create something new! The AI uses a pre-existing video and a library of photos to replace one person’s likeness with another. If you have the pictures for it, you could deepfake your face onto Chris Hemsworth’s body, and other such shenanigans. And deepfakes aren’t just for videos! They can also be used to create better still images as well. Where Photoshop relies on a human’s touch to make it believable, deepfake tech can create a realistic still mostly by itself given the tools.

That’s the catch, not all deepfake AI has all the tools – some deepfakes are noticeably worse than others, for a couple of reasons. The tech is still pretty new, so most programs are still ‘learning’ what is and isn’t possible for a human face. The second issue is the quality of the images fed to the deepfake – if the images don’t give the deepfake enough information to accurately recreate angles, it’s going to have to get creative. This is a bad thing when you’re trying to make a believable video.

Celebrities Vs. The Average Joe

Deepfakes rely on data, so if the software doesn’t have much data to work with, the resulting deepfake looks…uncanny. Even really, really good deepfakes right now, with a ton of data, look a little uncanny. Picture the last movie you saw a dead celebrity in – you probably realized something was wrong even if you didn’t know they were dead, like General Tarkin in Rogue One. He’d had his whole head scanned at high quality before he died, and he still looked a little strange on-screen. It was little things, like his neck not moving perfectly with his mouth. Young Carrie Fisher at the very end of Rogue One had a noticeable grain due to the source images, and that same young Carrie Fisher in The Rise of Skywalker looked strangely plastic even in low, indirect light.

The average person doesn’t have enough high-quality video or images from even one angle for deepfake AI to make something believable. It only takes a split-second of slightly misplaced nose or mouth for someone to get creeped out by whatever you’re making and identify it as fake. The uncanny valley is instinctual, but it’s reliable! It takes serious work to overcome that instinct. If Hollywood can’t manage it, is there anything to worry about for the average person? Well… yes. Because the average person has access to it, and the tech is always getting better.

Controlling it

How do you control it? Big stars have to deal with their image being stolen all the time. If anyone’s prepared, it’s the celebs, who have to fight magazines and movies alike to be represented like they want to be. But what about the average folks when it starts to bleed downwards? Minor politicians, or competition for the cheerleading squad? Or explicit images made specifically to harm someone’s image, made by an amateur with juuust enough knowledge to make something that, at first glance, looks believable.

How do you account for that?

Lets look at the Tik Tok Tom Cruise account. The creator has gone out of his way to make it clear that Tom Cruise’s likeness there is not real. Even so, the videos are jarringly realistic. He used a Tom Cruise impersonator as the ‘base’ for the deepfake, and the end result barely catches any uncanny valley at all. He just looks a little stiff. That guy’s videos are still up, because it’s obviously not really Tom Cruise no matter how realistic it is.

And then there’s an account that’s putting Charlie D’amelio’s face on their own body, in an attempt to impersonate her. Tik Tok is removing these because it’s not obvious that it’s not Charlie, even though the quality is worse. Someone who watches it more than once is going to recognize that it’s not Charlie, but it’s still getting pulled, because it’s not being clear enough. They are crossing a line.

There’s also a distinction between the two for intent. ‘Tom Cruise’ is showcasing his technical skill, the Charlie impersonator is trying to be Charlie.

Legally, copyright law does have some precedent from: if an the music and art world: if an impersonator is so close in performance to the original that an average person can’t distinguish it from reality, then they’re violating copyright. Singers use this when covers get a little too close to the original. See Drake songs, for instance: the only covers you’ll find on Youtube are by female singers or men who sound totally different, because he’s very strict on his copyright. When the audience can’t tell them apart, they’re pulled.

The problem is enforcement. The average person is not going to have to time or resources to hunt down impostors and report them all. Charlie is famous on Tik Tok, but if she wasn’t, Tik Tok mods likely wouldn’t actively hunt down these impersonator accounts for her. If someone really, really hated an obscure user, they’d be able to overpower their reporting efforts with fake content, and that fake content only has to be believable enough for someone to scroll past it and think “wow, I can’t believe they’d do that”.

The average person is not equipped to scrutinize every single little bit of media that comes their way, it’s exhausting and unrealistic to expect that of them. It’s how disinformation campaigns work. If the deepfake is believable enough, and the original’s not aware of it, that deepfake may as well be fact for everyone who sees it and doesn’t realize it’s fake.

Implications

If you’re online a lot, you might have heard of that new Mountain Dew ad featuring Bob Ross’s likeness. This was… weird, to a lot of people, and for good reason. Using a person’s likeness to sell something has been a matter of debate ever since money became mainstream – you’d probably sell more spices if you said the king bought from you back in BC times. But normally the person is able to call them out for it. Now, with deepfakes, you can make celebrities say anything post-mortem, and nobody but the estate will be able to challenge it.

And, even if the estate gives permission, how specific do you have to be about that image? Actors struggle with Paparazzi images even today – Daniel Radcliffe famously wore the same shirts and pants for weeks while filming a movie, so the paparazzi’s images of him were worthless. Imagine having the ability to put Daniel Radcliffe in any pose or outfit you wanted for the front of a magazine. The person wouldn’t make unflattering faces for your pictures before they died? Well. Now they will.

Presumably Bob Ross’s estate allowed the use of his image, but in the same way we don’t take organs from dead bodies without consent of the deceased, maybe we shouldn’t allow the selling dead loved ones’ images for advertising purposes, without their consent beforehand. Especially now, when it’s easy to deceive people with this tech!

Is There Good?  

And then there’s the other side of the spectrum, where deepfakes can be used to bring people back to their glory days, or color black-and-white movies. They can be used to de-age actors, as seen in Captain Marvel, Star Wars, etc. Samuel L Jackson was 40 years younger thanks to deepfake tech, and Mark Hamill appeared as he was forty years ago for another Star Wars series.

Deepfakes, given the tools, do a better job of recreating someone’s face than human-controlled CGI ever could. They could have been used to make Henry Cavill’s Superman in Batman Vs. Superman mustache-less, instead of whatever they did instead that made his face look unsettling. He couldn’t shave his ‘stache because he was also filming Mission Impossible at the same time, so the only way out was either prosthetic facial hair, or CGI-ing over it. They picked the CGI. People noticed. Deepfake tech might have made his mouth’s movement a little less uncanny.

Deepfake tech could be used to disguise facial injuries, like Mark Hamill suffered during the original Star Wars trilogy, or create alien races without the heavy prosthetics traditionally used or sweatshop CGI-studio labor. They could make dubbed movies less visually jarring, and line up actors’ mouths with the words they’re supposed to be saying.

Deepfake technology is a very double-edged sword. All the good it could do isn’t outweighed by the bad. It’s dangerous technology, and in a world that’s increasingly using the internet to share information, disinformation is a powerful pollutant. 

Sources:

https://www.theguardian.com/technology/2020/jan/13/what-are-deepfakes-and-how-can-you-spot-them

Y2K Wasn’t All Fear-Mongering

Elizabeth Uncategorized October 20, 2021

Y2K, for folks who don’t know, was the idea that the world’s computers would attempt to switch over from 1999 to 2000 on New Year’s Eve, and fail because most computers handled date in two digits instead of four – so the computer would see 99 turn to 00 and flip out. Date and time are critical for a lot of automation, even back then, so the fear wasn’t unfounded… but the panic might have been a little excessive. Why didn’t it happen?

The Bug Was Real

Programmers, with limited space and time, would look to optimize anywhere they could. The first commercial computers read years with two digits – memory was incredibly limited, and telling a 1970’s computer it had to count four digits instead of two would have shortened it’s already tiny attention span. They’d cut the “19” clean off the front and told the computer to take “19XX” as a given. This becomes a problem when the year is A) about to turn to 20XX and B) about to have two zeroes at the end because of how linear time works. The computer then might freak out, not knowing what to do with 2000. Automations relying on date could react unpredictably unless programmers got there first.

 It was a real bug. Hospital systems relying on date and time, stock systems, automated manufacturing, everyone used date for something. Documents could become chronologically disorganized. Bank automations would wildly miscalculate mortgage payments. Video rental stores would charge a century’s worth of late fees, not understanding ‘negative time’. Some computers might get into a death loop and crash over and over as they tried to understand what year it was.

The Panic

Echoes of Y2K were seen during the first months of the Covid pandemic: people created the shortages, not the event. Y2K caused an actual water shortage in some spots because people were preparing for the apocalypse by filling up their tubs last minute, assuming their water supply might be interrupted. For what it’s worth now, that’s not a great idea unless you have something to store the water in besides the tub. Water is treated, yes, but if something like an earthquake ruptures the water line, you might end up with contaminated water in the tub, or no water at all. If it’s something that won’t rupture the line, the tub itself is likely harboring some bacteria that could breed in the water over a long period of time – the tub’s where feet go, after all.

Buy your water far ahead of a potential disaster (as in before you can even tell an emergency is going to occur) in a stable, sealed container, because exposure to the open air can make still water go bad over time. Purdue has a good article on the subject here: https://engineering.purdue.edu/SafeWater/drinkinfo/y2kwater.html

Vanishing Point

And then it was gone. The new year came and went, and suddenly people and media outlets  acted like they’d never given in to the panic all along. Things returned to normal. Futurama said it best: “When you do something right, people won’t be sure you’ve done anything at all”. Engineers had fixed the problem so well that people became convinced Y2K was never going to actually happen.

How They Fixed It

As with most things, the truth is somewhere between tabloid and denialist. Y2K really could have screwed some stuff up, but a plane wouldn’t have fallen out of the sky because its GPS said it was time traveling. And on the flip side, it didn’t turn into a crisis, not because it wasn’t one, but because software experts fixed it. The sudden public interest isn’t what alerted software engineers to the problem, either. They generally knew, and they’d been working on it for years.

There were a couple of options to fix the Y2K bug: Windowing, which meant that the computer would treat dates as 20XX instead of 19XX, and full-out reprogramming for four digits. Most programmers went with windowing when they had the option because it was much, much easier (and therefore faster and cheaper) to do than trying to reprogram a legacy system to understand four digits. It’s important to note here that computers are in a lot of things that don’t have very much memory: parking meters, some cash registers, old gaming systems, etc. But they all need to know what the date and time is to issue receipts and count time properly for calculations. And when I say they don’t have much memory, I mean some of these legacy systems had been in use since the initial programming that lead to Y2K, the 1970s. Windowing was sometimes the only viable option.

This did kick the can down the road, but it bought time for memory storage tech to catch up. Businesses now had time to find an alternative to their legacy system, or decide to just keep windowing the problem until it was no longer feasible, which might be a while. Ultimately, it was fixed. Not perfect, not infallible, but fixed well enough to prevent a mass computer meltdown. Windowed systems may go out intermittently, but they’re not failing all at once and causing a choke.

For devices that needed to switch to four digits anyway for futureproofing (like the computers found in nuclear plants, banks and utilities) reprogramming took longer. Again, this was a project in the works for months, if not years, and it came with a very hard deadline. When the issue was described, most organizations jumped to fix it. A total of approximately 100 Billion dollars was spent by the US alone to prevent the potential collapse.

TidBits

Smaller issues popped up around date and time at the millennium, but they weren’t as potentially catastrophic as the actual Y2K. For example, the leap year problem: 2000 wasn’t supposed to be a leap year in computer logic, because every 100 years the leap year is skipped unless it’s also a year divisible by 400. Some systems accounted for that first part but not the second, leading to more programming work. Ironically if they’d just ignored the 100 year rule, the 400 year rule wouldn’t have come into play, but developers were doing their best to avoid having to patch systems after they’d been installed so close to a new millennium. Being off by a single day in the correct year wasn’t as critical (it was paperwork being dated incorrectly instead of systems crashing), but it was still kind of annoying to try and fix on top of the other crises happening at the same time.

More minor problems include coding only the last two digits of the date as an upwards count instead of all four without limiting the number of characters in the slot, which lead to some websites displaying “19100” until it was fixed later, instead of the more common 1900 also displayed.  

Previous date failures

Y2K was the most famous potential failure because of the potential consequences. But it wasn’t the first! This is actually a good thing, as it gave engineers and software experts a good idea of what can and can’t be shoe-horned or windowed on a large scale. Windows Vista? No. Registers? Yes.

In the 1960s, storage space was incredibly limited. This led to a miniature Y2K, where the computer could only count up to 9, meaning 1969 was the highest it could count to from the year 1960. Computers were significantly less widespread then, so it wasn’t as critical a problem as the Y2K described above. Again, it was done to save any scrap of memory possible. By the time 1970 rolled around, memory storage had improved enough for the second digit.

The date 9/9/99 was another mini Y2K: Strings of 0000 or 9999 are frequently used in programming to tell the computer to take a closer look, since it’s both the upper and lower limit of what a four-digit number can hit. As a result, when a computer sees it, it may react by shutting down the process. 0/0/00 wasn’t a possibility because the calendar recognized that day and month couldn’t be 0, so 9999 became the standard for clock errors!  9/9/99 is a real date, but thankfully – much like Y2K – programmers of affected systems caught it before it became a problem.

Sources:

https://www.nationalgeographic.org/encyclopedia/Y2K-bug/

https://engineering.purdue.edu/SafeWater/drinkinfo/y2kwater.html

https://www.crn.com.au/news/bank-of-queensland-hit-by-y201k-glitch-163864

https://time.com/5752129/y2k-bug-history/

https://www.newscientist.com/article/2229238-a-lazy-fix-20-years-ago-means-the-y2k-bug-is-taking-down-computers-now/

https://jim.rees.org/apollo-archive/date-bug

Accidentally Breaking a Ship with Divide By Zero

Elizabeth Uncategorized October 18, 2021

The Ship

It’s 1997, and computers are making many things easier and more efficient. Enter: The Yorktown, a ‘smart’ Navy ship outfitted with new tech. This ship has been in metaphorically uncharted waters before: in 1988, when the Soviets decided nobody got to use ‘right of innocent passage’ in their chunk of the Black Sea, the Yorktown was there to push back a little and ensure that innocent passage was still an option. A Soviet cruiser attempted to push it back into international waters via hull-bumping, but because the Yorktown was extraordinarily well-outfitted and came with buddies, it didn’t go any further. The Soviets begrudgingly updated relevant handbooks to include right of innocent passage in their waters, something that had been available in the English version of the UN sea laws book but not the Russian one, and promised not to bully innocent boats out of their water again.

The Yorktown was impressive within the military, and yet it didn’t make much mainstream news beyond that Soviet incident. That’s not to say it didn’t deserve it – the Yorktown is very well-decorated across the board, in everything from safety to combat readiness. It was also already partially computerized already due to its young age. It was an excellent place to start, a great testing ground for the newly possible ‘smart’ ship technology.  Things could only get better once it had it’s new, shiny computers onboard fired up and ready to rumble. The Navy was entering a new era, one of peace, yes – but also one of significant technological catch-up to the consumer and business sides of tech.

The Incident

However, programming has always had its flaws, even when it was many times simpler than what we have today. A small bug or flaw could cripple a business until it was sorted out, and the Navy was about to learn this the hard way: a programming flaw consumed literally all of the RAM everywhere on the boat. Someone put a zero into a database field that was used in a division equation, and the computer locked up while trying to divide by zero. This wasn’t a particularly consequential computer, but it didn’t stop there! As said before, it ate all of the RAM. All of the computers shut down. All of them. Including seemingly unrelated ones in propulsion, because instead of just locking up the one computer, the error ate all of the RAM and spread in what’s known as a buffer overflow.

Luckily, it wasn’t in a lot of danger – boats float by themselves, and it was in friendly waters just off the coast of Virginia during the incident. Two days later, it was back to business and battle maneuvers. However, whether or not it had to be towed in is a point of contention: reporters who talked to important folks on the boat say that’s what they were told, but the important folks later came out and said that it got into port under its own power. There’s motivation in keeping the exact details quiet – buffer overflows were easy to make happen, should a malicious actor get into the LAN – so the exact truth is somewhat obscured.

It recovered eventually, and safeties were put into place so that system administrators could bypass the faulty input and replace it without totally wrecking the entire network in the process.

The Flaw: Divide By Zero Plus Buffer Overflow

It’s a common joke: divide by zero, destroy the universe. Convert something into nothing, break laws of physics and conservation of mass. Dividing by zero is illegal for most computers nowadays – they’re programmed not to think about it and dismiss it right away with an error message instead. If they don’t, the computer ends up locking up as it tries to figure out how to do the impossible. Yorktown’s database manager program wasn’t programmed to return an error. Even that might not have been a huge issue if it hadn’t overflowed, which was the real problem, and that might not have been an issue if the LAN wasn’t trying to resource-share where it shouldn’t have been.

The way a buffer overflow works in Windows is that an error starts overwriting surrounding lines in the database, compounding the error and consuming even more RAM to calculate, so once it got going, it was going. Other computers in the network designed to connect to this one began experiencing the buffer overflow as well, and soon the entire network had been eaten by the divide by zero error as other computers became fodder for a faulty calculation.

How would you prevent this from happening in the future?

Buffer overflows are still a widely known exploit. In an attack, a malicious actor would send code that breaks the formatting of the destination database, either to lock up the computer or replace nearby innocent code with malicious code and squeeze in further. Windows (and the version of Windows the military was using on the boat) actually doesn’t have a built-in prevention feature, either, so it really was possible to cripple ‘smart’ vessels with simple typos that would be harmless on other operating systems.

The simple answer is to just not use Windows NT. If something makes it into a database, Windows NT would not check to make sure it fit in the provided array (which is essentially a size and formatting gate for databases). A more complicated answer is reprogramming around the software to ensure it returns an error, or refuses to process bad inputs altogether, something they wouldn’t have had to do had they used a different OS. Software failures were allegedly already a known issue in Windows NT, but the only other options were UNIX or custom-building something. The Navy picked Windows because it was slightly more user-friendly, but as a result, patchwork code filled in the gaps between the stuff already there and the stuff being freshly installed to make the Yorktown a ‘smart’ ship. Some things were overlooked.

After this incident, the Navy pushed for more UNIX and less NT. UNIX, while imperfect, didn’t literally try to eat the entire network over a typo when it encountered a database error. Engineers within the Navy said they pushed back against Windows only for Windows to win anyway in a political victory; this is one of the incidents that forced the Navy to listen to engineers further down the line.

Was it worth it?

Short answer? Yes, actually. The Yorktown saved an estimated 2.9 million dollars and reduced its crew by ten percent, just by computerizing. Human error has also stranded boats; at least the computer doesn’t have a sense of lying for self-preservation. Besides, the old boats without computers doing everything were effectively wasting labor, which – in the unprecedented peace just before 9/11 – was unacceptable. Those sailors could be doing something else more productive than hand-calculating trajectory and speed.

Those first Smart boats were rough, yes, but the experience gained from that buffer overflow in safe waters was invaluable.

Sources:

https://web.archive.org/web/20060208090921/http://www.gcn.com/17_17/news/33727-1.html

https://en.wikipedia.org/wiki/1988_Black_Sea_bumping_incident

https://gcn.com/articles/1998/07/13/software-glitches-leave-navy-smart-ship-dead-in-the-water.aspx

https://slate.com/technology/2018/06/why-the-military-cant-quit-windows-xp.html

https://www.wired.com/1998/07/sunk-by-windows-nt/