Posts Tagged


Can There Be Another Billy Mays?

Elizabeth Technology May 30, 2023

Or are they all doomed to MilkShake Duck, Crash, and Burn?

Billy Mays Here

I’m sure you’ve seen his ads at least once. Billy Mays was one of the most famous salesmen for everyday household products like the Shamwow and bathtub-ring remover, an amazing salesman famous for both his delivery of his lines and the variety of stuff he’d promoted during his career.

He sold everything. He did it while yelling. His consistently cheery demeanor and intro became a trademark unto himself, a trustworthy salesman in an era where companies weren’t sure they needed a face. He was a staple of phone-order TV products in the period right before everyone had a website they could pitch instead, filling a transient niche. He sadly passed away due to a heart attack in 2009, and nobody has ever been able to take his place.

The Milkshake Duck

A Milkshake Duck refers to a tweet where the poster is presenting a fictional duck that drinks milkshakes, a duck that everybody on the internet loves. The second part of the tweet implies that the tweeter found out the duck is racist only after that duck became famous. Milkshake Ducks are people who become famous for something cute or funny, only for the spotlight to show things from their past they may not have wanted the entire public to see. An unfortunately large number of SNL performers have done blackface, for example, but nobody ever knows until they’re in front of the camera and people want to find out every little detail about them.

Billy Mays appeared during an incredibly unique time in TV history, a time when Twitter was new and celebrities had to really screw up before they’d get called on it. Obviously, this has now changed – while it’s still possible for celebrities to suppress bad news about themselves, it’s much harder to do that when the reporting is crowdsourced by people at varying levels of anonymity.

The question of could we get another Billy Mays is complicated tremendously by this problem.

You have to be a little insane to keep up the constant pep and showmanship Billy Mays had for his commercial. You have to be willing you put your name behind things wholeheartedly, like he did – he claimed he used every product he showed, and many of those products are genuinely good. If they’re not, they’re not poorly made – they’re just not made for everybody. You have to be a ‘Type A’ personality. All of this combines into a person that, simply put, is likely to have gotten into some trouble at some point in their life.

For close comparison, look at the people we saw get big in similar ways after his untimely passing: the Shamwow guy had complaints of domestic violence against him. The MyPillow guy is a notorious conspiracy theorist, but in the racist way, not the fun way. Commercials for products like the Scrub Daddy sponge and other assorted ‘As Seen On TV’ stuff have, instead, gone back to using actors who don’t speak to demonstrate their product with a narrator over the top.

Milkshake Ducks are more common than ever, and the kind of product still using infomercials can’t make it work if they pick the wrong person. It may actually tank all of their marketing to be associated with the wrong person.

Flex Tape

The only man who’s come sort of close to him in recent memory is the Flex Seal guy, Phil Swift. Flex Seal is essentially spray rubber, which has existed, but wasn’t well known outside of construction and underwater sports markets.

All that stuff earlier still applies – he’s a little unhinged. Billy Mays was always shouting, but he always maintained a professional demeanor underneath it. The Flex Seal guy will sometimes pull out a chainsaw and look a little too eager to use it, which is to say – exactly eager enough for people to remember. Nobody could replace Billy Mays because his delivery was unique for the time and imitators have cropped up in his absence, but Phil Swift takes his presentation and tweaks it just enough.

However, while Phil Swift is a close match in this one regard, he doesn’t do the same cross-product stuff that Billy Mays did. Mays had a marketing company that other companies would reach out to, but Phil is employed by Flex Seal specifically. He only does Flex Seal. Finding someone who hits all of the critical points has been difficult at best and impossible at worst. Even when they do find someone, a la Phil Swift, they’re often not willing to go beyond one company like Mays had been. Mays was truly rare – I don’t expect we’ll see another one as technology continues to isolate advertising, both online and on traditional TV.


Evolution of the Ringtone in Media

Elizabeth Technology May 25, 2023

Loud Buzzer Noise

You may notice in some older movies that instead of the traditional “Ring! Ring” noise of a telephone bell, there’s just a low flat buzz instead. This was a direct result of consumers complaining that they thought their telephone was ringing, got up to answer it, and missed part of the show for it. To this day, British television shows alter the noise of ringing phones to avoid confusing viewers – although now it confuses them in a different way, because some viewers from other countries assume that this is what British and American phones used to sound like back in the olden days of black and white film.

How media represents a social phenomenon is an excellent view into the forces of the phenomena’s time – how have ringtones changed in the past decades?

The House Phone

Big in horror movies and coming of age films, the seventies and eighties had house phones that were physically attached to the wall, with the ear piece attached to the phone hangar by a familiar curly cord. Phones were usually situated in the kitchen for ease-of-access for mom (society was struggling to adapt to second-wave feminism and housewives were still plenty common in sitcom TV shows), but with a cord long enough to walk around to nearby rooms. Around this time, some shows just let the original ring play, because the phone in the viewer’s house was probably in the kitchen, where it was too far away to be as loud as the sound playing from the television system. The multi-screen gag, where kids listen in on conversations with a second phone hooked up to the house line (a luxury) starts in this era. Doing something comparable now would be requiring the protagonist to accidentally set up a conference call on their mobile phone. It’s easier to just say the character meant for that to happen.

Of course, the non-linear adaptation of tech means that these phones hung around for quite some time! Nickolodeon’s Show As Told By Ginger features an episode using the house phone as a plot point, but the show started in the 2000s. The utility of the home phone didn’t disappear just because cellular phones were out and about. In fact, their unreliability made it kind of impolite to call about serious things on a cell phone at first! If you really needed to talk to someone, you called them on a landline, until mobile carriers got their act together.

Something Obnoxious

The 1990s brought portable phones, and the 2000s brought truly customizable flip phones that could do a lot of things, including recording video and audio. You could set a ringtone for each person on your phone, or use the same ringtone for everything. Cell service wasn’t as rock solid as it is now, but it was still totally possible to hold a conversation over the phone if both parties were in high-coverage areas. Unfortunately, the general public was unused to the tech, and so as a result phones were not always silenced in movie theaters or on busses. Or in funerals.

Comedy shows had plenty of fodder to joke about someone forgetting their ringtone was something completely inappropriate in front of strangers, receiving a call, and then fumbling to answer or silence their phone. The days of the home phone weren’t gone, but having a cell phone felt modern and cool. You didn’t have to talk out loud to talk to your friends, you just had to press a key three times to get to the letter V.

Good Vibrations

The phone ringtone flub appeared so much in comedy because it was such a real problem. Don’t text and drive, silence your phone in the movie theater, and set it to vibrate at weddings and funerals so you can discreetly answer a call if you need to.

Now, the default is setting phones to vibrate – plenty of people don’t know what their actual ringtone sounds like anymore because setting the device to vibrate is the easiest step you could take to avoid being potentially annoying in a quiet public place. Worse, some apps had specific rings when they gave an alert. Constantly hearing the tweet noise that the Twitter app used was infuriating.

Ringtones on TV are mostly gone, now! More modern media acknowledges that kids just do not set their phone to make sound anymore. Something may replace even the buzz, but the trend at the moment is a return to the buzz of days long past.


Elizabeth Technology May 18, 2023

Tetris, released in the 1980’s (the first version was released in 1985, but other countries received it from 1986-1988) is one of the most viral games ever. It’s simple enough that children can play it, but complex enough to keep players of all ages entertained for hours. It doesn’t require that the player speak any one language – the mechanics are simple enough to not need instructions. And, most importantly, it’s fun. Winning is satisfying. It gets harder the longer you play, so you’re never bored with the difficulty.

Versions of Tetris exist everywhere now. The game itself is as endlessly versatile as eggs. Physics-based. Efficiency based. Tetris games that want you to fill the board completely, like a puzzle. Tetris games that allow you to squeeze pieces in between gaps that are too small, and Tetris games that don’t. Tetris games that troll you. Competitive Tetris, where discarded lines are given to your enemies. Tetris games where the Tetriminos have 5 blocks, instead of four. The game is endlessly updateable, and the original remains the most ported game in all of video game history. Difficult, but fair, the standard games have chased since day one.

Tetris Effect

Some players develop what’s known as the Tetris Effect – they’ve played the game so long that it begins to seep into their dreams, and they unconsciously wait for blocks to start descending from somewhere whenever they aren’t occupied with another task. The Tetris Effect technically refers to any time a person is devoting so much time to an activity it starts to bleed into places it wouldn’t normally be – Rubix Cube speed-solvers sometimes unwillingly run through their algorithms in their head, and chess players may find themselves trying to identify what piece a traffic bollard would be and how it could move on the board.

When you look at it that way, sea legs are part of the Tetris effect. The Periodic Table in it’s solved state is as well! Tetris first put a name to the phenomenon because it is so genuinely interesting that people who weren’t accustomed to having it were experiencing the effect for the first time.

Repetitive Games and PTSD

Simple puzzle games have benefit beyond just immediate entertainment. Studies seem to suggest that repetitive games like Tetris or word games, something easy enough to be attention-absorbant, can help curb the effects of PTSD after a traumatic event, like a car crash. Specifically, games like Tetris help combat involuntary flashbacks. Treating PTSD after it develops with CBT shows promise, but intervening before it has a chance to really take root would be better. The study size in the initial research was small, but it shows promise: .

DOOM (The Game) And Porting

Elizabeth Technology May 16, 2023

DOOM is an incredible game that is famous for running on everything. The game’s code only takes up 2.39 MB (it takes a little bit more to run it), and it’s method of recording player inputs as demos instead of video (enabling anyone to play a demo of another player’s run in a time when recording games as videos and uploading them usually looked like pixelated garbage) made it extremely popular among people who love speedrunning games competitively.

All that said, the original version of the game, run on an emulator, functions really well. What about the ports to other platforms?

The Times

Firstly, to ‘port’ anything in software terms means getting it ready to operate on a different system than the one it was first designed for. It’s the process of making the software portable.

Getting DOOM to play on anything is a trivial matter now. But back when DOOM was new and super cool, it wasn’t so easy to move it to handheld game devices or consoles. Picture a game made for the computer – you play it with your keyboard and mouse. To get it ready for the XBox or the Playstation, the developers of the game have to change how it handles inputs. They may also have to change textures (XBox plays on a TV screen usually, which is larger than a computer screen) and how the game handles loading. That takes work. And games weren’t an object of respect at that point. They were time wasters, something to keep the kids indoors if it was too hot or too rainy outside for them to play. A significant number of people involved in the game making process felt that anything they helped produce just had to be playable, it didn’t have to be good. The gradual dropoff of Atari and the ocean of shovelware games lost to time gradually changed that attitude, but DOOM ports to other consoles were an unfortunate victim of it before that happened.

Rush Jobs

Porting to other consoles was like rebuilding the game, and if you don’t respect the game, you’re going to build a facsimile of it good enough that kids will buy it and stop there.

Take the port to the Super NES, made in 1996 – the game literally does not have the functionality of saving. You have to beat each episode (episodes consist of nine levels each) in it’s entirety in a single sitting. Bizarrely, some of those episodes won’t let the player alter the game’s difficulty, so playing through on Easy the whole way through is not going to happen. It might still have been better than the Sega adaptation two years later, which cut several textures as well as a full episode altogether to make room for the rest of the game! Yeah, you could save, but at what cost? Meanwhile, the Atari’s port to the Jaguar console managed to make a passable copy of the game at the expense of only five levels and a lot of texture. But it could run multiplayer if you had a second Jaguar, so that already made it leagues more attractive than other ports at the time. Not that it was good, it sounded bad and it looked sort of ugly, but it was better.

Better Versions

Of course, DOOM had good copies as well! DOOM is surprisingly functional as an app on the Apple store. You can’t jump in DOOM, so the controls remained simple enough that players could still see most of their screen back in 2009 when the app released. To go just a couple of years after most of these ports to 2001, Nintendo’s Gameboy Advance made a surprisingly playable copy of the original game. The Playstation version from 1995 did a fantastic job of catching the spirit of the game instead of cutting things for time, even adapting some of the music and lighting so the console could handle it better. Eventually, XBox released a version of the game where you could play multiplayer and everything was 1080p in 2006 as part of the XBox LIVE Arcade, and even the Nintendo Switch can play DOOM now.

This isn’t counting emulators that allow the player to play the game on their home computer as if it were the original – the hardware most computers have by default means the game runs as well as the emulator does.

You can see which companies understood the appeal of the game they were porting, in the sense that the companies who went out of their way to make a good version of a simple violent videogame are still mostly competitive today. With the exception of Nintendo and their first chopped up version of the game and Atari’s functional multiplayer version, gaming companies who pushed DOOM to the side ended up pushed aside themselves.


Elizabeth Technology May 11, 2023

Pong is one of the earliest video arcade-style games, originally released in 1972 by Atari – it was actually their first game. The game was based on another tennis game manufactured by a competitor for a household console, the Odyssey, which was manufactured by their competitor Magnavox. Atari’s version was much more successful, and laid the first bricks in the road for video games as we know them today.

Sue Over Anything

Atari’s new tennis game got into hot water with Magnavox because they were both tennis games. That sounds funny now, but in the era of the first video games, lawmakers weren’t sure how to handle it. Atari believes it could have won, but the expense of fighting Magnavox would have cost them more money than they had at the time. Instead, they settled, and Magnavox agreed to a sum of 1.5 million dollars split across eight payments as well as full information on everything Atari was doing for the next year, public or in development. Atari, as a result, delayed some of it’s products.

In terms of business dealings, the original creator figured Atari would be able to produce the game themselves (instead of licensing it out, as this was Atari’s first game they both made and kept for themselves) but couldn’t get any credit or loans to actually manufacture the things, because it looked like pinball at a glance, and banks associated pinball with the Mafia at the time. Eventually Wells Fargo gave Atari credit, and the arcade cabinets went into production at the rate of ten machines a day. Many of them failed quality testing. This was still their first game! Eventually Atari got it together, and even began shipping Pong multi-nationally thanks to their success in the States.

Home Pong, the edition of Pong that gamers could play at home, sold so many units that it became Sears’s most popular selling item for the holiday season in 1975, a coveted position that lead to dozens upon dozens of copycats entering the market. But it was too late – Atari won. Atari won decisively. Pong was popular and fun among all ages, installed in bars or arcades, or even played at home.

The Age of CRTs

Many early CRT monitors didn’t have great resolution, and it’s not like the computers inside of the consoles of the time were powerful enough to display much anyway. Still, in spite of this, the creator aspired to make the game more interesting than the simple version found in the Magnavox device.

The paddle is designed so that the ball will bounce back at different angles, depending on which pixel of the paddle the ball hits. The ball goes faster, the longer the players are trading it back and forth. The game has a surprising amount of complexity given the simplicity of the tech put into it. Pong doesn’t run on ‘code’ as we understand that word today. The home version ran on a chip, but the arcade-cabinet version that kickstarted Atari ran on a printed circuit board that used transistor-transistor logic to determine where the ball was going to go. Remember – this is just three or so years after Neil Armstrong set foot on the moon, and Atari is certainly not working with NASA’s budget or their technology department. Part of the game, the way that the paddles don’t reach the top of the screen, is due to those circuits. It’s a built-in bug, a flaw that the creator let slide because it made the game harder. Today, making a Pong game is a popular beginner’s exercise in coding languages like Python, done on machines dozens of times more powerful than the original.

Truly, Pong was a pioneer.

Why Not Make Elder Scrolls 6?

Elizabeth Technology May 9, 2023

The Wii U was a console that operated on the same mechanics as the Wii, but was much more powerful. The Elder Scrolls 5: Skyrim, is almost unrecognizable from the base game that launched over ten years ago, in 2011.

What was once considered peak design is outdated; what sold well in the past sells well now, but begrudgingly.

What happened to designing games and consoles?

It Has to Be Impressive

The worst recent trend when it comes to electronics is that no matter what company is making the product, the product has to be impressive. In fact, newer companies have to be more impressive than ever to get a fighting chance in the market, without costing so much that a potential buyer is turned off. Plenty of smaller companies would love to make games for gaming consoles they designed themselves! (And in fact, plenty of consumers would love to buy a simple device like an iPod shuffle with only 16 GB of memory – but Apple won’t make anything that costs less than 400$.) But they can’t keep up with the biggest companies on the market, and trends suggest indie games are where users look to change up their experience, not indie consoles. The Switch is technologically unique, the PS5 and the Xbox 1 are the most powerful consoles ever in their respective lines – nothing but desktop computers could even hope to keep up. Buying a console has become a market like buying a major appliance. If you could spend just 100$ more on a fridge for a fridge that also defrosts itself automatically, wouldn’t you?  Sure, minifridges are cute, but unless you’re a college student, you probably have access to a better one. Indie developers over a certain size can design games for the big consoles as well, so the more expensive fridge still has space for artisanal cheeses, even if it wasn’t built just for Mimolette.

The second problem is that gamers sort of don’t want to invest in ‘new’ right now. Everything seems to be on fire outside. Games are a comfortable distraction. Gamers want ‘familiar’. They want the things that reviewers have looked at and invested time into, even if they’re realistically a B or a C grade game at best. Old, huge companies like Nintendo make custom-tailored consoles like The Switch, but if Soulja Boy’s console company had come up with it first, it might have bombed. Nintendo making a console that mainly serves Nintendo games is no accident, as well. If Sony were to release an updated PSP or a PS Vita in an attempt to compete, there’s no guarantee it would work out for them as well as it does consistently for Nintendo, partly because it would be freakishly expensive to match the performance that Playstation fans have come to expect, but also because the Playstation has an enormous gaming library that’s pretty intimidating to approach as a newcomer. The ratio of games they make to games they outsource is completely different. It couldn’t guarantee a market for either old fans or new ones.

Nintendo releases its own games – Nintendo makes Mario, not a gaming studio that Nintendo owns. Nintendo can pull from old catalogue favorites like Legend of Zelda and remake them for the Switch without starting a copyright spat. This is not only a built-in age gate (Nintendo Games made for kids will always look like they were made for kids, and Nintendo Games that aren’t, don’t) but an easy flag for quality the consumer can keep track of. Nintendo rarely has ‘bad’ games. You don’t have to really research that ahead of time to know the worst Nintendo game is miles ahead of the worst game available on Steam.  When games are consistently reaching 60 and 70 dollars new, a dud is a serious disappointment. That’s four movie tickets out here in Vegas. Of course people are expecting to get at least four movies’ worth of entertainment from a game that costs that much.

 And Familiarity Wins The Crowd

With all that said, it’s no wonder Skyrim is eating up development time that could have been used on a new Elder Scrolls game. Everyone still likes Skyrim! It plays on every console, so it doesn’t matter which one you’ve already got. You can buy mods for it, download free mods, and the dungeons are neverending if that’s your jam. It’s easy to take a break from serious quests if you want, or to beat the game and spend time doing the multiple hours’ worth of side quests once the final boss is dealt with. Skyrim is a good game. You’re going to have a good time playing it unless you go out of your way not to. It’s comfortable and easy to access. At this point, Skyrim’s replayability may steal sales from the next game, if Bethesda ever gets around to making it. Bethesda has a golden goose, and it’s not going to get rid of it until the goose dies of old age.

Wii U

The Wii was much the same, except Nintendo wanted to make the goose a little better. The Wii U’s mistake wasn’t in making a console designed to appeal to the fans of the Wii – the Wii U’s mistake was not making it clear that the Wii U was a younger sibling to the Wii, not it’s twin. Not enough was done to ensure that fans of the console knew the difference, and if you believe it to be a slightly different Wii and not a complete overhaul of the console, then why would you ever spend the money on it? The Wii U is more powerful, but it didn’t secure enough games that were unique to the Wii U and the Wii U alone – why would game developers make games for a device that sold as poorly as the Wii U did when they could keep making games for the Wii and get better royalties in the process? The failure to market the console trickled into every facet of it’s existence to ensure it could never eclipse the Wii.

Skyrim has cursed Bethesda. If the next game is too similar to Skyrim, it’ll be a Wii U. If it’s too different, and the reviews are mixed, game reviewers might not take to it so easily after a solid ten years of good Skyrim content.

What is WiFi?

Elizabeth Technology March 30, 2023

Wi-Fi’s older than it may seem, as it spent quite some time at the fringe of new tech. The market was already flooded with dial up internet, and replacing it was going to take quite a bit of doing. When it first launched, it had an average speed of 2 mbps, which is actually pretty good, about four times faster than dial up, which had a max speed of 56 kbps. However, systems were so heavily dependent on that dial up system that it took many years for it to become the standard.

Wi-Fi is understood to mean Wireless Fidelity, but apparently nobody in the labs that studied or made it ever designated Wi-Fi as the official shortening of that term, it just sort of happened, and then licensing and officiating went from there.

Kind of Like Radio

AM and FM radio have been around for decades, now, and they work fairly similarly to Wi-Fi if Wi-Fi did both at the same time. AM radio changed the amplitude of the waves to transmit information across different bands, where FM changes the frequency of the band. However, AM and FM stick to kilohertz and megahertz frequencies, while Wi-Fi is in the significantly higher gigahertz frequencies.

Electromagnetic Radiation is a spectrum: at one end, there is infrared radiation which is extremely low-frequency, and at the other, gamma radiation, which is extremely high frequency. Visible light falls somewhere near the infrared side, where red is closer to the low end and violet is closer to the high end. Microwaves fall on the low side. A 2.4 GHz microwave has a gap between wave crests about the size of a baseball – the waves aren’t nearly as close together as they are in visible light. (Note – a microwave oven has the same frequency, it is much higher energy than Wi-Fi. Loud sounds can be the same pitch, or frequency, as quiet sounds, the same goes for microwaves). Microwaves, just like colors, are broken up into bands, and different frequencies can do different things. For this article, we’re focusing on information transmission.

What Can Stop Wifi?

Wi-Fi does get weaker when walls or other obstacles get in the way, although this is usually a good thing – there are only so many viable ‘bands’ for Wi-Fi to transmit over, just like radio, so crowded buildings would run out of available bands if they weren’t so easily stopped. While microwave ovens use metal, eventually those same microwaves would be stopped if they came into contact with walls or other solid materials. Eventually, distance also stops Wi-Fi. The waves lose energy as they travel and then carried information is lost.

Bluetooth devices can interact poorly with Wi-Fi as well – they work on similar principles, but Bluetooth is much weaker. If your headphones are undetectable to your phone, even when your device is on, it’s possible the Bluetooth is being drowned out by local Wi-Fi. Bluetooth typically has a range of about 30 feet, compared to Wi-Fi’s much larger 240 feet in ideal conditions.

How Does Protecting WiFi work?

Wi-Fi transmits over those microwave frequencies to bring information to the computer and send it back out.

How do you protect information if it’s just being broadcast like that? Well, a couple of things. While it is very similar, it’s not exactly like radio, where the information from the station is broadcast across the city, and all you have to do is tune it. The computer has to find the network first, and as previously stated, both physical objects and distances can keep Wi-Fi from reaching a compatible device. Distance is a solid defense. If a hacker is in the same building, however, how do you protect the network then? Assuming their device is within accessible distance of the network, can it intercept information sent over that network?

The second part is encryption: it doesn’t matter if the data’s intercepted if the interceptor can’t un-scramble it. Transmitting unencrypted data over unprotected Wi-Fi can get you into trouble – see all the warnings about using public Wi-Fi to do banking – but encrypting it stops most issues before they start. Hence, the rise of VPNs. However, encryption alone won’t stop intruders, so the third part is network security.

The next logical step for a hacker is to get into the protected network and then seek out the info they want, skipping the encryption step entirely. The network itself has to be protected as well! Network protection can be passwords, or firewalls, or anything that prevents closed data ports from being opened. An open port in data security just means something that will allow packets of data to go in or out. A website has open ports so you can access the information on it, for example. If a poorly configured application on a computer has an open port, it’s looking for information, and that can be used to get into the network, bypassing the encryption.

2.4 GHz vs 5 GHz

Some modems allow two frequencies of Wi-Fi, a faster channel, and a further channel. The 5GHz channel is what you’ll want to use for your video streaming. The frequency is higher, and that means information is transported to your device faster. The 2.4 GHz frequency is probably what the printer in the other room is best on. It’s better at penetrating solid objects than 5 GHz, and it has a larger range, but it’s also weaker. 2.4 GHz is also more prone to interference, because many things use that frequency. Microwaves, for example. If you’ve had issues with your Wi-Fi while the microwave is on, get that microwave checked! The odds are good it’s shielding is faulty.

Modem Vs. Router

What’s the difference? A router routes traffic from your network to the internet. It’s sometimes referred to as a WLAN (or a wireless local area network) device. Most houses have a router because of the number of network-enabled devices in a modern home. Printers are rarely connected by cable to a computer anymore, for example.

A modem, on the other hand, is designed to connect devices directly to the internet. Modems are hard-wired into cabled data lines, like telephone lines, so they’re less popular than they used to be. Routers have taken their spot in-home, as dial-up internet is basically non-existent.

Routers and Wi-Fi are here to stay, at least until the next big things comes out!


Magnetic Storage Types

Elizabeth Technology March 16, 2023

Magnetic Tape

The most well-known version of tape-based magnetic storage is the kind used for media. When tape-based recording was first introduced, it revolutionized the talk show and DJ-ing scene of the time (mostly post WWII) because it enabled shows to be recorded and played later, rather than live. Music recording tech already existed, but it required physical interaction from the DJ, so it wasn’t as hands-off as tapes were.

The second-most well-known version is the kind used for computer memory! Data is stored on the tape in the form of little magnetic ‘dots’ that the computer can read as bits. Before each pocket of data dots is a data marker that tells the computer how long that pocket should be, so it knows when one set of data ends and the next begins. The polarity of the dot determines it’s bit value, and the computer can then read all these dots as binary code.

This method of data storage was a massive breakthrough, and other mediums continue to use the format even today! Tapes are still in use for big stuff – parts of IBM’s library rely on modern tapes, which can now store terabytes of information at a higher density than disks and flash drives alike. Other memory types relying on magnetic domains include hard disks and drums, to name a couple. All that separates them is material and know-how: the better the magnetizing material on the outside, the smaller the domains can get. The better the insulation between the domains and regular old entropy, the more stable the data is!

Carousel Memory

Carousel memory was an attempt at shrinking the space that magnetic tape took, but to the extreme. Instead of one very long piece of magnetic tape on a bobbin, the carousel memory system uses several smaller reels of tape arranged in a carousel pattern around the central read mechanism. To get to the right info is as simple as selecting the right reel! This has some issues with it, as you might imagine. Moving parts add complications and an increased risk of mechanical failure to any device, but a device carrying thin, delicate magnetic tape on it is an especially bad place to start.

However, it wasn’t all bad. Carousel memory was actually quite fast for the time because it didn’t have to rewind or fast-forward as much to get to the right area of code. It could skip feet of tape at a time! This advantage declined as tape tech improved, but it still helped companies trying to squeeze the most life from their machines. The bobbins and individual ribbons were all replaceable, so the tape wasn’t worthless if it got torn or damaged. The carousel itself was also replaceable, so the many moving parts weren’t as much of a curse as they’d be on, say, the first hard disks, which had irreplaceable heads.

Core Rope Memory

Core rope memory featured magnetic gromets, or ‘cores’ on metal ‘ropes’, and then those ropes were woven into fabric the computer could read. In ROM (read-only memory) format, if a wire went through the core, it was a ‘one’, or a ‘yes’. If it didn’t, it was a ‘zero’, or a ‘no’. In this way, the fabric is physically coded into binary that the computer can use. ROMd Core-rope memory involved quite a bit of complicated weaving and un-weaving to get the cores in the right spots.

Core rope memory was chosen over tape memory for the Apollo missions, mainly for weight purposes. Tape was great, but not nearly dense or hardy enough for the mission yet, and neither were the other similar core modules available to NASA. A read-only core-rope memory module could store as many as 192 bits per core, where erasable core memory could only manage one bit per core. Where each core on the final module depended on reading the wires to determine the bit’s state, the erasable model (core memory) read the core’s magnetic state to determine the bit state, not the threads going through it. The final module sent up to get to the moon was a total of 70-ish pounds and read fairly quickly. Tape, core memory, or hard disks available at the time couldn’t have gotten to the same weight or speed.

Core-rope memory has its place. It’s very sturdy, and since it relies on the cores to act as bits, it’s possible to visually identify bugs before the memory’s even used, unlike core memory. Both are sometimes called ‘software crystallized as hardware’ because of the core system. It isn’t seen much today, since it is still incredibly bulky, but at the time of its use it was revolutionary.

Core Memory

Core memory is the older sibling of core rope memory, and it stores less. However, the people who got to work with it call it one of the most reliable forms of memory out there! Core memory works much the same as core rope memory, where the bits are stored in cores.

However, the formats are different. If core rope memory is like a binary-encoded scarf, core memory is more like a rug. Thin threads made of conductive material are woven into a grid pattern, with cores suspended on where the threads cross each other. The computer understands these threads as address lines, so asking for a specific bit to be read is as simple as locating the X and Y address of the core. A third set of lines, the sense lines, runs through each core on the diagonal, and this is the thread that does the actual reading.

When asked to, the computer sends a current down the sense threads and sees if the cores flip their magnetic polarity or not. If it doesn’t, it was a zero. If it does, it was a one, and it has been flipped to zero by the reading process. This method is known as ‘destructive reading’ as a result, however, the computer compensates for this by flipping the bit back to where it was after the reading. Due to its magnetic nature, the core then keeps this info even after power to it is cut!

This link here is an excellent, interactive diagram of the system.

Even though this improved the bit-to-space-taken ratio, core memory still aged out of the market. With the price of bits decreasing rapidly, core memory got smaller and smaller, but the nature of its assembly means it was almost always done by hand – all competitors had to do was match the size and win out on labor. Soon, its main market was taken over by semi-conductor chips, which are still used today.

Magnetic Bubbles

Magnetic memory has had strange branches grow off the central tree of progress, and magnetic bubble memory is one of those strange shoots. One guy (who later developed other forms of memory under AT&T) developed bubble memory. Bubble memory never took off in the same way other magnetic memory styles did, although it was revolutionary for its compact size – before the next big leap in technology, people were thinking this was the big leap. It was effectively shock proof! Unfortunately, better DRAM chips took off shortly after it hit the market and crushed bubble memory with improved efficiency.

Anyway, bubble memory worked by moving the bit to-be-read to the edge of the chip via magnets. The magnetic charge itself is what’s moving the bits, much in the same way electrons move along a wire when charge is applied, so nothing is actually, physically moving within the chip! It was cool tech, and it did reduce space, it just didn’t hold up to semi-conductor memory chips. They saw a spike in use with a shortage, but they were so fiddly that as soon as DRAM chips were available again, they went out of style.

Semi-Conductor DRAM – Honorable Mention

DRAM chips are a lot like core memory, in that the device is reading  the state of a physical object to determine what the bit readout is. In Semi-conductor chips, that physical object is a tiny capacitor, hooked up to a tiny transistor, on semiconductive metal-oxide material. Instead of determining magnetic state, the device is instead checking if the capacitor’s discharged or not. No charge = 0, yes charge = 1. These chips aren’t technically magnetic, but since they’ve killed so many of the other options, here they are!

DRAM stands for Dynamic Random-Access Memory, and it means that the memory can be accessed randomly instead of linearly. As long as the computer knows where the data’s stored, it’s able to pull it without pulling other files first. They’re still being sold today!

Magnetic Disk (Hard Disk Drive)

Hard drives work more like tape than core memory. A Hard drive is a platter (or a stack of platters) with a read-write head hovering above it. When you want to save data, the hard drive head magnetizes areas in binary to represent that information. When you want to read or recover that data, the head interprets these areas as bits in binary, where the polarity of the magnetized zone is either a zero or a one.

The zones of magnetization are incredibly tiny, which makes hard drives one of the more demanding memory forms out there, both now and back then.

Early hard drives could suffer from ‘de-magnetization’, where a magnetic disk’s domains were too close and gradually drew each other out of position, slowly erasing the information on the disk. This meant that the disks had to be bigger to hold the data (like everything else at the time) until better materials for data storage came along. Even though they held more capacity at launch, they were passed over for smaller and more stable stuff like tapes and core memory. The very early drives developed by IBM were huge. Like, washing machine huge. They didn’t respond to requests for data very quickly, either, which further pushed reliance on tape and core technology.

Over time, hard disks improved dramatically. Instead of magnetic zones being arranged end-to-end, storing them vertically next to each other created even denser data storage, enough to outcompete other forms of media storage entirely. Especially small hard drives also come with a second layer of non-magnetizable material between the first layer and a third layer of reverse-magnetized ‘reinforcement’ which keeps the data aligned right. This enables even more data capacity to be crammed into the disks!

Some time in the 80s, hard drives finally became feasible to use in personal computers, and since then they’ve been the standard. SSDs, which don’t have any moving parts whatsoever, are beginning to gain ground in the market, but they can’t be truly, irrevocably erased like hard drives can due to different storage techniques. Hard drives are going to stick around a while, especially for the medical and military industries, as a result!

Sources: (all primary sources regarding carousel memory are in Swedish)

Cartrivision – Another Attempt to Curtail Home-Viewing

Elizabeth Technology February 9, 2023

Cartrivision was the very first tape system to offer home rentals. It was introduced back in 1972, and didn’t see very much mainstream success – you had to buy an entire TV to play the tapes, and some of the tapes were not rewindable.

You may have actually seen them before, in the cliff notes of a documentary: the 1973 NBA Game 5 Finals were recorded, but somehow every other recording method except for a Cartrivision tape failed or was lost, so retrieving the information stored on the tape became the obsession of the documentarian. The documentary even won an award.

What makes Cartrivision so special that just recovering one warranted a documentary?

This Was Super Expensive

As mentioned before, the Cartrivision player came built into a TV, and TVs were already expensive. The result was a device that cost the equivalent of a decent used car (about 1,300$ in early 1970s money, or about 8,000$ today). This, understandably, meant that the market for these devices was already kind of niche. But wait, there’s more! As an added expense, most of the fictional stories available for Cartrivision devices were rental-only, and only non-fiction could be purchased to own. This meant you couldn’t build a catalogue of fictional stories for home use after you made this huge investment for the machine. Why not just ‘keep’ them, you may ask?

Because the Cartrivision tapes came with a built-in mechanism that prevented home machines from rewinding the rental tapes! Rental tapes, much like Flexplay discs, were denoted by a red cartridge. Unlike Flexplay, you could only play them once. You could pause them, but never go back. The movie studios were worried that Cartrivision could disrupt the movie theater market, and as such the Cartrivision people had to be careful not to make things too convenient to avoid spooking the people providing them their IPs. They were the very first, after all.

Perhaps You Went Too Far

The company discontinued their Cartrivision manufacturing after a little over a year, thanks to poor sales. Users generally don’t want to pay twice for something, and the red tapes were just not convenient enough to warrant buying a specific (and very expensive) TV for a lot of families. Cartrivision then liquidated their stock, but a large number of tapes were destroyed thanks to humidity in one of their warehouses, making them even harder to find today. Cartrivision TVs were suddenly cheap enough for tinkerers to buy and modify themselves, and many did – there are few or no original, mint-condition Cartrivision TVs on the market that aren’t also broken.

Additionally, Cartrivision tapes come in a strange size, even though the tape itself remains fairly standard. They were custom-made for one specific machine, so they were allowed to be weird in as many ways as they wanted, but as a result they are incredibly finicky to work with if you don’t have one of Catrivision’s proprietary watching machines. If you didn’t get a Cartrivision during their liquidation sale, you’d have no reason to buy and preserve their proprietary tapes.

Speaking of the tapes, the company started selling the red tapes, but not the machine they used to rewind them. There were less to start with, anyway. Home Cartrivision fans had to take apart the cartridge and physically rewind the tape themselves to watch their content. Magnetic tape is fragile, so this would never be a permanent fix, and it came with the disadvantage of damaging the art on the box to reach hidden screws that held the case together. Even untouched ones degrade over time in ideal conditions, getting sticky and brittle inside the case, which makes them unplayable. There are, effectively, no working Cartrivision tapes left. Not without a lot of finagling. The people who rescued the NBA game ended up trying a bunch of things from freezing the tape to baking it and scrubbing it with special cleaners, and after everything they had to do quite a bit of digital touchup with a computer even after they got it to play – anything less profitable or historic recorded to Cartrivision tapes alone may very well be lost to time.

Just like Flexplay, the red plastic left behind by Cartrivision is a warning: if it’s not better than what’s already out there, customers aren’t going to go for it.

What is Bluetooth?

Elizabeth Technology January 31, 2023


Bluetooth behaves a lot like ordinary WiFi, but over much shorter distances. It’s wavelengths fall between 2.402-2.48 GHz, which, as you might remember from the WiFi article, is fairly low-powered. The max range that most consumer devices can reach is only about 30 feet. This works to its advantage! Bluetooth was designed in a time where rechargeable batteries were either small or high-powered, but never both. As a result, it’s one of the better ways to control peripherals. How many times a year do you have to replace your wireless mouse’s battery, after all?

Bluetooth is currently at version 4, and at the cusp of version five. Version 3 saw major upgrades to speed, but no decrease in battery life; version four delivers all of that power for less consumption.

Bluetooth standards are maintained by an outside, not-for-profit body of experts, the Bluetooth Special Interest Group. If something wants to be called Bluetooth, it has to go through them first – they oversee the licensing of the term for businesses worldwide.

The History

Bluetooth and WiFi have a lot in common with each other. The lower bands of their wavelengths actually overlap some, and they’re both capable of transmitting a lot of complicated information to and from devices. The very first traces of something Bluetooth-like started in 1989, and its first major use cases were unfortunately places where battery use blocked it – the 1990s was partially known for obnoxiously large mobile phones. Getting any info anywhere without a cord could get expensive resource-wise. However, the creators didn’t give up! The team that created it was pretty small, but they wanted to see it used elsewhere, and so they made it public – other people could get in on ‘short-link radio’, and it’s first working version was on two competitor’s devices, a phone and a Thinkpad.

 From there, it’s been everywhere. Bluetooth first saw commercial use in 1999, with a wireless, hands-free headset. The first phone with Bluetooth rolled out in 2001, and Bluetooth version 1 had a top speed of about 721 kbps. It was barely enough for the compressed data from a phone call to get to the headset and back, and it wasn’t nearly enough for music, but it was still incredible. Hands-free phone calls! Hands free phone calls!!!

Bluetooth Version 2 doubled that speed and also made pairing much easier, and both of these made speakers more possible. Version 3 was even better, so much better that it could stream video wirelessly between devices – its data transfer speeds were up to 24/Mbs, because it established a connection directly to the device’s Wifi protocols. Versions 4 and 5 promise even more – all of this, minus the heavy battery consumption that can come with using WiFi to stream things. As rechargeable batteries improved, so did Bluetooth. Where WiFi failed or was impractical, Bluetooth swooped in on machinery and appliances.


Bluetooth is still in use everywhere today! Where cords and cables can’t do the job or would be inconvenient, Bluetooth swoops in.

Mice. Keyboards. Radio dongles. Car infotainment systems. Headphones. Speakers. Truly, Bluetooth revolutionized the way people thought about their peripherals, and turned serious, irritating issues with file transfers into minor inconveniences. WiFi ad hoc was the prior protocol – it was very annoying to set up and maintain a connection, but the only other options were often cords. And, unlike WiFi, walls usually don’t stop Bluetooth transmissions!

Better yet, the tech has slowly improved over the years, and now it can transmit at speeds it’s previous versions could have only dreamed of. Version 4 transmits at speeds up to 25 MBPs, on par with it’s earlier versions but with much less battery consumption.  Bluetooth can stream between devices with very minimal delay, making it a popular choice for soundbars and other similar peripherals.

Flaws (and Fixes)

Much like WiFi, Bluetooth can be interrupted by the microwave, if it’s not properly shielded. If your headphones begin to act up whenever said microwave is on, it might be time to replace it! WiFi too – if your devices won’t pair, it could be because the Bluetooth band and the WiFi bands are overlapping. Move your device further away from the WiFi so it can connect, and then move it back once the devices have paired. Bluetooth is also (usually!) very short-range, and most consumer devices only broadcast to about 30 feet. It’s designed to be convenient, not powerful. Still, the newer versions of Bluetooth are able to reach further and further.

Connectivity issues are also unfortunately common. Bluetooth Smart, a low-energy version of Bluetooth from before 4.0, doesn’t get along with older versions of Bluetooth. Similarly, new versions are backwards compatible, but old and older versions may not be able to communicate if they’re both from before Bluetooth became backwards compatible in the first place.

Getting two devices who are compatible to communicate can be annoying too! And since so much of Bluetooth is hidden to the user, it’s possible to get stuck in a loop of turning both devices off and back on again to try and get them to connect. Especially if it’s something like a speaker or a pair of headphones – if it doesn’t connect while the other half is actively looking, you’ll have to start over. There’s not really a way around it: Bluetooth can’t be actively looking for the connection forever, and not every device can have a screen for users to monitor the connection’s progress.  

All that said, though, Bluetooth is still generally the most convenient wireless option that still delivers quality sound.