Posts Tagged

hardware

Parallel ports: A Brief History

Elizabeth Technology August 15, 2023

Parallel and serial ports used to be everywhere, and now they’re more or less limited to ancient printers and old iPads. What happened to them?

Serial

Serial ports came first, but just barely. Serial in this context means that the information is processed one stream at a time, which the receiving device will then have to stack up to read. For example, printers: a serial connection on a printer would give the printer the ASCII data one bit at a time, and it’s up to the printer to stack up the bits to make the words.

But if the data’s being transferred one bit at a time, why does it need so many pins?

On a computer, each pin on a serial port does something different – some regulate the out- and in-put speed, some are purely for grounding the connection, and some are responsible for transferring the requests for data between the computer and the peripheral it’s connected to. And each peripheral has different needs! A mouse or a CNC is going to need more information about the data than a printer or a bar code scanner. There might also be a parity pin, which ensures the data sent is correct. They came in all sorts of shapes and sizes, from circular 7-pins to trapezoidal 25-pins for motherboards.

Serial ports are actually faster (now) than parallel ports because the data’s transferring one bit at a time. If you can make the bit transfer faster, then the entire serial port speeds up with it, because data transfer speeds are basically arbitrary! Serial ports could keep up with computers as they improved. Parallel ports have to be sure that their data’s being received all at the same time: if one pin can’t be optimized any more than it already is, then that pin holds back the speed of the data for the other data-transferring pins. If the data doesn’t make it at the same time, then the computer doesn’t know how to interpret it. Imagine receiving parts for an IKEA chair out-of-order and being told you had to start assembling it now even though you don’t have the legs or screws yet.

Parallel

Parallel ports actually appeared at about the same time as serial ports, and allowed for multiple streams of bits (the ‘parallel’ part) instead of just one. The port was feasible in the 1970s, but the first commercial parallel port appeared on IBM printers, in the early 1980s. Printers were where they found most of their use. The pins sped up printing by presenting the ASCII (a character library that uses sets of binary characters to represent letters) to the printer all at once, instead of serially.

However, parallel ports came with a couple of problems. They couldn’t match a serial port’s speed, once bit-cycle-times shot down, and the three major companies attempting to use them for their printers came up with different protocols for each operating system, so everything had to be double checked for compatibility.

Where’d They Go?

As said before, the USB has taken over much of the parallel port’s turf, and where USB is inconvenient, network printing rules supreme. There’s not much space left for these parallel pin plugs out in the wild. They’re still around – people still need access to legacy machines no matter the industry or time – but they’re not usually on regular, consumer electronics anymore.

And yet, they aren’t extinct. Serial ports still exist on old or simple tech that can’t take high speeds and still function, things like scientific equipment, or stenotype machines. Because the transfer’s tightly regulated, serial ports avoid overloading the tiny computers inside these several-thousand-dollar instruments.

Universal Serial Bus (or USB) plugs use similar tech, just highly compressed and much faster. USBs are also transmitting data serially, hence the ‘serial’ in the name. Parallel ports may have been left behind, but serial’s sticking around. If you look at the inside of the actual connecting piece, you’ll still see pins, albeit different ones than the kind serial connectors used to use.

 Serial ports represent a major breakthrough in data transfer tech, and they’ve stuck around to this day!

Sources:

https://www.howtogeek.com/171947/why-is-serial-data-transmission-faster-than-parallel-data-transmission/

https://computer.howstuffworks.com/serial-port.htm

Who is this Apple Gadget Built For?

Elizabeth Technology July 11, 2023

Apple’s newest VR headset, the Apple Vision Pro, is technologically impressive – but it’s priced out of reach for many VR enthusiasts. Who is it for, exactly?

The Device Costs As Much as a Used Car

Apple’s Vision Pro headset costs 3,499$ not including tax. It’s such a gigantic price point that it’s hard not to picture what an average consumer could get elsewhere with that money. Many highschoolers drive cars that cost less than this device – before the pandemic completely destroyed both the new and used car markets, finding a used car that still drove reliably for under 2,000 American dollars was possible. Less, even, if that highschooler had family looking to get rid of a beater. You could get four touchscreen Dell computers with i5 cores, 16 GB of RAM, and Windows 11 included for the same price as one Apple Pro headset as of this article’s writing.

This is the latest in a long line of Apple products priced prohibitively. Apple devices were always costly, but they were a sort of costly that made sense – if you really need a photography-grade camera in your phone, then save up for an Apple device and be set for years. As of recently, every device demands a significant investment, whether ‘better’ equipment exists for the task or not.

Technologically Impressive

It was always going to be expensive, though, even without Apple branding. The price reflects what might be a breakthrough in wearable tech, and the device is certainly impressive. It can do things that the Apple iPhone can do as well as simulate Virtual Reality around the wearer. Images of the device seem to hint you may be able to walk around with these things on without being completely blind behind the visor – the presentation states that this is the first Apple product you look through, not at. The ‘screen’ is transparent. You are seeing reality with the augment layered over it in the lenses, the way Google wanted to do it years ago but couldn’t make look natural.

This isn’t smaller or more discreet, but “ski goggles” that can run intensive apps without another computer attached to them is the stuff science fiction writers have been dreaming of! It’s goofy now, sure, but the iPhone was goofy – the first iPhone was four or five good products mushed into one okay-ish product before it found its footing and started doing things well.

While this is goofy, and expensive, and right now its usefulness is pretty much just for entertainment, it is still impressive. Someday it might be a regular piece of techwear. It depends on what people can find them useful for, the true question that determines the product’s life.  

How Much Use Are Headsets Anyway?

Most of the advertising seems to suggest this is best for filming videos and consuming content. That’s certainly an increasing part of everyday life for many people, but for the same price, those people can buy a quality TV and soundbar and couch and have a decent home entertainment system that shows stuff to more than one person at a time. Even if they wanted the apps, they could buy at least two of the newest iPhone for the same price as one set of ski goggles. Nobody can agree on whether headsets, augmented reality glasses, and metaverses have real value beyond entertainment.

Potential monetary gain is getting in the way of real assessments! If augmented reality or metaverses ever find their footing, the money made by the people who establish themselves first will be completely insane the same way the first NFT sales were insane. They have a motivation besides advancement of the technology to push this stuff. It’s why Facebook bought Oculus, and then never seemed to do anything with it. Zuckerberg saw the potential for purchasable avatar clothing and virtual storefronts that would have to pay ‘rent’ for the virtual space, which Facebook/Meta could sell for massively inflated prices compared to website domains.

Worse, some of the people pushing hardest for Metaverse successors don’t even think that potential money will last, they just figure a boom-bust cycle is inevitable – the sooner the boom, the sooner they can extract money from people and then bounce before it all comes crashing down.

However, while that attitude is everywhere within the companies, it’s getting in the way of making an enjoyable experience for the end user. There is no money to be made until the consumer is having enough fun to spend a couple dollars on a virtual arcade game, or uses their avatar enough to buy it a funny hat. The only reason so many of the crypto ones exist at all is because the funding comes in before they have time to set up all the little microtransactions designed to bleed consumer wallets dry. Once those are in place, the Metaworld’s player count usually drops sharply. Not even the worst arcades in the world steal quarters like these places plan to, and every single one thinks they’re the first to have the idea.

 Metaverses are often painted as a sort of cyberpunk wonderland, the future, the inevitable next step in technology, but they never seem to end up getting there because ‘visionaries’ and ‘early adopters’ make promises they can’t keep and slink away with whatever they got first. If the virtual parts of augmented and virtual reality never improve because of this cycle, then there just won’t ever be a stable set of apps and programs to use on the very expensive hardware bought to facilitate it. Apple has the potential to fix the second part of that because it has final say on every app in the app store as well as the funding necessary to make new and exciting apps for the headset should it choose to do so, but the first part is going to take some serious reimagining of the space’s potential.  

In the face of all of that, what can a peripheral do to prove itself worthy of a consumer’s time? Does Apple really believe this headset is the future, or is it banking on customers buying it to use as a status symbol-slash-fashion statement? For that matter, if money is removed from the conversation, if you could just have one, would you want it, and use it if you got it?

What would you use it for?

It’s Summer for Computers, Too

Elizabeth Technology June 22, 2023

Listen, sometimes machines get old, and they work too hard, and then you don’t want to burn yourself by watching Netflix, so you resort to other methods of cooling your computer. There are right ways, and there are wrong ways.

DON’T: Put Your Machine in the Freezer or Fridge

It sounds like a good idea, but it’s really not. Condensation can form on the inside of the machine, which can then permanently break things as said condensation re-melts and drips onto other components inside your device. Plus, if it’s a systemic issue like a broken fan or overworked CPU, this isn’t actually fixing the issue. You’re going to be taking your machine in and out of the freezer forever!

Cold screws up glues over time, too, meaning internal elements can gradually wiggle their way loose.

As an unrelated hack, freezing gum can usually get it off the bottom of your shoe.

DON’T: Put Ice Packs, Popsicles, or Bags of Ice on or in the Machine

Condensation, once again, can ruin your machine if it drips into the wrong spot. However, ice bags have the added danger of leaking! Ice sometimes has sharp enough points to pierce its own bag. Popsicles, while usually sealed for safety, are not worth the risk of some sharp component in your machine piercing the bag full of sugary dyed liquid. If that doesn’t kill the machine, it will make you wish it had when the keyboard is too sticky to type on quickly.

DON’T: Run Every Program at Once

You shouldn’t be running high-distance Minecraft alongside high-render Overwatch while also running your internet browser for a live Youtube stream in 4K unless you’ve got a super-computer. If it even lets you get those programs open and running, but you notice your computer is unusually, abysmally hot, those programs might be contributing. You can overload your CPU! If you can’t identify which program specifically is eating up all your CPU’s power, check the task manager. Windows devices have a task manager that allows them to see how much of the RAM, the hard drive, and the CPU a program is using. Just hit (Ctrl + Alt +Delete) and you’ll reach a menu with Task Manager at the bottom. If you can’t narrow your issue down to a specific program, then restarting the computer may fix whatever background program has gotten stuck in the RAM. It’s a good idea to reboot regularly anyway!

Now that we’re past the don’ts, what should you do? You obviously can’t let it stay hot, that will slowly fry the hard drive. Excessive heat is worse for electronics than cold is, especially the kinds with batteries in them. You should take steps to cool off your machine if it’s getting ridiculously hot.

DO: Use a Fan

There’s a small fan inside of your computer already. If it’s not cutting it, then the next best step is to use a real fan, and just position the intake for your device in front of it. The extra air flow is just doing what the fan inside the device was already doing, but on a bigger scale! You might find that repositioning your computer so the fan will fit by the intake can help cool it down, too – computers in front of windows might be absorbing more heat than you realize.

DO: Use a Specially Designed Cooling Pad

Some companies sell cooling pads, pads that cool the device down externally. These are specially designed to avoid generating condensation inside the device, while still wicking away heat safely. If you can’t get a fan into the area it needs to be, a cooling pad is a solid second option. Unfortunately, due to the shape and size of PC towers, this is generally only feasible for laptops.

DO: Make Sure the Vents Are Clear

If the machine’s pretty young, and the programs on it aren’t too intense for its specs, the reason may be external. Check where it’s vents are! Especially for PCs. If the tower is pushed right up against the wall, it might not be able to generate the airflow it needs. Also, don’t put stickers or decorations over vents. That’s also bad for the vent’s venting power.

Speaking of vents, make sure the vents are cleared of dust, too! Cleaning them improves efficiency.

DO: Restart Every Once in a While

Your computer is doing a lot of things in the background for you. Many programs are still doing things after you close them! Steam, a popular gaming platform, is almost always also connected to the internet when users aren’t looking. It does this at start up, and it keeps an eye on it’s own connection to let you know if you lost internet. As such, it’s important to occasionally restart, so these programs don’t ‘get stuck’ eating processing power for their own little functions.

DO: Consider a Shop

If the computer’s hot enough to fry eggs, the odds are pretty good that something’s up with the CPU, the fan, or it’s own internal thermometer, depending on the age of the machine. If you’ve tried everything you can think of to cool it off, or keep it from getting so hot in the first place, it might be time to visit a shop. At the very least, you should be keeping backups of your files. If the heat eventually kills the machine, a backup saves you a lot of money on very expensive data recovery.

Sources: https://www.crucial.com/support/system-maintenance-cooling

Why Not Make Elder Scrolls 6?

Elizabeth Technology May 9, 2023

The Wii U was a console that operated on the same mechanics as the Wii, but was much more powerful. The Elder Scrolls 5: Skyrim, is almost unrecognizable from the base game that launched over ten years ago, in 2011.

What was once considered peak design is outdated; what sold well in the past sells well now, but begrudgingly.

What happened to designing games and consoles?

It Has to Be Impressive

The worst recent trend when it comes to electronics is that no matter what company is making the product, the product has to be impressive. In fact, newer companies have to be more impressive than ever to get a fighting chance in the market, without costing so much that a potential buyer is turned off. Plenty of smaller companies would love to make games for gaming consoles they designed themselves! (And in fact, plenty of consumers would love to buy a simple device like an iPod shuffle with only 16 GB of memory – but Apple won’t make anything that costs less than 400$.) But they can’t keep up with the biggest companies on the market, and trends suggest indie games are where users look to change up their experience, not indie consoles. The Switch is technologically unique, the PS5 and the Xbox 1 are the most powerful consoles ever in their respective lines – nothing but desktop computers could even hope to keep up. Buying a console has become a market like buying a major appliance. If you could spend just 100$ more on a fridge for a fridge that also defrosts itself automatically, wouldn’t you?  Sure, minifridges are cute, but unless you’re a college student, you probably have access to a better one. Indie developers over a certain size can design games for the big consoles as well, so the more expensive fridge still has space for artisanal cheeses, even if it wasn’t built just for Mimolette.

The second problem is that gamers sort of don’t want to invest in ‘new’ right now. Everything seems to be on fire outside. Games are a comfortable distraction. Gamers want ‘familiar’. They want the things that reviewers have looked at and invested time into, even if they’re realistically a B or a C grade game at best. Old, huge companies like Nintendo make custom-tailored consoles like The Switch, but if Soulja Boy’s console company had come up with it first, it might have bombed. Nintendo making a console that mainly serves Nintendo games is no accident, as well. If Sony were to release an updated PSP or a PS Vita in an attempt to compete, there’s no guarantee it would work out for them as well as it does consistently for Nintendo, partly because it would be freakishly expensive to match the performance that Playstation fans have come to expect, but also because the Playstation has an enormous gaming library that’s pretty intimidating to approach as a newcomer. The ratio of games they make to games they outsource is completely different. It couldn’t guarantee a market for either old fans or new ones.

Nintendo releases its own games – Nintendo makes Mario, not a gaming studio that Nintendo owns. Nintendo can pull from old catalogue favorites like Legend of Zelda and remake them for the Switch without starting a copyright spat. This is not only a built-in age gate (Nintendo Games made for kids will always look like they were made for kids, and Nintendo Games that aren’t, don’t) but an easy flag for quality the consumer can keep track of. Nintendo rarely has ‘bad’ games. You don’t have to really research that ahead of time to know the worst Nintendo game is miles ahead of the worst game available on Steam.  When games are consistently reaching 60 and 70 dollars new, a dud is a serious disappointment. That’s four movie tickets out here in Vegas. Of course people are expecting to get at least four movies’ worth of entertainment from a game that costs that much.

 And Familiarity Wins The Crowd

With all that said, it’s no wonder Skyrim is eating up development time that could have been used on a new Elder Scrolls game. Everyone still likes Skyrim! It plays on every console, so it doesn’t matter which one you’ve already got. You can buy mods for it, download free mods, and the dungeons are neverending if that’s your jam. It’s easy to take a break from serious quests if you want, or to beat the game and spend time doing the multiple hours’ worth of side quests once the final boss is dealt with. Skyrim is a good game. You’re going to have a good time playing it unless you go out of your way not to. It’s comfortable and easy to access. At this point, Skyrim’s replayability may steal sales from the next game, if Bethesda ever gets around to making it. Bethesda has a golden goose, and it’s not going to get rid of it until the goose dies of old age.

Wii U

The Wii was much the same, except Nintendo wanted to make the goose a little better. The Wii U’s mistake wasn’t in making a console designed to appeal to the fans of the Wii – the Wii U’s mistake was not making it clear that the Wii U was a younger sibling to the Wii, not it’s twin. Not enough was done to ensure that fans of the console knew the difference, and if you believe it to be a slightly different Wii and not a complete overhaul of the console, then why would you ever spend the money on it? The Wii U is more powerful, but it didn’t secure enough games that were unique to the Wii U and the Wii U alone – why would game developers make games for a device that sold as poorly as the Wii U did when they could keep making games for the Wii and get better royalties in the process? The failure to market the console trickled into every facet of it’s existence to ensure it could never eclipse the Wii.

Skyrim has cursed Bethesda. If the next game is too similar to Skyrim, it’ll be a Wii U. If it’s too different, and the reviews are mixed, game reviewers might not take to it so easily after a solid ten years of good Skyrim content.

Maximalist Mouse – What Else Can You Use It For?

Elizabeth Technology May 4, 2023

You can bind keys on your keyboard, but you can also bind those extra keys on a gaming mouse, if you dare.

Gaming mice are designed with games that use hotbars in mind. A hotbar usually refers to the number keys across the top of the keyboard, sometimes including the F# keys as well. Within the game, you can tie specific usable items to those number keys, and simply hit the right key in the heat of battle to use the item. However, keyboards meant for gaming usually have bigger keys than ones attached to laptops or designed for travel, and sometimes it’s difficult to use the hotbar while your character is still moving – if you need a health potion for your character, but you can’t contort your hands to hit the right hotbar key, a gaming mouse with those hotbar bindings instead can save the day!

How to Bind Keys Elsewhere

Gaming mice are designed for games. Many of the games expecting a mouse like a gaming mouse will let you go into the settings and manually change the keys you need to press for certain actions, whether that’s to other keys on the keyboard or to the buttons on your gaming mouse. While most mice have two or three buttons (mice designed for Windows at least) the sky is the limit!

Be careful doing this – you don’t want to override the primary function of the left or right click buttons, just the ones that shouldn’t already have another function attached.

1) Click Start, and then click Control Panel.

2) Double-click Mouse.

3) Click the Buttons tab.

4) Under Button Assignment, click the box for a button to which you want to assign a function, and then click the function that you want to assign to that button.

5) Repeat this step for each button to which you want to assign a function.

6) Click Apply, and then click OK.

7) Close Control Panel.

Your imagination (and Microsoft’s bindable shortcuts) is the limit!

Maximalist Mouse – What Else Can You Use It For?

Elizabeth Technology April 20, 2023

You can bind keys on your keyboard, but you can also bind those extra keys on a gaming mouse, if you dare.

Gaming mice are designed with games that use hotbars in mind. A hotbar usually refers to the number keys across the top of the keyboard, sometimes including the F# keys as well. Within the game, you can tie specific usable items to those number keys, and simply hit the right key in the heat of battle to use the item. However, keyboards meant for gaming usually have bigger keys than ones attached to laptops or designed for travel, and sometimes it’s difficult to use the hotbar while your character is still moving – if you need a health potion for your character, but you can’t contort your hands to hit the right hotbar key, a gaming mouse with those hotbar bindings instead can save the day!

How to Bind Keys Elsewhere

Gaming mice are designed for games. Many of the games expecting a mouse like a gaming mouse will let you go into the settings and manually change the keys you need to press for certain actions, whether that’s to other keys on the keyboard or to the buttons on your gaming mouse. While most mice have two or three buttons (mice designed for Windows at least) the sky is the limit!

Be careful doing this – you don’t want to override the primary function of the left or right click buttons, just the ones that shouldn’t already have another function attached.

1) Click Start, and then click Control Panel.

2) Double-click Mouse.

3) Click the Buttons tab.

4) Under Button Assignment, click the box for a button to which you want to assign a function, and then click the function that you want to assign to that button.

5) Repeat this step for each button to which you want to assign a function.

6) Click Apply, and then click OK.

7) Close Control Panel.

That’s it! Your buttons should be working.

Magnetic Storage Types

Elizabeth Technology March 16, 2023

Magnetic Tape

The most well-known version of tape-based magnetic storage is the kind used for media. When tape-based recording was first introduced, it revolutionized the talk show and DJ-ing scene of the time (mostly post WWII) because it enabled shows to be recorded and played later, rather than live. Music recording tech already existed, but it required physical interaction from the DJ, so it wasn’t as hands-off as tapes were.

The second-most well-known version is the kind used for computer memory! Data is stored on the tape in the form of little magnetic ‘dots’ that the computer can read as bits. Before each pocket of data dots is a data marker that tells the computer how long that pocket should be, so it knows when one set of data ends and the next begins. The polarity of the dot determines it’s bit value, and the computer can then read all these dots as binary code.

This method of data storage was a massive breakthrough, and other mediums continue to use the format even today! Tapes are still in use for big stuff – parts of IBM’s library rely on modern tapes, which can now store terabytes of information at a higher density than disks and flash drives alike. Other memory types relying on magnetic domains include hard disks and drums, to name a couple. All that separates them is material and know-how: the better the magnetizing material on the outside, the smaller the domains can get. The better the insulation between the domains and regular old entropy, the more stable the data is!

Carousel Memory

Carousel memory was an attempt at shrinking the space that magnetic tape took, but to the extreme. Instead of one very long piece of magnetic tape on a bobbin, the carousel memory system uses several smaller reels of tape arranged in a carousel pattern around the central read mechanism. To get to the right info is as simple as selecting the right reel! This has some issues with it, as you might imagine. Moving parts add complications and an increased risk of mechanical failure to any device, but a device carrying thin, delicate magnetic tape on it is an especially bad place to start.

However, it wasn’t all bad. Carousel memory was actually quite fast for the time because it didn’t have to rewind or fast-forward as much to get to the right area of code. It could skip feet of tape at a time! This advantage declined as tape tech improved, but it still helped companies trying to squeeze the most life from their machines. The bobbins and individual ribbons were all replaceable, so the tape wasn’t worthless if it got torn or damaged. The carousel itself was also replaceable, so the many moving parts weren’t as much of a curse as they’d be on, say, the first hard disks, which had irreplaceable heads.

Core Rope Memory

Core rope memory featured magnetic gromets, or ‘cores’ on metal ‘ropes’, and then those ropes were woven into fabric the computer could read. In ROM (read-only memory) format, if a wire went through the core, it was a ‘one’, or a ‘yes’. If it didn’t, it was a ‘zero’, or a ‘no’. In this way, the fabric is physically coded into binary that the computer can use. ROMd Core-rope memory involved quite a bit of complicated weaving and un-weaving to get the cores in the right spots.

Core rope memory was chosen over tape memory for the Apollo missions, mainly for weight purposes. Tape was great, but not nearly dense or hardy enough for the mission yet, and neither were the other similar core modules available to NASA. A read-only core-rope memory module could store as many as 192 bits per core, where erasable core memory could only manage one bit per core. Where each core on the final module depended on reading the wires to determine the bit’s state, the erasable model (core memory) read the core’s magnetic state to determine the bit state, not the threads going through it. The final module sent up to get to the moon was a total of 70-ish pounds and read fairly quickly. Tape, core memory, or hard disks available at the time couldn’t have gotten to the same weight or speed.

Core-rope memory has its place. It’s very sturdy, and since it relies on the cores to act as bits, it’s possible to visually identify bugs before the memory’s even used, unlike core memory. Both are sometimes called ‘software crystallized as hardware’ because of the core system. It isn’t seen much today, since it is still incredibly bulky, but at the time of its use it was revolutionary.

Core Memory

Core memory is the older sibling of core rope memory, and it stores less. However, the people who got to work with it call it one of the most reliable forms of memory out there! Core memory works much the same as core rope memory, where the bits are stored in cores.

However, the formats are different. If core rope memory is like a binary-encoded scarf, core memory is more like a rug. Thin threads made of conductive material are woven into a grid pattern, with cores suspended on where the threads cross each other. The computer understands these threads as address lines, so asking for a specific bit to be read is as simple as locating the X and Y address of the core. A third set of lines, the sense lines, runs through each core on the diagonal, and this is the thread that does the actual reading.

When asked to, the computer sends a current down the sense threads and sees if the cores flip their magnetic polarity or not. If it doesn’t, it was a zero. If it does, it was a one, and it has been flipped to zero by the reading process. This method is known as ‘destructive reading’ as a result, however, the computer compensates for this by flipping the bit back to where it was after the reading. Due to its magnetic nature, the core then keeps this info even after power to it is cut!

This link here is an excellent, interactive diagram of the system.

Even though this improved the bit-to-space-taken ratio, core memory still aged out of the market. With the price of bits decreasing rapidly, core memory got smaller and smaller, but the nature of its assembly means it was almost always done by hand – all competitors had to do was match the size and win out on labor. Soon, its main market was taken over by semi-conductor chips, which are still used today.

Magnetic Bubbles

Magnetic memory has had strange branches grow off the central tree of progress, and magnetic bubble memory is one of those strange shoots. One guy (who later developed other forms of memory under AT&T) developed bubble memory. Bubble memory never took off in the same way other magnetic memory styles did, although it was revolutionary for its compact size – before the next big leap in technology, people were thinking this was the big leap. It was effectively shock proof! Unfortunately, better DRAM chips took off shortly after it hit the market and crushed bubble memory with improved efficiency.

Anyway, bubble memory worked by moving the bit to-be-read to the edge of the chip via magnets. The magnetic charge itself is what’s moving the bits, much in the same way electrons move along a wire when charge is applied, so nothing is actually, physically moving within the chip! It was cool tech, and it did reduce space, it just didn’t hold up to semi-conductor memory chips. They saw a spike in use with a shortage, but they were so fiddly that as soon as DRAM chips were available again, they went out of style.

Semi-Conductor DRAM – Honorable Mention

DRAM chips are a lot like core memory, in that the device is reading  the state of a physical object to determine what the bit readout is. In Semi-conductor chips, that physical object is a tiny capacitor, hooked up to a tiny transistor, on semiconductive metal-oxide material. Instead of determining magnetic state, the device is instead checking if the capacitor’s discharged or not. No charge = 0, yes charge = 1. These chips aren’t technically magnetic, but since they’ve killed so many of the other options, here they are!

DRAM stands for Dynamic Random-Access Memory, and it means that the memory can be accessed randomly instead of linearly. As long as the computer knows where the data’s stored, it’s able to pull it without pulling other files first. They’re still being sold today!

Magnetic Disk (Hard Disk Drive)

Hard drives work more like tape than core memory. A Hard drive is a platter (or a stack of platters) with a read-write head hovering above it. When you want to save data, the hard drive head magnetizes areas in binary to represent that information. When you want to read or recover that data, the head interprets these areas as bits in binary, where the polarity of the magnetized zone is either a zero or a one.

The zones of magnetization are incredibly tiny, which makes hard drives one of the more demanding memory forms out there, both now and back then.

Early hard drives could suffer from ‘de-magnetization’, where a magnetic disk’s domains were too close and gradually drew each other out of position, slowly erasing the information on the disk. This meant that the disks had to be bigger to hold the data (like everything else at the time) until better materials for data storage came along. Even though they held more capacity at launch, they were passed over for smaller and more stable stuff like tapes and core memory. The very early drives developed by IBM were huge. Like, washing machine huge. They didn’t respond to requests for data very quickly, either, which further pushed reliance on tape and core technology.

Over time, hard disks improved dramatically. Instead of magnetic zones being arranged end-to-end, storing them vertically next to each other created even denser data storage, enough to outcompete other forms of media storage entirely. Especially small hard drives also come with a second layer of non-magnetizable material between the first layer and a third layer of reverse-magnetized ‘reinforcement’ which keeps the data aligned right. This enables even more data capacity to be crammed into the disks!

Some time in the 80s, hard drives finally became feasible to use in personal computers, and since then they’ve been the standard. SSDs, which don’t have any moving parts whatsoever, are beginning to gain ground in the market, but they can’t be truly, irrevocably erased like hard drives can due to different storage techniques. Hard drives are going to stick around a while, especially for the medical and military industries, as a result!

Sources:

https://spectrum.ieee.org/tech-history/space-age/software-as-hardware-apollos-rope-memory

https://www.apolloartifacts.com/2008/01/rope-memory-mod.html

https://electronics.howstuffworks.com/vcr.htm

https://www.apolloartifacts.com/2008/01/rope-memory-mod.html

http://www.righto.com/2019/07/software-woven-into-wire-core-rope-and.html

https://www.computerhistory.org/revolution/memory-storage/8/253

https://nationalmaglab.org/education/magnet-academy/watch-play/interactive/magnetic-core-memory-tutorial

https://www.rohm.com/electronics-basics/memory/what-is-semiconductor-memory

https://cs.stanford.edu/people/nick/how-hard-drive-works/

https://psap.library.illinois.edu/collection-id-guide/audiotape

https://www.engadget.com/2014-04-30-sony-185tb-data-tape.html?guce_referrer=aHR0cHM6Ly9lbi53aWtpcGVkaWEub3JnLw&guce_referrer_sig=AQAAAC5GC2YOKsvhOs9l4Z2Dt1oHX3-YxjPyJC60qfkq6_6h8zyckkBK9V9JJC9vce3rCmcgyehT-RB6aORBfzB9b5oiBoF1Fbic_3653XVM8fsUTHHnTgxKx4piCeEl65Lp54bkbMcebEEddwlq-EDnAcM7zuv49TXYHcgq9lmnrBln

https://en.wikipedia.org/wiki/Carousel_memory (all primary sources regarding carousel memory are in Swedish)

Wii: A Masterpiece of it’s Time

Elizabeth Technology February 21, 2023

The Wii, a motion-controller game console, used a combination of things to make sure it read your movements. The Wii was a truly special device!

Hardy Equipment

If you could only look at consoles to compare them, the Wii is at an advantage. It stands straight up, like a book on the shelf! It’s also much smaller. Other consoles can be stood up straight, but it’s not advisable – if doing so blocks the vent, the console can overheat and then die. The Playstation 5 recently advised against flipping the device on its side because the cooling system could break down and leak, which is not good.  

Aside from configuration, the Wii is the weakest of it’s generation of consoles, but that’s actually still a selling point – the device was so cheap because almost all of the interior computing hardware was coming ‘off the shelf’, which made it weaker, but meant the consumer was paying less for a device like no other on the market.

The Wii could sense motion in a way that other consoles simply had not dared to try – no doubt the Xbox or Playstation would manage to create a machine/controller pack that cost three times as much as the Wii did.

Differing Technologies

The Kinect, a much more unique approach to the matter of motion detection, is much more complex, but also more expensive. And Xbox’s mishandling of the new ‘always on’ era of gaming made it pretty contentious. Playstation had the most success by simply trying to emulate what the Wii had going for it.

And what did the Wii have going? It used a sensor bar in conjunction with the actual device to sense where the controller was pointing. The sensor bar itself didn’t actually do anything but light up!

This meant that in a pinch, you could simulate a missing Wii bar with a couple of candles – the machine is using the sensor bar as a frame of reference for where the controller is pointing at any given time. Within the controller itself was an accelerometer, which allowed the machine to tell if you were spinning, shaking, swinging, or otherwise moving the remote. Nintendo even later produced an optional set of control enhancers (the Wii Motion Plus) for games that required even finer tuning. The only downside was that controllers sometimes went through TVs or windows, which eventually stopped happening once users adjusted to the unfamiliar motions of bowling. 

Good Games

One of the biggest deciders of a console’s fate back in the 2000’s was what games would be available on launch day. Wonder why so many consoles come with games already downloaded to them? It’s because that system benefits every party involved, and may swing the purchasing party on whether or not to get the special edition of a particular console. Outside of built-ins, the console has to attract studios to make games, otherwise you end up with a catalogue full of repeats, sometimes even made by the console developers themselves. The Stadia, the Ouya, and a number of other small consoles make a great platform that doesn’t have any games on it. None attractive enough to swing the purchaser.

The Wii, because it was made by Nintendo, was already hand-in-hand with a number of games from a brand known for being family friendly. For families looking for a new console that a child of any age could play, this was a fantastic option. It had zombie games alongside party games and sport simulators. It really was a game-changer.

Bad Sequel

Given all of this , the most disappointing part of the Wii is the Wii U, the next console in the line. Not enough was done to ensure users knew the Wii U was a different console. It sounds ridiculous, but it was a real problem! The Wii U looked just like the Wii to someone who didn’t have either, and the game cases didn’t do a great job of telling users what console they were buying for, so once it came out, there was always the chance that a well-meaning relative would buy the wrong edition of a game.

Similarly, the Wii (just like all Nintendo products) didn’t make enough for the first run… and then broke pattern by drastically overproducing the WiiU, a business decision that haunts the choices made by execs to this day (it was impossible to get a Switch for a good three or so months after launch).

Still – the Wii did set standards for what AR really could be, even without a helmet or anything too fancy. In a way, it’s got tons of sequels. The Playstation started using motion controls after the Wii proved it was not only possible, it was fun! And it opened the door to gameplay mechanics that engineers and programmers could have only dreamed of.

Could AR ever be used in an office setting?

Elizabeth Technology February 16, 2023

Home Offices

A home office is often a place of respite. Quiet. Calm. Personalized organization. Companies looking to save money on renting a space may go for work-from-home solutions, no matter their size, and even people who work in an office may still choose to make an office space in their home, whether that’s just a desk in the corner of the living room or a whole spare bedroom, because it makes paperwork and keeping important documents organized easier. In essence, the idea of a home office is incredibly customizable and flexible. If you call it your home office, and it’s not superseded by being a dining room table, it’s a home office.

So, when Zuckerburg announced plans to make ‘virtual offices’, many people were put off, but many more were intrigued. A home office is obviously not a perfect substitute for the kind a business rents out to use, for better or worse. Could Meta Company somehow improve it?

Fun and Games

What Zuckerberg presented combined the worst aspects of VR Chat, the worst aspects of Slack, and the worst aspects of the headset itself. The headset is designed to make you feel like you’re actually seeing a different environment when you move your head, and it does it so well that a percentage of people with VR headsets report headaches – the brain is receiving conflicting information that it can’t sort out, and it doesn’t like that.

The virtual office concept allowed you to look across a virtual desk with a virtual keyboard to see your virtual colleagues, who could perform gestures and small expressions to indicate some sort of feeling. The thing about this system is that it’s annoying – the benefits of being work-from-home include not being in the work office, and being in your home office physically but not in spirit pretty much cancels that out. Under this system, other users could theoretically tell when you’d stepped away – the feeling of being watched in the work office was fine, but it wasn’t in the home office, where workers expected to feel like they were in their home and not in the panopticon.

Walmart Too…?

So many of these ideas seem to think that adding a need to traverse a 3D virtual space somehow improves the idea of a virtual experience. Walmart thought that you might miss actually walking up and down the aisles when they premiered their virtual solution to online shopping, which is by far the worst part of going to a Walmart Supercenter. They added physics to items so your avatar could grab them and put them in the cart instead of just clicking buttons, which makes shopping take longer and also increases the risk of the application bugging out on you. They offered to link up to your smart fridge, so they could remind you that you already have milk in there while you’re grabbing it in the app, allowing you to confirm that you did in fact mean to grab more milk, adding a prompt to do so. The entire idea from top to bottom seemed to hope that you’d spend more money if their app made you work more.

This is not the way VR was meant to re-invent the office, or the remote shopping-experience, or any experience that’s annoying or difficult to do. When customers are shopping in person, the other people are part of the experience (especially in small towns). When they’re shopping over an app, the customer has to be able to find what they want as easily as possible, with as little friction as possible, and it doesn’t get much simpler than searching for an item in a search bar and hitting ‘add to cart’. It’s the worst of both worlds.

It’s almost as if they’re trying to retroactively come up with stuff for the headset to do that they already have easy access to, vs. actually researching and developing programs specifically for VR. VRs shine brightest in games because of the way they function, but if Facebook’s CEO doesn’t believe in the future of games as a product, then there’s going to be a lot of running around trying to make other products more game-like so they’ll fit better. Walmart’s VR demonstration felt like dozens of games, across all genres, simulating everything from stocking shelves to driving trucks. It’s bizarre to try and use it as a virtual world that’s just as boring and simple as the real one – if you’re going to have a virtual Walmart or a virtual office, surely you can do something more entertaining with the surrounding environment than one that the user can already go visit at almost any time? That’s completely the wrong feeling, but it’s the one VR sinks into most naturally, because it’s the only real justification for the product being sold.

There’s room for AR, but not like this!

Cartrivision – Another Attempt to Curtail Home-Viewing

Elizabeth Technology February 9, 2023

Cartrivision was the very first tape system to offer home rentals. It was introduced back in 1972, and didn’t see very much mainstream success – you had to buy an entire TV to play the tapes, and some of the tapes were not rewindable.

You may have actually seen them before, in the cliff notes of a documentary: the 1973 NBA Game 5 Finals were recorded, but somehow every other recording method except for a Cartrivision tape failed or was lost, so retrieving the information stored on the tape became the obsession of the documentarian. The documentary even won an award.

What makes Cartrivision so special that just recovering one warranted a documentary?

This Was Super Expensive

As mentioned before, the Cartrivision player came built into a TV, and TVs were already expensive. The result was a device that cost the equivalent of a decent used car (about 1,300$ in early 1970s money, or about 8,000$ today). This, understandably, meant that the market for these devices was already kind of niche. But wait, there’s more! As an added expense, most of the fictional stories available for Cartrivision devices were rental-only, and only non-fiction could be purchased to own. This meant you couldn’t build a catalogue of fictional stories for home use after you made this huge investment for the machine. Why not just ‘keep’ them, you may ask?

Because the Cartrivision tapes came with a built-in mechanism that prevented home machines from rewinding the rental tapes! Rental tapes, much like Flexplay discs, were denoted by a red cartridge. Unlike Flexplay, you could only play them once. You could pause them, but never go back. The movie studios were worried that Cartrivision could disrupt the movie theater market, and as such the Cartrivision people had to be careful not to make things too convenient to avoid spooking the people providing them their IPs. They were the very first, after all.

Perhaps You Went Too Far

The company discontinued their Cartrivision manufacturing after a little over a year, thanks to poor sales. Users generally don’t want to pay twice for something, and the red tapes were just not convenient enough to warrant buying a specific (and very expensive) TV for a lot of families. Cartrivision then liquidated their stock, but a large number of tapes were destroyed thanks to humidity in one of their warehouses, making them even harder to find today. Cartrivision TVs were suddenly cheap enough for tinkerers to buy and modify themselves, and many did – there are few or no original, mint-condition Cartrivision TVs on the market that aren’t also broken.

Additionally, Cartrivision tapes come in a strange size, even though the tape itself remains fairly standard. They were custom-made for one specific machine, so they were allowed to be weird in as many ways as they wanted, but as a result they are incredibly finicky to work with if you don’t have one of Catrivision’s proprietary watching machines. If you didn’t get a Cartrivision during their liquidation sale, you’d have no reason to buy and preserve their proprietary tapes.

Speaking of the tapes, the company started selling the red tapes, but not the machine they used to rewind them. There were less to start with, anyway. Home Cartrivision fans had to take apart the cartridge and physically rewind the tape themselves to watch their content. Magnetic tape is fragile, so this would never be a permanent fix, and it came with the disadvantage of damaging the art on the box to reach hidden screws that held the case together. Even untouched ones degrade over time in ideal conditions, getting sticky and brittle inside the case, which makes them unplayable. There are, effectively, no working Cartrivision tapes left. Not without a lot of finagling. The people who rescued the NBA game ended up trying a bunch of things from freezing the tape to baking it and scrubbing it with special cleaners, and after everything they had to do quite a bit of digital touchup with a computer even after they got it to play – anything less profitable or historic recorded to Cartrivision tapes alone may very well be lost to time.

Just like Flexplay, the red plastic left behind by Cartrivision is a warning: if it’s not better than what’s already out there, customers aren’t going to go for it.