Posts Tagged


In Terms of Memory

Elizabeth History, Innovation April 23, 2021

The first Bit of Data


A bit is a single character in binary, and actually comes from shortening “Binary Digit”. A bit is the simplest possible data that the machine can read, and is either a 1, or a 0. A yes, or a no. True or false. The bit has been around for longer than computers, originating in punch cards in the 1700s for analog machines to “read”.




If you’ve recently upgraded to Windows 10, you may recall having to check if your computer is 32 bit or 64 bit. The numbers determine how much memory the computer’s processor can access by its architecture – is it equipped to read up to 32 consecutive bits of data as an address, or 64? A 32 bit computer has fewer possible memory addresses from its CPU register– not much more than 4 GB’s worth, or 2^32’s address’s worth – while a 64 bit computer can store to up to two TB, or 2^64 addresses. This doesn’t mean 32 bit computers can only store 4 GB of data, it just means it can store 4 GB worth of names. The files themselves can be nearly any size as long as there’s storage available for them.


Then, a Byte


A byte is usually eight bits in compliance with international standard – but it didn’t always have to be. Instead, it used to be as long as needed to show a character on screen, usually somewhere between two and ten bits, with exceptions down to one and up to forty-eight bits for certain characters. Eight-bit bytes became the standard by their convenience for the new generation of microprocessors in the 70s: within 8 bits in binary, there are 255 possible organizations of ones and zeroes. 16 bits would give too many possibilities and could slow the computer down, while 4 bits would mean combining phrases of bits anyway to get more than 32 or so characters.




8 sounds like the perfect combination of length and possible complexity, at least with the benefit of hindsight. The government had struggled with incompatible systems across branches due to byte size before 8-bit came along. ASCII was the compromise, at seven bits per byte, and when commercial microprocessors came along in the 1970s, they were forced to compromise again with ASCII Extended, so that commercial and government systems could communicate.

However, not all ASCII extended versions contained the same additions, so Unicode was then formed later to try and bridge all the gaps between versions. Unicode, a character reading program that includes the ASCII set of characters within it, uses eight-bit bytes, and it’s one of the most common character encoding libraries out there. You’ll run into ASCII a lot, too – if you’ve ever opened an article and seen little boxes where characters should be, that’s because it was viewed with ASCII but written with a bigger library. ASCII doesn’t know what goes there, so it puts a blank!




1000 bytes of storage forms a Kilobyte, or a Kb. This is the smallest unit of measure that the average computer user is likely to see written as a unit on their device – not much can be done with less than 1000 bytes. The smallest document I can currently find on my device is an Excel file with two sheets and no equations put into it. That takes up 9 KB. A downloadable “pen” for an art program on my device takes up 2 KB.

Computers before Windows had about 640 KB to work with, not including memory dedicated to essential operations.

The original Donkey Kong machines had approximately 20 kilobytes of content for the entire game.




A megabyte is 1 million bytes, or 1,000 kilobytes. Computers had made some progress post-relays, moving to hard disks for internal memory. IBM’s first computer containing a megabyte (or two) of storage, the System 355, was huge. It was also one of the first models to use disk drives, which read faster than tapes. In 1970, if users didn’t want a fridge, they could invest in the now desk-sized 3 million bytes on IBM’s model 165 computers, an improvement over GE’s 2.3 million bytes the year before – and the year before that, Univac had unveiled a new machine with separate cores tied together to give users between 14 and 58 megabytes of capacity in Byte Magazine, at the cost of space. IBM’s System 360 could reach up to 233 megabytes with auxiliary storage, but its size was…prohibitive, reminiscent of that first System 355.

Tapes and drums were competitive with the disk format for a while, but ultimately disk and solid state improved faster and won out (right now it’s looking more and more like SSDs, those solid state drives, will outcompete disks in the future too). During the 80s, the technology improved so much that hard disks became standard (IBM released a home computer with 10 MBs of storage in 1983) and floppy disks acted as media transport.

DOOM comes out in the 1990s and takes up 2.39 MB for it’s downloadable file, with smaller, DLC-like packs of fan-created mods coming out along the way.




A Gigabyte is 1 billion bytes, or 1,000 megabytes. In 1980, IBM releases another fridge – but it stores up to a gigabyte of information! According to Miriam-Webster Dictionary, you can pronounce Gigabyte as “Jig-ga-bite”, which just… feels wrong. In 1974, IBM releases a 20 foot long beast of a storage system that stores up to 236 GB of data on magnetic tape.

In 2000, the first USB sticks (memory sticks, jump drives, etc…) are released to the public with 8 megabyte capacities, and they’re so convenient that floppy disk ports begin disappearing from computer designs in favor of USB ports. USB sticks then improve exponentially, and soon have capacities of one, two, and four Gigabytes while floppies struggle to keep up.

Besides being smaller and harder to break, those USB sticks also store more. Where the first USB sticks held 8 MB, the standard size floppy disk at the time could only hold 1.44 MB of memory. Knowing how small DOOM is, it would take two floppy disks to download all of DOOM, but a USB only took one. By 2009, USB sticks with capacities of 256 GB were available on the market. That’s 178 floppy drives.




A terabyte is 1 trillion bytes, or 1,000 gigabytes. The first commercial drive with a capacity of one terabyte was first sold in 2007 by Hitachi, a Japanese construction and electronics company. The movie Interstellar, released in 2015, featured a depiction of a black hole known as Gargantua – and became famous when it closely resembled a picture of an actual black hole taken by NASA. A ring of light surrounds the black hole in two directions, one due to friction-heated material Gargantua has accumulated, one due to the lensing of light around it. The gravity is so intense that light itself is pulled into orbit around Gargantua’s hypothetical horizon and kept there. It took 800 terabytes to fully render the movie and make Gargantua somewhat accurate in terms of light-lensing.


A petabyte is 1 quadrillion bytes, or 1,000 terabytes. This is typically cluster storage, and while it’s available for purchase, it’s very expensive for the average consumer. For comparison, while rendering Interstellar took 800 terabytes, storing it at standard quality takes 1/200th of a terabyte. You could store approximately 2000 DVD quality copies of Interstellar on a petabyte. It took a little less than 5 petabytes to take a picture of the real black hole, M87.



Attempts at Media Storage That Didn’t Get Big

Elizabeth History April 21, 2021

CEDs: Like Vinyl for Video, but more expensive


The CED (or capacitance electronics disc) was a disc that could create pictures with the grooves in its surface, like a video/audio form of the vinyl record. It was expensive to produce, however, and just like vinyl it could degrade after being played too many times. The reader was physically touching the disc to read it. CED tech was also extremely sensitive to dust, even more so than vinyl records. It took a specialized caddy to store these things just to keep contaminants out! Consumers may have thought the idea was great, but the upfront cost was just too much for the average Joe, especially since more affordable media types were already nearby on the horizon (like VHS tapes).

CEDs were being produced even as the manufacturer said they’d cancel them. This understandably led to a dip in profits while manufacturing was still happening, and nobody wanted to pick it back up. CEDs are a fine idea, but much like the eight-track, they were somewhat expensive to make and not very widely demanded.


Optical Cards: Like a CD-ROM, But Worse


The Optical card briefly appeared as an alternative to CD-ROMs (ROM here stands for Read-Only Memory). It’s very cool in theory – it can only be written on once, it’s flexible, and it’s sturdy! It could make a perfect ID card as it usually had a capacity of several megabytes, perfect for storing info to access right away. However, you don’t see much of them today. Why? Optical cards seem like a perfect solution for a number of things.

It’s difficult to find a solid answer online, but my theory is that it did stuff that other products already did.

By the time it came out, it was easier to just scan a code linked to files in the computer than it was to manufacture a card with that unchanging data inside of it. For example: a barcode. There’s a reason barcodes win out over things like RFIDs for inexpensive(!) goods – adding in all that tech is just not worth the price when the computer can also do the trick by itself. Do you invest in 500 small cards with electronics inside and a machine to read them, or 500 plastic business cards with a barcode, and a machine to read them? One’s going to be much cheaper.

Besides, magnetic stripe cards were already on the market, and machines could already read them. It was a short jump to include more info on the card that everybody already had a machine for, so magnetic cards dominated over opticals.

The other part of it (which information online will verify) was that storage was getting cheap! So cheap that optical cards fell out of use for other forms of storage, too. Like in cameras, where Canon released their first optical card. SD cards could hold more than even CDs, so an optical card had no chance in the race. That’s not to say optical cards aren’t used at all, but they sit in an intersection that other products can fill with minimum additional effort. Legacy machines, and certain companies use them, but they’re not very popular.

Good theory, niche too small.


Bubble Memory: Like A Magnetic DRAM Chip, but worse


Bubble memory was supposed to be a more compact, sturdier replacement to other memory types. Unfortunately, bubble memory sat at the worst intersection of expensive and power hungry – even if it outperformed DRAM chips, Semi-conductor memory, and hard disks in one field or another, everything else wrong with it dragged it down to become a second-rate competitor. Not to mention, the main producers of bubble memory drives never got manufacturing down to a science, so it was prone to breakage and bugs even when it should have been competitive in each niche, before the others came along.

It got some use because it popped up in the middle of a DRAM chip shortage, and then promptly died back out once DRAM units were back on the shelf alongside other replacements. It was just too fiddly to keep!


Eight-Tracks: Like a cassette, but more niche


If you’re going purely off of the item’s legacy, the eight-track is certainly a legacy item worth mentioning. It’s in this list because other items from the same era survived where the eight-track died. Cassette players in cars are still so widely present that adaptors sell in drugstores, while eight-track adaptors are a specialty item sold online. The last generation of cars to hold eight-track players are largely off the road, while cars with cassette players were still made into the early aughts. Vinyl records are still sold in physical locations, eight-track tapes are not. Compact cassettes are still sometimes featured in teen movies… eight tracks are not. Eight tracks still hold a lot of nostalgia, but the effort to get one playing in this day and age is a massive pain.

It was a great idea, but it was outlived by other media.


ROM Cartridges for Not-Games: Like a floppy disc, but earlier


Once, cartridges were used across the board. Of course they were! They were convenient, and the earliest home computers already had a slot for them! Most people recognize them as video game storage, but they were capable of more than just that. Applications, extra RAM, extra storage – the cartridge, even the ROM-only cartridge, was almost as capable as a USB was, except for capacity.

Nothing really had that much capacity at the time, though. The computers of the time usually held less than a modern cheapo USB’s worth of memory. Other forms of media outstripped it for basic storage, but it reigned supreme for a few more years in videogame media, before floppy discs started taking over there, too.

It’s distant descendant, the CD-ROM, held more data more securely, so the cartridges started to become outdated when optical media became available for purchase. Even video game consoles switched from cartridges to discs.


Sinclair ZX Microdrive: Like mini-USBs, but too early


A teeny-tiny drive with about 200 inches of magnetic tape inside sounded like a piece of spy equipment when it first launched. The Microdrive was especially small for it’s time and capacity, although it tended to wear out quickly. As a result, it still struggled to compete with bigger drives despite its many advantages. Other, similar drives released by competition were in much the same position. The thinner the plastic, the easier it wore out. Smaller devices either had less tape, or thinner tape – most devices chose thinner. It was the best consumer electronics could do at the time.


Magnetic Drums: Like tape, but bigger


Magnetic tape came out before magnetic drum tech did, although both saw use at release. The primary difference is in the reading: magnetic tape is moved in front of a single reader, while the drum spins in front of several fixed readers. In computers, it was replaced by core-memory. In a way, drum memory was the first time hard drives really took shape: hard drives follow many of the same principles, in that the heads stay still while the magnetic (or capacitive) item rotates beneath it, and the machine picks the correct head to see the data it’s looking for. Instead of a stack of discs, it was a single drum, so it’s capacity understandably wasn’t as great as modern drives, even if the idea was there.

Drum memory certainly didn’t fail – the military used it for years! IBM even used it up until the 90s in certain machines. However, its limited storage capacity made it a less popular choice than the also-widely-used tape, and then core-memory, for regular consumers who wanted something smaller.