Posts Tagged

hardware

Water Proofing and Phones

Elizabeth Technology November 30, 2023

Getting your phone wet used to be a death sentence. It still is, for a lot of phones, but it used to be too.

Non-Smartphone Devices

Retail computers usually stay far away from water unless it’s for cooling purposes. There’s no reason to tack on another year of development, parts, and labor for a computer that will never get close to a pool or ocean. Of course there are computers meant for the water, but they’re not normally made to also be used at a desk on the regular. Usually. As such, certain elements inside retail computers may be waterproofed, but not the whole thing.

Flip phones and smart phones are much the same way. There was no reason to waterproof until a market started appearing for clumsy folks, and people who get pushed into pools. Even then, Samsung, Motorola and Sony didn’t make that a feature for every phone they sold.

Once the speakers, flip phones, and other computer types got on board, it was only a matter of time til smartphones joined the party.

Waterproof, or Water Resistant?

Waterproofing a phone is difficult, and not because of the screen: all the buttons, speakers, and ports have to be sealed as well, and it has to hold up over time. Once a user comes to expect dunking their phone in water, they’ll continue to do so. Taking pictures in a pool, listening to music in the shower, scrolling online in the bath, etc. All kinds of things they’re very much not used to doing right now can turn into habits.

User wear and tear had to be taken into consideration, and at the same time whatever they surrounded the buttons and openings with had to be lightweight. Around the screen, lightweight sealants already existed, but around the buttons? A whole new story. If it were easy, it would have been done with the first smartphones! Right now, waterproofing comes down to gluing the components inside of the device together really well and using rubber gaskets around buttons.

Interfaces

This construction method isn’t perfect, but it retains as much of the interface that users are used to using as is possible. Buttons are still on the sides and front, ad there’s no extra resistance from the gaskets. It’s a compromise. Water will eventually get past the seal, and submerging the device too deep puts the rubber under so much pressure that it fails.

Ratings are how manufacturers tell users what their phone can hold up to. By most standards, the best phones on the market are water resistant, not waterproof. IPM, or Ingress Protection Measure, tells the user how resistant the phone is to foreign bodies and water. The rating starts with IP and is followed by two separate numbers: the first number tells the user how resistant the device is to solids, while the second tells the user about the moisture resistance.

Most phones aim for IP67, meaning it’s rated at a 6 (out of a 0-9 scale) for solids and 7 (out of 0-9 again) for liquids. This equals 30 minutes under about three feet of water before the phone begins to leak. Perfect for folks who didn’t want to worry about their phone during a sudden rainstorm, or the clumsy among us who drop their phone into the sink, in a pitcher, in soup, etc. and can retrieve it right away. It’s not for beach or pool use. It just means the phone’s not dead if you happen to drop it while taking an above-water picture.

Water Resistant (Up to Thirty Minutes)

Samsung’s first cutting edge smartphone with water resistance was the Galaxy Note 7.  It was a breakthrough moment for waterproofing technology! But it wasn’t the first: Motorola made a phone that met IP67 standards, the Motorola Defy. It was a smaller device, built with a heavy-duty case and rubber plugs for the cord holes. It was definitely a ‘lifestyle phone’, the kind you’d get if you camp or fish on the regular. As a result, it was… a little ugly.

Samsung’s new Galaxy Note 7 took the Motorola’s sturdy design and lightened it up for the average consumer who’s only going to be using it just outside the pool. It had the same rating, too – for some reason thirty minutes seems to be the tap-out for water resistant and water proof phones, even today. Invest in a case or a waterproof camera if you want to take pictures while diving!

Stopgap: Cases

(Note: this is not a buyer’s guide. This shouldn’t be treated as an endorsement).

Most touchscreens rely on conductivity, or how your fingers conduct charge. Most phones won’t work if your hands are wet, because the water is also conducting the charge, and it confuses the screen. You might have noticed your phone trying to zoom in or spasming when drops of water are on the screen.

For gloves, the fabric usually keeps the charge from conducting at all because most cloths are insulators, meaning they don’t carry charge very well. Gloves have to be specially made with conductive finger pads if the user wants to use that phone while gloved. The same goes for cases: if a case is going to be usable, it has to be made with conductivity in mind.

But wait. Water is conductive. How do you get a case that reacts to touch and not to water? The answer is also difficult! Most cases that keep water out and allow you to touch the screen are pouches, and even then, not all of them work underwater. Some are simply designed so that you can keep your phone in your pocket when wading, or kayaking – you’re not expected to actually be using your phone.

The difference is both in the material of the case and the phone: the phone has to be programmed to know what water ‘looks’ like on its screen, and sort it separately from your inputs, and the material has to be conductive enough for touches to get through. Thinner material gets better readings to the phone, but thicker material has fewer fake inputs from the water. It’s a challenge! The best way to get what you’re looking for out of a waterproof casing is to look at reviews, and the IP rating. If it’s 67 or above, you’ll know it can handle being submerged for a little bit.

Anything above is a bonus, and means it can go deeper in the water for longer. Look at what sports the manufacturer is advertising it for: are you meant to be using it on boats, or is it for snorkeling? And if it has anything about touchscreen use, be sure to test it before you really test it, using your sink or bath. If the screen’s not responding through the bag, you’ll know ahead of time.

Sources:

https://www.androidauthority.com/first-water-resistant-android-phone-1153031/

https://www.androidauthority.com/best-waterproof-pouch-cases-smartphones-689477/

https://www.hoista.net/post/42353282830/the-evolution-of-the-first-waterproof-phones

https://uk.rs-online.com/web/generalDisplay.html?id=ideas-and-advice/ip-ratings

https://www.travelandleisure.com/style/travel-accessories/best-waterproof-phone-case

Public Internet Access Terminals

Elizabeth Technology November 21, 2023

In the early days of the internet, the average computer was still bulky and often pretty pricey. Most electronics were! Some people still have the brick phones or old CRT monitor computers they used before the size of transistors and chips shrank, and finding those old models in movies or on eBay isn’t hard.

Bringing the internet out of designated places (colleges, libraries, the home, etc.) into other spots it might be useful was difficult. One of the wackier ideas of the time, the Public Internet Access Terminal, foresaw a world where computers would be like payphones, in 2003.

American Terminal Public Internet Access Portal

Sources on this company are incredibly limited. One Youtube video: https://www.youtube.com/watch?v=DASfwrCjICg&t=5s&ab_channel=RicDatzman pops up when the exact name of the kiosk is searched. One very old commercial proves the existence of the internet’s first tendrils encroaching into public space.

The commercial itself is a perfect snapshot of how people viewed the web in its early days. You might need it on the go. It might be like a payphone someday. The computer inside the terminal will make you money and won’t need to be replaced by the next more powerful model anytime soon because it’s good enough as it is (as a reminder, Moore’s Law stated that the number of transistors on a circuit would double about every two years, and up until recently ). People in the comments remember using them to check online game accounts and send last-minute emails before hopping onto planes or busses.

And yet, no other source aside from this commercial seems to exist! Despite their confidence in their product, they weren’t confident enough to build a website for prospective franchisees.

I Just Need To Send An Email

Internet usage at the time was limited – Amazon was still selling primarily books, computers were still pretty large, and while things like email were much more convenient than snail mail or phone calls for their traceable, info-dense nature, not everyone had an email address. For the lighter users of the internet, stopping by the library to check their digital mailboxes was a cheap and easy way of keeping up with the times without committing to a fullblown computer. After all, the dot com crash ruined the internet’s most aggressive investors. If it somehow didn’t pan out, they wouldn’t be out too much money.

The problem was that while that crash was disastrous, the internet still had plenty of use! And people who didn’t want to invest in the equipment were being pulled further and further into it either by work or for recreation.

In the midst of this, a particularly enterprising company thought to put together internet terminals that could be put in places like airports, and controlled by outside franchisees like vending machines often are. To the people trying to sell these products, the age of computers was slowing down post-crash, and  while they may have anticipated that these computers would be fully depreciated by the time the owner paid back the investment and maintenance costs (just like any free money scheme, if this was actually as low risk as they advertise, they would have kept it to themselves), they likely didn’t picture a world where the very thought of one of these things existing freely, unmonitored, in public, paid for by the minute and not the GB, would seem outdated. Like a payphone.

Using Biometrics: Is It Really Better?

Elizabeth Technology November 9, 2023

Some phones allow users to use their biometric data as 2FA, or as a password by itself – how does it measure up to PINs?

Cons

1) Your Face Looks Like Your Family’s

Every single service using face unlock handles this a different way – they all use different programs, and those different programs handle similarities differently. Apple, which uses state-of-the-art hardware and code to see faces, still sometimes mixes it up. For Apple, the program that reads your features and unscrambles this information is constantly updating itself and adding to its library of what you look like. If it didn’t, a sunburn or a new eyeliner shape would trip it up and lock you out for looking different.

The problem is that it’s allegedly doing that by looking at the person holding the device when it’s unlocked (using a passcode or otherwise), which is usually you but sometimes isn’t. People who look similar enough and who may be holding your phone enough (like family) can sometimes trick FaceID into opening for them by accident. While this is getting better, there’s no way to rule out a twin unlocking your phone without also sometimes locking you out too.

2) Law Enforcement

Most police forces have the right to collect some of your biometric data if you are ever arrested – your face and fingerprints go into their records. The legality of using that to unlock your mobile device pre-subpoena varies from state to state; some states will allow you total freedom to decline an un-subpoena’d unlock request no matter how your device is secured, while others won’t let you decline at all, but some states depend on the type of lock. Certain biometric data is not legally protected in the same way passcodes or PINs are. Look it up for your state!

3) Nefarious Children

A much more common unwanted-unlock scenario is a child getting hold of your phone during a nap and holding it up to your face to buy Robux. While face-unlock adapted, and many smartphones don’t let you attempt an unlock with closed eyes anymore, fingerprints stay the same even if you’re asleep. Still pictures of the target tend to trick older Face ID as well, although that is improving with each new generation of phones.

Pros

1) When Done Right, It’s Really Tough to Beat

Barring the similarity issues above, when biometric data is used correctly, it’s pretty darn good at keeping unwanted people out. Collecting fingerprints to unlock a device or account is often more difficult than it’s worth, and deters bad actors from trying. Strangers will generally not have photos of the phone’s owner good enough to unlock it on-hand – more recent phones use infrared too, so pictures don’t even work on new phones anymore. Cracking biometric locks takes a lot of coincidences or a lot of effort, not just a computer stuffing passwords.  

You also can’t write down your face and lose it somewhere like you might for a password, and (at least for phones) you can’t have it breached in the same way as a written password.

2) When Done Right, It’s Faster

You’d need to wait for a sent 2FA code, but you don’t need to wait for a fingerprint or a face unlock.

3) As Long As Policies Stay the Same, The Data Doesn’t Leave The Phone

As of the writing of this article, Pixel and Apple devices state that the mathematical representation of your face which the phone uses to unlock will not leave the device it’s being used on. Apple even goes a step further and separates the computer that handles facial recognition from the computer that does everything else inside the phone!

The caveat of course is if those policies stay the same – companies make promises and then go back on them all the time. American privacy laws are fairly lax compared to other countries, so any privacy policy not written into law needs an eye kept on it for changes.

Why Won’t My Bluetooth Devices Link Up?

Elizabeth Technology October 19, 2023

1) How far away is your other item?

Headphones, speakers, and keyboards generally aren’t going to have a range over 30 feet – some can’t even hit that. Generally speaking, devices don’t go over 30 feet because they don’t need to, and making them powerful enough to do so makes the battery drain faster. If your device is far away, it may not connect, or keep disconnecting and re-connecting.

2) Are you trying to connect to the right thing?

Many devices have bizarre serial number names that you can only find in your user’s manual, especially if the device is not a name brand like Microsoft or Apple.  When you start trying to connect your phone to your new Bluetooth headphones or speaker, make sure you know what its real name is!

3) Are you following the pairing instructions?

Many devices have light indicators on them somewhere to signal whether or not they are connected. Blinking lights usually mean a device is looking for another Bluetooth device to pair with, but not always. Sometimes things blink just to indicate that they aren’t connected. Follow your user’s manual!

Along with that, are both devices seeking a connection? Bluetooth can be on and open, but if the device isn’t in search mode, it might not connect where it’s supposed to until one or both devices are told via their Bluetooth menu that they’re supposed to work with each other.

4) Is the battery charged?

Bluetooth takes a fair amount of power to broadcast, and the signal may get weaker before the device is fully dead. If you notice your speaker suddenly wants you to be closer to make it work, it might be time to charge it, or change the batteries.

5) Has it been on for a long time, with no breaks?

You can also try turning both devices off and back on again. Any device with RAM can have things clog it up, and turning a device off usually fixes it and gives it a fresh start.

6) How old is your other item?

Bluetooth is backwards compatible, so it’s rare to find two Bluetooth compatible devices that won’t work together. It’s rare, but not impossible! Some older and simpler devices have a hard time overcoming the barriers between each successive Bluetooth upgrade. Which device has the newer version of Bluetooth seems to matter as well – my phone will connect to an older Bluetooth car radio transmitter, but my MP3 player will not. A new car radio transmitter will connect to the MP3 player and the phone just fine.

Footnote – Security

It’s not a good idea to leave Bluetooth on when it’s not in use! When you’re done using it, turn it off. Bluetooth can be tricked into connecting to strange devices a number of ways, and be used to take data off of your device. The good news is that most devices only ever expect to connect to one other thing at a time, so as long as your phone is tied to your speaker, another device won’t be able to connect via Bluetooth.

Computer Power States: How is Rebooting Different From Sleep Mode?

Elizabeth Technology October 3, 2023

Closing Your Laptop/Desktop Sleep Mode

You may want to put your device into sleep mode when you need to save your battery life, when you need to move it, or when you need to step away from your desk for a period of time. Sleep mode doesn’t turn your computer or laptop off, it just conserves power. Open programs are paused, and the screen is turned off.   

Some devices will turn off if left in sleep mode for an extended period of time, so you shouldn’t leave any work on your computer unsaved while the screen is closed or off. Your computer may go into sleep mode if it is left idle.

Rebooting

Rebooting shuts down your computer, and then immediately restarts it. This will often solve issues with programs that have gotten stuck, are crashing, or are otherwise struggling – when the computer is turned off, they are forced to start fresh. This goes for any other programs open on the computer as well, so be sure to save your work before you restart!

Shutting Down, and Then Restarting

This is the same as a reboot, but you decide how long to leave it off. Sometimes, programs will prevent the computer from shutting off all the way during a reboot (or they will keep doing the last thing they were doing before a reboot, if the computer’s RAM is not completely wiped), so by waiting 30 seconds to make sure it really is turned off and all the components have powered down, you force those programs to restart too.

Why are Game Downloads So Ridiculous?

Elizabeth Technology September 28, 2023

The first Doom game is famous for how little space it takes up. Because of it’s absolutely tiny size, almost any device can play it. The latest Doom was approximately 50 GB, a far cry from the games of the past.

The Beginnings

Doom is famously small. When games came on floppy disks, fewer disks meant less overhead expense and a more seamless player experience. It also reduced the risk that something could go wrong. Programming was simple, elegant – textures and sounds were limited, and yet Doom used it’s few pixels to great effect.

Sonic, another small game (meant for a console this time), famously took up a levels’ worth of space for the SEGA opening soundbite. The scale of the levels themselves was so incredibly small that a second-long clip of someone saying ‘Sega!’ consumed as much space as a level. That is insane. Audio, even the crispest, clearest mp3s around, can no longer say that.

While some ambitious games like Doom were technically 3-D, many more were much simpler – Sonic, Metroid, and other 90s games were all 2-D, and yet they all came out difficult and engaging. A number of other trash games came out alongside them, but the shining stars of 90s nostalgia still hold up to this day. The concepts themselves were fairly new to the world: Personal computers and consoles alike were still a fairly new consumer product, still heavily associated with businesses in the case of PCs and children for consoles. Customers, therefore, were making room for something new, not accommodating it unconditionally.

The Graphics, and The Next Step: 3D

The next generation of consoles and computers were significantly more powerful, and as such the games could afford to take up a little more space. A little. Still, that little meant that things looked very different. The distinct polygonal art seen in Final Fantasy, Banjo Kazooie, and other favorites was the best, least-intensive art they could make at the time. You’ll notice shadows are limited and that textures often repeat.

The Nintendo 64 had about 4 MB of RAM – games to fit it could be a maximum of 64 MB, although many were much smaller. Articles say that all of the games for the 64 could fit on the Switch, which is still underpowered compared to other consoles! And yet, so many iconic games come from this era. Ocarina of Time, Mario 64, Banjo Kazooie and Banjo Tooie – the world was an oyster. Other games on other consoles came out, but Nintendo – having watched Atari shoot itself in the foot – didn’t make a habit of producing bad games, or letting third parties make bad games for them.

Game length, too, was incredible for the limited storage space: polls say that Super Mario 64 and Banjo Kazooie both take 11 to 12 hours to beat, and longer to 100%. If they didn’t provide hours of entertainment, that money might go elsewhere to keep the kid distracted. A small game had a big hill to climb, and it still didn’t have a lot of space to do it in, both on the shelf and on the actual console. Games would much rather be longer than look great.

Middling

I recently watched a video covering Silent Hill 2, a game for the PS2, which came only four years after the Nintendo 64. I was very impressed by how good it looked; while the main character was definitely blocky, and the fog and fire effects looked like they were sponged on, the frame rate never dropped, and the pre-rendered cutscenes could have blended in with games much younger. Game storage had moved from cartridges and the occasional floppy disc to the new and much better CD-ROM. Silent Hill 2 was 1.8 GB. The device to load it, the PS2, had a RAM of 32 MB, and that was plenty to run it. Games like those pushed the boundaries of what could be packed into a disc!

The great thing about these games is that even though they took up more space to look nicer, you could still generally tell how long the game was by the file size, although people were no longer ‘making room’ for a ‘toy’ – they were creating a space for legitimate art and leisure. Computers were more widespread, now, although not every family had one. Games were turning into art, into something most people wanted or already had experience with. Consoles, while almost always hot Christmas items ever since their conception, started turning games into ‘must-buys’. As such, sloppier games and games that took up a lot of space now had permission to exist. Games didn’t have to be so brutally efficient in their coding..

The 2010s

The Xbox 360 had entered the market, and LAN was becoming outdated. The console and the increasing internet speeds of the time meant that players didn’t have to get together to play together for everything anymore. Gaming consoles as well as computers are now capable of downloading large games directly from the internet, where before a disc or a cartridge or something would have been more efficient. This is around the time games start bloating. Characters look really good – Grand Theft Auto 4 looked downright realistic compared to San Andreas, and adults who had played both could tell. Games, even for the PC, could be really effective sandboxes. The player had room for a whole world now, after all.

 Big games were usually still pretty long, but they were also becoming unpredictable. The Darkness (a game I really liked) took up 6.8 GB and several entire afternoons to beat – Sonic ’06, meanwhile, takes up 5.6 GB, and if the game weren’t slidey and difficult to navigate, could easily be done in a day or two assuming you found the will to finish it. Games could get boring much faster – unlimited potential and limited handholding meant that games like Minecraft could be really fun for hours, or get boring in thirty minutes. Storage space for PC games is no longer a promise of quality or length.

Even Better Graphics – How Big Is Too Big?

Doom: Eternal takes up 50 GBs to download. While it is decently long – PCGamer says it took them 20 hours to complete the main quest, and HowLongToBeat says 12 hours – it’s also significantly bigger than the first Doom, which provided somewhere between 5 and 6 hours for just the main quests, at a mere 2.39 MB. You don’t have to be a math expert to tell that the ratios are way different. The textures that go into making Doom Guy’s gun now take up more room than the original game ever did. And is it worth it? How many copies of Doom could you actually download on your computer? Borderlands 2, another game on both PC and console, takes up 20 GB but provides around 30 hours of entertainment with just the main quest. The twist is that Borderlands 2 is 4 years younger. In the time between GTA 4 and GTA 5, between Borderlands 2 and Doom, between Gears of War 3 and Gears of War 4, game studios have ballooned all the trappings that come alongside the game. However, the concepts of games themselves, at their core, don’t take up any more space than they used to.

Games stopped getting bigger for levels. They started getting bigger for detail. New Halo games are not longer than the old ones, on average, but they are still bigger. The detail of the levels is consuming valuable space that gamers with mid-tier rigs might like to save for other things. Like other games. Games that don’t hold themselves to hyper-realism in every new generation are finding their job much easier, but the Triple A studios are struggling to justify the expense and space consumption of a game that gets a ‘B’ on Steam. Triple A studios have come full circle, and are beginning to shut themselves out of parts of their market that they’d otherwise be guaranteed.

An unintended side effect is that indie studios are providing much more accessible games. A triple A studio is forced to let go of otherwise guaranteed customers because their game sizes and specs are keeping up with the top-of-the-line computers, not the mid- and low-tier ones the indie studios are aiming for. They take less resources, they provide a different experience – but they’re much closer to that original era of gaming where their spot on a computer was very far from guaranteed. Small games can be just as fun and charming as big ones – especially when their size comes down to texture and engine lighting over more substantial things like story and gameplay, or AI.

Sources: https://www.nintendolife.com/news/2020/05/random_every_nintendo_64_game_ever_released_would_fit_onto_a_single_switch_cartridge

https://howlongtobeat.com/game?id=834

https://howlongtobeat.com/game.php?id=9364

https://howlongtobeat.com/game?id=1274

https://gamerant.com/halo-infinite-franchise-how-long-to-beat/

https://store.steampowered.com/agecheck/app/12210/

The Old Twitter Is Dead – Long Live Twitter

Elizabeth Technology September 12, 2023

What Made You Think You Could Just Do That?

When a company is big enough to become an official channel of communication for the White House, it’s not shocking that jerking it around in an effort to break things off of it is going to break a lot more than leadership bargained for. Twitter, now X, is experiencing quite a bit of seismic activity in response to their rebrand.

Purely From A ‘Visibility of Leadership’ Viewpoint

I’ve said it before and I’ll say it again, the current CEO’s reputation as a funny rich guy is going down the pipes. He launched a car into space! Haha, what a Bond villain! He sold a flamethrower! Haha, what a Bond villain! He smoked a joint on Joe Rogan’s show! Haha, what an Everyman. He illegally started taking Twitter’s old sign down and replaced it with an incredibly bright one that strobes every twenty seconds! Haha…. what… but that’s not….that’s like… Lex Luthor, not Tony Stark. A man who made a program that Twitter bought had to tweet at Twitter leadership to ask whether or not he was still going to receive the money he was owed for his work, and the CEO, barely researching the issue at all, tried to embarrass him into dropping it with private medical info that Twitter The Company did not have the right to share. That’s not even Lex Luthor – that’s abominable.

 It’s important to know these things for context. Musk, the current CEO, is not and has never been playing 4-D chess with this purchase. Turning Twitter into X instead of simply making X is a result of impulsive tweets and communications that (falsely) boosted the hopes for Tesla’s stock price. Remember how hard he was trying to get out of it? Especially the stuff about the bot count? He didn’t want it. But not following through would have landed him in hot water with the US government for stock manipulation, perhaps even insider trading.

He could have made his own social media platform instead of buying the most expensive one on the market, and he probably would have been better off for it if he hadn’t been memeing, but he had something to prove. He had to prove he was funny and cool and so rich he could just buy Twitter on impulse.

Being Important

Musk’s incredible wealth has insulated him from consequences. Worse, Twitter’s status as an important communications tool is delaying further consequences for the company itself. The companies hosting the servers are reluctant to shut it down or throttle. Advertisers pulled away from Twitter, but losing their money didn’t turn off the lights like so many predicted it would. Meeting the definition of ‘doing business with’ terrorist organizations may have already triggered investigations by the US government, but they’re moving so slow it’s impossible to tell what’s happening. The company itself is running on a skeleton crew, but the people remaining are effectively held hostage by visa requirements or somehow believe they can fix what’s been broken. View rate limits keep people from scrolling perpetually like they always have. Despite everything, despite waves and waves of bad choices, bad updates, firings, missed rent payments, bad sources of income, all the things that used to take down giants like Sears, MySpace, Kmart, et cetera, Twitter lumbers on, a giant among giants. The landlord can’t even get them out of the building despite Twitter tampering with the signs outside of it without a permit. Part of the reason people are so eager for the ship to go down is because by all rights, Twitter should have died already. And yet users keep going back! Twitter keeps limping forward! Few websites have ever been able to keep crawling forward like this after getting kneecapped, but by golly Twitter is hanging in there.  

One More Sign, One More Change, One More Anything

The sign’s hanging in there too! The sign(s), actually. The whole reason I’m writing this article is because of the signs and what they represent within Twitter, now X. We know rebranding to X is wiping out a ton of brand recognition. How does Musk intend to make up for all the lost years of bird? By changing the sign on the San Francisco Twitter building into a giant, blinding white Unicode character X. This is after Twitter tried to take down the old sign without getting the building’s permission first, and the cops came and stopped them taking letters off before they were even half done. Both signs, both bad, were on the building at the same time, and neither had the necessary permits from the city to be in the state they were in, either partially torn down or powered up.

Musk’s blinding X sign was an unusual sight on the San Francisco streets because those signs are a genuine danger to drivers at night, so nobody else puts them up. It’s likely that they’re not allowed to. Twitter happened to be right across from a residential building (which is occupied even on the weekends) but even if it wasn’t, several thousand lumens of strobing, flashing light is irritating unless you’re actively seeking it out at shows and such.  It didn’t stay. Like the process for removing the sign, Musk did not get permits to put this new one up – it wouldn’t have strobed like that if he had. It wouldn’t have been as bright, or as annoying, if he’d just gone through the process and let someone tell him no.

History of the Emulator

Elizabeth Technology August 24, 2023

An emulator is a program that emulates a game console, usually for the purpose of playing a game that is – either by price, age, or device – inaccessible. Streamers commonly use emulators to play Pokemon games made for the Gameboy, so they can screen-record their gameplay directly from their computer instead of having to somehow hook the Gameboy up to it. Zelda fans might want to play Ocarina of Time, but they might also find that the console to play it on is awfully expensive for one game, but an emulator is pretty cheap! In certain cases, games are geolocked – countries restrict access to certain forms of art as a means of censorship. Emulators can make those games accessible to people who want to play them in that country.

In the 1990s, consoles were on top when it came to games. Computers were rapidly gaining in power, however, and some folks realized that the console could be recreated using a home computer. The first emulators were born via reverse-engineering console coding. They evaded legal action by only copying devices that were outdated, but that changed too with a major emulator made for the Nintendo 64 while it was still in production. Nintendo pursued legal action to stop the primary creators, but other folks who had already gotten their hands on the source code kept the project going.

Ever since then, emulators have lived in a delicate balance of making games available and making them so available that the parent company decides to step in and try to wipe it out, which is nearly impossible once it’s out on the open web. Gamers simply won’t allow a good emulator to die!

Copyright

Copyrights are crucial to the gaming ecosystem, and it’s a delicate balance of allowing fan art, but disallowing unauthorized gameplay. Allowing game mods, but disallowing tampering that could lead to free copies being distributed against the company’s wishes. Copyright laws are always evolving – new tech comes with new ways to copy, create, and distribute intellectual property. Generally, though, copyright falls back to permission: did the original company intend for their IP to be used in this way?

Emulators and copyright don’t get along very well at all! Emulators are, by their very definition, creating access to the game in a way the original company didn’t intend. As such, it’s unofficial, and if money is exchanged, it’s not normally between the copyright holder company and the customer, it’s the customer and some third unauthorized party.

Games aren’t selling you just the physical disk. You’re buying a license to play the game. If you take it as far as Xbox intended to back when the Xbox One was coming out, friends are only allowed to come over and play with you on your license because the company can’t enforce it. It’s a limitation of the system that they can’t keep you from sharing disks.

Not every company thinks like this (see the Playstation 5), but that’s the most extreme possible interpretation. You bought a disk so you could play a copy of their game that they have licensed out to you. You own the right to play that copy of the game, you don’t own the game itself.

Consider: Death of a Console

When a console dies, it’s taking all of its content with it. There is no more money to be made off of it, and the games are going to slowly disappear into collections and trash bins.

Does art need to exist forever, or is it okay if some art is temporary? Not every Rembrandt sketch is still in trade – some of it was just sketches, and he obviously discarded some of his own, immature art. Immature art is interesting to see, but it’s not what the artist wanted their audience to see. Otherwise it would have been better kept. Think about the ill-fated E.T. game that Atari made. They weren’t proud of it, they didn’t want it seen, and they saw fit to bury it. So they buried it. It was directly against their wishes for people to find this game and then play it. Emulating it is obviously not what the programmers who made it wanted for it.

But then consider all the little games included on a cartridge that’s just forgotten to the sands of time, made by a programmer who didn’t want it to fade away? Acrobat, also for the Atari, isn’t very well-remembered, but it still made it onto Atari’s anniversary console sold in-stores. 97 games on that bad boy, and Acrobat was included. It’s not a deep game, it’s nearly a single player Pong. But the programmers who made it didn’t ask for it to be excluded from the collection, so some amount of pride must exist over it, right? Does the game have to be good to be emulated? Is only good art allowed to continue existing officially?

Is all art meant to be accessible to everyone?

If some art is made with the intent to last forever, is it disregarding the creator’s wishes to not emulate it, against their production company’s wishes?

If art’s made to last forever but the artist (and society) accepts that that’s simply unrealistic, is it weird to emulate it, in the same way it’s weird to make chat-bots out of dead people? Every tomb we find, we open – even against the wishes of the grave owner, in the case of the Egyptians, or against the wishes of the living relatives, in the case of Native Americans. Video games are kind of like tombs for games that have lived their life and then died. But they’re also kind of like art.

When you get past the copyright, it’s a strange, strange world to be in.

Ethical Dilemma

Stealing goes against the ethics of most societies, modern or not. The case against emulators is that it’s stealing. It often is! An emulator/ROM (ROMs act as the ‘disc’ or ‘cartridge’ for the emulator) for Breath of the Wild was ready just a few weeks after the game launched, which could have seriously dampened sales if Nintendo didn’t step in to try and stop that. That first emulator, the one for the Nintendo 64, also drew a lot of negative attention for the same reasons, potentially siphoning away vital sales.

However, there’s a case to be made for games and consoles that aren’t in production anymore.

Is this a victimless crime, if the original game company really can’t make any more money off of it? It’s one thing to condemn piracy when the company is still relying on that income to make more games and pay their workers, it’s another entirely when the game studio isn’t interested in continuing support, and the console had a fatal fault in it that caused many of them to die after 10 years. That game is as good as gone forever without emulators. With no money to be made, why not emulate it?

In less extreme circumstances, the console’s still functioning, but the cartridges that went to it are incredibly rare. The company could potentially make money off of the game if they someday decided to remaster it, but that’s unknowable. Licenses could be available for purchases… but they aren’t right now.

Or, even better, the cartridges are still available for purchase in the secondary market. You just don’t happen to have the console, which has now spiked to a cost of 400 dollars due to reduced supply over time. You buy the cartridge – you’re still buying the license, you just don’t have the car, right?

According to copyright, you need a specific car for a specific license, but ethically, you’ve done the best you can as a consumer.

Assuming you have tried to buy a license for the car. The biggest issue with emulators is that they allow unlicensed drivers access to cars, making piracy much easier than it should be.

Brand Name

Much like Disney did with Club Penguin’s many spinoffs, emulators are kind-of sort-of overlooked up until they start eating into sales. Most companies just don’t want to spend money to enforce an issue like emulators – their game is still being played, their brand is still out there, and the users are going to be very upset if this big company decides to step in and ruin fun when they don’t need to. It may do more harm than good to try and wipe the emulator out when most people want to do the right thing.

Obviously, they’ll need to put a stop to emulating new games – the goal is to spend just enough money to do that effectively without also overstepping and destroying emulators for consoles no longer in production. It takes money to make games, games should earn money as a result. Removing emulators for games and consoles no longer in production isn’t helping them earn money – as such, many are allowed to stay. For now.

Sources:

https://www.pcgamer.com/the-ethics-of-emulation-how-creators-the-community-and-the-law-view-console-emulators/

https://scholarlycommons.law.northwestern.edu/njtip/vol2/iss2/3/

What’s Up With VHS Tapes?

Elizabeth Technology August 17, 2023

Yeah, CDs are impressive or whatever, but have you ever seen the inside of a VCR?

What makes a VHS different from other options?

Many things! It’s entertainment predecessor, film, was dropped for a couple of reasons. Film is composed of individual images on thin, photosensitive tape, where magnetic tape is the image’s information translated into computer language. A reel film projector is shining a light behind the reel to show the image. If you shined a light behind a VHS’s tape, you’d see nothing but brown! Plus, you can pause VHS tapes. Pause a reel film on the projector without moving the tape, and you risk burning it.

Betamax, VHS’s primary competitor, is arguably superior in every way. Betamax had better resolution, better sound quality, etc. and it came out at the same time as the original VHSs. What separated the two was cost: a Betamax tape was more expensive than VHS, and since VHS was only marginally worse, companies picked it up. Secondly, VHS tapes could record more, but since most movies were under three hours anyway that didn’t do as much for it as pricing did. CED tapes, Hi8 tapes, better, smaller reel tapes, and the rest were also vying for the ‘primary choice’ crown – and the VHS beat them all with durability.

From the beginning, VHSs were kind of an underdog. Radically new tech was always coming and going, VHSs could be another flash in the pan and disappear the next day, like CEDs did. The first company to launch VHS tapes set up standards to prevent VHSs from dying out due to quality issues, but widespread adoption would be up to marketing and luck. Plenty of good ideas on paper died once they were actually put into manufacturing.

How does the tape itself work?

Where CDs and DVDs have no moving parts, VHS tapes are full of them. The tape itself moves on spools, and that forces VCRs to read the data linearly (instead of randomly). All that means is that that the VCR had to read the rest of the tape before it can get to the part you’re looking for, where something like a hard drive can pick a file without reading other files first.

Additionally, the tape is not a loop, it’s a strip. Many media types got caught up in trying to make a self-rewinding form of media, but the tech simply wasn’t there yet to do that cheaply. if you got to go to a Blockbuster before they switched away from tapes, they would kindly remind you to rewind the tape after you were done watching it so the next person to rent the movie doesn’t have to rewind it first. Since the reader works on the tape in both directions, having to do that can spoil the movie.

The information is encoded onto the tape in a couple of areas: there’s a control track, an audio track, and a visual track. The reader can’t read the tape without the audio or control track – trimming either off will cause the tape to fail. A reader head is actively looking for the control track to synchronize with the other tracks, which will ‘pulse’ in sync with each other to ensure correct alignment. If it can’t find it, it doesn’t have a backup plan!

Visual information is encoded onto the tape using two separate writing heads held at a slight angle. The data is magnetized into the tape in an almost herringbone-like pattern, which the VHS can read fast enough to generate smooth images on-screen. This has the added benefit of ‘self-correcting’ – each reader head only reads the data slots that are at its angle, so there’s no weird flashing or jumping between frames. Given the end-user is not doing something strange to the tape, VHSs run pretty smoothly as a result.

How does the reader work?

The reader is composed of a motor, some internal mechanisms to control the speed of the tape, and a couple of reading and writing heads. To play content, the VCR pulls the tape in front of it’s readers, which then decode the information written on the magnetic tape into video. The tape itself is divided into separate areas for audio and video, as well as a timing track. Different heads along the inside of the VCR read the tape as it’s pulled by, and rollers keep it taut between them to prevent tangling.

If one wants to write to a VHS tape, their VCR should be capable – all but the cheapest usually are. VCRs completely revolutionized the entertainment industry by enabling the consumer to record particular episodes or events cheaply. Suddenly, a TV show didn’t need to re-run an episode five or six times to be sure their fans saw it. Their other revolutionary trait was being able to do this when the user wasn’t home – again, all but the cheapest of VCRs were able to record at a set time, with minimal user interference.

VCRs are specially adapted to reuse VHS tapes. It’s possible to tape over other tapes because the VCR, while in writing mode, erases the tape as it goes by so that the writing head has a clean surface to write to. “Taping Over” something persists to this day, even though very few consumer devices use tape anymore!

Durability

 VHS tapes are pretty durable – but they aren’t invincible. No form of media is! VHS tapes are vulnerable to many of the same things hard drives are: excessive heat may cause warping and a loss of quality, cold and radiation exposure can ruin the information on the tape. Unlike reel film, however, VHSs don’t become worthless when exposed to light. The tape shouldn’t be out of the container, but it’s not ruined if it somehow gets stuck outside the casing for a little while.

It takes a little bit of hunting to find working VCRs, but luckily they’re so simple that even broken ones can be used again. Replacement parts are still sold in specialty stores and online!

Assuming digital content really is the future forever and physical media declines, there are things you can do to convert tapes if you’re worried your home movies aren’t storing well in the attic. VHS-to-digital converters are available for purchase, for example, and places that do photo-printing also frequently offer mail-out services for conversion.

Sources:

http://repairfaq.cis.upenn.edu/sam/icets/vcr.htm

https://www.zimmermantv.com/tv/how-a-vcr-works/

http://aperture.stanford.edu/lab/video/Tutorials/vhsvhsdub.html

https://southtree.com/blogs/artifact/what-came-before-vhs

Parallel ports: A Brief History

Elizabeth Technology August 15, 2023

Parallel and serial ports used to be everywhere, and now they’re more or less limited to ancient printers and old iPads. What happened to them?

Serial

Serial ports came first, but just barely. Serial in this context means that the information is processed one stream at a time, which the receiving device will then have to stack up to read. For example, printers: a serial connection on a printer would give the printer the ASCII data one bit at a time, and it’s up to the printer to stack up the bits to make the words.

But if the data’s being transferred one bit at a time, why does it need so many pins?

On a computer, each pin on a serial port does something different – some regulate the out- and in-put speed, some are purely for grounding the connection, and some are responsible for transferring the requests for data between the computer and the peripheral it’s connected to. And each peripheral has different needs! A mouse or a CNC is going to need more information about the data than a printer or a bar code scanner. There might also be a parity pin, which ensures the data sent is correct. They came in all sorts of shapes and sizes, from circular 7-pins to trapezoidal 25-pins for motherboards.

Serial ports are actually faster (now) than parallel ports because the data’s transferring one bit at a time. If you can make the bit transfer faster, then the entire serial port speeds up with it, because data transfer speeds are basically arbitrary! Serial ports could keep up with computers as they improved. Parallel ports have to be sure that their data’s being received all at the same time: if one pin can’t be optimized any more than it already is, then that pin holds back the speed of the data for the other data-transferring pins. If the data doesn’t make it at the same time, then the computer doesn’t know how to interpret it. Imagine receiving parts for an IKEA chair out-of-order and being told you had to start assembling it now even though you don’t have the legs or screws yet.

Parallel

Parallel ports actually appeared at about the same time as serial ports, and allowed for multiple streams of bits (the ‘parallel’ part) instead of just one. The port was feasible in the 1970s, but the first commercial parallel port appeared on IBM printers, in the early 1980s. Printers were where they found most of their use. The pins sped up printing by presenting the ASCII (a character library that uses sets of binary characters to represent letters) to the printer all at once, instead of serially.

However, parallel ports came with a couple of problems. They couldn’t match a serial port’s speed, once bit-cycle-times shot down, and the three major companies attempting to use them for their printers came up with different protocols for each operating system, so everything had to be double checked for compatibility.

Where’d They Go?

As said before, the USB has taken over much of the parallel port’s turf, and where USB is inconvenient, network printing rules supreme. There’s not much space left for these parallel pin plugs out in the wild. They’re still around – people still need access to legacy machines no matter the industry or time – but they’re not usually on regular, consumer electronics anymore.

And yet, they aren’t extinct. Serial ports still exist on old or simple tech that can’t take high speeds and still function, things like scientific equipment, or stenotype machines. Because the transfer’s tightly regulated, serial ports avoid overloading the tiny computers inside these several-thousand-dollar instruments.

Universal Serial Bus (or USB) plugs use similar tech, just highly compressed and much faster. USBs are also transmitting data serially, hence the ‘serial’ in the name. Parallel ports may have been left behind, but serial’s sticking around. If you look at the inside of the actual connecting piece, you’ll still see pins, albeit different ones than the kind serial connectors used to use.

 Serial ports represent a major breakthrough in data transfer tech, and they’ve stuck around to this day!

Sources:

https://www.howtogeek.com/171947/why-is-serial-data-transmission-faster-than-parallel-data-transmission/

https://computer.howstuffworks.com/serial-port.htm