Posts Tagged

hardware

Magnetic Storage Types

Elizabeth Technology March 16, 2023

Magnetic Tape

The most well-known version of tape-based magnetic storage is the kind used for media. When tape-based recording was first introduced, it revolutionized the talk show and DJ-ing scene of the time (mostly post WWII) because it enabled shows to be recorded and played later, rather than live. Music recording tech already existed, but it required physical interaction from the DJ, so it wasn’t as hands-off as tapes were.

The second-most well-known version is the kind used for computer memory! Data is stored on the tape in the form of little magnetic ‘dots’ that the computer can read as bits. Before each pocket of data dots is a data marker that tells the computer how long that pocket should be, so it knows when one set of data ends and the next begins. The polarity of the dot determines it’s bit value, and the computer can then read all these dots as binary code.

This method of data storage was a massive breakthrough, and other mediums continue to use the format even today! Tapes are still in use for big stuff – parts of IBM’s library rely on modern tapes, which can now store terabytes of information at a higher density than disks and flash drives alike. Other memory types relying on magnetic domains include hard disks and drums, to name a couple. All that separates them is material and know-how: the better the magnetizing material on the outside, the smaller the domains can get. The better the insulation between the domains and regular old entropy, the more stable the data is!

Carousel Memory

Carousel memory was an attempt at shrinking the space that magnetic tape took, but to the extreme. Instead of one very long piece of magnetic tape on a bobbin, the carousel memory system uses several smaller reels of tape arranged in a carousel pattern around the central read mechanism. To get to the right info is as simple as selecting the right reel! This has some issues with it, as you might imagine. Moving parts add complications and an increased risk of mechanical failure to any device, but a device carrying thin, delicate magnetic tape on it is an especially bad place to start.

However, it wasn’t all bad. Carousel memory was actually quite fast for the time because it didn’t have to rewind or fast-forward as much to get to the right area of code. It could skip feet of tape at a time! This advantage declined as tape tech improved, but it still helped companies trying to squeeze the most life from their machines. The bobbins and individual ribbons were all replaceable, so the tape wasn’t worthless if it got torn or damaged. The carousel itself was also replaceable, so the many moving parts weren’t as much of a curse as they’d be on, say, the first hard disks, which had irreplaceable heads.

Core Rope Memory

Core rope memory featured magnetic gromets, or ‘cores’ on metal ‘ropes’, and then those ropes were woven into fabric the computer could read. In ROM (read-only memory) format, if a wire went through the core, it was a ‘one’, or a ‘yes’. If it didn’t, it was a ‘zero’, or a ‘no’. In this way, the fabric is physically coded into binary that the computer can use. ROMd Core-rope memory involved quite a bit of complicated weaving and un-weaving to get the cores in the right spots.

Core rope memory was chosen over tape memory for the Apollo missions, mainly for weight purposes. Tape was great, but not nearly dense or hardy enough for the mission yet, and neither were the other similar core modules available to NASA. A read-only core-rope memory module could store as many as 192 bits per core, where erasable core memory could only manage one bit per core. Where each core on the final module depended on reading the wires to determine the bit’s state, the erasable model (core memory) read the core’s magnetic state to determine the bit state, not the threads going through it. The final module sent up to get to the moon was a total of 70-ish pounds and read fairly quickly. Tape, core memory, or hard disks available at the time couldn’t have gotten to the same weight or speed.

Core-rope memory has its place. It’s very sturdy, and since it relies on the cores to act as bits, it’s possible to visually identify bugs before the memory’s even used, unlike core memory. Both are sometimes called ‘software crystallized as hardware’ because of the core system. It isn’t seen much today, since it is still incredibly bulky, but at the time of its use it was revolutionary.

Core Memory

Core memory is the older sibling of core rope memory, and it stores less. However, the people who got to work with it call it one of the most reliable forms of memory out there! Core memory works much the same as core rope memory, where the bits are stored in cores.

However, the formats are different. If core rope memory is like a binary-encoded scarf, core memory is more like a rug. Thin threads made of conductive material are woven into a grid pattern, with cores suspended on where the threads cross each other. The computer understands these threads as address lines, so asking for a specific bit to be read is as simple as locating the X and Y address of the core. A third set of lines, the sense lines, runs through each core on the diagonal, and this is the thread that does the actual reading.

When asked to, the computer sends a current down the sense threads and sees if the cores flip their magnetic polarity or not. If it doesn’t, it was a zero. If it does, it was a one, and it has been flipped to zero by the reading process. This method is known as ‘destructive reading’ as a result, however, the computer compensates for this by flipping the bit back to where it was after the reading. Due to its magnetic nature, the core then keeps this info even after power to it is cut!

This link here is an excellent, interactive diagram of the system.

Even though this improved the bit-to-space-taken ratio, core memory still aged out of the market. With the price of bits decreasing rapidly, core memory got smaller and smaller, but the nature of its assembly means it was almost always done by hand – all competitors had to do was match the size and win out on labor. Soon, its main market was taken over by semi-conductor chips, which are still used today.

Magnetic Bubbles

Magnetic memory has had strange branches grow off the central tree of progress, and magnetic bubble memory is one of those strange shoots. One guy (who later developed other forms of memory under AT&T) developed bubble memory. Bubble memory never took off in the same way other magnetic memory styles did, although it was revolutionary for its compact size – before the next big leap in technology, people were thinking this was the big leap. It was effectively shock proof! Unfortunately, better DRAM chips took off shortly after it hit the market and crushed bubble memory with improved efficiency.

Anyway, bubble memory worked by moving the bit to-be-read to the edge of the chip via magnets. The magnetic charge itself is what’s moving the bits, much in the same way electrons move along a wire when charge is applied, so nothing is actually, physically moving within the chip! It was cool tech, and it did reduce space, it just didn’t hold up to semi-conductor memory chips. They saw a spike in use with a shortage, but they were so fiddly that as soon as DRAM chips were available again, they went out of style.

Semi-Conductor DRAM – Honorable Mention

DRAM chips are a lot like core memory, in that the device is reading  the state of a physical object to determine what the bit readout is. In Semi-conductor chips, that physical object is a tiny capacitor, hooked up to a tiny transistor, on semiconductive metal-oxide material. Instead of determining magnetic state, the device is instead checking if the capacitor’s discharged or not. No charge = 0, yes charge = 1. These chips aren’t technically magnetic, but since they’ve killed so many of the other options, here they are!

DRAM stands for Dynamic Random-Access Memory, and it means that the memory can be accessed randomly instead of linearly. As long as the computer knows where the data’s stored, it’s able to pull it without pulling other files first. They’re still being sold today!

Magnetic Disk (Hard Disk Drive)

Hard drives work more like tape than core memory. A Hard drive is a platter (or a stack of platters) with a read-write head hovering above it. When you want to save data, the hard drive head magnetizes areas in binary to represent that information. When you want to read or recover that data, the head interprets these areas as bits in binary, where the polarity of the magnetized zone is either a zero or a one.

The zones of magnetization are incredibly tiny, which makes hard drives one of the more demanding memory forms out there, both now and back then.

Early hard drives could suffer from ‘de-magnetization’, where a magnetic disk’s domains were too close and gradually drew each other out of position, slowly erasing the information on the disk. This meant that the disks had to be bigger to hold the data (like everything else at the time) until better materials for data storage came along. Even though they held more capacity at launch, they were passed over for smaller and more stable stuff like tapes and core memory. The very early drives developed by IBM were huge. Like, washing machine huge. They didn’t respond to requests for data very quickly, either, which further pushed reliance on tape and core technology.

Over time, hard disks improved dramatically. Instead of magnetic zones being arranged end-to-end, storing them vertically next to each other created even denser data storage, enough to outcompete other forms of media storage entirely. Especially small hard drives also come with a second layer of non-magnetizable material between the first layer and a third layer of reverse-magnetized ‘reinforcement’ which keeps the data aligned right. This enables even more data capacity to be crammed into the disks!

Some time in the 80s, hard drives finally became feasible to use in personal computers, and since then they’ve been the standard. SSDs, which don’t have any moving parts whatsoever, are beginning to gain ground in the market, but they can’t be truly, irrevocably erased like hard drives can due to different storage techniques. Hard drives are going to stick around a while, especially for the medical and military industries, as a result!

Sources:

https://spectrum.ieee.org/tech-history/space-age/software-as-hardware-apollos-rope-memory

https://www.apolloartifacts.com/2008/01/rope-memory-mod.html

https://electronics.howstuffworks.com/vcr.htm

https://www.apolloartifacts.com/2008/01/rope-memory-mod.html

http://www.righto.com/2019/07/software-woven-into-wire-core-rope-and.html

https://www.computerhistory.org/revolution/memory-storage/8/253

https://nationalmaglab.org/education/magnet-academy/watch-play/interactive/magnetic-core-memory-tutorial

https://www.rohm.com/electronics-basics/memory/what-is-semiconductor-memory

https://cs.stanford.edu/people/nick/how-hard-drive-works/

https://psap.library.illinois.edu/collection-id-guide/audiotape

https://www.engadget.com/2014-04-30-sony-185tb-data-tape.html?guce_referrer=aHR0cHM6Ly9lbi53aWtpcGVkaWEub3JnLw&guce_referrer_sig=AQAAAC5GC2YOKsvhOs9l4Z2Dt1oHX3-YxjPyJC60qfkq6_6h8zyckkBK9V9JJC9vce3rCmcgyehT-RB6aORBfzB9b5oiBoF1Fbic_3653XVM8fsUTHHnTgxKx4piCeEl65Lp54bkbMcebEEddwlq-EDnAcM7zuv49TXYHcgq9lmnrBln

https://en.wikipedia.org/wiki/Carousel_memory (all primary sources regarding carousel memory are in Swedish)

Wii: A Masterpiece of it’s Time

Elizabeth Technology February 21, 2023

The Wii, a motion-controller game console, used a combination of things to make sure it read your movements. The Wii was a truly special device!

Hardy Equipment

If you could only look at consoles to compare them, the Wii is at an advantage. It stands straight up, like a book on the shelf! It’s also much smaller. Other consoles can be stood up straight, but it’s not advisable – if doing so blocks the vent, the console can overheat and then die. The Playstation 5 recently advised against flipping the device on its side because the cooling system could break down and leak, which is not good.  

Aside from configuration, the Wii is the weakest of it’s generation of consoles, but that’s actually still a selling point – the device was so cheap because almost all of the interior computing hardware was coming ‘off the shelf’, which made it weaker, but meant the consumer was paying less for a device like no other on the market.

The Wii could sense motion in a way that other consoles simply had not dared to try – no doubt the Xbox or Playstation would manage to create a machine/controller pack that cost three times as much as the Wii did.

Differing Technologies

The Kinect, a much more unique approach to the matter of motion detection, is much more complex, but also more expensive. And Xbox’s mishandling of the new ‘always on’ era of gaming made it pretty contentious. Playstation had the most success by simply trying to emulate what the Wii had going for it.

And what did the Wii have going? It used a sensor bar in conjunction with the actual device to sense where the controller was pointing. The sensor bar itself didn’t actually do anything but light up!

This meant that in a pinch, you could simulate a missing Wii bar with a couple of candles – the machine is using the sensor bar as a frame of reference for where the controller is pointing at any given time. Within the controller itself was an accelerometer, which allowed the machine to tell if you were spinning, shaking, swinging, or otherwise moving the remote. Nintendo even later produced an optional set of control enhancers (the Wii Motion Plus) for games that required even finer tuning. The only downside was that controllers sometimes went through TVs or windows, which eventually stopped happening once users adjusted to the unfamiliar motions of bowling. 

Good Games

One of the biggest deciders of a console’s fate back in the 2000’s was what games would be available on launch day. Wonder why so many consoles come with games already downloaded to them? It’s because that system benefits every party involved, and may swing the purchasing party on whether or not to get the special edition of a particular console. Outside of built-ins, the console has to attract studios to make games, otherwise you end up with a catalogue full of repeats, sometimes even made by the console developers themselves. The Stadia, the Ouya, and a number of other small consoles make a great platform that doesn’t have any games on it. None attractive enough to swing the purchaser.

The Wii, because it was made by Nintendo, was already hand-in-hand with a number of games from a brand known for being family friendly. For families looking for a new console that a child of any age could play, this was a fantastic option. It had zombie games alongside party games and sport simulators. It really was a game-changer.

Bad Sequel

Given all of this , the most disappointing part of the Wii is the Wii U, the next console in the line. Not enough was done to ensure users knew the Wii U was a different console. It sounds ridiculous, but it was a real problem! The Wii U looked just like the Wii to someone who didn’t have either, and the game cases didn’t do a great job of telling users what console they were buying for, so once it came out, there was always the chance that a well-meaning relative would buy the wrong edition of a game.

Similarly, the Wii (just like all Nintendo products) didn’t make enough for the first run… and then broke pattern by drastically overproducing the WiiU, a business decision that haunts the choices made by execs to this day (it was impossible to get a Switch for a good three or so months after launch).

Still – the Wii did set standards for what AR really could be, even without a helmet or anything too fancy. In a way, it’s got tons of sequels. The Playstation started using motion controls after the Wii proved it was not only possible, it was fun! And it opened the door to gameplay mechanics that engineers and programmers could have only dreamed of.

Could AR ever be used in an office setting?

Elizabeth Technology February 16, 2023

Home Offices

A home office is often a place of respite. Quiet. Calm. Personalized organization. Companies looking to save money on renting a space may go for work-from-home solutions, no matter their size, and even people who work in an office may still choose to make an office space in their home, whether that’s just a desk in the corner of the living room or a whole spare bedroom, because it makes paperwork and keeping important documents organized easier. In essence, the idea of a home office is incredibly customizable and flexible. If you call it your home office, and it’s not superseded by being a dining room table, it’s a home office.

So, when Zuckerburg announced plans to make ‘virtual offices’, many people were put off, but many more were intrigued. A home office is obviously not a perfect substitute for the kind a business rents out to use, for better or worse. Could Meta Company somehow improve it?

Fun and Games

What Zuckerberg presented combined the worst aspects of VR Chat, the worst aspects of Slack, and the worst aspects of the headset itself. The headset is designed to make you feel like you’re actually seeing a different environment when you move your head, and it does it so well that a percentage of people with VR headsets report headaches – the brain is receiving conflicting information that it can’t sort out, and it doesn’t like that.

The virtual office concept allowed you to look across a virtual desk with a virtual keyboard to see your virtual colleagues, who could perform gestures and small expressions to indicate some sort of feeling. The thing about this system is that it’s annoying – the benefits of being work-from-home include not being in the work office, and being in your home office physically but not in spirit pretty much cancels that out. Under this system, other users could theoretically tell when you’d stepped away – the feeling of being watched in the work office was fine, but it wasn’t in the home office, where workers expected to feel like they were in their home and not in the panopticon.

Walmart Too…?

So many of these ideas seem to think that adding a need to traverse a 3D virtual space somehow improves the idea of a virtual experience. Walmart thought that you might miss actually walking up and down the aisles when they premiered their virtual solution to online shopping, which is by far the worst part of going to a Walmart Supercenter. They added physics to items so your avatar could grab them and put them in the cart instead of just clicking buttons, which makes shopping take longer and also increases the risk of the application bugging out on you. They offered to link up to your smart fridge, so they could remind you that you already have milk in there while you’re grabbing it in the app, allowing you to confirm that you did in fact mean to grab more milk, adding a prompt to do so. The entire idea from top to bottom seemed to hope that you’d spend more money if their app made you work more.

This is not the way VR was meant to re-invent the office, or the remote shopping-experience, or any experience that’s annoying or difficult to do. When customers are shopping in person, the other people are part of the experience (especially in small towns). When they’re shopping over an app, the customer has to be able to find what they want as easily as possible, with as little friction as possible, and it doesn’t get much simpler than searching for an item in a search bar and hitting ‘add to cart’. It’s the worst of both worlds.

It’s almost as if they’re trying to retroactively come up with stuff for the headset to do that they already have easy access to, vs. actually researching and developing programs specifically for VR. VRs shine brightest in games because of the way they function, but if Facebook’s CEO doesn’t believe in the future of games as a product, then there’s going to be a lot of running around trying to make other products more game-like so they’ll fit better. Walmart’s VR demonstration felt like dozens of games, across all genres, simulating everything from stocking shelves to driving trucks. It’s bizarre to try and use it as a virtual world that’s just as boring and simple as the real one – if you’re going to have a virtual Walmart or a virtual office, surely you can do something more entertaining with the surrounding environment than one that the user can already go visit at almost any time? That’s completely the wrong feeling, but it’s the one VR sinks into most naturally, because it’s the only real justification for the product being sold.

There’s room for AR, but not like this!

Cartrivision – Another Attempt to Curtail Home-Viewing

Elizabeth Technology February 9, 2023

Cartrivision was the very first tape system to offer home rentals. It was introduced back in 1972, and didn’t see very much mainstream success – you had to buy an entire TV to play the tapes, and some of the tapes were not rewindable.

You may have actually seen them before, in the cliff notes of a documentary: the 1973 NBA Game 5 Finals were recorded, but somehow every other recording method except for a Cartrivision tape failed or was lost, so retrieving the information stored on the tape became the obsession of the documentarian. The documentary even won an award.

What makes Cartrivision so special that just recovering one warranted a documentary?

This Was Super Expensive

As mentioned before, the Cartrivision player came built into a TV, and TVs were already expensive. The result was a device that cost the equivalent of a decent used car (about 1,300$ in early 1970s money, or about 8,000$ today). This, understandably, meant that the market for these devices was already kind of niche. But wait, there’s more! As an added expense, most of the fictional stories available for Cartrivision devices were rental-only, and only non-fiction could be purchased to own. This meant you couldn’t build a catalogue of fictional stories for home use after you made this huge investment for the machine. Why not just ‘keep’ them, you may ask?

Because the Cartrivision tapes came with a built-in mechanism that prevented home machines from rewinding the rental tapes! Rental tapes, much like Flexplay discs, were denoted by a red cartridge. Unlike Flexplay, you could only play them once. You could pause them, but never go back. The movie studios were worried that Cartrivision could disrupt the movie theater market, and as such the Cartrivision people had to be careful not to make things too convenient to avoid spooking the people providing them their IPs. They were the very first, after all.

Perhaps You Went Too Far

The company discontinued their Cartrivision manufacturing after a little over a year, thanks to poor sales. Users generally don’t want to pay twice for something, and the red tapes were just not convenient enough to warrant buying a specific (and very expensive) TV for a lot of families. Cartrivision then liquidated their stock, but a large number of tapes were destroyed thanks to humidity in one of their warehouses, making them even harder to find today. Cartrivision TVs were suddenly cheap enough for tinkerers to buy and modify themselves, and many did – there are few or no original, mint-condition Cartrivision TVs on the market that aren’t also broken.

Additionally, Cartrivision tapes come in a strange size, even though the tape itself remains fairly standard. They were custom-made for one specific machine, so they were allowed to be weird in as many ways as they wanted, but as a result they are incredibly finicky to work with if you don’t have one of Catrivision’s proprietary watching machines. If you didn’t get a Cartrivision during their liquidation sale, you’d have no reason to buy and preserve their proprietary tapes.

Speaking of the tapes, the company started selling the red tapes, but not the machine they used to rewind them. There were less to start with, anyway. Home Cartrivision fans had to take apart the cartridge and physically rewind the tape themselves to watch their content. Magnetic tape is fragile, so this would never be a permanent fix, and it came with the disadvantage of damaging the art on the box to reach hidden screws that held the case together. Even untouched ones degrade over time in ideal conditions, getting sticky and brittle inside the case, which makes them unplayable. There are, effectively, no working Cartrivision tapes left. Not without a lot of finagling. The people who rescued the NBA game ended up trying a bunch of things from freezing the tape to baking it and scrubbing it with special cleaners, and after everything they had to do quite a bit of digital touchup with a computer even after they got it to play – anything less profitable or historic recorded to Cartrivision tapes alone may very well be lost to time.

Just like Flexplay, the red plastic left behind by Cartrivision is a warning: if it’s not better than what’s already out there, customers aren’t going to go for it.

What is Bluetooth?

Elizabeth Technology January 31, 2023

Specs

Bluetooth behaves a lot like ordinary WiFi, but over much shorter distances. It’s wavelengths fall between 2.402-2.48 GHz, which, as you might remember from the WiFi article, is fairly low-powered. The max range that most consumer devices can reach is only about 30 feet. This works to its advantage! Bluetooth was designed in a time where rechargeable batteries were either small or high-powered, but never both. As a result, it’s one of the better ways to control peripherals. How many times a year do you have to replace your wireless mouse’s battery, after all?

Bluetooth is currently at version 4, and at the cusp of version five. Version 3 saw major upgrades to speed, but no decrease in battery life; version four delivers all of that power for less consumption.

Bluetooth standards are maintained by an outside, not-for-profit body of experts, the Bluetooth Special Interest Group. If something wants to be called Bluetooth, it has to go through them first – they oversee the licensing of the term for businesses worldwide.

The History

Bluetooth and WiFi have a lot in common with each other. The lower bands of their wavelengths actually overlap some, and they’re both capable of transmitting a lot of complicated information to and from devices. The very first traces of something Bluetooth-like started in 1989, and its first major use cases were unfortunately places where battery use blocked it – the 1990s was partially known for obnoxiously large mobile phones. Getting any info anywhere without a cord could get expensive resource-wise. However, the creators didn’t give up! The team that created it was pretty small, but they wanted to see it used elsewhere, and so they made it public – other people could get in on ‘short-link radio’, and it’s first working version was on two competitor’s devices, a phone and a Thinkpad.

 From there, it’s been everywhere. Bluetooth first saw commercial use in 1999, with a wireless, hands-free headset. The first phone with Bluetooth rolled out in 2001, and Bluetooth version 1 had a top speed of about 721 kbps. It was barely enough for the compressed data from a phone call to get to the headset and back, and it wasn’t nearly enough for music, but it was still incredible. Hands-free phone calls! Hands free phone calls!!!

Bluetooth Version 2 doubled that speed and also made pairing much easier, and both of these made speakers more possible. Version 3 was even better, so much better that it could stream video wirelessly between devices – its data transfer speeds were up to 24/Mbs, because it established a connection directly to the device’s Wifi protocols. Versions 4 and 5 promise even more – all of this, minus the heavy battery consumption that can come with using WiFi to stream things. As rechargeable batteries improved, so did Bluetooth. Where WiFi failed or was impractical, Bluetooth swooped in on machinery and appliances.

Today

Bluetooth is still in use everywhere today! Where cords and cables can’t do the job or would be inconvenient, Bluetooth swoops in.

Mice. Keyboards. Radio dongles. Car infotainment systems. Headphones. Speakers. Truly, Bluetooth revolutionized the way people thought about their peripherals, and turned serious, irritating issues with file transfers into minor inconveniences. WiFi ad hoc was the prior protocol – it was very annoying to set up and maintain a connection, but the only other options were often cords. And, unlike WiFi, walls usually don’t stop Bluetooth transmissions!

Better yet, the tech has slowly improved over the years, and now it can transmit at speeds it’s previous versions could have only dreamed of. Version 4 transmits at speeds up to 25 MBPs, on par with it’s earlier versions but with much less battery consumption.  Bluetooth can stream between devices with very minimal delay, making it a popular choice for soundbars and other similar peripherals.

Flaws (and Fixes)

Much like WiFi, Bluetooth can be interrupted by the microwave, if it’s not properly shielded. If your headphones begin to act up whenever said microwave is on, it might be time to replace it! WiFi too – if your devices won’t pair, it could be because the Bluetooth band and the WiFi bands are overlapping. Move your device further away from the WiFi so it can connect, and then move it back once the devices have paired. Bluetooth is also (usually!) very short-range, and most consumer devices only broadcast to about 30 feet. It’s designed to be convenient, not powerful. Still, the newer versions of Bluetooth are able to reach further and further.

Connectivity issues are also unfortunately common. Bluetooth Smart, a low-energy version of Bluetooth from before 4.0, doesn’t get along with older versions of Bluetooth. Similarly, new versions are backwards compatible, but old and older versions may not be able to communicate if they’re both from before Bluetooth became backwards compatible in the first place.

Getting two devices who are compatible to communicate can be annoying too! And since so much of Bluetooth is hidden to the user, it’s possible to get stuck in a loop of turning both devices off and back on again to try and get them to connect. Especially if it’s something like a speaker or a pair of headphones – if it doesn’t connect while the other half is actively looking, you’ll have to start over. There’s not really a way around it: Bluetooth can’t be actively looking for the connection forever, and not every device can have a screen for users to monitor the connection’s progress.  

All that said, though, Bluetooth is still generally the most convenient wireless option that still delivers quality sound.

Sources: https://www.thoughtco.com/who-invented-bluetooth-4038864

https://www.androidauthority.com/history-bluetooth-explained-846345/

https://www.pcworld.com/article/208778/Wi_Fi_Direct_vs_Bluetooth_4_0_A_Battle_for_Supremacy.html

https://www.techlicious.com/how-to/how-to-fix-bluetooth-pairing-problems/

Is It True Macs Don’t Get Viruses? Short Answer: No!

Elizabeth Technology January 24, 2023

Absolutely not. Here’s why!

Apple devices are slightly harder to weasel into from outside, but that doesn’t mean that it’s impossible. A virus has to be crafted differently to even function on an Apple computer. For the same reason that Apple needs its own version of browsers and games, it needs its own version of viruses, and with Microsoft being the default for most ‘sensitive’ systems, like pharmacies, school networks, and hospitals, hackers and other malicious individuals just don’t seem to care that much about Mac devices.

But not caring that much is not the same as not caring at all.

Apple’s known virus count is slowly creeping up, although viruses that use weaknesses in the system to get in are quickly made obsolete by updates. Apple viruses are a special kind of pain to deal with because the person who made them surely made them out of spite – as said previously, Mac’s system is not compatible with Microsoft’s, so viruses are custom tailored.

Their recommendation is to completely avoid third party apps – for good reason. The primary way that malware ends up in the computer’s system is via scam downloads. Those can look like a couple different things. Everybody (or almost everybody) knows not to click those flashing banners at the top of blog sites that advertise “FREE iPAD! CLICK NOW!” because it used to be the most common way to steal information from non-tech-savvy people.

“Free Flash Player!” “Free Game! Connect With Friends! Download Now!” are it’s equally outdated cousins. Anything that tells a Mac user that they need to download it has the potential to be a virus, and if the user is unlucky enough to get a virus prepared for a Mac, they’re in for a headache. But it’s tough to trick people with those flashing banners anymore, right? So…

The next easiest way is to fake an email from an app publisher, or even from Apple itself! This still won’t get a lot of people, but the people who fell for the flashing banners the first go-round might fall for an email that looks juuuuust official enough to make them doubt themselves.

One version of this scam involves sending an email with a downloadable attachment to ‘fix’ a ‘virus’ that ‘Apple’ has detected on the device. That’s not Apple, and there’s no virus until the recipient downloads the attachment. That was the goal! And now the virus is on the computer. Oh no!

Alternatively, if you’ve downloaded some game or another that you trusted, even though it was third party, and then received an email about a big patch that needs to be downloaded, you might fall for it! Depending on the game, they could have your email to send patches to, right? Official platforms like Steam certainly have their user’s email.

And that’s not even the game download itself! Downloading a game off of third party websites can lead to some nasty results, which is why Apple goes out of it’s way to warn you every step of the download, and also warn you off of third party downloads in every help forum. The risk that what you downloaded could be malware is just not worth the inconvenience of waiting for that game to come out on an Apple-licensed platform.

Long story short: it’s very possible, albeit difficult, to get viruses on a Mac computer. Don’t download attachments from strangers!

Source: Apple.com resources

What is RFID?

Elizabeth Technology December 27, 2022

Definitions

RFID stands for Radio Frequency Identification, and it’s usually used in the context of a chip! There are active and passive types: an active RFID chip has a tiny battery with it, while a passive one is powered by the energy of the reader’s signals alone. Active chips can be read from much greater distances, but the battery makes them heavier and more expensive. Meanwhile passive chips have to be blasted with the RFID signal to be read.

How do they work?

RFID chips are great because they’re small, and they don’t take line-of-sight to read like many other cataloguing techs do.

There are three major parts to an RFID chip: the microchip, an antenna for receiving and broadcasting signals, and substrate to hold it together. RFIDs work with radio waves, a form of electromagnetic radiation. They actually got their start during the end of WWII, where a Soviet engineer created a passive listening device activated by radio waves, which would then store a small amount of information about the transmission. It wasn’t really the same as what we use in security tags and inventory systems today, but it was a tiny passive chip with information stored on it passively, and that’s close enough! 1973 saw a real attempt at the kind we have today, and ever since, they’ve been shrinking in size.

RFID chips can also come with read-only or read/write memory, depending on the style of that chip. Essentially, it has a very small amount of memory on it, just enough to store things like batch number, serial number, or address, in the case of pet tags. They’re not very complex: in the case of an active tag, the reader simply dings the RFID chip, which then responds on a compatible wavelength with the relevant information via that antenna.

Some chips broadcast constantly, while others broadcast on a regular interval, and some wait for the RFID reader to ding them before they send their data. In a passive chip, the RFID reader has to ding the chip so hard that it absorbs enough EM radiation to respond – energy hits the antenna, travels to the chip, and powers it enough to activate the antenna for signalling, which then causes the chip’s signal to travel back up the antenna and transmit to the reader. Neat!

Utility

An RFID chip’s low profile and small size makes them great for inventory management. Since the chip doesn’t need line-of-sight like barcode scanners do, production doesn’t have to worry about maintaining a certain orientation towards cameras for their items, they can just pass them over an RFID scanner and they’re good to go. Radio waves can pass through solid objects!

The RFID chips are also good at tracking inventory while in the store: you’ll notice many big box stores have an exit with detectors alongside the doors, which prevents unscanned or active chips from getting out the door. It also sometimes triggers on nametags and items the cashier had to scan in the cart, but most of the time it works as intended.

RFID chips are great for livestock and pet chipping – they’re small, and not only are they less painful than a tattoo, the data is also unlikely to migrate or blur like ink could in a pet’s ear. The initial wound is also smaller, which makes infection less likely. That doesn’t mean they’re perfect, but they carry a lot more information for less relative risk to the animal.

On the human side, RFID chips are frequently used in employee identification badges – the theory is that it’s harder to copy and easier to read than a barcode scanner for restricted areas. Some people go so far as to get them implanted, but the ethics of that are… iffy, to say the least, even if they want the implant. The long-term effects in humans just aren’t that well-known, and while pets are a good indicator that nothing should go wrong, pets also don’t have to worry about getting their phone hacked because their pet tag carried a virus along.

RFID chips are now popular in credit cards! The chip in the card is (in theory) safer than the regular magnetic stripe, and it’s supposed to be much harder to copy. Of course, early versions still had their issues, but now they’re difficult to signal from a distance.

Flaws

RFID chips aren’t free from flaws.

Security can be a problem, especially for active chips, which can be read from hundreds of meters away. Most vendors have some sort of protocol in place, but for a hot minute, RFIDs in cards were a potential security nightmare. Remember all those anti-RFID chip wallets? That’s because readers were able to access the chip as though they were being used for a purchase. It just wasn’t very safe before protocols were established.

Secondarily, a bunch of folks went out of their way to prove that the more complex RFIDs could become transmission sites for computer viruses – one guy had one implanted in his hand, and if the virus could infect that hand, then the virus could get anywhere he could wirelessly. The perfect crime! Airgapped networks were no longer safe if RFIDs were on the table.

Incompatible readers can make inventory transfers more painful than they need to be, as well – the ISO sets standards for which channels get to be used for what purposes, but the companies have to comply with them first. They also have to have the right kind of reader – is it scanning for active or passive chips? The two have very different needs. An active reader might not be able to find a passive chip!

There’s also the sticky issue of privacy and destruction. How do you get rid of the tag on the product once it’s no longer needed for inventory? RFIDs can be destroyed by microwaves, but that doesn’t help if they’re attached to an electronic, which can also be destroyed by microwaves. They can be wrapped in foil a couple of times, and stop transmitting long distances – on some objects, that makes them unusable. It takes special equipment and some professional skill to actually scan a building for RFIDs, but it’s not totally impossible.

It just takes work, the kind of work a stalker or government agent might be willing to put in if they needed info on a person so badly that they’d want to see what items they had in their house. This is also more difficult than it sounds because most chips go by something vaguely specific, like a batch or serial number with no product name attached, but it’s not impossible. It would just take quite a lot of effort when stalking via binoculars is much easier.

It’s also still possible to clone RFIDs – passports with RFIDs in them could be an especially large problem for both the original holder and the government of that country. The obvious option, credit cards, are still cloneable too, although with modern banking it’s often not worth the investment for the scammers.

However. With tech improving every day, it may be possible to limit what chips respond to which scanners, which would make it much more difficult to invade privacy. Chips get smaller and smaller every day, so it’s entirely possible a password- or signal- protected RFID may some day come into power.

Sources:

https://www.researchgate.net/publication/224328848_Impacts_of_RF_radiation_on_the_human_body_in_a_passive_RFID_environment

https://www.atlasrfidstore.com/rfid-insider/active-rfid-vs-passive-rfid

https://electronics.howstuffworks.com/gadgets/high-tech-gadgets/rfid.htm

https://www.reuters.com/article/factcheck-coronavirus-vaccine/fact-check-magnet-test-does-not-prove-covid-19-jabs-contain-metal-or-a-microchip-idUSL2N2N41KA

https://www.reuters.com/article/uk-factcheck-vaccine-microchip-gates-ma/fact-check-rfid-microchips-will-not-be-injected-with-the-covid-19-vaccine-altered-video-features-bill-and-melinda-gates-and-jack-ma-idUSKBN28E286

How Much Thinner Can A Device Really Get?

Elizabeth Technology December 8, 2022

What’s the final goal of these wafer-thin devices?

You’ve likely noticed it in your own electronics – they get thinner. New phone upgrades are thinner, new business or casual use laptops are thinner. Smaller. And it frequently comes at the cost of features.

There was a time when the phone’s camera was flush to the surface of the phone. Now, the phone’s thinner and the camera’s better, but the lens protrudes from the surface. There’s nowhere else for it to go. The plug-in parts of chargers slim down too, so they don’t hold the weight of the phone if it’s lying flat on the table and charging, but it makes the cord weaker.

Thinner has not yet become a problem at this stage.

Feature Loss

The Mac Pro was fast, cool-looking, cutting edge, and it was priced to match.

However… it lost its optical drive, the place you put DVDs and CDs on most computers. Optical drives are now available to buy separately, because the optical drive has a minimum thickness it has to be, so removing it is the only way to break past that barrier thickness. Folks still need them, though, for things like software downloads and old movies only stored on discs.

HDMI cables are also disappearing off the sides of devices (from all brands). USB ports are being cut, too. Much to the disappointment of photography fans, the SD card is no exception: Apple removed the SD card reader from their MacBook Pro, leaving them stuck with wireless connection. Supposedly this was done to encourage camera manufacturers to build a more robust wireless system, but it looks an awful lot like ditching useful features for the pursuit of a thinner device and selling them back to the customer with adaptors.

Apple’s not the only one falling into this trap, either! Dell laptops with no disc reader leave people with no ability to install older software stored on discs – if you upgrade your computer, you might very well be dragged into upgrading your printer, too, or downloading an emulator to play a game that you used to be able to play on your old computer. Optical readers can be expensive, especially if you spring for Apple’s name brand, on top of the ever-increasing cost of the actual computer.

What’s the goal, here? What is the final, perfect device? Shaving millimeters off at a time surely isn’t worth the cost of a separate optical drive, right? Eating away at features, boosting the price, all while selling the device as thinner?

Material Strength

Aluminum. It’s great. Lightweight, it’s an excellent conductor of heat, and it bends, instead of cracking, when under strain. It’s a popular choice for encasing machinery – planes, cars, and electronic devices all use aluminum or aluminum alloys to get that perfect lightweight strength. However, lately, electronics casings have been getting weaker. The cases bend. They twist. They flex, and put strain on the inside. Why? It’s still the same aluminum – but it’s thinner. The iPhone bending under weight wasn’t purely the aluminum’s fault, the whole thing was thinner, with a bigger screen!

There’s a ratio! Nobody wants the brick phone, but making the laptop into a pancake and removing features just to get thinner is becoming unsustainable. If you’re big into gaming, you might have noticed a recent trend towards mechanical keyboards for desktops – the buttons are big and chunky, and make a satisfying click when pressed. Bigger desktop machines with lights flashing inside them to show off the mechanical components are also rising in popularity. These increasingly thin laptops are sacrificing a lot of user experience all for the sake of being thinner than a single-subject notebook.

Not to forget feedback!

People generally like physical feedback – I can tell when I’ve missed a button just by sound on a regular keyboard, but on the slimmest of Macs, where the button barely depresses? Can’t tell. The buttons just don’t have enough space to make a clicking noise. I respect that! It’s incredible engineering, but it’s the kind of thing that drives users insane over time unless they didn’t want tactile feedback. I expect keyboards to be clicky, so when something subverts that expectation, it doesn’t feel right.

At some point, a laptop can’t be as thin as a tablet. There’s just not enough places to shave off from, and yet computer manufacturers are still trying to get it to a quarter inch thick. It’s making the machine physically weaker, and worse at withstanding the user opening it. Can that paper-thin screen handle me opening it from the corner? It makes me deeply uncomfortable to try. Besides bending, blunt force trauma is also a bigger threat to thinner, lighter machines – there will be no ‘kinetic therapy’ for these machines. Less aluminum and thinner internal components means it suffers more when it’s dropped, too.

Please. It already fits in the bag. It doesn’t need to fit in a folder, too.

Sources:

https://www.theverge.com/2013/10/22/4866504/apple-macbook-pro-with-optical-drives-discontinued-but-13-inch-model-remains

https://www.apple.com/shop/product/MD564LL/A/apple-usb-superdrive

https://www.howtogeek.com/204867/how-to-use-cds-dvds-and-blu-rays-on-a-mac-without-an-optical-drive/

https://www.apple.com/shop/product/MD564LL/A/apple-usb-superdrive

How To Break A Hard Drive

Elizabeth Technology December 1, 2022

1. Those Fancy Neodymium (or Rare-Earth) magnets.

Strong magnets can erase credit cards and fuzz (or even destroy) VHS tapes, and magnetizing hard drives is actually a commonly advertised way to completely wipe the information from it. A hard drive contains thin glass disks coated in magnetic film that can be modified and read by the head attachment, and therefore shouldn’t be exposed to other magnets.

As for decorating your PC stand, fridge magnets are probably not strong enough to wipe the machine from the outer wall of the PC stand, but it’s not exactly recommended.

Go for a couple of stickers instead (not over the air intake ports, of course). (As a side note, when I was looking up information from this, I found a lot of freak MRI accidents. Industrial magnets are terrifying!)

2. Dropping the machine.

Don’t do that. Blunt force trauma can break the computer by dislodging components inside it. Most good-quality computers do their best to prevent breakage by just making the machine’s insides a little tougher, using bigger pins or more solder, but at some point fine machinery is fine machinery and dropping it might break it.

The same goes for “percussive maintenance” – a computer wouldn’t last very long in the real world if the user could never get away with tapping it, or setting it down a little too fast, but hitting anything inside the PC tower, directly, even if you recognize the part, is a major NO. The hard drive especially. Remember, it’s insides are made of glass!

3. Freezing it.

Freezing the machine is sometimes recommended for hard drive failure, but it’s… not ideal. You know how putting cling-wrap over something that’s still kind of warm will lead to water collecting on the inside of the wrap? The same thing happens when a computer is put in the freezer. Water can collect inside the machine and cause issues.

Besides the risk of condensation, freezers can completely brick up LCD screens, which rely on temperature to change color. If you’re trying to save an older laptop with this hack, that alone can match the cost of the hard drive IF it works, which is far from guaranteed. The potential for saving a hard drive vs. harming other parts of the computer is not good enough to be worth it. Anecdotally, Gillware Data Recovery’s article on the subject says that they’ve never seen this trick work in the first place!

4. Cooking it.

On the other end of temperatures, don’t remove the cooling fan for being too loud. Overheating a computer can lead to hard drive failure (heat makes magnets less magnet-y); the fan being too loud is much better than the hard drive going dead silent, and there’s other fixes for a too-loud fan. Dell, a large computer manufacturer, has a troubleshooting guide on the issue.

5. Drowning it.

Getting the machine wet can cause a short circuit, which can then lead to hard drive failure. Don’t balance your drinks on top of your PC stand! It’s not so much the liquid itself as the things dissolved inside it. Chemically pure, laboratory-grade water is actually a pretty poor conductor, but tap water and even regular grocery-store distilled water have some amount of dissolved minerals in them, which are conductors. Not to mention things like soda or juice. Just keep drinks away!

6. Choking it.

Don’t go opening the hard drive seal in a non-dust free environment. It does require lab conditions. Don’t try to DIY hard drive repair in the same garage that’s regularly opened to the outside; getting dust in that part of the computer can cripple or ruin it. Even if you know what the problem is, the machinery in the hard drive is so incredibly fine that dust invisible to the human eye can damage it.

7. Confusing it.

Try not to delete critical software and/or firmware. If you’re going to take advice from strangers online to avoid a computer repair shop, maybe do a little more research after they’ve suggested a solution – sometimes, people give wrong instructions on purpose just because they can. It is also important to note that it’s not impossible to damage hardware with software, or a lack of software.

If you do accidentally delete critical software, don’t restart the machine after turning it off. The odds of getting your data back go down every time the computer has to struggle to re-boot after a major failure, according to Data Recovery Labs.

Did Always On Displays Get Good?

Elizabeth Technology November 24, 2022

‘Always On’ for phone displays is a relatively new development. Of course some apps and programs have tried it before, but the cons are numerous: it consumes battery. It can make it hard to tell if the phone screen is off-on or on-on and ready to be unlocked. It used to cause pixel burn-in, where the design on-screen becomes ‘stuck’ there. (In fact, depending on how long you’ve had your phone, you may be able to spot burn-in from your battery and WiFi indicators when you go full-screen on a video on your current device.)

It seems things have changed, and both the iPhone and Pixel are coming out with phones that have optional Always On Display built in. How do they tackle the issues this style of display used to cause?

1) Battery Life

Outside of the brick and suitcase phones, handheld consumer cell phones are as big as they’ve ever been. The screens are huge, many topping 6 inches, and while the devices haven’t gotten much thicker (sometimes even slimming down) the actual components needed for computing keep getting smaller and smaller, freeing up space for the battery.

Ultimately, ‘Always On’ didn’t get much cheaper to use energy-wise, but phone batteries have gotten truly massive in the time between the first notions of it and now. Many new generation phones boast multiple days of battery life given you’re not running Minecraft, location services, and Youtube on said phone at the same time.

Battery life is no longer the limiting factor in Always On settings!

Additionally, the iPhone has announced that Always On will not activate when the phone is face down, when it senses it’s in a pocket, and when a bedtime routine is set. Users can also turn it off in settings if they so choose. This limits how much power that function actually takes.

2) Is This Thing On?

The Pixel turns off the pixels surrounding the phone’s clock display, and keeps the clock itself at a very low light. Essentially, only the clock zone and any notifications are actually ‘on’ onscreen, and in most lighting conditions it’s not hard to tell whether your device is off-on (which won’t accept touch to activate it) or on-on (which will). The new iPhone still turns on the entire screen if Always On is active, but it shouldn’t be hard to differentiate as long as your brightness preference when you’re actually in your phone’s menu isn’t set to the lowest setting.

That said, this is a new feature, and it’s going to take consumers a bit of time to adjust. Notifications on screen are generally interactable, but while the screen is “off-on” (in a state where it would be off under the old display rules), they won’t be.

3) Burn-In

While many screens on all sorts of devices, consumer or not, have improved, it’s tough to say whether or not they’ll be able to hold up to the demands of Always On without getting some burn-in. The thing causing burn-in is not the screen staying on and displaying pictures – it’s that those pictures don’t change. It’s why the TV displays showing movies in Best Buy and restaurants don’t get burn-in despite being on for far longer than any ordinary consumer would leave them on. Meanwhile, restaurant menu boards aren’t doing that annoying thing where the menu disappears for an animatic to play because they think you want that, they’re doing it so that if the menu ever changes, there’s not the ghostly image of “Double Cheeseburger – 6.49$” over top whatever they’ve swapped it out with. If they just left it as it was, it’d burn in. If the screen is brighter, the effect is usually worse.

It will likely come down to how the consumer stores their phone. Some people do leave their phone facing up on a coffee table, and those people are going to be subjecting their phone to Always On for much longer than the people who store their phone in their pocket or face down. Again, it’s too early to tell how susceptible the phones will ultimately be to burn-in, but the odds are better than they’ve been in the past.