Category Archive

Uncategorized

Selling Scams : It’s Online Now, Too!

Elizabeth Uncategorized September 20, 2021

BetterHelp

Betterhelp is notable because it’s one of a few online scams where the Youtubers in question genuinely might not have known better. Therapy is something trained professionals do – how is a layman, especially one who haven’t been or only went infrequently, supposed to know how to separate experts from people who just showed up and said ‘I can do that’?

The terms of service state that BetterHelp doesn’t promise to put users in touch with licensed professionals – or professionals at all. They were aimed at “high-functioning” folks – people with diagnosed illnesses weren’t supposed to turn to BetterHelp for help. Even for the people that can sign up, Betterhelp doesn’t promise it will match that user to the right category of counselor. The whole thing was a mess of “We can help anyone! But we won’t take responsibility if that help isn’t adequate even though we have spent hundreds of thousands of dollars to advertise the opposite”. This app was so bad that it was asking the users to vet the counselors. In what world is that the way things work?

And still, Youtubers promoted it until other Youtubers investigated and discovered some of it’s many flaws. An unqualified counselor may still be helpful to someone who’s not going through too much to bear – that’s basically what friends are for. Anything serious? This was a danger. It was rightfully lambasted for being a scam, and many of the people who promoted it didn’t feel bad about denouncing it later.

LootBox Website “MysteryBrands”

I’ve mentioned this before, but the LootBox website was such a terrific example of money ruining everything that I keep harping on it. RiceGum, Jake Paul, and assorted other ‘bro’ Youtubers with large child audiences promoted a website that promised to function like lootboxes. You could even win a house, they claimed. I don’t have any proof that the boxes were rigged, but we do know that when people won something expensive, they often got a cheap knock-off in the mail several weeks later if they got anything at all. For a website that sold its product on the promise of winning name-brand items, that’s atrocious.

And yet, when asked to apologize or retract their videos on the products, they just… didn’t! There’s a whole complicated mess of contracts and rules that go into making a sponsorship deal, so that explains part of it – they definitely would have been required to return the money if there was an ‘undo’ clause, and that’s a powerful ‘if’. But taking the deal in the first place (even without knowing that the stuff that showed up wasn’t the real deal) was a bad idea and a bad fit for underage audiences given what adults should know about gambling.

Jake Paul’s Team 10

The website, now defunct, doesn’t look like much. The original Team Ten, too, is gone. Everyone who was a part of it has gone off to do their own thing, for better or worse – many actually did come out better as a result of all the free advertising, but many didn’t feel that it was worth it. As opposed to being a scam for followers, this is actually a rare scam for fellow Youtubers! Team Ten was clearly a marketing stunt or an ego trip for Jake, and while it worked as a ‘social media collective’ for a little while, it spent much longer being completely non-functional.

Team Ten consisted of vloggers that Jake brought in to form a team. While he could get them in, he couldn’t get them to stay, and many of the members faced targeted bullying from other members while living in the Team Ten house. Alissa Violet was kicked out with no warning. The Martinez twins, who didn’t speak English fluently during their time there, were called racial slurs and mocked. Cole Carrigan, a beauty Youtuber, was promised that he’d get help securing deals for joining, but ended up with a room and an editor and nothing else out of Jake (which was a significant downgrade to the system he had before – at least when he was living in his own house, he could film when he wanted). He was also told to ‘get the f*** out’ via spraypaint on a mural in his room.

Jake Paul implemented fines and weird, strict rules for living in the house, while providing basically no benefit other than the marketing and exposure to his fanbase – which always went both ways and could have been accomplished without forcing people he didn’t like to live in his house with him. The house itself was a constant source of chaos, between parties, stunt videos, and the constant filming of other members. Every original member of Team Ten except for him has left. He, as of 2020, was planning to relaunch – given that it was 2020, he didn’t get the running start he may have needed.

Assorted Youtuber Success “Lessons”

Drew Gooden, a popular commentary Youtuber, attempted to join Jake Paul’s “Edfluence”, a portmanteau of Education and Influence. It consisted of a set of videos only available online through Jake’s website, and it cost 7$ to access. The first one was mostly fluff, and that’s alright – the introduction to many courses often is. However, after the first lesson, Drew discovered that only the first video was 7$, the rest of the course brought that total to 57$, but you could only find that out after paying for and watching the first video. Talk about a bait and switch!

Prince Ea, another prominent Youtuber and social media star, attempted to introduce a ‘school’ that was really just a text-message mailing list. The text messages were supposed to be life advice. However, the kind of thing that ends up in a text message is often limited and generic because the format itself is. It was much more modest about its pricing, at 9$ a month, but he claimed he was launching this ‘school’ (which is text messages) to get into better touch with his fans, because his upload schedule had been lacking. Twitter could have done that – many people pointed this out. If you like getting generic advice from fortune cookies, but don’t eat takeout enough to get that advice every day, Prince Ea had the solution for you!

“Lessons for Success” is one of the easiest genres for big Youtubers to produce. It requires no creativity, limited skill, and there’s no punishment for getting it wrong – the other party simply ‘didn’t try hard enough’. You can’t say that about most other kinds of teaching content. You teach a language wrong, it’s obvious. You teach someone how to fix a toilet, and it still sounds wrong when they flush, they know you screwed up in your video. You teach people how to make it big, and they don’t …. Well, maybe they just didn’t have what it took, whatever it was you had that made you big. See?

A lot of getting big on Youtube is in the algorithm, and it’s unfair. It just is. Content that took effort is no match for content that can be mass-produced. Things made by channels that do try often blow up for no reason – many third-generation Youtube stars can point to a specific video that got to the front ‘recommended’ page and earned them several times over what they normally got in views. From there, capitalizing on momentum was easy – but getting there is incredibly hard.

Kenza Brushes

Gabbie Hanna is not universally well-liked. Tana Mongeau is also not universally well-liked. Both of them are love-it-or-hate-it kinda people, so sponsorships tend to look a little strange. Where others get ads from HelloFresh or Blue Apron, they get sketchier stuff.

One deal came from a brand called Kenza Brushes. The Youtubers should have noticed the complaints about the site, across multiple platforms and from many, many people. They especially should have noticed complaints about drop-shipping. Drop-shipping is one of those things that is morally shady but not quite illegal yet: the host website offers a product, a drop-shipper offers it on their own website for a mark-up (without actually buying it, yet) and if a customer buys the product, the dropshipper orders it off the host website as though they were the customer.

This adds both time and expense for the customer, yay!

Kenza brushes was a particularly bad example because the brushes were coming from a supplier on Ali Express, which has many suppliers shipping direct-from-China, and therefore it can take months for a product to actually arrive at its destination. Ali Express can be great for cheap things – but it’s not what the customer thought they were ordering. People knew this. People had complained online. Gabbie and Tana did not research it at all.

Customers who did get their product noticed immediately how poor the quality was – 70$ brushes generally don’t shed fibers. If someone says their product is free except for shipping and handling, the shipping and handling is how much the product costs, how much the shipping costs, and finally – profit for the company. An item that cost ten dollars to ship may only be two or three to actually buy. Gabbie, when questioned on the brushes, essentially said the customers should have known not to trust her.

Louis Vitton, Allegedly

This one comes from all the way back in 2019. Social media star Alissa Violet teamed up with Cheek, a company that… planned to sell sunglasses? And announced a giveaway for Louis Vitton bags. Alissa seems to be the only person they contacted for sponsorship, which was strange right out of the gate. Even worse, the company shut down at some point during the giveaway. Much like Gabbie and Tana above, she herself might not have intended to scam followers, but accepting the sponsorship put her in that position.

Three people won the giveaway, and none of them got their stuff. Two were actually discovered to be disqualified (for either their age or location, neither of which was mentioned in the giveaway launch) but only after heavy, persistent questioning of both Violet and Cheek, and the third person, who was both in the US and of age, never got their stuff. They were told it was shipped, and then they never received it or an answer from Cheek.

Sources:

https://www.buzzfeednews.com/article/tanyachen/youtuber-influencer-alissa-violet-accused-of-scam-for-louis

https://www.buzzfeednews.com/article/scaachikoul/gabbie-hanna-youtube-controversies-alanis-morisette

Casey Aonso: “exploring the world of youtuber scams” https://www.youtube.com/watch?v=cAnIbBhOmoI (This is a video – ads are a little excessive, you have been warned)

https://www.team10official.com/

https://www.newsweek.com/jake-pauls-raided-team-10-house-has-been-losing-members-years-heres-why-1523327

https://www.polygon.com/2018/10/4/17932862/betterhelp-app-youtube-sponsorship-controversy-explained

(This is a Tumblr Post, but it does a good job of summing up the contents of other, longer articles and videos)

Bad UI – Button Edition

Elizabeth Uncategorized September 17, 2021

Do you want to make a button that suuuuuucks?

1) Make Sure There’s no Context.

Some buttons are self explanatory. “Check out now” could only mean the button goes to the checkout page. “View Location” sometimes leads to a map, or sometimes it leads to text, but either way, the user knows they’ll gain information about the business’s location when they hit that button. It would be bad practice to make the button lead to something other than it’s label, and it’s pretty obvious, so it doesn’t happen too often.

Similarly, it’s bad practice to label the button and then have no additional information surrounding it. That happens more often!

“Bowhair Frogs” on a menu with no dropdown options means nothing to viewers who stumbled upon the site. “Download Now” when the page doesn’t have any information about the actual download makes it seem scammy. The same goes for “Sign up now.” Sign up for what? The end user could guess it’d for the newsletter or something, but it could also be nearly anything else. Sign up to the website, for an account, maybe? Context is crucial. If you want to do a bad job, leave the buttons completely stranded in an enormous blank field, with no relevant instructions or text, and let the user guess what the button does or what it’s for.

2) Make the Options Vague.

I’ve covered this before in previous articles, but it’s really hard to overstate how much content viewers depend on clear, concise wording to make their choices. Just like context, content matters! A dialogue box is useless after clicking a button if it doesn’t sufficiently elaborate on what the button does. Make your options as clear as you can – ‘more words and clearer’ is better than ‘less words and not clear enough’.

3) Make the Button Visually Unresponsive to Clicks.

You might have noticed that many buttons simulate being pushed down when you click or tap them. That’s good design! It gives the viewer vital information, by letting them know that the button has indeed registered their input.

If you want to confuse users or accidentally generate multiple inputs, make it unclear if the button’s registered their input. No color changes, no animations, no nothing. The user will know their order has gone through when they get the email, right? That’s what bad designers say. When checking out from smaller websites, you might notice a screen warning you not to click the button again, or not to navigate away from the page until it’s complete, and that’s because this is still a very real issue with payment portals. Impatient users or users with slow internet would hit the button multiple times. They’re not necessarily at fault – that’s exactly what you would do to a button in real life if you thought it hadn’t responded!

4) Make the Button Very Small.

Experts say that the smallest you should make a button is about 10 mm by 10 mm, or 26 px by 26 px. This is about the same area as the tip of an average person’s index finger. This is an absolute minimum size! Customers viewing your site on mobile literally will not be able to push the button if the button was any smaller – their phone could pick up the input around the button and could misunderstand what they’re asking it to do. It’s also harder to hit something if you can’t see exactly where it is, and that little micro-button is well hidden underneath the user’s finger.

4.5) Make Small Buttons Close Together

One of the things that makes shopping at Amazon very fast and smooth is that the buttons are all gigantic, and they’re all pretty far apart. People new to designing webpages for mobile might assume that putting their smaller yes/no buttons close together would make the mobile experience more pleasant, when in reality, it’s the opposite. People like some amount of spacing between buttons, especially if they’re small. You can have small buttons, and you can have buttons close together, but too small and too close creates an unpleasant user experience. Pick one!

5) Make it Unclear What Options the End User Has

Write the ‘exit’ option in white text, or put the ‘x’ in a very light gray in the upper right corner. Don’t put a background, don’t draw attention to it, wait for the user to scroll over it as they jiggle their mouse in a fit of rage. Similar rules apply to ‘go back’ and ‘cancel’ options. Ads do it all the time! Do you like ads?

You could also make the website’s default page a ‘sign-up’ option, and make the log in option small, at the bottom of the page, so users aren’t sure if they’re in the right spot to log into their account. Remember, most users don’t like to read, especially if they’re trying to get somewhere fast – sometimes, highlighting something is the only way the end user ever actually sees the option. If you want to irritate or confuse your users, make it unclear where they’re supposed to actually look!

Twitter does this right. Twitter has both their log-in and sign-up buttons on the same page, one right under the other, and they’re the same size. Even though the ‘sign up’ option is highlighted, the right button for logging in is directly underneath it, so the user’s eyes are still drawn to approximately the right area. Meanwhile, on other platforms with Google and Facebook’s new ‘log in with (PROGRAM)’ buttons, the ordinary log-in button sometimes gets shuffled to the bottom of the pile. You’ll see Sign Up, the spaces for email and password, Log in With Google, Log in With Facebook, and then finally, at the bottom, the regular login button. Which then takes users to the next page so they can, y’know, log in. This isn’t ideal design, especially if the website’s older than the Google option – users may accidentally create two accounts before they realize what they’re doing.

6) Make it Unclear What’s Clickable or Unclickable

Here’s an otherwise great example of a sign-up box. It only appeared halfway through the article, it has both an ‘X’ and an “I’m not interested” button, and it doesn’t take up the entire screen. The only possible issue is that the larger text doesn’t necessarily look like a button – but given the context, it doesn’t really have to look exactly like a button, and it does highlight when I hover over it. This is a good example. I know it’s a button when I interact with it.

How do you do it wrong? Hide text, don’t allow the text to highlight when the user hovers over it, or otherwise obscure its true nature.

You could also write unclickable/uninteractive text in underlined blue as emphasis, instead of just the underline. Is your website’s primary color blue? This is still a bad idea – users understand underlined blue to be hyperlink text, so if you really want to highlight, do it in your secondary color. You could give unclickable words a hover-over option, which sounds good for things like word definitions, but can get really frustrating on mobile. They see a word, they think it leads to another page, they click it. If the hover-over is poorly configured, it won’t do anything. Well-configured hover-overs can still be frustrating on small screens, so nailing other aspects of your web page is critical if you plan to use them. Make the experience as mobile friendly as possible, and the desktop aspects will follow.

Sources:

https://www.smashingmagazine.com/2012/02/finger-friendly-design-ideal-mobile-touchscreen-target-sizes/

https://medium.com/nextux/design-better-buttons-6b64eb7f13bc

REvil is Over Party

Elizabeth Uncategorized September 17, 2021

REvil, a notoriously tough ransomware, recently had a master decryptor released. What this means is that victims of the ransomware will now be able to unencrypt their files without being forced to pay! However, there is a caveat: the key only works for victims who were encrypted before July 13th (which is when the REvil team disappeared, likely due to investigations by law enforcement, leaving many people locked up with no way to pay even if they wanted to). REvil took a very long break after July, and they’ve only begun to attack people again this month – the new software they’re using post-break isn’t compatible with the key. Still, most victims were before July 13th, so this is still a massive breakthrough!

Cybersecurity company BitDefender worked with an unspecified branch of law enforcement to get the key, which – when combined with BitDefender’s software – painlessly unencrypts the devices of victims.

The best part is that it’s free!

Here’s the website to download the free key:

https://www.bleepingcomputer.com/news/security/free-revil-ransomware-master-decrypter-released-for-past-victims/

Safe online travels!

Game Data is Getting Ridiculous

Elizabeth Uncategorized September 15, 2021

Why are game downloads so ridiculously huge?

The first Doom game is famous for how little space it takes up. Because of it’s absolutely tiny size, almost any device can play it. The latest Doom was approximately 50 GB, a far cry from the games of the past.

The Beginnings

Doom is famously small. When games came on floppy disks, fewer disks meant less overhead expense and a more seamless player experience. It also reduced the risk that something could go wrong. Programming was simple, elegant – textures and sounds were limited, and yet Doom used it’s few pixels to great effect.

Sonic, another small game (meant for a console this time), famously took up a levels’ worth of space for the SEGA opening soundbite. The scale of the levels themselves was so incredibly small that a second-long clip of someone saying ‘Sega!’ consumed as much space as a level. That is insane. Audio, even the crispest, clearest mp3s around, can no longer say that.

While some ambitious games like Doom were technically 3-D, many more were much simpler – Sonic, Metroid, and other 90s games were all 2-D, and yet they all came out difficult and engaging. A number of other trash games came out alongside them, but the shining stars of 90s nostalgia still hold up to this day. The concepts themselves were fairly new to the world: Personal computers and consoles alike were still a fairly new consumer product, still heavily associated with businesses in the case of PCs and children for consoles. Customers, therefore, were making room for something new, not accommodating it unconditionally.

The Graphics, and The Next Step: 3D

The next generation of consoles and computers were significantly more powerful, and as such the games could afford to take up a little more space. A little. Still, that little meant that things looked very different. The distinct polygonal art seen in Final Fantasy, Banjo Kazooie, and other favorites was the best, least-intensive art they could make at the time. You’ll notice shadows are limited and that textures often repeat.

The Nintendo 64 had about 4 MB of RAM – games to fit it could be a maximum of 64 MB, although many were much smaller. Articles say that all of the games for the 64 could fit on the Switch, which is still underpowered compared to other consoles! And yet, so many iconic games come from this era. Ocarina of Time, Mario 64, Banjo Kazooie and Banjo Tooie – the world was an oyster. Other games on other consoles came out, but Nintendo – having watched Atari shoot itself in the foot – didn’t make a habit of producing bad games, or letting third parties make bad games for them.

Game length, too, was incredible for the limited storage space: polls say that Super Mario 64 and Banjo Kazooie both take 11 to 12 hours to beat, and longer to 100%. If they didn’t provide hours of entertainment, that money might go elsewhere to keep the kid distracted. A small game had a big hill to climb, and it still didn’t have a lot of space to do it in, both on the shelf and on the actual console. Games would much rather be longer than look great.

Middling

I recently watched a video covering Silent Hill 2, a game for the PS2, which came only four years after the Nintendo 64. I was very impressed by how good it looked; while the main character was definitely blocky, and the fog and fire effects looked like they were sponged on, the frame rate never dropped, and the pre-rendered cutscenes could have blended in with games much younger. Game storage had moved from cartridges and the occasional floppy disc to the new and much better CD-ROM. Silent Hill 2 was 1.8 GB. The device to load it, the PS2, had a RAM of 32 MB, and that was plenty to run it. Games like those pushed the boundaries of what could be packed into a disc!

Levels could have more complexity. Silent Hill 2 features constant fog and enemy combatants with sometimes unpredictable behavior. Grand Theft Auto: San Andreas still stands as a worthy open-world entry. The content, too, became more adult as the consoles and PC games proved their worth as much deeper than what a quarter got you in the arcade. Halo, for the Xbox (which came out only a year after the PS2), is one of the most revered Xbox games of all time – it’s lasting cultural impact is the stuff game designers dream about.

The great thing about these games is that even though they took up more space to look nicer, you could still generally tell how long the game was by the file size, although people were no longer ‘making room’ for a ‘toy’ – they were creating a space for legitimate art and leisure. Computers were more widespread, now, although not every family had one. Games were turning into art, into something most people wanted or already had experience with. Consoles, while almost always hot Christmas items ever since their conception, started turning games into ‘must-buys’. As such, sloppier games and games that took up a lot of space now had permission to exist. Games didn’t have to be so brutally efficient in their coding..

The 2010s

The Xbox 360 had entered the market, and LAN was becoming outdated. The console and the increasing internet speeds of the time meant that players didn’t have to get together to play together for everything anymore. Gaming consoles as well as computers are now capable of downloading large games directly from the internet, where before a disc or a cartridge or something would have been more efficient. This is around the time games start bloating. Characters look really good – Grand Theft Auto 4 looked downright realistic compared to San Andreas, and adults who had played both could tell. Games, even for the PC, could be really effective sandboxes. The player had room for a whole world now, after all.

 Big games were usually still pretty long, but they were also becoming unpredictable. The Darkness, a game I really liked, took up 6.8 GB – Sonic ’06, meanwhile, takes up 5.6 GB. Games could become boring much faster – unlimited potential and limited handholding meant that games like Minecraft could be really fun for hours, or get boring in thirty minutes. Storage space for PC games is no longer a promise of quality or length.

Even Better Graphics – How Big Is Too Big?

Doom takes up 50 GBs to download. While it is decently long – PCGamer says it took them 20 hours to complete the main quest, and HowLongToBeat says 12 hours – it’s also significantly bigger than the first Doom, which provided somewhere between 5 and 6 hours for just the main quests, at a mere 2.39 MB. You don’t have to be a math expert to tell that the ratios are way different. The textures that go into making Doom Guy’s gun now take up more room than the original game ever did. And is it worth it? How many copies of Doom could you actually download on your computer? Borderlands 2, another game on both PC and console, takes up 20 GB but provides around 30 hours of entertainment with just the main quest. The twist is that Borderlands 2 is 4 years younger. In the time between GTA 4 and GTA 5, between Borderlands 2 and Doom, between Gears of War 3 and Gears of War 4, game studios have ballooned all the trappings that come alongside the game. However, the concepts of games themselves, at their core, don’t take up any more space than they used to. But that’s not universal: puzzle games and games with stylized art don’t take up nearly as much space as the Triple A open-worlders do. Baba Is You, a puzzle game with timeless graphics, takes up only 200 MB. Hades, a cartoony roguelike game by an indie studio, takes up 15 GB – and that one’s from 2020. Hades is also capable of providing many days-worth of replays complete with story advancement.  

Games stopped getting bigger for levels. They started getting bigger for detail. New Halo games are not longer than the old ones, on average, but they are still bigger. The detail of the levels is consuming valuable space that gamers with mid-tier rigs might like to save for other things. Like other games. Games that don’t hold themselves to hyper-realism in every new generation are finding their job much easier, but the Triple A studios are struggling to justify the expense and space consumption of a game that gets a ‘B’ on Steam. Triple A studios have come full circle, and are beginning to shut themselves out of parts of their market that they’d otherwise be guaranteed.

An unintended side effect is that indie studios are providing much more accessible games. A triple A studio is forced to let go of otherwise guaranteed customers because their game sizes and specs are keeping up with the top-of-the-line computers, not the mid- and low-tier ones the indie studios are aiming for. They take less resources, they provide a different experience – but they’re much closer to that original era of gaming where their spot on a computer was very far from guaranteed. Small games can be just as fun and charming as big ones – especially when their size comes down to texture and engine lighting over more substantial things like story and gameplay, or AI.

Sources: https://www.nintendolife.com/news/2020/05/random_every_nintendo_64_game_ever_released_would_fit_onto_a_single_switch_cartridge

https://howlongtobeat.com/game?id=834

https://howlongtobeat.com/game.php?id=9364

https://howlongtobeat.com/game?id=1274

https://gamerant.com/halo-infinite-franchise-how-long-to-beat/

https://store.steampowered.com/agecheck/app/12210/

https://venturebeat.com/2012/09/17/borderlands-2-nearly-perfects-the-blend-of-shooter-and-role-playing-game-review/

Why Can’t You Tape a Cord Back Together?

Elizabeth Uncategorized September 13, 2021

Have you ever wondered why just taping a cut cord back together doesn’t fix it? Well, there are a number of reasons!

“My Phone Only Charges at Certain Angles”

This is one of the most annoying aspects of broken or frayed cables, so I’ll put it first. How do you fix it? Unfortunately, the answer is usually replacing the cable, or stopping it before it happens. This is a very general article, so I’m not going to link tutorials – many cords have their own tricks anyway. If you’re looking for a specific tutorial, now’s the time to tune out.

Now, onto the ‘why’!

The area where the cord plugs into the phone receives the most stress. Power outlets are usually down low, and desks or nightstands are up higher. It’s almost never the plug-in part that fails, as a result – it’s usually either where the cable connects to the charger’s box or where the square plastic part of the plug-in side meets regular cable.

Engineers have been trying to fix this for years. What we have now is the best they have at a low price, until cordless charging really gets off the ground. The design is sturdy: the cord would simply bend the phone’s plug-ins if it weren’t reinforced at the end with the little plastic bit. However, they can’t just reinforce the entire cable, so the next part under the most stress, where the reinforced bit hits regular cable, is the next most-likely place to fail. That’s the spot that bends the most if the phone’s right at the edge of the table. The inside of the cable begins to suffer from metal fatigue after moving in and out of the same position day after day, year after year, and some of the copper lines building up the core of the cable snap. Pulling on the cable instead of the plastic reinforcement at the end hurts over time as well. When you, the user, move your phone in certain ways, the two frayed sides get to touch again. Sometimes there isn’t even visible damage on the outside!

You could try DIY reinforcing the cable by just slapping some cello-tape around the bit that breaks the connection when it moves, but ultimately, that’s a temporary fix (assuming it works at all. It might not!). If the cable is fairly new, it might also be the port that’s the problem – shine a light into your phone’s plugin, and if it’s looking a little dirty, you could try some compressed air. Particularly bad cases should be taken to an expert, though, as the pins are easily bent but not easily fixed.

Pins

The pins at the end of the charging cable each have a specific purpose. They can’t all perform their purpose via the same cable, so functions are split into several individual threads inside said cable.

Right here is why you can’t just slap tape onto your frayed cable – it’s also why frayed cables can sometimes still charge, but can’t transfer files anymore, or vice versa. The best thing to do is to prevent fraying in the first place, which mostly happens from material fatigue, i.e forcing the cable into odd positions over and over. However, Apple chargers sometimes just… do this under regular stresses, unfortunately. In that case, you could purchase some low temperature heat-shrink wrap, and double-reinforce the problem areas! Tutorials are scattered all over the web, so I’m not going to link a particular one; the ones I’ve used as a source are below, but I’m not endorsing them specifically. I will say to aim for low temp heat wrap, the kind that a hair dryer can set. Anything higher might damage the charger’s plastic.

As a sidenote, it’s really disappointing that Apple held such a monopoly on their lightning cable, only to drop their manufacturing standards and leave users constantly replacing cables, or DIY-ing their own repairs. The plastic isn’t particularly good on the outside, leading the charging head to sometimes snap off entirely if it isn’t treated delicately. 3rd party manufacturers aren’t doing much better.

However, phone chargers are not the only cable out there! Many others are in similar positions. HDMIs can’t just be copy/pasted back together, ethernet cables, printer cables, power cables, headphone cables, all of them are as good as dead if the cord is broken. They all follow similar methods for data transfer, where individual threads each do their own thing.

Is it possible to fix?

Well, yeah, depending on a number of factors. A frayed cable isn’t always dead, and sometimes heat-shrink or electrical tape is enough to fix it for another couple of months. On many other cords, you really, really shouldn’t try to DIY it. Especially high-powered ones, or ones that lead to delicate machinery. As I said before, the best thing you can do is prevent those cables from fraying or snapping in the first place by reinforcing their protective sheathes, but if you can’t, lower-powered cables do have tips and tricks to get them to work again (although you might have better, safer results with a pro).

On bigger cords, or cords to house appliances? Don’t touch that! It is technically possible to patch cables together yourself… however, with bigger appliances, that also greatly increases the risk of serious personal injury, fires, and shorts, both in the house’s circuit and your item’s. Assuming you don’t screw up at the starting line and mix and match two separate threads accidentally where it counts, i.e. a phone cable. If you’ve never done it before, if you doubt your ability to do it, or if you’re missing materials to do it safely, go to an electrician or a tech repair place. What’s cheaper – 70$ for a cord repair, or 700$ for a PC stand?  Plus, the ~danger~ factor!

House power is very dangerous. Electricians are paid mint for a good reason! While any number of kids have stabbed a fork into the electrical socket and survived, the fork isn’t carrying the full potential of the shock, and a number of people die doing that anyway. A cable would be. Never plug in a damaged or broken extension cable. 120 V of pure house power could be channeled across you if you touch the exposed part while power is flowing.

Flubs

Some people with more skill than wisdom assemble cords for things they think they need. You will never see a mass manufactured male-to-male three prong plug-in, for example. You could burn your house down, current is meant to go out of those plugs, not in. You’ll also never see A USB Male-to-A USB Male, because those almost always come powered, and transferring data to another computer is much less risky with a simple USB drive or Bluetooth transfer. You’d explode your computer with the male-to-male. If the computer manufacturers wanted you to have access to the forbidden plug-in, they would have made an adaptor for it. Do not make one yourself even if you have the technical skill to attach two cords to each other. It will end in a housefire. I’m not joking. You do not need a male-to-male plug.

Sources:

https://www.androidauthority.com/what-is-usb-type-c-594575/

https://acworks.com/blogs/ac-works-connector/male-to-male-extension-cords-adapter-dangers

https://www.techinsights.com/blog/systems-analysis-apple-lightning-usb-cable

Car Screens – Is It a Good Idea, Really?

Elizabeth Uncategorized September 10, 2021

We all know how addictive screens are. And yet, after endless campaigns to get teenagers to stop staring at their screens while driving, we’re introducing cars that practically require it. Why?

The Good

Screens exist basically everywhere. They’re oftentimes a good substitute for analog buttons, as in the case of phone keyboards, and can provide more flexibility and wear-time if the screen is going to be in front of the public, as in self-serve checkout screens in grocery stores. They’re easier to clean and more difficult to break.

However, screens and analog buttons don’t have to be enemies. Some modern cars come with air conditioning that can be set to an exact degree to aim for, but the number of possible answers means a digital readout is required. It gets both a screen and a set of buttons.

Other things necessitate screens if the customer wants them as a feature. You don’t have to have a screen for the radio, the buttons could perform all of the functionalities just fine, but if the customer wants to know the temperature outside, that takes a screen. Plus, that screenless radio would be annoying to select presets for, so they almost universally come with some sort of screen or indicator for the channels. Bigger, more complex car radios with screens can show more information about the broadcast, too!

Some features that help with safety and ease-of-driving come with screens too – backup cameras need a screen to function. There’s no way for that feature to exist without a screen somewhere, so it may as well be in the dashboard of the car.

The Bad

That being said…

Some things are better suited for buttons and physical inputs. Someone can adjust their volume by simply grazing their hand along the surface of the interior dash until they get to the right knob. Trying to do that with analog buttons and a digital readout is also doable – they will be able to both feel and hear the difference (of switching stations) in hitting different buttons, so they’ll eventually be able to land on the right one, as long as their fingers are on the buttons. Doing it with no physical feedback requires taking eyes off the road, otherwise the user doesn’t know if their fingers are even on the buttons on-screen.

Extra features that are useful are also often distracting while on-screen, so it’s not totally the screen’s fault. GPS hooked into the car’s screen makes sense. It’s safer than looking down at the cupholder or blocking off a bit of vision for a suction-cup phone holder on the windshield. However, typing on one is usually a nightmare because the positioning is awkward, right in the middle of the dash, even if the screen is top-of-the-line responsive. Syncing to a phone to use the GPS there, and then BlueTooth it to the screen fixes that problem but creates new problems in it’s wake. Even worse if these things are in separate menus, which means spending time navigating said menu to get to the GPS, Bluetooth hookup, or other assorted features in the first place. All of that should take place before driving – but isn’t it annoying to have to fix all of that up before even leaving? Flipping through the radio was effortless before screens made it more difficult than it needed to be.

Deeply Unnecessary and Largely Unwanted

Bizarrely, automakers also offer options to connect to the internet for reasons beyond simple GPS or music. As The Turning Signal points out, the layers upon layers of menus and features offer a lot of distraction while in the car, and no hierarchy of features. Radio should probably be fewer steps to get to than GPS, for example, because you’re not going to use GPS on every trip you make, but radio is almost always on. Another obvious downside is that if anything goes wrong with the screen itself, you’re trapped with the settings you had when it broke, and that’s really annoying.

Part of this isn’t even due to the screens – it’s because the automaker is desperate to stuff as many features as possible into the car. The sheer number of things a car can do now means even if everything were analog, the user would still be glancing down pretty often just to find the right button for the task. Seat warmers, directional AC, GPS, motorized seats, built in chair massagers??, the heater, turbo heating or cooling, the radio, Bluetooth, etc. etc. would all need their own buttons – multiple buttons for each. If automakers were to make these all real, physical buttons, your dashboard would look like something from Star Wars. It’s too late to go back unless the automaker wants to ditch features that other cars (and their previous cars) still have.

Even Worse

Ford announced plans to beam billboard information directly onto the screen, via a complicated system of computers and AI. While it’s not literally beaming every sign it sees into the car, and it is theoretically possible to shut off, it’s still an awfully ugly statement. The dashboard has become advertising space for billboards that used to be ignorable. A big question will be how it interacts with other apps on-screen. Does it get priority over the radio, or the GPS? Even assuming that’s all sorted, and the customer willingly has the ads open, glancing down to peep at a flash on-screen is a little bit dangerous, is it not? Their reasoning is that the consumer may have missed information they could be interested in. If the information is interesting, that’s worse! That makes the distraction issue worse! The screens are already horribly distracting as they are, with all of the menus and buttons and stuff to dig around in, so having an ad, which is inherently trying to snatch your attention away from what you were doing before you saw it, beamed directly into the car while the driver is driving, is effectively putting revenue above safety. I thought Ford had learned from the Pinto. Apparently not.

Many people jumped on Ford for even suggesting the option. As they should – billboards themselves have gotten into trouble for being too distracting, how beaming directly into the car was supposed to avoid those same issues is anybody’s guess.  

And then there’s things like games and social media apps built into the system. It’s weird anyway, because most people have phones, but whatever. Assuming it has the most basic of safety features built in, and won’t activate if the car is in drive – what’s to stop the driver from shifting into park at every red light to check up on their accounts?

Phones can at least be stuffed into pockets – this screen would have to be disabled.

Sources: https://www.motortrend.com/news/ford-billboard-ad-patent-system/

https://www.theturnsignalblog.com/blog/touch-screens/

https://www.motorbiscuit.com/why-are-automakers-replacing-buttons-with-touchscreens/

https://gizmodo.com/get-ready-for-in-car-ads-1846888390

https://newsroom.aaa.com/2017/10/new-vehicle-infotainment-systems-create-increased-distractions-behind-wheel/

Stop Hyping Autopilot

Elizabeth Uncategorized September 8, 2021

It’s not done yet!!

Tesla’s autopilot is really impressive. It’s just not done yet. Between failure to detect real objects and detecting ghost objects, the new Auto-pilot has a lot of really terrifying anecdotal cases.

A Word of Disclaimer

Tesla does tell users not to get in the back seat or otherwise take their eyes off of the road while autopilot is driving. They’re constantly updating their programs to include edge cases discovered on the road, and it’s really hard to do that if the car never gets to use the feature that’s causing bugs. However, I’m not sure it’s impossible to catch some of these user-reported issues in a testing environment. Elon Musk’s consistent belief that people will die for science is not comforting in this situation.

However, many of the issues in the following article are rare, fringe-case scenarios. It doesn’t represent the cars as a whole, it’s more of a warning – you really can’t trust the autopilot 100% yet, because users report multiple different issues stemming from the programming. Nothing most Tesla owners don’t already know.Drive without autopilot or drive while paying careful attention to the autopilot, and Tesla’s as good as any other car.

The irony of using cars out in the wild to ‘test’ is that a regular car’s cruise control is actually less stressful – the driver doesn’t have to pay active attention to the car’s surroundings on regular cruise control! The old-style cruise control couldn’t make the car suddenly brake or swerve into another car.

The Brakes, the Reads

Speaking of which, the brakes! A car capable of braking can brake itself into an accident in a split second on busy roads if it sees something it thinks is dangerous.

This is a cool feature, but it’s not done yet. Reddit’s Tesla subreddit has numerous accounts of the brakes engaging for little to no reason: phantom animals, suddenly ‘seeing’ a stop sign on the highway, misinterpreting special vehicles’ rear lights, and more. The biggest one is phantom overpasses, where it misunderstands the shadow as a reason to stop (users say that this was an older version of the software, and that newer ones don’t do it as much unless there are other, compounding factors, like tow trucks or construction lights. Still not ideal).

Nature released an article detailing how someone could hypothetically trick the car into seeing a speed limit sign instead of a stop sign, and get it to accelerate into an intersection. Specially painting trucks and cars so that the AI misinterprets what it’s seeing might turn into a great way to cause accidents. The AI seeing things is trying it’s best to look for issues, but as Nature describes it, AI is often ‘brittle’. The computer’s not totally sure what it’s looking at, so it makes its best guess. Unfortunately, it’s best guess is often pretty bad. A computer’s best guess as to what a food truck with a hot dog on top is might be that the truck’s actually an overpass, or maybe a deer, while even a baby human can tell it’s some sort of vehicle. Fringe cases like the hot-dog truck have to be manually added to the computer’s repertoire so it doesn’t freak out next time it sees it. However, it has to do this for each instance of a ‘hot dog truck’ it doesn’t recognize. Dale Gribble’s famous ant-van would confuse it too, for example, and it’s not hot dog-like enough for the AI to snap to that memory. It would be starting from scratch, every time.

It also occasionally fails to brake or move when there is something there. Commentors theorize that the computer is deliberately programmed to ignore things along its sides, so it doesn’t freak out about the railings and concrete barriers that run alongside highways.

The Lights and Cameras

Tesla’s auto-pilot is easily confused by wet road surfaces. One user reported that their Tesla couldn’t understand reflections from signs, or wet ground. It would see it’s own high-beams in the reflected light, and lower them automatically. And then it realizes it’s dark once it’s past the sign, so it flips them back on. It keeps doing this until it has a continuous level of darkness or brightness in-line with what it’s expecting from a dry road with few signs. Unfortunately, that means the car has to make it to an area with streetlights or other cars for it to figure out the low beams should be on, not the high beams. Or the user can flip it manually, which means turning off the autopilot, on some models. Speaking of light, it can’t always tell that lights are lights and not more white lines.

It also struggles with overpasses – it doesn’t understand bridges, and there are so many bridges, overpasses, and assorted vertical shadow-casters that distinguishing it from a regular stoplight pole is a Herculean challenge. As such, it often erred on the side of caution before reprogramming fixed its confusions.  

The built-in monitor can also display what the camera thinks it’s seeing, which gives the user some valuable insight into how it works. When it pings something as a thing, that thing is there now. See this gif of someone driving behind a truck with stoplights on it:

 This is a hilarious edge case, and I don’t blame the car for not understanding what’s happening, but the lights stick to the place in the road where the Tesla identified them. Once it’s there, it’s there – a box or bag in the road that’s incorrectly identified might not get re-identified correctly. Of course not! Because if the Tesla was told to constantly re-ping it, it might misidentify things it got right the first time, and the more opportunities the programmers give it to do that, the more likely it is to happen. Right now, what Tesla has going on is good for ideal conditions. The struggle is getting all of that to work in the real world.

The Hardware

The cameras are great. This issues with the autopilot are purely AI-driven. The flash memory used in older models was prone to failure and had to be treated like a warranty item to avoid a total recall, which sucked for users, but otherwise – the hardware directly tied to software functions is more or less working as advertised. It’s the other parts of being a car where Tesla falls down.

It’s unfortunate, but Tesla’s ‘Model S’ front axels are prone to deforming. It doesn’t happen quite often enough to warrant a recall, but enough for some disgruntled users to post about it online. Something as simple as driving onto the curb bends the front axle, and the user then starts to hear strange noises from around the wheel area when they turn. Many Tesla superfans attribute these complaints to one guy in Australia harping on it, but scattered posts (from various devices, locations, and dates) across the Tesla subreddit as well as Tesla forums suggest this is a bigger issue than those superfans (and Tesla) want to believe. Tesla revolutionized electric cars, but it also re-did a lot of design work itself, from scratch. Is it really that unbelievable that cars across nearly a decade could be suffering from a premature parts failure? It happens to non-electrics all the time!

Design

Also, from a design standpoint, I just… don’t think the cyber-truck looks that good. The previous four-door Teslas look great! They’re very slick, but they look a lot like some of the hottest cars in the market. A family car, or a commuter car. It blends in with the pack, and only stands out in traffic in good ways, like it’s lack of noise. The cyber truck looks nothing like the trucks it’s meant to compete with. The sides of the bed are raised so it meets the rest of the body on a nice, straight line. That sure looks cool, but for anything of actual weight, the driver can’t toss items in over the side. That’s one of those minor-but-annoying things that peeves owners off over time.

The glass is also armored, which is cool, but… what for? Who is driving this? Who’s afraid of getting hailed on or shot at, and doesn’t want a less conspicuous vehicle?  Or, the inverse – bougie celebrities with a lot of money and a lot of enemies might want a really conspicuous car but with stronger glass. Does the cyber truck do that? Kinda… but so do many sports cars.

It’s a cool idea, but it’s just that – an idea. The truck of the future, not the truck of right now. An electric truck is a great idea! But it doesn’t look anything like other company’s versions of the same concept does, so people may be reluctant to jump to Tesla, instead of Ford. Differentiation in cars can either give you the VW Beetle, or the Pontiac Aztec. Only time will tell how the cyber truck fares.

Sources:

https://www.tesla.com/cybertruck

https://www.nature.com/articles/d41586-019-03013-5

https://forums.tesla.com/discussion/60330/model-s-axle-problems

https://www.nature.com/articles/d41586-019-03013-5

https://www.forbes.com/sites/bradtempleton/2020/10/23/teslas-full-self-driving-is-999-there-just-1000-times-further-to-go/?sh=7c7734c32ba6

Internet Of Things: Network Vulnerability

Elizabeth Uncategorized August 30, 2021

 

Internet of Things items are convenient, otherwise they wouldn’t be selling. At least not next to regular, non-wifi-enabled items. They don’t even have to be connected to the internet, and they should stay that way!

An Internet of Things item, or an IoT item, is a device that has a WiFi- or network-enabled computer in it to make the consumer’s use of it easier. This includes things like WiFi-enabled/networked washing and drying machines, ovens, fridges, mini-fridges, coffee makers, lamps, embedded lights, etc. anything can be an IoT item, if it’s got WiFi capability.

 

Network Entry Point

 

Internet of Things items, when connected to WiFi, represent a weak link in the chain. They’re poorly protected, they’re designed to favor user friendliness over all else, and they’re usually always on. You likely don’t unplug your fridge or washing machine when you go to bed – that computer may sleep, but it’s not off. You probably don’t disconnect the internet when you go to bed, either. Some devices take advantage of this, and only schedule updates for late at night so you don’t notice any service interruptions. Unfortunately, their strengths are their weaknesses, and an always-open port is a dream for hackers.

 

Outdated Password Policies

 

Internet of Things items are rarely password protected, and if they are, many users don’t bother actually changing the password from the factory default. This makes them excellent places to start probing for weaknesses in the network!

Assuming someone’s hacking into a place to ding it with ransomware, there are a number of worthy targets: corporate offices, nuclear facilities, hospitals, etc. are all staffed by people, and people like their coffee. A well-meaning coworker bringing in an internet-enabled coffee machine for his coworkers is suddenly the source of a critical network vulnerability, an open port in an otherwise well-defended network!

If the coffee machine, or vending machine, or the lights are IoT items, they need to be air-gapped from the networks supplying critical data within the center (or cut off from the network completely), the same way outside computers are. The devices are simply unable to protect themselves in the same way a PC or phone is – there’s no way to download a suitable antivirus. If something gets past a firewall, and that password’s still default or nonexistent, there’s effectively no second layer of protection for IoT devices.

 

Malware

 

For example, hacking into a fridge is not nearly as hard as hacking into an old PC. Even great antivirus can struggle with traffic coming from inside the network, and IoT devices are often missed in security checkups. After all, when McAfee or Norton or Kaspersky recommends you scan your computer, are they offering to scan your lightbulbs as well?

Once they’re in, the entire network is vulnerable. Ransomware events with no obvious cause, malware that’s suddenly deleted all the files on a server, stolen data and stolen WiFi – all of it’s possible with IoT devices. There’s more to gain than just bots for the botnet, which is why hackers keep going after these IoT items.

IoT devices are also much easier to overwhelm to gain access, even with firewalls and effective load balancing. DoSing an IoT item can be as simple as scanning it. No, really. A team in the UK found that they could shut down turbines in a wind farm by scanning them. The computers inside weren’t equipped to handle both a network scan and their other computing duties at the same time. Many user devices are in the same spot or worse!

 

Security

 

Besides turbines, items like cameras and door locks probably shouldn’t be connected to the internet just yet. A terrifying string of hacks let strangers view doorbell and baby monitoring cameras, for example, because the cameras themselves were difficult to defend even though the network was protected by a router. This is terrible for obvious reasons and class action suits were filed soon after. It even happened accidentally; Nest users would occasionally end up viewing other people’s cameras accidentally, a bug in the system that was only fixed after complaints were made. A consistent pattern is forming, here: security patches are only issued after vulnerabilities are discovered by the consumer! Any other type of programming wouldn’t get away with this without some public outcry – you shouldn’t have to become a victim of a security flaw to get it fixed.

And then there’s things that physically interact with the security features of a house, like electronic locks. There’s nothing wrong in theory with a password lock. However, electronics are not inherently more secure than physical locks, and adding in WiFi only gives lockpickers another ‘in’. Hacking the lock could lead to being locked out of your own home, or worse. Besides, a regular lock will never unlock itself because its battery died, or because you sat down on the fob while getting on your bike or into your car. If you do want a password lock, it’s better to get one that’s not network enabled.

We aren’t quite at the point where hacked self-driving cars are a legitimate issue, although the danger is growing on the horizon. Cars are also poorly protected, computer wise.

 

BotNets

 

The fridge doesn’t need a quadcore processor and 8 GB of RAM to tell you that it’s at the wrong temperature, or that the door’s been left open and you should check the milk. The voice-controlled lightbulbs only need enough power to cycle through colors. IoT items are weak. However, that doesn’t mean they can’t be used for things like Botnets, even if your main PC wards off botnet software.

Botnets are networks of illegitimately linked computers used to do things like DDoSing, brute-forcing passwords, and all other kinds of shenanigans that a single computer can’t do alone. By combining the computing ability of literally thousands of devices, a hacker can turn a fridge into part of a supercomputer. No one ant can sustain an attack on another colony, but an entire swarm of ants can!

This is another reason tech experts are worried about IoT items becoming widely used. Their basic vulnerabilities give skilled hackers the ability to ding well-protected sites and fish for passwords even if the network they’re targeting doesn’t have any IoT items on them. It’s a network of weaponizable computers just waiting to be exploited. Remember, password protect your devices!

Source:

https://eandt.theiet.org/content/articles/2019/06/how-to-hack-an-iot-device/

https://danielelizalde.com/iot-security-hacks-worst-case-scenario/

https://cisomag.eccouncil.org/10-iot-security-incidents-that-make-you-feel-less-secure/

https://www.courtlistener.com/docket/16630199/1/orange-v-ring-llc/

 

Nintendo, and Always Underproducing

Elizabeth Uncategorized August 23, 2021

 

Is it intentional? It’s difficult to say. Artificial Scarcity, the process of deliberately limiting the supply of something that would otherwise be easy to get or find, causes prices to go up (or stay up) with demand. It’s pretty common. Diamonds, for example, are only expensive because one company owns most of the mines and supplies most of the stores.

So too are other commodities limited, and jaded consumers tend to believe malice over incompetence when it comes to the supply chain. But sometimes scarcity isn’t artificial – real scarcity also happens all the time, even when it should be possible to just ‘make more’. See GPU cards, for example. Many companies make them, and yet they’re very expensive and difficult to find right now. The manufacturing process is long and complicated, so they really are that scarce. Is Nintendo in camp one, or two?

 

The NES Classic (2016)

 

The NES Classic was a hit. It was hugely popular and sold out quickly; fans hoping to get their hands on one waited weeks or months after the release date just to order one. This was a re-release of an older console, one that didn’t sell a ton of units in the US, but one that was still quite popular among gamers in the 90’s. Nostalgia had fans tripping over themselves to give that experience to their children, or perhaps relive it themselves. Either way, any high-quality remake of an item that was high-quality in the first place generally does alright. And it sold really well in its first weeks!

And then Nintendo stopped making them, effectively doubling the price of the ones being resold. It didn’t come down all at once, either, as the US stopped producing units before Europe did. For some reason. Obviously, people wanted to know why, and Nintendo didn’t really provide a why, just that they’d announce it if it ever came back.

 

The SNES Classic (2017)

 

This one was easier to find in person… at first. Learning from the NES Classic, Nintendo ramped up production enough to meet the initial demand, only for that supply to taper off once again, leaving fans to buy from resellers if they couldn’t get it during the first wave. Vendors didn’t seem to know when they would be getting said SNESs, so pre-ordering turned into a nightmare.

Walmart canceled pre-orders after realizing that they went live before they meant to, Target’s website crashed, and bots were being used to swoop in on pre-orders before humans could get to the listings, meaning that the first opportunities to get SNES Classics were almost entirely consumed by scalpers. People were more or less limited to getting their console in person, and if they didn’t have a vendor nearby, they were out of luck. And then production stopped before everyone got their fill! As of this article, the SNES Classic is 270$ on Amazon. It launched at about 80$. Just like the NES Classic, it disappeared with no explanation and a steep drop-off into non-production.

Combined, the NES and SNES Classic just barely touched 10 million units, a tenth of what the Wii sold. It would have been more if more were available! The high prices on reseller sites should be a hint that more would have sold, if only the units were available, and yet Nintendo didn’t produce nearly enough to keep up with demand at any point during its run even with a perfect ‘test run’ in the unit before it.  Previous consoles should have acted as a warning – why didn’t they? Perhaps the Wii U really burnt them up on ever making even slightly too many of something.

 

The Wii U (2012)

 

Maybe it was always destined for failure. The Wii U was a more powerful console than the Wii, but marketing didn’t make it clear just how different it was. It might be funny to see the new designs that other console lines come up with, but they all look so distinct for a reason – the casual end-consumer needs to be able to distinguish the boxes by looks alone. Customers don’t like to read, and if grandma heard that her grandson wanted an Xbox 360, but the game shop still has the original Xbox in stock, Microsoft better plaster that 360 all over so she doesn’t get the wrong one. Different consoles need to look different.

Labelling and making each console very distinctively only helps sales and keeps customers satisfied. Meanwhile, the Wii U looked only very slightly different than the Wii. Customers saw the ads for the Wii U, but it’s unclear to them what’s actually different about it – is it a special edition of the Wii? Is it a software upgrade?

The name might have had something to do with it as well, but I don’t blame Nintendo for their naming conventions. Xbox jumped from Xbox to Xbox 360 instead of Xbox 2, so they had precedent, but everything else from design to the advertising of the console needed to be a hundred times clearer.

It’s ironic that this is the unit that hung around, the only one they’ve overproduced for in years. I don’t think this would have sold out at lightspeed like their other units did even if everything did go right. The CEO took a paycut (rather than cut employees) after it didn’t meet expectations, so that was nice, though.

 

The Wii (2006)

 

The Hot Commodity of 2006, the Wii indirectly caused at least one death by overhydration. It sold out across countries, and even outsold the PS3! It got difficult to find, sure, but it wasn’t impossible. Just very annoying. And yet, as many critics say, the timeline for production-to-sale makes their under-supply seem unintentional.

A few months with limited supply to keep the hype up makes sense. PS4 did that, PS5 did that, Xbox did that, etc. but a timeline of a few years? Consistently undersupplying for a few years? A little bit of over-demand is good, because it makes the indecisive buyer more likely to ‘get it while they can’. Too much for too long turns the console into a second priority to another company’s version. Wii was in the fortunate position of being one-of-a-kind, and so that hype never really died out. Substitutes just didn’t exist.

This was an incredibly unique time in console history. The market for the casual gamers was narrowing down to two or three consoles, and each has merit – but one is obviously very distinct. Those casual gamers were willing to wait a minute to get their hands on it. If it had been an Xbox clone, this would have blown sales.

They sold over 100 million units, and continue to sell on the reseller’s market even today.

 

The Switch (2017)

 

The Switch was different from the start. It was meant to compete with other, larger consoles, just like the Wiis, unlike the Classics. It advertised as a family console, like almost all of Nintendo’s consoles do, but it also innovated. The PSP is the first non-Nintendo handheld that comes to mind, and that’s obviously ancient. It sold well, and there were rough patches where supply was lower than demand, but it wasn’t months of waiting like it was for the Wii.

And then Covid-19 strikes, and the world shuts down. The Switch has already amassed several very good games – Breath of the Wild and Animal Crossing: New Horizons would have secured many purchases all by themselves, but new Mario games and assorted others only sealed the deal. It’s a really good console with a really good selection. Of course it sold out, people who wouldn’t have looked into it otherwise were now starved for entertainment, and the Switch provided.

Given the bump between the first peak of demand at launch and the second peak at the first lockdowns, it’s not surprising they sold out – who could have anticipated demand getting quite as high as it was during that year?

 

What’s happening?

 

Again, it’s difficult to say. The NES Classic and the Wii spent months out of stock, only for the Wii U to spend months at a discount because they overproduced after first launch. Each individual console has its own unique reasons for being out of stock or difficult to get.

Many of Nintendo’s successful consoles come during a time of industry upset. The Switch might not have gone missing for months if Covid hadn’t struck, and the Wii might not have sold so fast if it wasn’t literally the first commercially well-made console with motion sensing.

Ultimately, aside from ones where Nintendo literally stops all production, Nintendo consoles are undermade, yes, but artificially scarce? I don’t actually think so. I think they really do genuinely struggle with forecasting demand and organizing higher outputs. The consoles get to the shelves eventually, having a surplus of re-sellers taking all of the stock and reselling it hurts the consumer the most and Nintendo second-most. Customers complaining is surely getting back to Nintendo, and when a console takes a particularly bad ‘dip’, the one right after it generally has a better, more accessible supply.  Wii vs. Wii U, the NES Classic vs the SNES Classic.  I think Nintendo is focusing so hard on pure optimization top-to-bottom that they don’t allow for Wii U scenarios anymore – they’ve simply accepted negative reviews of their company based on quantity. Coming up short is, after all, much less costly than having too many units.

Sources:

https://www.forbes.com/sites/insertcoin/2017/05/01/a-switch-story-that-should-kill-the-myth-of-nintendos-artificial-scarcity/?sh=49e892233fbf

https://arstechnica.com/gaming/2017/06/nintendo-switch-shortages-are-definitely-not-intentional/

https://www.forbes.com/sites/insertcoin/2017/04/18/nintendo-confirms-nes-classic-discontinuation-in-europe-but-leaves-the-door-open/?sh=187a4eff2b3a

https://www.nintendoworldreport.com/editorial/44492/the-nes-classic-and-the-company-that-misunderstood-basic-economics

https://www.theverge.com/2018/12/14/18140909/nintendo-nes-snes-classic-console

https://www.ign.com/articles/2018/10/31/combined-nes-and-snes-classic-sales-surpass-10-million-units-amiibo-hits-50-million-figures

Radiation’s Effects on Media

Elizabeth Uncategorized August 17, 2021

What Kind?

Radiation is not limited to radioactive materials. Electro-magnetic radiation from the sun touches nearly all life on Earth. Microwaves generate microwave radiation, and heating pads generate infrared radiation. Not all radiation is bad, but some can be dangerous. Nonionizing radiation includes the visible light spectrum and the bandwidths below it, while ionizing includes part of the UV bands and above. The difference between the two is that Ionizing EM radiation has enough energy to ionize something, meaning that when it strikes an atom or molecule, it has enough energy to remove an electron. Ionizing radiation has many applications, and not all of them are scary – the same UV radiation that can cause skin cancer can also sterilize equipment without chemicals.

Ionizing radiation that comes from radioactive decay include alpha particles, beta particles, and gamma rays. The first two can be stopped with relatively little difficulty – alpha particles can be stopped by distance, or something as thin as a sheet of paper. Beta particles are lighter and higher-energy, so they take a thicker shield. A car door might protect you from beta particles. Gamma rays, which are the highest-energy kind of EM radiation, are where lead casings and thick concrete become necessary.

How long does media last if exposed to ionizing radiation?

Film

Photosensitive film exposed to radiation becomes speckly, and picks up a ‘fog’ after long-term exposure. Film cameras work by exposing photosensitive chemicals on a backing to light – if something is ‘overexposed’, the photographer let too much light in, or let the light in for too long, causing lightly-colored areas to look white and dark areas to appear much brighter than they did in person, and vice-versa for underexposure. Getting the right amount of radiation to the film is what gives you that perfect picture! But that’s all about non-ionizing – what about ionizing?

Ionizing radiation penetrates the plastics surrounding the film (camera, film canisters, etc.), and excites the film’s chemical coating in the same way light does, leading to strange white speckles across the film alongside a general ‘fog’ which worsened the image’s quality, according to NASA’s tests. The good news is that the effect stops getting worse after about 4 months of low-grade exposure (NASA exposed the film to the radiation you’d get up in space, unprotected by the atmosphere)!

Interestingly enough, this effect is how Kodak found out the US was testing nuclear devices in the 1950s. The government contaminated the strawboard (via fallout contaminating the rain and rivers used by their processing plants) that was used to ship film, so when that film got to where it was going, it was already partially exposed and therefore ruined.

VHS Tapes

VHS is a tape-based medium – but the information is stored magnetically, in binary. This greatly increased storage capacity, and some of the biggest computers still run on magnetic tapes. However, that doesn’t mean it’s better-equipped to survive radiation!

Radiation and magnets have a complicated relationship, but long-term (or short term high-volume) exposure to ionizing radiation can make bits flip, and the effect is only magnified the smaller the bits are. Even powerful magnets can slowly demagnetize when exposed to radiation in the long-term! VHS is notoriously delicate as a storage medium anyway, and shares many of the same issues that regular film does – the thin plastic used in both hates a lot of things! Heat, cold, too long in the sun (which is non-ionizing radiation!) too long without being played, being played too many times, you name it – VHSs can rot or degrade from it.

CDs/DVDs

CDs and DVDs work differently than magnetic media – a laser has etched the data onto the disc, and a laser is reading data off of the disc. Magnets are totally uninvolved, chemicals only minimally. Testing done by the USPS shows that sterilizing radiation doesn’t ruin the information on the disc! However, the plastic of the disc as well as its box started smelling burnt and turned a funny color during these tests. The information could still be accessed, though, making it a major upgrade to the VHS’s loss of quality upon exposure.

Electronics (and Modern Storage)

Radiation, specifically gamma radiation, makes complicated electronics stop working. Amazon’s “Chernobyl” series recreates the events of the famous power plant’s real-life meltdown and it’s resulting consequences. One scene shows a helicopter passing directly above the ‘spout’ of Cherenkov radiation created by the exposed reactor materials, and then crashing because it lost power. Further into the series, we see three young men go back into the plant only for their regular flashlights to fail.

Why? Well, they were basically hit with an EMP!

A beta particle is essentially a rogue electron or positron – where it hits, it ionizes, although their easy-to-stop nature means they aren’t too much of a threat unless you’re already too close. Gamma rays, on the other hand, are a much bigger problem, and go much further. A large burst of them like the kind released by a nuclear detonation, and the slow, steady kind released by an event like Chernobyl, are both really bad news for electronics.

Location and dose are critical factors in how the device fares. Would they start working again once removed from the source? Possibly! But because the gamma radiation is changing the physical attributes of the device where it strikes, it becomes less and less likely the longer it spends next to the radiation source.  Nuclear Engineering Magazine states that long-term exposure or big, one-time exposures can damage transistors into non-functionality even after they’re removed from the source. As mentioned before, they can also cause bit-flips, which will wipe stored data, too – if the radiation can get through the casing, it’s at risk. Any especially small memory systems using magnets may be vulnerable if unshielded! Including flash memory. Where Solid State Drives and Flash Drives are resistant to physical impact, they aren’t more resistant to radiation than their counterparts, because their method of storage – magnetization – is the same.

However, that doesn’t mean it’s impossible to protect electronics from radiation. You can harden devices against radiation just by picking a different casing, since both alpha and beta particles are fairly easily stopped given appropriate shielding. Gamma radiation is tougher to stop entirely but can be slowed. If the device was powered down at the time of exposure, it’s odds of surviving go up even further.

Sources:

https://www.neimagazine.com/features/featureprotecting-electronics-from-the-effects-of-radiation-5890599/

https://nepp.nasa.gov/docuploads/392333B0-7A48-4A04-A3A72B0B1DD73343/Rad_Effects_101_WebEx.pdf

https://www.popularmechanics.com/science/energy/a21382/how-kodak-accidentally-discovered-radioactive-fallout/

https://www.nytimes.com/2010/09/14/science/14atom.html

https://www.clir.org/pubs/reports/pub121/sec5/

https://hps.org/publicinformation/ate/q11162.html

https://www.slac.stanford.edu/econf/C010630/papers/T207.PDF