Stop Hyping Autopilot

It’s not done yet!!

Tesla’s autopilot is really impressive. It’s just not done yet. Between failure to detect real objects and detecting ghost objects, the new Auto-pilot has a lot of really terrifying anecdotal cases.

A Word of Disclaimer

Tesla does tell users not to get in the back seat or otherwise take their eyes off of the road while autopilot is driving. They’re constantly updating their programs to include edge cases discovered on the road, and it’s really hard to do that if the car never gets to use the feature that’s causing bugs. However, I’m not sure it’s impossible to catch some of these user-reported issues in a testing environment. Elon Musk’s consistent belief that people will die for science is not comforting in this situation.

However, many of the issues in the following article are rare, fringe-case scenarios. It doesn’t represent the cars as a whole, it’s more of a warning – you really can’t trust the autopilot 100% yet, because users report multiple different issues stemming from the programming. Nothing most Tesla owners don’t already know.Drive without autopilot or drive while paying careful attention to the autopilot, and Tesla’s as good as any other car.

The irony of using cars out in the wild to ‘test’ is that a regular car’s cruise control is actually less stressful – the driver doesn’t have to pay active attention to the car’s surroundings on regular cruise control! The old-style cruise control couldn’t make the car suddenly brake or swerve into another car.

The Brakes, the Reads

Speaking of which, the brakes! A car capable of braking can brake itself into an accident in a split second on busy roads if it sees something it thinks is dangerous.

This is a cool feature, but it’s not done yet. Reddit’s Tesla subreddit has numerous accounts of the brakes engaging for little to no reason: phantom animals, suddenly ‘seeing’ a stop sign on the highway, misinterpreting special vehicles’ rear lights, and more. The biggest one is phantom overpasses, where it misunderstands the shadow as a reason to stop (users say that this was an older version of the software, and that newer ones don’t do it as much unless there are other, compounding factors, like tow trucks or construction lights. Still not ideal).

Nature released an article detailing how someone could hypothetically trick the car into seeing a speed limit sign instead of a stop sign, and get it to accelerate into an intersection. Specially painting trucks and cars so that the AI misinterprets what it’s seeing might turn into a great way to cause accidents. The AI seeing things is trying it’s best to look for issues, but as Nature describes it, AI is often ‘brittle’. The computer’s not totally sure what it’s looking at, so it makes its best guess. Unfortunately, it’s best guess is often pretty bad. A computer’s best guess as to what a food truck with a hot dog on top is might be that the truck’s actually an overpass, or maybe a deer, while even a baby human can tell it’s some sort of vehicle. Fringe cases like the hot-dog truck have to be manually added to the computer’s repertoire so it doesn’t freak out next time it sees it. However, it has to do this for each instance of a ‘hot dog truck’ it doesn’t recognize. Dale Gribble’s famous ant-van would confuse it too, for example, and it’s not hot dog-like enough for the AI to snap to that memory. It would be starting from scratch, every time.

It also occasionally fails to brake or move when there is something there. Commentors theorize that the computer is deliberately programmed to ignore things along its sides, so it doesn’t freak out about the railings and concrete barriers that run alongside highways.

The Lights and Cameras

Tesla’s auto-pilot is easily confused by wet road surfaces. One user reported that their Tesla couldn’t understand reflections from signs, or wet ground. It would see it’s own high-beams in the reflected light, and lower them automatically. And then it realizes it’s dark once it’s past the sign, so it flips them back on. It keeps doing this until it has a continuous level of darkness or brightness in-line with what it’s expecting from a dry road with few signs. Unfortunately, that means the car has to make it to an area with streetlights or other cars for it to figure out the low beams should be on, not the high beams. Or the user can flip it manually, which means turning off the autopilot, on some models. Speaking of light, it can’t always tell that lights are lights and not more white lines.

It also struggles with overpasses – it doesn’t understand bridges, and there are so many bridges, overpasses, and assorted vertical shadow-casters that distinguishing it from a regular stoplight pole is a Herculean challenge. As such, it often erred on the side of caution before reprogramming fixed its confusions.  

The built-in monitor can also display what the camera thinks it’s seeing, which gives the user some valuable insight into how it works. When it pings something as a thing, that thing is there now. See this gif of someone driving behind a truck with stoplights on it:

 This is a hilarious edge case, and I don’t blame the car for not understanding what’s happening, but the lights stick to the place in the road where the Tesla identified them. Once it’s there, it’s there – a box or bag in the road that’s incorrectly identified might not get re-identified correctly. Of course not! Because if the Tesla was told to constantly re-ping it, it might misidentify things it got right the first time, and the more opportunities the programmers give it to do that, the more likely it is to happen. Right now, what Tesla has going on is good for ideal conditions. The struggle is getting all of that to work in the real world.

The Hardware

The cameras are great. This issues with the autopilot are purely AI-driven. The flash memory used in older models was prone to failure and had to be treated like a warranty item to avoid a total recall, which sucked for users, but otherwise – the hardware directly tied to software functions is more or less working as advertised. It’s the other parts of being a car where Tesla falls down.

It’s unfortunate, but Tesla’s ‘Model S’ front axels are prone to deforming. It doesn’t happen quite often enough to warrant a recall, but enough for some disgruntled users to post about it online. Something as simple as driving onto the curb bends the front axle, and the user then starts to hear strange noises from around the wheel area when they turn. Many Tesla superfans attribute these complaints to one guy in Australia harping on it, but scattered posts (from various devices, locations, and dates) across the Tesla subreddit as well as Tesla forums suggest this is a bigger issue than those superfans (and Tesla) want to believe. Tesla revolutionized electric cars, but it also re-did a lot of design work itself, from scratch. Is it really that unbelievable that cars across nearly a decade could be suffering from a premature parts failure? It happens to non-electrics all the time!

Design

Also, from a design standpoint, I just… don’t think the cyber-truck looks that good. The previous four-door Teslas look great! They’re very slick, but they look a lot like some of the hottest cars in the market. A family car, or a commuter car. It blends in with the pack, and only stands out in traffic in good ways, like it’s lack of noise. The cyber truck looks nothing like the trucks it’s meant to compete with. The sides of the bed are raised so it meets the rest of the body on a nice, straight line. That sure looks cool, but for anything of actual weight, the driver can’t toss items in over the side. That’s one of those minor-but-annoying things that peeves owners off over time.

The glass is also armored, which is cool, but… what for? Who is driving this? Who’s afraid of getting hailed on or shot at, and doesn’t want a less conspicuous vehicle?  Or, the inverse – bougie celebrities with a lot of money and a lot of enemies might want a really conspicuous car but with stronger glass. Does the cyber truck do that? Kinda… but so do many sports cars.

It’s a cool idea, but it’s just that – an idea. The truck of the future, not the truck of right now. An electric truck is a great idea! But it doesn’t look anything like other company’s versions of the same concept does, so people may be reluctant to jump to Tesla, instead of Ford. Differentiation in cars can either give you the VW Beetle, or the Pontiac Aztec. Only time will tell how the cyber truck fares.

Sources:

https://www.tesla.com/cybertruck

https://www.nature.com/articles/d41586-019-03013-5

https://forums.tesla.com/discussion/60330/model-s-axle-problems

https://www.nature.com/articles/d41586-019-03013-5

https://www.forbes.com/sites/bradtempleton/2020/10/23/teslas-full-self-driving-is-999-there-just-1000-times-further-to-go/?sh=7c7734c32ba6