These days, modern cars are just as much high-tech computers as they are mechanical devices. A new car has dozens of sensors, cameras, and processors to control everything from the cabin temperature to emergency braking. Going one step further, cars with autonomous driving features like Tesla have even more technology packed inside of them.

The thing about smart computers, though, is that they can be increasingly dumb.

Researchers at McAfee released a video this week showing off what they call “model hacking.” They claim it’s a study how someone could intentionally “target and evade artificial intelligence.” In a fairly simple test, they vandalized a normal “35 MPH” speed limit sign in an effort to trick Tesla’s systems. And it worked. A little too easily, maybe.

How To Turn 35 Into 85

The McAfee group focused on older model Teslas that used the MobilEye camera system. The company hasn’t used it since late 2016, though. Regardless, a small piece of black tape over the middle section of number ‘3’ was enough to throw off the Tesla’s decision making.

Source: McAfee

The altered sign caused Tesla’s Speed Assist and Automatic Cruise Control to assume the speed limit had drastically risen to 85 MPH. As you can see in the video, the car accelerates quickly. In a real world situation, having your car decide to speed up dramatically in a 35 MPH speed limit zone could be disastrous.

“Even to a trained eye, this hardly looks suspicious or malicious, and many who saw it didn’t realize the sign had been altered at all,” the researchers wrote. “This tiny piece of sticker was all it took to make the MobilEye camera’s top prediction for the sign to be 85 mph.”

It should be noted that current Tesla models do not use MobilEye technology. In fact, they don’t attempt to read speed limit signs at all. That information is built into the GPS mapping program used for navigation.

Self-Driving is Not Really Self-Driving

Tesla’s self-driving features require full human attention at all times, in case intervention is needed. In theory, the human operator would realize the car is accelerating above safe operating speeds. Then they would apply the brakes before any damage was done. However, there have been multiple cases of Tesla drivers taking the term “Autopilot” a little too literally. Multiple crashes have been the result of humans failing to take over in a situation when autonomous features failed.

Automakers would like to sell you on self-driving cars being right around the corner. However, the reality is that we are still a long way from true self driving vehicles.

Shutterstock

Devon is a writer, editor, and veteran of the online publishing world. He has a particular love for classic muscle cars.