An autonomous systems defense company contends it has successfully spoofed the GPS mechanism of a Tesla Model 3 using the automaker’s latest Autopilot technology, sending the vehicle off its intended route.

Regulus Cyber said it used commercially available hardware and software to wirelessly divert the electric car using Navigate on Autopilot, a Tesla feature that with driver supervision guides a car along the highway, from on-ramp to off-ramp, executing lane changes and navigating interchanges along the way.

According to Haifa, Israel-based Regulus, the car was three miles from a planned exit, traveling at a steady speed and in the middle of the lane with the Navigate feature activated when its test began. The car reacted as if the exit was 500 feet away, according to Regulus, slowing “abruptly,” flicking on the turn signal and turning off the road.

Now, to get this to work, the company said it had to install a 4-inch-long antenna on the roof of the target car. And Tesla, responding to questions about the software maker’s test, dismissed it as a sales ploy.

“These marketing claims are simply a for-profit company’s attempt to use Tesla’s name to mislead the public into thinking there is a problem that would require the purchase of this company’s product,” a Tesla representative said. “That is simply not the case. Safety is our top priority, and we do not have any safety concerns related to these claims.”

But the issue of GPS spoofing has hovered over autonomous driving from its inception. Relying on a wonky signal to get to your destination in a normal car may simply mean missing your exit. Relying on it to keep your car on the right path at 60 mph is something else entirely. Now that the general public has awakened to the fact that autonomous driving is getting closer to reality, addressing consumer safety concerns will be critical to facilitating mass adoption.

In a 2018 paper winkingly titled “All Your GPS Are Belong to Us: Towards Stealthy Manipulation of Road Navigation Systems,” researchers demonstrated the possibility that spoofing substituting pirate signals for those of a GPS satellite could stealthily send you to the wrong destination.

While they note the threat of GPS spoofing has been discussed as far back as 2001, and that spoofing has been shown to work in other contexts, their experiment was the first to test road navigation systems. The researchers used real drivers behind the wheel of a car that was being told to go to the wrong place.

Some 38 out of 40 participants followed the illicit signals, the researchers said.

And while cars with autonomous features have additional tech to protect against spoofing, they cautioned that other studies raised the specter of attacks on other systems, such as ultrasonic sensors, millimeter-wave radar, lidar (light detection and ranging) and wheel speed sensors.

“These new semi-autonomous features offered on new cars places drivers at risk, and provides us with a dangerous glimpse of our future as passengers in driverless cars,” said Roi Mit, chief marketing officer of Regulus Cyber. Curtis Kexiong Zeng, one of the authors of the 2018 study, said that successfully spoofing a Tesla Autopilot system depends on what kinds of maneuvers the car can make based on GPS location and without driver participation or permission.

“Generally speaking,” he said, “the threat of GPS spoofing increases as the level of automation goes up.”

So what about this test by Regulus Cyber? “This is entrepreneurial hacking,” said Colin Bird-Martinez, senior analyst in connected car software at IHS Markit. The Regulus attack is both time and labor intensive, he said, and relied on someone placing an antenna on the car itself, something any reasonably alert motorist would likely notice.

Read or Share this story: