6 LINKEDINCOMMENTMORE

In a massive traffic jam in San Diego, I watched a colleague who learned to drive in New Jersey cut gracefully through six lines of super-dense moving traffic.

He called his performance “Jerseying.” I define it as forcing your way in front of another car, using gentle-to-aggressive “auto-body language.”

“Jerseying” is becoming more aggressive now that drivers are recognizing the presence of cars equipped with automatic emergency-braking and advanced driver-assist systems. They yield and stop for cars in their paths.

“I was out recently in Silicon valley with a friend of mine driving around, and he made a point of showing me how much he likes to screw with Tesla cars, while he’s driving around the 101,” says Missy Cummings, the director of the Humans and Autonomy Laboratory and Duke Robotics at Duke University.

“He says, ‘It’s pretty hilarious because there are so many of them, I just love to play games with them.’ This is meant in a benign way, a pranksterous way. Do [self-driving cars] know they’re being screwed with? No.”

Such behavior is likely to increase, along with overall risk-taking: A Transport Research Laboratory project in the United Kingdom measured human driver behavior in the presence of autonomous vehicles. It reported in February that human drivers would take more risks if they knew they were driving among autonomous cars that would stop and yield for human-driven cars.

“I was actively looking for self-driving vehicles, as I felt I could pull out in a smaller gap than normal in front of them,” one participant was quoted in the TRL report.

Another participant said, “I feel I would be more inclined to take risks.”

Cummings, who is an unmanned vehicle expert and was one of the U.S. Navy’s first female fighter pilots, said drivers may be given these semi-autonomous cars more credit than deserved: “[Self-driving cars] are not as smart as we think they are.”

She points to a crash involving semi-autonomous Volvo that was hit by an at-fault car in March. If both cars had been in full control by the people behind the wheel, she said, the crash probably wouldn’t have happened.

Automakers are adding features that may improve a self-driving car’s ability to know when it is being gamed. Tesla in January announced its “Shadow” learning system that is being loaded into some of its cars with Autopilot self-driving capabilities. This means the car’s computers will pay attention to moves the driver makes — perhaps “Jerseying” — and eventually learn the difference between mistakes and deliberate gaming.

“It’s in [an autonomous car’s] realm to see that drivers are behaving in a risky manner. Or in a degree of aggressiveness,” added Cummings.

Autonomous cars are also expert tattletales to law enforcement and insurance companies.

“The interesting thing is in an autonomous car, because of the internal and external cameras, you can classify the persons in the car, not necessarily just its driver, and every other license plate the car can see. Now everything is being recorded,” Cummings warns. “And Tesla and Google and everyone else can absolutely be selling camera imagery of all the cars around.”

In other words, every autonomous car has the ability to monitor for profit.

I plan to keep that in mind should I get the urge to Jersey a Tesla.

6 LINKEDINCOMMENTMORE
Read or Share this story: http://detne.ws/2oXnpGF