Got An iPod? Want To Steal Some Cars?
…We also found that the entire attack can be implemented in a completely blind fashion—without any capacity to listen to the car’s responses. Demonstrating this, we encoded an audio file with the modulated post-authentication exploit payload and loaded that file onto an iPod. By manually dialing our car on an office phone and then playing this “song” into the phone’s microphone, we are able to achieve the same results and compromise the car.
This tidbit, found on page 11 of “Comprehensive Experimental Analyses of Automotive Attack Surfaces” by researchers from the University of California (San Diego) and the University of Washington, says exactly what you think it says: it’s becoming easy for intelligent, dedicated criminals to steal your car — or, worse yet, to control certain functions of the car remotely while you’re driving it.
The complete article details the team’s attempts to find vulnerabilities in an unnamed, “100,000 to 200,000 units per year” sedan. Here’s another super-fun discovery by the team. I’ve bolded the horrifying part for our readers who don’t like long quotes.
For the former, we experimentally verified this by compromising two cars
(located over 1,000 miles apart), having them both join the IRC channel, and then both simultaneously respond to a single command (for safety, the command we sent simply made the audio systems on both cars chime). Finally, the high-bandwidth nature (up to 1 Mbps at times) of this channel makes it easy to exfiltrate data. (No
special software is needed since ftp is provided on the host platform.) To make this concrete we modified our attack code for two demonstrations: one that periodically “tweets” the GPS location of our vehicle and another that records cabin audio conversations and sends the recorded data to our servers over the Internet.
The entire article is worth reading, even if talk of “stack overflows” won’t exactly rivet those of you who didn’t grow up writing “sploits”. It details one exploit in which the team remotely unlocked a car and started it up so an “unskilled accomplice” could drive it away. Another scenario: by compromising a group of cars in the Google parking lot, decoding the VINs to determine which ones were expensive, and correlating the location of the car at 7pm to known property records, it would be possible to sell Google executive conversations to third parties. Gosh, I can’t think of anyone who would pay money to hear what the Google CEO is talking about in private.
The team goes on to state how the exploits they discovered can be easily disabled in the future by adding encryption, reducing unnecessary “easter eggs” in embedded vehicle code, and more thorough debugging. What they do not explicitly state is that anyone familiar with how the car business works will be rolling on the proverbial floor laughing at the idea of automakers taking due care with their on-board electronics.
Not frightened by the idea of losing your car to hackers in Romania? Unconcerned that someone might be able to remotely throw random inputs into the adaptive steering in your wife’s BMW while simultaneously cranking the stereo to 110 dB, permanently locking the doors, and turning off the headlights? Just think of what will happen when self-driving cars become the norm.