The projector has forced Tesla to swerve into the oncoming lane

Engineers from Israel and showed that the portable projector can be used for deception advanced driver assistance systems (ADAS), for example, in one experiment, Tesla was forced to go into the oncoming lane. During the experiments, they applied to the road image additional markings, pedestrians and cars, as well as projected on the walls road signs with a higher permitted speed. Article with description of attacks and methods of protection against them are published on the website of the International Association of cryptological research.

To date, there is no production of unmanned car 4 or 5-level classification SAE, however, many manufacturers equip the vehicles ADAS — advanced automated driving functions. As a rule, they can automatically hold the car in its lane, braking and track road signs. Each manufacturer notes that ADAS is exactly the help system, not a complete replacement of the driver, so any errors are not critical and they have to correct people. However, partly due to the position of the producers, and partly due to habituation to ADAS and relaxation, some people actually shifts the responsibility on autopilot and not watching the road.

Researchers in the field of information security are increasingly looking for scenarios of deception ADAS in order to find ways to protect yourself and also to show the importance of tracking the traffic situation behind the wheel. For example, last year the Chinese experts showedthat Tesla under the control of the autopilot is able to perceive points on the road in the quality of marking and to turn into the oncoming lane. However, this study, like many similar to him, took place in conditions that are difficult to reproduce in real life.

Researchers under the leadership of Ben Nissi (Ben Nassi) from the University of Ben-Gurion in the Negev showed that ADAS can be deceiving and a more realistic way that does not require long and permanent changes to the road environment using images displayed on the road and a wall projector, or embedded in commercials on the video screens.

As model devices they used Tesla Model X Autopilot system, as well as Renault Captur is equipped with modular system Mobileye 630 Pro, performing the functions of ADAS. Researchers have shown that both systems can be fooled by using projected on the trees or wall signs, and in one experiment the device from Mobileye cheated projection from the drone — that is, if necessary, the attacker may be targeted to attack a specific vehicle. In addition, the authors showed that this system can be fooled by displaying the modified advertisement with a small road sign that is displayed in some frames within a fraction of a second.

The most dangerous vulnerability the researchers found the autopilot Tesla, because he, unlike Mobileye, driving, and does not display warnings. Using the projector on the side of the road they had projected on the asphalt of the pedestrian and modified the markup, and in both cases the algorithms work correctly: in the case of a pedestrian the car ran over him, recognizing at the last moment, and the layout caused the car to swerve towards the curb.

As protection against such attacks, the researchers suggest developers of driver assistance systems to train the neural network to filter the objects and calculate among them fake based on the context of their location, relative size, light and texture. The researchers pre-sent their results in Tesla and Mobileye, Mobileye, however, replied that he did not consider it a vulnerability, but Tesla declined to comment, citing the modification of the vehicle authors. They note that, probably, the representatives of Tesla had in mind included an experimental mode of traffic sign recognition.

At Tesla there are other critical vulnerabilities Autopilot. For example, in 2016 the Tesla Model S crashed into a light truck, not noticing it on the background of the bright sky, then the driver died. And three years later occurred almost identical fatal accident involving Model 3. After that, one of Tesla owners got access to the raw data of the Autopilot system and foundthat it correctly processes the data, taking semi trucks over high objects, under which you can drive.

Leave a Reply

Your email address will not be published.