Scientists have found a way to trick selfdriving carsNovember 7, 2019
Engineers around the world puzzle over how to make drones as safe as possible. Now, these cars revealed an unexpected vulnerability that could jeopardize passengers of unmanned vehicles, as well as pedestrians.
Scientists at the Max Planck Institute for Intelligent Systems have discovered that the drone is easy to confuse with a special picture. The car ceases to identify objects on the road if the cameras notice a special pattern: it is a bright spot with a certain color scheme. To determine the “working” combination of shades, the researchers spent only a few hours. After that, they clearly demonstrated what happens to the “smart” autopilot system, if such a picture is captured by cameras.
The drone reacts unpredictably to such a color spot. So, for example, it can sharply collapse or completely slow down.
In order to drive unmanned cars crazy, a small picture will be enough. It can be applied to a T-shirt of a person passing by, another car or some object of road infrastructure. Such a vulnerability seriously bothered scientists, and now they are actively working to solve the problem. Since, in addition to security threats, hackers can use such a bug to disable drones.
- Mitsubishi L200 did not cope with the “moose test”
- Tesla Roadster Decomposes in Space
- A glowing Mercedes-Benz logo can lead to an accident
- Volkswagen Beetle pickup made into tiny camper
- Why does an electric Hummer need “crab mode”