Scientists have found a way to trick selfdriving carsNovember 7, 2019
Engineers around the world puzzle over how to make drones as safe as possible. Now, these cars revealed an unexpected vulnerability that could jeopardize passengers of unmanned vehicles, as well as pedestrians.
Scientists at the Max Planck Institute for Intelligent Systems have discovered that the drone is easy to confuse with a special picture. The car ceases to identify objects on the road if the cameras notice a special pattern: it is a bright spot with a certain color scheme. To determine the “working” combination of shades, the researchers spent only a few hours. After that, they clearly demonstrated what happens to the “smart” autopilot system, if such a picture is captured by cameras.
The drone reacts unpredictably to such a color spot. So, for example, it can sharply collapse or completely slow down.
In order to drive unmanned cars crazy, a small picture will be enough. It can be applied to a T-shirt of a person passing by, another car or some object of road infrastructure. Such a vulnerability seriously bothered scientists, and now they are actively working to solve the problem. Since, in addition to security threats, hackers can use such a bug to disable drones.
- Rent a 2021 Ford Bronco will be cheaper than the more affordable Bronco Sport
- Vietnamese VinFast presented three electric cars at once
- BMW wants to make its lineup easier
- Formula 1 Champion to compete in EV racing for the first time
- Tesla accused another employee of stealing classified data