Israeli company Mobileye, which is owned by Intel, showed a video of a 40-minute drive through the city of an unmanned vehicle with cameras, without lidars and radars. For the whole trip operator in the driver’s seat has assumed the office only once: when removing a trip the drone was dead, and demanded the replacement of batteries reported in the blog of the head of the Mobileye Amnon of Shashua.
Now the accepted approach to the management of unmanned vehicles implies that the machine receives data about the environment using sensors of three types: cameras, lidars and radars. Each has its own features and benefits in a qualitative tracking of objects around. Cameras allow you to capture color images of high resolution to analyze objects on the road thanks to computer vision algorithms, lidars allow to obtain point clouds and to reconstruct reliable data on the volume. Radar, in turn, can be called a simplistic analogue of lidars, but with a lower resolution (at the same time — with less dependence on the weather).
The use of three different types of sensors means that the resulting data have different format. The standard approach to analyzing such data is that each sensor transmits data to one compute unit to merge, and after that the algorithms responsible for making decisions, working with a single model. In this scheme, the redundancy based on a diversity of sensors in one system. Mobileye uses to develop a fully Autonomous car a slightly different approach: two separate subsystems, one of which works with cameras and the other with lidar and radar.
In late may, the company showed a 40-minute drive of the UAV, equipped only with cameras. It comes with eight cameras for long-range shooting and four cameras with wide angle lens on different sides. Data from the cameras are processed in two pre-production chips Mobileye EyeQ5. The processing power of one chip is 24 trillion operations per second, and the electric power — 10 watts. The trip took place in Jerusalem with a sufficiently dense and aggressive traffic on the road. During the race the driver just again seized control to stop and charge the drone.
Mobileye showed in the video different camera angles: shooting with drone, model roads and surrounding objects, the view from the interior, as well as footage from certain cameras during difficult situations. The car went in good weather conditions on the roads are well-marked, but were faced with a fairly aggressive maneuvers of drivers and pedestrians, some of which violated the rules of the road.
In the video you can see that the car efficiently and pre-determines almost all objects and in General behaves similarly unmanned cars with lidars and radars. For example, at 12:18 you can see how the drone is not slowed down and drove around the pedestrian who was walking on the road bar the hydraulic cart. 13:55 drone behind a parked truck, trying to regroup in the left lane. However, he is not just waiting until all the cars pass, and like a car under human control to gradually protrude the front of the vehicle into the adjacent lane to show that he is trying to rebuild. Also significant moment can be seen at 22:12: car in advance and recognized the moped in the back, but realized that he had enough space between cars, and drove off, not slowing down.
In some situations the drone is still made mistakes or actions in which the driver-the person could do more efficiently without compromising security. For example, at 2:45 drone not rebuilt in the left lane to avoid a parked car, and was between bands and slowly rode the car. At 8:30 drone on the turn too much closer with the bus and forced him to shift to the left a little, and he moved to the right. At 18:30 on the circle, the drone entered formally correct, thinking that the other car is going to go to the range, but was slowed down all around the car. For all the shortcomings of the car, however, showed a very high quality drive for the system running only cameras — especially in difficult road conditions.
In addition to Mobileye there is one large company, creating a complete unmanned vehicle without lidar – based electric-vehicle maker Tesla. CEO Elon Musk and chief on artificial intelligence Andrew Karpati believe that the current set of Tesla sensors consisting of cameras and radars, in the future will be sufficient to achieve full autonomy, largely due to the development of computer vision algorithms,
In 2019, Tesla introduced the third generation of the computing module of its own design. It uses full duplication, and in fact the fee consists of two identical computers on separate operating systems that are powered by two different sources. The overall performance is 144 trillion operations per second. In the same year, the company demonstrated a fully Autonomous tour of the city and highways, including automated stopping and starting of the drive at intersections with traffic lights. Also in 2019 Elon Musk has stated that in 2020 the company will create a network of fully Autonomous taxi on the basis of the current-generation car Tesla and recently confirmed the plans, noting that their implementation is dependent on state permission.