Traditional Culture Encyclopedia - Photography major - Sensor fusion is the light of the right path for autonomous driving.
Sensor fusion is the light of the right path for autonomous driving.
On October 21st, Tesla CEO Elon? ·? Musk (Elon? Musk) tweeted that Tesla's Fully automatic driving (full? Self-Driving, hereinafter referred to as? FSD)beta? Version of the software has been pushed to a small number of customers. Earlier on October 11th, Baidu also announced that its driverless taxi service Baidu Apollo was officially opened for operation in Beijing.
since October, two consecutive waves of landing on autonomous driving technology have pushed people's enthusiasm for autonomous driving to a climax. Moreover, this year is actually an important year for autonomous driving, because if Google's autonomous driving project is launched in 21, this year is just the tenth year of commercialization of autonomous driving technology.
in the past ten years, as the application scene of artificial intelligence closest to life, autonomous driving has been highly anticipated by people. Today, with the breakthrough and application of big data, AI, 5G and other technologies, Baidu, Uber, Didi and Wen Yuan Zhixing, more and more autonomous driving technologies have moved from the laboratory to the road.
there may never be a single and most effective way to realize the sensing technology for ADAS and AV. This magic number may be six, because every automobile manufacturer will decide how to realize it in its own way with six basic considerations, which will lead everyone to create their own unique methods to integrate sensors into future vehicles.
some auto parts companies with strong comprehensive strength at home and abroad carry out multi-product layout on sensors of autonomous vehicles, which can provide comprehensive autonomous driving solutions for downstream customers and form strong competitiveness. These companies include foreign companies such as Bosch, Continental Group, Valeo, Hella, Delphi, Fujitsu, Autoliv and domestic companies such as Desai Siwei, Huayu Automobile and Baolong Technology.
more and more sensors are deployed in the whole vehicle to actively solve safety problems. How many sensors are there in our car today? How many sensors are needed to further improve autonomy? The answer to this question is that if we consider the sensors of ADAS-ultrasonic wave, radar, camera for induction, camera for observation and LiDAR, it is estimated that the vehicle has 1 to 2 sensors, depending on the vehicle type? .
sensors will be the key to solve the high level of automation, and the number and types of sensors are expected to increase.
The environmental monitoring sensors for autonomous driving mainly include cameras and radars. First, the cameras use image recognition technology. Realize the functions of distance measurement and target recognition; Secondly, radar uses the time difference and phase difference between transmitted wave and reflected wave. Obtaining the position and velocity data of the target object, radar can be divided into three types according to the different types of waves used: millimeter wave radar, laser radar and ultrasonic radar.
camera: the eyes of automatic driving
in terms of camera, it can be divided into forward-looking camera, panoramic camera (side+back-looking camera) and internal-looking camera according to the coverage position of visual field. The forward-looking camera is the most critical, which can realize lane departure warning system (LDW), forward collision warning system (FCW), and? Pedestrian recognition warning (PCW) and other functions. Forward-looking cameras include monocular cameras, binocular cameras, and even multi-camera cameras? First-class different solutions. Although binocular or multi-eye cameras have higher ranging accuracy and wider viewing angle, due to? Its high cost and high requirements for precision and computing chips make it still unable to be mass-produced. Mobileye's monocular camera solution is the mainstream of the market.
The industrial chain of car camera mainly involves three main links: upstream materials, midstream components and downstream products. Optical lenses, filters and protective films in upstream materials are used to manufacture lens groups, and wafers are used to manufacture? CMOS? Chip and? DSP? Signal processor; In the midstream lens group, CMOS? Chips and bonding materials are assembled into modules and combined with? DSP? The signal processor is packaged into a camera product. At this level of the industrial chain, upstream suppliers can already supply complete camera products to downstream vehicle or first-tier supplier customers.
in the car camera industry chain, the camera and software algorithm together form a car camera solution, which is applied to self-driving cars. The industrial chain of vehicle-mounted cameras is long, and there are many links in the upstream and downstream, each of which involves many manufacturers and companies at home and abroad.
compared with cameras used in consumer electronics, car-level cameras have higher requirements on shock resistance, stability, continuous focusing characteristics, thermal compensation, and anti-interference of stray light and strong light, so the module assembly process is complicated and the technical fortress is high. From the perspective of the global camera supply market, foreign companies such as Panasonic, Valeo, Fujitsu, China, Magna, etc. currently occupy a large share. What is the total market share of the top five manufacturers? About 59%, the concentration is relatively high.
Radar: the brain of autonomous driving
In terms of radar, it can be mainly divided into three categories: 1. Millimeter-wave radar: between microwave and infrared, frequency range? 1GHz? -—2GHz, with a wavelength of millimeter; 2. Lidar: between infrared and visible light, with a frequency of approximately? 1GHz, the wavelength is nanometer; 3. Ultrasonic radar: The frequency is higher than? 2Hz。 According to the formula: light speed = wave? Long * frequency, the higher the frequency, the shorter the wavelength. The shorter the wavelength, the higher the resolution; And the higher the resolution, it means that in? The measurement accuracy in distance, speed and angle is higher.
The reversing radar we usually use is ultrasonic radar, which emits sound waves and can only reach the speed of sound. Ultrasonic radar is small in size and low in price, but its detection accuracy is poor, its range is small, and it has great influence on high-speed movement, so it is not widely used in automatic driving.
millimeter wave radar is widely used, which emits electromagnetic waves and propagates at the speed of light. The main millimeter-wave radars are 24GHz and 77GHz. 24GHz has low frequency, narrow bandwidth and relatively low accuracy, and is mainly used for blind spot monitoring and automatic parking. And 77GHz has much higher precision, which can detect the distance more accurately, and the weather has little influence on it. The integration with the camera can complete the perception of the environment well.
However, millimeter-wave radar can sense the distance, but there is no accurate method to sense the specific shape of an object or the distance between two people in front, and there are many noises detected. For example, on an empty road, reflection will interfere with the judgment of millimeter-wave radar because of some ups and downs or particles on the road.
lidar can solve these problems well, and its accuracy can reach centimeter level. Each laser generator on lidar represents a line, and commonly used mechanical rotary lidar has 1 lines, 64 lines and 128 lines. Lidar is actually a kind of radar working in optical band (special band), and its advantages are very obvious.
first, it has extremely high resolution: lidar works in the optical band, and its frequency is 2 ~ 3 orders of magnitude higher than that of microwave. Therefore, compared with microwave radar, lidar has extremely high range resolution, angular resolution and speed resolution.
second, it has strong anti-interference ability: the laser wave is short, and it can emit a laser beam with a very small divergence angle (μrad order), and the multipath effect is small (it will not form directional emission and produce multipath effect with microwave or millimeter wave), and it can detect low-altitude/ultra-low-altitude targets.
thirdly, the information obtained is rich: the distance, angle, reflection intensity, speed and other information of the target can be directly obtained to generate a multi-dimensional image of the target. Fourth, it can work all day: laser active detection does not depend on external lighting conditions or the radiation characteristics of the target itself. It only needs to emit its own laser beam and obtain the target information by detecting the echo signal of the emitted laser beam.
However, due to the limitation of price and volume, at present, lidar is rarely assembled on production vehicles. Musk criticized lidar for being "bulky", "ugly" and "completely unnecessary" on many occasions. This is also a big disadvantage of lidar. At this stage, it is difficult for everyone to reduce its size and its position on the roof is abrupt, which directly affects the mass production, so we have not seen the lidar system installed on the mass production car yet.
The last type of ultrasonic radar has become a common automobile component, supporting driving assistance functions such as automatic parking, and will contribute to fully automatic driving in the future. Its working principle is mainly to measure obstacles within the range of .2-5m with the accuracy of 1-3 cm, and act as the "eye of the car". Ultrasonic radar can be divided into analog, four-wire digital, two-wire digital and three-wire active digital. Their signal anti-jamming ability is improved in turn, and the technical difficulty and price are generally progressive.
since the launch of Tesla's Autopilot, its dependence on ultrasonic radar has been very high, and it has always adhered to the 4+4+4 ultrasonic radar layout. In the early version, Tesla used 8 radars before and after parking assistance and all 12 radars in assisted driving. Tesla said that unlike the camera monitoring lane markings, ultrasonic radar can monitor the surrounding area and clear blind spots such as vehicles or other objects.
Tesla's "preference" for ultrasonic radar actually has a reason. As mentioned above, although the laser radar is good, its cost is too high, so it can't be assembled on large-scale vehicles for the time being, which also limits the promotion of high-level autonomous driving technology.
and ultrasonic radar is cheap. At present, the price of a single ultrasonic radar is about tens of yuan, the radar hardware cost of a reversing radar system is less than that of 2 yuan, and the radar hardware cost of an automatic parking system is about 5 yuan. In contrast, the price of millimeter wave radar is still in the thousand yuan level, and the price of laser radar is as high as several hundred thousand yuan. The relatively low price tightly binds the car companies with the ultrasonic radar, and promotes the prosperity of the vehicle-mounted ultrasonic radar market.
according to p&; S? According to Intelligence data, in 219, the global vehicle-borne ultrasonic radar market scale was 3.46 billion US dollars (about 24.39 billion yuan); The agency predicts that from 22 to 23, the global vehicle-borne ultrasonic radar market will maintain a compound annual growth rate of 5.1% and reach 6.1 billion US dollars (about 42.98 billion yuan) in 23.
However, ultrasonic radar is not the breakthrough of autonomous driving technology, and it is limited by its physical characteristics. The detection range of vehicle-mounted ultrasonic radar is limited to several meters, and it cannot accurately describe the position of obstacles. In addition, multiple radars in the same frequency band mostly adopt time division multiplexing to avoid echo "fighting", and the information acquisition speed is slowed down; Its detection accuracy is easily affected by vehicle speed, vibration, temperature and humidity, and it is full of challenges in anti-interference and calibration. In a word, ultrasonic radar is an "auxiliary material" rather than a "staple food". Only when it is matched with millimeter-wave radar, camera and even laser radar can it support a higher level of driving assistance function.
integration is the future of sensors
obviously, sensors will be the key to solve the high level of automation, and the number and types of sensors are expected to increase. More and more sensors are just the tip of the iceberg. Sensors will generate a lot of data, while the system is severely limited by its processing power.
so the more sensors, the better? Some people may think so, but for reasons of cost or integration, the number of sensors in cars will not increase indefinitely. It is expected that the number of automated sensors will reach a level at some point, and the main difference lies in the software level and the enterprise's ability to effectively handle a large amount of data. Some OEMs such as Tesla still don't use LiDAR, but bet on the combination of sensors and AI computing to achieve a high level of automation.
just like human feelings, sensors must be strategically positioned to continuously feed back the information around the car. However, there are technical limitations in the placement of sensors. For example, condensed water in headlights may prevent lidar from working. In snowy or cold weather, frost may cause sensor failure. The infrared sensor cannot penetrate the glass, nor can it be placed behind the windshield.
At present, there are three mainstream solutions for autonomous driving. One is based on vision, using GPS map and AI artificial intelligence for autonomous driving. At present, Tesla mode is mainly based on visual dominance. Tesla collects environmental data through the cameras of all Tesla cars, and combines image processing with machine learning to pass without relying on pre-recorded maps. Tesla cars collect data to learn while driving and share what they have learned with all Tesla cars, so as to view the terrain in a way similar to human eyes, and then analyze and guide autonomous cars to make decisions through artificial intelligence.
Second, based on lidar, visual guidance, using high-precision maps and artificial intelligence for automatic driving. This is the automatic driving mode adopted by the mainstream traditional OEMs GM, Mercedes-Benz, Ford and many autonomous driving companies including Waymo and Google. These vehicles depend on a pre-recorded 3D high-resolution map of the surrounding environment, which was previously captured and drawn by vehicles equipped with lidar. Then, the vehicle can use the map, use its own lidar equipment to locate and determine whether the environment has changed, and then control when cruising in the map area.
The third is the artificial intelligence autopilot based on the Internet of Vehicles and the fusion of multiple sensors. The networking of vehicles requires huge infrastructure investment and requires all the running autonomous driving to be in the same platform. Compared with the first two strategies, this is a broader ecosystem, and the complexity and uncertainty of vehicle autonomy can be reduced by investing in building smarter roads. This requires automobile manufacturers, V2X suppliers and municipal authorities to work together and create the infrastructure and standards of vehicles, so that vehicles can navigate smoothly and reduce the error threshold.
obviously, the first two solutions are realistic solutions based on the current traditional road conditions, automobile conditions and laws and regulations. Although Tesla's scheme is only adopted by one family, Tesla's volume in the electric vehicle market is also very large. It is hard to say that the vision-based autopilot scheme is definitely worse than the scheme based on lidar.
But one thing is certain, the third scheme based on the Internet of Vehicles is the only way for the future development of autonomous driving. Under the leadership of the Internet of Vehicles, a large number of sensors will inevitably be needed, which will cooperate with each other to form a complete automatic driving system with the automobile itself. Therefore, the development prospect of sensors can almost be described as a smooth road.
people's evaluation of cars
In p>219, the global output of self-driving cars was about several thousand, and it is expected to increase to 4, vehicles per year before 232, with a cumulative total output of 1 billion vehicles. The total revenue related to the production of self-driving cars will also reach $6 billion, of which 4% will come from the vehicle itself, 28% from the sensing hardware, 28% from the computing hardware, and the remaining 4% will come from integration. This means that in the next 15 years, a complete industrial ecology will be built around the technology of self-driving vehicles.
for this, Yole? Développement's analysts expect that the sensor revenue in 224 will be: Lidar will reach 4 million US dollars, radar will reach 6 million US dollars, camera will reach 16 million US dollars, and IMU will be.
- Related articles
- What's the use of upgrading the control room in Xiao Bo?
- How to describe the beauty of Old Town of Lijiang at night?
- What does Xishuangbanna mean?
- Sheng Di's musical instruments suddenly became popular.
- How to describe a scene on campus?
- How to make photos have stories?
- What does a mermaid look like?
- What are the famous streets in Shanghai?
- Be molested by a photographer
- Can the use of iphone external lens significantly improve iPhone's photographic ability? Which brand of external lens is good?