Traditional Culture Encyclopedia - Weather forecast - Is Tesla's autopilot credible in the recent 10 "out of control" incident?

Is Tesla's autopilot credible in the recent 10 "out of control" incident?

165438+1October 18 Shenzhen, a blue car? Tesla model? S suddenly accelerated and hit a taxi, then continued to accelerate and hit the back of a truck.

The driving state of this car is quite special. At first, it crawled at a very low speed, then accelerated rapidly, and still did not slow down after the first crash.

This operation is quite unconventional and does not involve the active safety system of the whole vehicle. The owner claimed that the vehicle broke down, while Tesla said that it was caused by improper driving of the owner, and the two sides failed to reach an agreement.

In fact, this is far from Tesla's first "out of control" incident in China. It was officially made in China at the end of last year and sold out quickly after the epidemic.

In the past six months, there have been nine suspected "out of control" cases in Tesla. The drivers all claimed that the vehicle suddenly got out of control, and Tesla said that the owner was not operating properly. Who is lying?

(1) Rashomon with frequent accidents and inconclusive results.

Since the first "out-of-control" incident occurred in the underground garage of Hangzhou Zhao Nvshi on May 2 1, Tesla has experienced "out-of-control" incidents almost every once in a while. It's basically a sudden acceleration. The most serious one was a user in Nanchong, Sichuan. The vehicle suddenly lost control in Ximen Market of Fumin Street, causing 2 deaths and 6 injuries.

At present, there are 9 incidents with great repercussions, except for the Beijing incident of 65438+February 65438+February, and the specific information of the owner has not been released. The other eight car owners happen to be four men and four women, so it is not obvious that they are all female drivers, mistaking the throttle as a brake or using the brake as a throttle.

Tesla's reply is similar. For example, in Shenzhen's "out of control" incident, Tesla replied that according to their background data, during the second collision, the driver kept stepping on the accelerator pedal instead of the brake, so the vehicle would continue to accelerate.

Including Mr. Chen, the former owner of Nanchang, the car crashed into a mound and burned at an ultra-high speed of 8km. Tesla's reply is that the background data shows that the driver has been stepping on the accelerator and has not stepped on the brakes.

All Tesla replies are based on background data. At present, there is no evidence that the background data is that the vehicle suddenly accelerates when braking. Although the car owners don't recognize it, there is nothing they can do.

This reminds the kung fu car of the college entrance examination that year. Indeed, there are often some students whose academic performance is quite good, but their final exam results are disastrous. Some of them really have problems, and some firmly believe that there must be problems with marking. So these students will do one thing, check the scores.

What was the score at that time? Candidates and parents can't see the test paper. If you ask for a score check, you will get the score of your test paper, and then add up the scores of each question through the computer. Because it was added by the computer itself before, the concept of error approached zero, so most candidates and parents were disappointed. But even so, it will explode every year, and there is indeed a phenomenon that computers are "added incorrectly" for various reasons.

Tesla's so-called background data is actually the same. Like computer addition, the probability of natural error is extremely low. The backstage display is accelerating, and of course the vehicle is accelerating. As for whether the driver stepped on the brakes or not, this is a "Rashomon" in itself. It is impossible for the driver to make it clear and there is no way to prove it.

(2) Problems occur frequently, and control mode is the key.

The braking of ordinary fuel vehicles is to drive the vacuum pump to generate vacuum through the operation of the engine. After stepping on the brake pedal, braking is realized by vacuum assistance. Electric vehicles mostly use electronic power brakes, and Tesla is no exception. According to the stroke of stepping on the brake pedal, the motor drives the brake caliper to clamp the brake disc to realize braking.

There is no difference between the two, but the control implementation is slightly different, and the biggest feature of Tesla is that the brake control is different from other electric vehicles.

Tesla's braking system is completely controlled by software and operated electronically, and all its working signals come from the signal sensor on the brake pedal. Then the software and hardware control of all Tesla systems are integrated, and finally the control is carried out through bus communication. Moreover, in Tesla's system, there is almost no hard wire connection between the sensor and the actuator, and it is all controlled by the bus, which is terrible.

Its advantage is high system integration, which greatly reduces the difficulty and cost of layout. As long as the system of the main controller is strong enough, theoretically, its control accuracy will be higher and its response will be faster.

The disadvantage is the lack of backup processing mechanism, and some rely on the stability of the overall control system and the reliability of the sensor. For example, an ordinary fuel car feels that the brakes are not working. Maintain it, or take it apart to see which part is faulty, so it won't go wrong all at once. In most cases, it will become unavailable first, then fail, and will not recover automatically after failure.

Tesla's brake failure may be a moment, or it may be that the signal recognition is not accurate, and its acceleration signal is also the same mechanism. This probability is extremely small, but the consequences may be irreparable.

In fact, after the frequent outbreak of Tesla's problems, a large number of car owners have reported that they have encountered similar problems. For example, when the R gear is obviously engaged, the vehicle just rushes forward. But this is generally an ultra-low speed state, which has not caused serious consequences, and the system may correct itself in an instant. For example, at high speed, there is no response when stepping on the brakes. Just restart the system. Fortunately, there is no danger.

It is difficult to determine what the problem is in this situation. It is different from the traditional mechanical failure, and even cannot be reproduced. It is far from safeguarding rights. Let's just say that everyone pays more attention to driving.

(3) What is the logic behind automatic driving becoming automatic assisted driving?

In addition to the frequent "out of control" incidents, Tesla's other heart disease is probably due to the problems caused by the owner's adoption of the automatic driving function. When Tesla first entered China, the Chinese translation of "Autopilot" was "Autopilot". Later, due to frequent problems, it was changed to "automatic driving".

Because it knows that its assisted driving technology is still at the level? Level 2, far from real automatic driving.

For example, not long ago, a Tesla car crashed into a truck at high speed. The owner's explanation is that there is no one around, and when the road conditions are good, the automatic driving function is turned on to open the gap. I don't want to jump off a big truck suddenly. He thought the active braking function would trigger, but he didn't expect it. When I slammed on the brakes, it was too late and I hit it.

In fact, the responsibility of the car is not great, but why the automatic driving mode failed to actively brake is worth exploring. Mainly determined by the environment at that time. At that time, the weather was clear, the sun was dazzling, and the truck itself was white, which may cause the system to fail to identify it in time and lead to misjudgment.

News about Tesla hitting a fire truck, sweeper, sanitation vehicle, or roadside police car when he started driving automatically is also endless. These cases are not worth mentioning individually, but together they point to one point. Tesla's autonomous driving technology has insufficient recognition ability for stationary or low-speed vehicles, which is determined by its control logic.

In fact, the current semi-automatic driver assistance technology can really liberate drivers to a certain extent, but it is also very easy to induce accidents.

Therefore, Tesla officials have been emphasizing that when using the Autopilot function, you still have to pay attention to the road surface and the steering wheel. It can only assist driving, not drive automatically. However, sales and supporters have been boasting that Tesla's autonomous driving can be completely "ignored".

(4) Kung Fu is angry

Any technology will have a mature process, such as the early dual-clutch gearbox, which should shift gears faster and have higher transmission efficiency in principle. But in practice, it is scary to use and it is easy to break.

After accumulating enough user data, the new generation dual-clutch gearbox is really easy to use. Although there are still setbacks, most of the time they are smooth and responsive. But for the first car owners, they may become mice unconsciously.

Tesla does have some radical ideas in design. Right or wrong is difficult to judge at present. Maybe it will be more and more perfect after accumulating enough user data. However, early users must be more careful and careful.

After all, a little mistake in the system may be beyond redemption for the user's family.

This article comes from car home, the author of the car manufacturer, and does not represent car home's position.