source： editor：zhang wenni
autonomous vehicles operate on a road in boao, hainan province, in june. [yuan chen/for china daily]
following a series of crashes, better education needed to define each autonomy level
after a string of vehicular accidents involving autonomous driving technology occurred in both china and abroad, experts have called for more efforts to educate consumers about the tech's current limitations while not dampening their enthusiasm for driverless cars' long-term potential.
the comments came after accidents occurred in china, including a fatal crash that allegedly involved the usage of a partially automated driving system, and the us national highway traffic safety administration opened an investigation into accidents involving tesla's autopilot.
li xiang, ceo of chinese electric vehicle startup li auto, said in a post on wechat that companies need to adjust how they talk about autonomous driving technology.
"the terms 'autonomous' and 'assistance' are conflicting with each other when describing one single status," li said. he also emphasized that drivers are ultimately responsible for their own safety even when using assisted-driving functions.
"i call on the media and industry bodies to unify the chinese terminology standard for autonomous driving," li said. "because users can't understand what level 2 and level 3 autonomous driving technology is, which is industry jargon."
currently, the global automobile industry defines six levels of autonomous driving, going from level 0 to level 5 and the degree to which the autonomous machine takes over. the standards were developed by the society of automotive engineers, a us-based professional association and standards development organization for engineering professionals.
level 2, or l2, includes features like lane-keeping assistance, adaptive cruise control and emergency braking. taken together, the features mean a car can automate many simple driving tasks, but drivers need to be fully engaged even when so-called level 2 systems are active.
level 3, or l3, refers to conditional driving automation and means that drivers can afford not to pay attention in certain situations. level 4 means that vehicles can perform all driving tasks under specific circumstances but human override is still an option. level 5 is full automation, without the need for any human intervention.
currently, almost all autonomous driving systems available for purchase today on cars are level 2 or below, experts said.
statistics from the ministry of industry and information technology show that 30 percent of new vehicles sold in the first half feature level 2 or driver assistance technologies. that means around 3.61 million vehicles that hit the road from january to june have functions that include lane-keeping and cruise control.
"though the l0-l5 system is well-known in the automobile industry, consumers have not necessarily heard of them, and just regard autonomous driving as full self-driving technology that allows them to relax their minds when driving," said jia xinguang, an independent auto analyst.
in some cases, driver-assisted technology malfunctioned and caused the crashes, but some drivers were to blame for the accidents because they did not do what they had to do to avoid the crashes, jia added.
he also said that car brands need to reiterate that the systems are not full self-driving technologies and require drivers to be fully alert.
a cautiously optimistic approach is needed when promoting the development of autonomous driving technology, jia said.
in august, a 31-year-old man was killed when his navigation-on-pilot nio-branded car rear-ended another vehicle on a highway in china.
us electric vehicle company tesla is also under investigation in the us following a series of autopilot crashes. the new york times reported that the us national highway traffic safety administration said in june that it was upgrading its preliminary evaluation of tesla's autopilot to an engineering analysis, a more intensive level of scrutiny.
the us nhtsa said it is aware of 35 crashes that occurred while autopilot was activated, including nine that resulted in the deaths of 14 people. but it said in june that it had not determined whether autopilot has defects that can cause cars to crash while it is engaged. the wider investigation covers 830,000 vehicles sold in the united states.