![]() |
'Apple will not likely want responsibility and neither will carmakers'
By Kim Bo-eun
A recent string of crashes involving Tesla vehicles in the U.S. is raising questions about the capabilities and safety of today's driverless technologies
Eleven accidents since 2018 have reportedly occurred while Tesla's Autopilot mode was engaged, including at least one that was lethal. The continued accidents in vehicles equipped with its Autopilot mode suggest that the technology is far from perfect.
Major car manufacturers, including Tesla, BMW, GM and Hyundai Motor currently offer the level 2 self-driving function. There are six levels of autonomous driving technology ranging from 0 to 5, with level 5 being the most complete form of automated driving.
Currently, most automated driving features available on the market require some level of human involvement, which puts them at level 1 or 2. Level 2 requires the driver to keep their hands on the steering wheel, but the vehicle steers, accelerates and brakes on its own during highway driving. Testing is underway for levels 3 and 4 in Korea, but these features are awaiting commercialization.
Accidents occurring from below level 3 are dealt with like those of regular vehicles: the driver is held accountable, as such accidents are largely attributed to human error, since accidents involving these levels don't involve high-level self-driving capabilities.
But for levels 4 and 5, artificial intelligence (AI) plays a central role. This means that the fault may lie with the vehicle instead of the driver, which complicates the issue of determining which entity is responsible when an accident occurs.
Car manufacturers as well as tech firms are working to develop the ultimate driverless vehicle, but this complexity of determining responsibility is making it difficult to commercialize a completely autonomous vehicle for widespread use, on top of the high prices and low demand.
Fully autonomous vehicles are forecast to be put to use first in public bus transportation systems, shuttle bus systems at airports, or in logistics, where they entail less risk when it comes to potential accidents.
Apple is on track to develop its first-generation self-driving electric vehicle (EV), tentatively named the "iCar." Attention has focused on the U.S. tech giant's choice of partners, given the fact that it needs a company to manufacture the vehicle.
Reports say that global carmakers such as Nissan were unwilling to partner with Apple because the iPhone designer was unwilling to include the car manufacturer's name in the vehicle's branding. Apple is also known to have discussed the matter with Hyundai Motor, but talks collapsed. Speculations arose that the discussions fell through for the same reason, but industry insiders say that the responsibility issue was likely the thornier matter.
"Apple will not likely want the burden of responsibility, and neither will carmakers. This issue of responsibility is what would have made discussions difficult and time-consuming," an industry source said.
While companies involved in the software and manufacturing of autonomous cars need to figure out these issues, the legal grounds defining responsibility will also have to be laid out. Policymakers and regulators including in Korea are working hard with carmakers to prepare for the safe deployment of driverless technology on roads.
Korea's Ministry of Land, Infrastructure and Transport has drawn up an accident investigation committee to look into cases involving self-driving vehicles.
"For the commercialization of higher levels of autonomous vehicles, setting up a legal basis will be crucial along with the connected infrastructure," said Kang Kyong-pyo, head of connected and automated driving research at the Korea Transport Institute.