AutomotiveLatestNewsTechnology

Mercedes-Benz addresses Level 3 legalities; lawyers say uncertainty lingers

In the meantime, Mercedes-Benz issued a response in late June that addresses one of the lingering questions: how the company views its liability for crashes or incidents that may occur when Drive Pilot is active.

In California, Nevada and Germany, the first three locations where Mercedes-Benz intends to launch Drive Pilot, “there are well-established legal systems for determining responsibility and liability of roads and highways,” the company told Automotive News in a written statement.

“While they might differ between jurisdictions, they still provide the legal foundation that is the basis of the respective tasks and duties,” the company said.

That’s both nebulous and inadequate, Widen said.

Because the technology is new, the status quo does not necessarily delineate responsibility between computer and human. He cautioned motorists should not assume they have been legally absolved when Level 3 systems are active nor feel reassured by statements made by manufacturers.

Without legal clarity, “then the whole line about relaxing and taking your time back is nothing but air,” he said.

Few precedents exist for how courts might treat cases that arise from Drive Pilot crashes, and the ones that do exist are imprecise comparisons:

  • A Tesla owner awaits trial on a vehicular manslaughter charge in California related to a fatal crash during which his Autopilot feature was engaged. But Autopilot is considered a Level 2 driver-assist system. With those systems, humans always remain responsible for vehicle operations even when the system is engaged.
  • General Motors settled a lawsuit that alleged a vehicle from its Cruise autonomous vehicle subsidiary knocked a motorcyclist to the street in San Francisco, causing injuries. But that involved a Level 4 self-driving test vehicle. With Level 4, human motorists have no role in the driving process.
  • The most direct precedent may arise from Brouse v. United States, a case stemming from a 1947 midair collision between a U.S. Army fighter plane and a small plane over Ohio. Although the fighter was under the control of an autopilot system, the U.S. district court ruled the human pilot still had an obligation to keep “a proper and constant lookout,” according to the ruling.

Motorists face similar exposure when using Level 3 systems unless new laws are written, said Widen, who co-authored a paper, alongside Carnegie Mellon University professor Phil Koopman, that proposes rules for attributing liability when computers and humans share control.

“They need a shield law for owners who engage Level 3 automated driving systems unless new laws are written,” he said. “You at least want an interim period where the company is on the hook because you have no evidence that warrants a belief that these systems are safer than a human driver.”

chonprasit

this is up to date news about automotive and technology