Skip to Content

'Self-driving' cars are changing Utah roads—sometimes with fatal consequences, lawsuit alleges


Francisco Kjolseth // The Salt Lake Tribune

‘Self-driving’ cars are changing Utah roads—sometimes with fatal consequences, lawsuit alleges

The site of a fatal crash that happened along Interstate 15, in the area of 14800 South in Draper.

Around midnight on Pioneer Day two years ago, a pair of 21-year-olds—a man and a woman—decided to make a trip from Davis County to Utah County to retrieve the man’s “church clothes,” presumably to wear hours later that Sunday morning, The Salt Lake Tribune reports

The man, according to a recently filed lawsuit, was “tired and not in a condition to drive as an ordinarily prudent driver.” The woman also couldn’t drive because she was tired and suffering from a headache, according to the lawsuit.

But, the lawsuit contends, the “utility” of the man’s church clothes outweighed any “risk of harm” to the pair or other motorists. And they had the woman’s father’s Tesla—equipped with the company’s trademark “Autopilot” technology—at their disposal.

Driving between 75-80 mph south in the HOV lane of Interstate 15, the pair reached the area of 14800 South in Draper around the same time that Landon Embry, on a Harley Davidson motorcycle, decided to cross the double-white line and merge into the HOV lane. The lawsuit claims Embry, 34, signaled for “several seconds” before changing lanes while slowing down, the Utah Highway Patrol reported.

Once Embry was in the HOV lane, the Tesla caught up to him. It didn’t stop in time before crashing into the back of the motorcycle, throwing Embry from the bike. He died that night on the highway.

Embry’s parents sued the company and the vehicle’s driver, alleging that the electric car manufacturer “lulled consumers … into a false sense of safety” that its self-driving features would prevent collisions..

The lawsuit, filed in August in 3rd District Court, alleges that neither the driver of the 2020 Tesla Model 3, nor its passenger, were in a condition to drive on July 24, 2022, but got behind the wheel anyway, thinking that the car’s Autopilot feature would “drive the vehicle, keeping them, and those around them safe from the Tesla colliding with other vehicles.”

“[A] reasonably prudent driver, or adequate auto braking system, would have, and could have slowed or stopped without colliding with the motorcycle,” the lawsuit alleged.

The lawsuit accuses the man, the woman, and her father of negligence, and claims that Tesla was also negligent and sold a defective and dangerous product.

In addition to this crash, the lawsuit outlines 54 other crashes around the world involving Teslas with Autopilot engaged.

The lawsuit states that Embry’s parents seek punitive damages “sufficient to grasp the attention of the Tesla Defendants … such that they take action to remediate defective Teslas they have placed into the marketplace.”

Indeed, Tesla cars make up the vast majority of crashes involving cars with the level 2 autonomous technology, according to data collected by the National Highway Traffic Safety Administration. With level 2 technology, a car assists with acceleration, braking, and steering, while the driver “remains fully engaged and attentive.”

The agency has collected more than 1,700 such reports since summer 2021 (including some reports for vehicles involved in crashes prior to 2021). Teslas account for more than 1,400 of those crashes. Tesla also sells more cars with such technology, The Verge reported in 2022.

Around the time it started collecting crash reports, the agency also launched an investigation into Tesla’s driver-assist technology, which ultimately led to Tesla issuing a recall in December 2023 for more than 2 million of its cars with its basic Autopilot package—including all Tesla Models 3 made between 2017 and 2023.

To remedy the issue, Tesla provided a software update with new controls that monitor drivers, meant to ensure they remain engaged while driving. The administration has since opened an investigation into the efficacy of the recall.

The Utah lawsuit and the tragedy underpinning it are examples of how the rollout of autonomous car technology in the U.S. has shifted the driving paradigm—both in how roads are designed, and how courts work to determine who is liable for crashes—all while experts say we live in a world where not all drivers fully understand to what extent cars can drive themselves and how much a driver should be involved.

While the technology is meant to make driving safer, and experts also believe this technology ultimately will, there are still kinks to work out, said Blaine Leonard, transportation technology engineer with the Utah Department of Transportation.

Representatives from Tesla did not respond to The Salt Lake Tribune’s requests for comment and have not yet filed court papers in response to the lawsuit. Court records indicate the man, woman, and her father named as defendants have not yet retained counsel, and The Tribune was unable to reach them for comment. The Embrys’ attorney also did not provide comment.

Technology Still Requires ‘Undivided Attention of Drivers’

The National Highway Safety Administration acknowledges that automated driving systems aim to make roadways safer. It envisions a future where automated driving systems “may be able to handle the whole task of driving when we don’t want to or can’t do it ourselves.”

But that isn’t yet a reality.

“At this time,” the administration said on its website, “even the highest level of driving automation available to consumers requires the full engagement and undivided attention of drivers.”

The website breaks down automation from level 0 to level 6. In the United States, the highest level available to buy is level 2, called “additional assistance.” (Some states permit higher-level self-driving vehicles on roads for testing, research, and pilot programs.)

The administration found in its investigation of Tesla, according to an update released in April, that the company’s driver-assistance technology was misleading and “elicits the idea of the drivers not being in control.”

“This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation,” according to the administration. Other companies tend to use “more conservative technology,” the report found, like “assist,” “sense,” or “team” to hint at a driver’s necessary involvement.

Leonard said despite how systems are named, he’s noticed that drivers aren’t always educated on how these systems work and how they should be used.

“So I get in a rental car, and I don’t know exactly what the car is capable of doing, or what [the system is] called, or how I turn it on or how I turn it off,” he said.

Yet, Leonard also cautioned restraint, saying crashes involving cars with autonomous technology “kind of get blown out of proportion in the media.”

Last year, according to National Highway Traffic Safety Administration estimates, 40,990 people died in vehicle crashes. In their dataset of level 2 technology crashes, which ranges multiple years since they began collecting data in 2021, 42 fatalities have been reported.

“I think we’ll figure it out over time,” Leonard said. “It’ll either get better, so you can take your eyes off the road for a while … or we’ll just come to understand these are driver-assistance systems, largely, not driver-replacement systems.”

Leonard said that UDOT began preparing for automated driving technologies more than a decade ago—when Google began testing its self-driving car.

“As agencies, we started talking about, ‘What does this mean for us? How’s this going to work? How fast is it going to come?'” he said.

Leonard said he thought the technology would be fully rolled out by 2020, but it’s been more difficult than he and others in the industry imagined to teach the cars how to properly respond to “niche” situations, such as when it finds itself near emergency responders. He said an autonomous car in San Francisco made the news in June 2022 when it drove over a fire engine’s water hose.

“It doesn’t know what’s happening. It’s not much different than a small limb,” he said. “It doesn’t know there’s water going through that.”

As far as how UDOT has had to adjust to this technology, he said the agency mostly focuses on lane striping and has run studies to determine how systems engage with different types of striping—like when newly repaved roads haven’t yet been repainted, or when they have, but remnants of the old lines remain.

They are currently experimenting with adding dotted lines across on- and off-ramps to show a car that it’s entering a ramp.

Who Is Liable, if Anyone?

Utah Highway Patrol investigated the crash that killed Embry, and forwarded the case to the Salt Lake County district attorney’s office.

District Attorney Sim Gill said prosecutors decided this summer not to file any charges after its investigators learned that the Tesla did register Embry’s motorcycle and tried to stop—nearly a second before the driver pressed the brakes himself—but it couldn’t stop fast enough to avoid Embry, who was apparently slowing as he changed lanes.

“If anything,” Gill said, “the system did what it was supposed to do. It reacted to the hazard sooner than a human being could have reacted.”

Because prosecutors declined to file charges against the Tesla’s driver, The Salt Lake Tribune is not currently naming the defendants identified in the lawsuit at this point in the litigation. The Tribune typically does not name people accused of crimes unless they have been charged with felony offenses.

It’s not clear why Embry slowed down as he merged, but Gill’s investigators theorized he may have been trying to get to the shoulder to stop and help a stalled car.

Gill called the crash a “tragic accident,” but said his prosecutors couldn’t meet the threshold to prove the driver acted criminally. Gill noted that a civil lawsuit would have a lower burden of proof.

“It is not unrealistic for us to say we can have certain kinds of conduct for which we may not have a criminal remedy, but we may have a civil remedy, because accidents do happen. Machines do malfunction,” Gill said. “But is that the same as the intentional conduct which causes harm, or reckless conduct that causes harm? You have to have a little bit more for the purposes of criminal culpability.”

But what about in cases where the car’s technology and driver’s intervention is less clear?

Gill said those questions haven’t yet been answered.

“I think that’s going to be an interesting space for us to continue to think about what is that responsibility,” he said. “As A.I. continues to encroach into our day-to-day living, when does that criminal accountability to the actor go, [when] can you surrender that?”

According to the the National Highway Traffic Safety Administration, “drivers will continue to share driving responsibilities for the foreseeable future and must remain engaged and attentive to the driving task and the road ahead with the consumer-available technologies today.”

But it acknowledged that questions about liability and insurance will continue to arise, and that issues related to fully autonomous vehicles haven’t yet been rectified.

“Policymakers,” the organization said, “are working to address [these issues] before automated driving systems reach their maturity and are available to the public.”

This story was produced by The Salt Lake Tribune and reviewed and distributed by Stacker.


Article Topic Follows: stacker-News

Jump to comments ↓

Stacker

BE PART OF THE CONVERSATION

KTVZ NewsChannel 21 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content