Lyft self-driving Ford Fusion
Image Credits:Lyft

Lyft is using data from its rideshare drivers to develop self-driving cars

Lyft is using data collected from drivers on its ride-hailing app to accelerate the development of self-driving cars.

Lyft’s Level 5 self-driving car program is using the data to build 3D maps, understand human driving patterns and improve simulation tests — all tools needed to push their autonomous vehicle technology forward, according to a blog post on Tuesday.

The program is taking data from select vehicles in its Express Drive program, which provides rental cars and SUVs to drivers on its platform as an alternative to options like long-term leasing.

These Express Drive vehicles are equipped with forward-facing camera sensors to collect the data. Level 5 also collects data from its autonomous vehicles in Palo Alto as well as the safety cars that follow its AV when they’re testing on public roads. Every driver in the program receives a one-page disclosure detailing information about the camera and the data being collected, a Lyft spokesperson told TechCrunch. The camera isn’t linked to the driver in any way; it’s forward facing and doesn’t collect audio, the spokesperson added.

Technology developed by Blue Vision Labs, which Lyft acquired in 2018, is used to convert that rideshare driver data into city-scale 3D geometric maps.

“While mapping operations teams can build 3D geometric maps for AVs, keeping them up-to-date and scaling their scope is a challenge,” Lyft’s head of AV research Peter Ondruska, engineering manager Luca Del Pero and product manager Hugo Grimmett wrote in a blog post Tuesday. “We’ve mapped thousands of miles thanks to the wide geographic coverage of the cars on our network. We’re able to continuously update our maps based on a constant stream of data that is immediately logged when a ride is completed.”

Lyft maps
Image Credits: Lyft

The cameras also help collect data on scenarios that drivers face daily. Those scenarios are then used to make Lyft’s simulations more sophisticated and realistic.

Techcrunch event

Join 10k+ tech and VC leaders for growth and connections at Disrupt 2025

Netflix, Box, a16z, ElevenLabs, Wayve, Hugging Face, Elad Gil, Vinod Khosla — just some of the 250+ heavy hitters leading 200+ sessions designed to deliver the insights that fuel startup growth and sharpen your edge. Don’t miss the 20th anniversary of TechCrunch, and a chance to learn from the top voices in tech. Grab your ticket before doors open to save up to $444.

Join 10k+ tech and VC leaders for growth and connections at Disrupt 2025

Netflix, Box, a16z, ElevenLabs, Wayve, Hugging Face, Elad Gil, Vinod Khosla — just some of the 250+ heavy hitters leading 200+ sessions designed to deliver the insights that fuel startup growth and sharpen your edge. Don’t miss a chance to learn from the top voices in tech. Grab your ticket before doors open to save up to $444.

San Francisco | October 27-29, 2025

The Level 5 team also uses visual localization technology to track the trajectory that Lyft drivers follow on the road. That trajectory information is then used to help Lyft’s AV maintain the best location in their lane, which isn’t always to remain centered.

Topics

, , ,
Loading the next article
Error loading the next article