All the data self-driving cars take in from cameras looks like this

Robo-cars are picking up everything its sensors see, hear, and detect.
 By 
Sasha Lekach
 on 
All the data self-driving cars take in from cameras looks like this
What to do with all that data from LiDAR sensors on autonomous vehicles? Credit: DAVID MCNEW/AFP/Getty Images

Self-driving cars are almost too observant, taking in information from light-emitting LiDAR sensors, radar equipment, microphones, and cameras. But all the information a car gleans from the outside world still has to be wrangled to be useful.

Cruise's fleet of self-driving cars testing in San Francisco take in petabytes of data each month from its sensor suite on the road and in simulation, similar to other configurations other self-driving car companies have on autonomous vehicles. A petabyte is a million gigabytes, by the way.

So to corral all this information, Cruise -- through a hackathon event -- created an open-source data visualization platform called Webviz. Other autonomous vehicle companies offer different aspects of the self-driving process, like Baidu's Apollo open-source autonomous driving platform. Now Cruise is opening up its application for anyone who works with robotics.

Original image replaced with Mashable logo
Original image has been replaced. Credit: Mashable

With Webviz, engineers can understand the autonomous vehicle data and analyze what the cars are doing out in the streets and help decide how the cars should drive or approach different situations. Even though there are robo-car specific aspects, Cruise says anyone in the robotics community can use the program.

So someone who works with a delivery bot or a humanoid mimicking human movement can plug in data inputs from their cameras and sensors and lay it out and visualize it for further analysis and interpretation, just like autonomous vehicle teams do.

Cruise says it uses the platform to watch simulations live or to examine past rides from an older data set. Here's a live demo to see how the data is displayed.

Original image replaced with Mashable logo
Original image has been replaced. Credit: Mashable

Cruise previously opened up its 2D and 3D scene rendering library, Worldview, and Uber made its tool Autonomous Visualization System publicly available around the same time back in February to turn self-driving data into 3D scenes.

Anyone who wants to start looking through their robotics data can now go to and use Webviz.

Mashable Image
Sasha Lekach

Sasha is a news writer at Mashable's San Francisco office. She's an SF native who went to UC Davis and later received her master's from the UC Berkeley Graduate School of Journalism. She's been reporting out of her hometown over the years at Bay City News (news wire), SFGate (the San Francisco Chronicle website), and even made it out of California to write for the Chicago Tribune. She's been described as a bookworm and a gym rat.

Mashable Potato

Recommended For You
DoorDash drivers are getting paid to close Waymo car doors
Waymo robotaxi

Tesla cars in the U.S. no longer come with Autopilot
Tesla FSD

Elon Musk: Tesla FSD will soon become subscription-only
Inside a Tesla, a driver uses Full Self Driving.

Car tech at CES 2026: Utter AI domination
The interior of a Sony Honda Afeela at CES 2026.


More in Tech

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

NYT Connections hints today: Clues, answers for April 4, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 4, 2026
Wordle game on a smartphone

Google launches Gemma 4, a new open-source model: How to try it
Google Gemma
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!