For the last one a year and a half ago, two hacked white Tesla Model 3 sedans, each equipped with five additional cameras and one palm-sized supercomputer, were quietly cruising around San Francisco. In a city and era rife with questions about the capabilities and limits of artificial intelligence, the startup behind modified Teslas is trying to answer a plain question: How quickly can a company create software for autonomous vehicles today?
The startup, which is going public for the first time today, is called HyprLabs. Its 17-person team (only eight of them are full-time) is split between Paris and San Francisco, and the company is led by veteran autonomous vehicle company Zoox co-founder Tim Kentley-Klay, who he suddenly left in 2018, a company now owned by Amazon. Hypr has raised relatively tiny funds – $5.5 million from 2022, but its ambitions are wide-ranging. Ultimately, he plans to build and operate his own robots. “Think about the love child of R2-D2 and Sonic the Hedgehog,” says Kentley-Klay. “It will define a new category that does not currently exist.”
For now, though, the startup is announcing its software product called Hyprdrive, which it is touting as a step forward in the way engineers train vehicles to pilot themselves. These kinds of leaps are happening across the robotics space thanks to advances in machine learning that promise to reduce the cost of training autonomous vehicle software and the amount of human labor involved. This evolution in training has brought a up-to-date movement to a space that has suffered for years from a “trough of disappointment” as technology developers failed to meet their own deadlines for operating robots in public spaces. Now robotics pick up paying passengers in an increasing number of citiesand carmakers are making ambitious up-to-date promises regarding introducing autonomous driving into customers’ passenger cars.
But using a tiny, agile, low-cost team to go from “driving pretty well” to “driving much safer than a human” is itself a long hurdle. “Hand on heart, I can’t say it’s going to work,” Kentley-Klay says. “But what we’ve built is a really solid signal. It just needs to be scaled up.”
Aged technology, up-to-date tricks
HyprLabs’ software training technique is a departure from other robotics startups’ approaches to teaching their systems to drive themselves.
First, a little background: For years, it seemed like the huge battle in autonomous vehicles was between those who only utilize cameras to train software – Tesla! – and those who also rely on other sensors – Waymo, Cruise! – including once expensive lidar and radar. But larger philosophical differences swirled beneath the surface.
Camera-only advocates like Tesla wanted to save money by planning to launch a giant fleet of robots; for a decade, CEO Elon Musk’s plan was to suddenly switch all of his customers’ cars to autonomous cars via software updates. The advantage was that these companies had a lot of data at their disposal, as their not yet autonomous cars collected images wherever they went. This information was fed into a so-called “end-to-end” machine learning model through reinforcement. The system records images –bicycle—and spits out driving commands—move the steering wheel to the left and accelerate gently to avoid impact. “It’s like training a dog,” says Philip Koopman, an autonomous vehicle software and safety researcher at Carnegie Mellon University. “At the end you say, ‘Bad dog’ or ‘Good dog.’
