Saturday, March 7, 2026

“Physical AI” will appear in your car

Share

Physical AI sounds like a terminological contradiction. A computer, but a body?

But for marketing architects, it is the latest term in the art, a buzzword that aims to point us, the citizens, towards a dazzling and promising technological future.

Here on earth, the term is perhaps most useful because it helps us understand how automotive companies think of themselves today: as technology pioneers. It’s also a useful shorthand for understanding how appetizing the automotive industry is for chip companies –what could be a chance to reach $123 billion by 2032, an enhance of about 85 percent from 2023. There have always been silly robot demonstrations at the giant consumer technology showcase at CES that just took place in Las Vegas, but this year’s showcases showed how the world of robots, cars and chipsets is getting closer.

First, define the (marketing) terms: “Physical AI” is how technology developers ultimately hope autonomous systems will interact with the real world, using data from cameras and sensors to truly understand and think about what’s happening around them and perform elaborate tasks to respond. Physical AI is the humanoid robots that perform everyday work at Hyundai’s factory, as Google DeepMind, Boston Dynamics and the Korean automaker have announced they will do so in the coming months. It’s a car that can handle elaborate driving situations on its own, or perform an arguably more elaborate task: seamlessly transferring control between a human driver and a software-controlled driver. Physical AI allows autonomous systems such as cameras, robots, and self-driving cars to perceive, understand, reason, and perform or organize elaborate actions in the real world.

It is no coincidence that the companies that speak loudest about physical artificial intelligence are chip manufacturers, including: Nvidia and ARM. The former announced a completely up-to-date line of open-source artificial intelligence models intended for autonomous systems; the latter made its debut Physical AI at CES. They want to partially change the trend.

Witness, for example, the parade of autonomy-related announcements at CES, all of which will require massive onboard computing resources.

Ford says it will sell a system by 2028 that will allow drivers to drive without looking at the road ahead. Afeela, Sony and Honda’s battery-powered collaboration, will drive itself in most situations at some point (date TBD). Nvidia will do it deliver the chips for Chinese automaker Geely’s up-to-date “intelligent driving system,” which will eventually transition to what the company calls “high-level autonomous driving.” Nvidia is also involved in Mercedes-Benz’s up-to-date hands-free driving system, which will debut in the US later this year. The company says the system should eventually be able to travel between home and work without assistance. “This is already a huge business for us,” Nvidia CEO Jensen Huang said of autonomous cars during his presentation at CES.

“The central brain of the vehicle will now be much larger – hundreds of times larger – and that is the point [chipmakers] they sell,” says Mark Wakefield, global automotive market leader at the consulting company AlixPartners. “They see a great future in these vehicles.”

No wonder their marketers found a sexy up-to-date way to describe it.

Latest Posts

More News