top of page

January 6, 2026 - Nvidia Announces Open Source Automated Driving Model

This is the Telemetry Transportation Daily for January 6, 2026, and I'm Sam Abuelsamid, Vice President of Market Research for Telemetry.  


During a keynote presentation at CES 2026, Nvidia founder and CEO Jensen Huang again focused on his company's efforts around artificial intelligence, including the announcement of a next-generation Rubin GPU chip. But a significant portion of the event was dedicated to automated driving. This included a discussion of how Nvidia has built tools to extract information from driving logs to build driving models and create synthetic data for training and testing scenarios. Given the nearly infinitely variable conditions that vehicles must operate in and the scenarios that can happen, this is essential for verifying that the systems can work reliably and safely under all conditions. 


Nvidia first got involved in automated driving as the provider of GPUs that were being used to power most of the systems being developed by everyone from automakers to startups. From 2018, Nvidia also began building its own automated driving software stack and offering those components to the companies using its chips. The combination of what is now a complete automated driving software and hardware solution from the operating system up, along with a reference set of sensors and compute, became Hyperion Drive. While various companies are using elements of Hyperion Drive, Mercedes-Benz is the first automaker to launch a vehicle utilizing the entire Nvidia solution for its ADAS and automated driving with the 2026 CLA. 


Huang announced details of the latest version of the Hyperion software, which consists of four main software layers, starting with the Halos safety OS. At the top is the new Alpamayo end-to-end AI world driving model. This includes all of the perception, prediction, and motion planning for a vehicle to drive. The Alpamayo world driving model is now open-source and available for anyone to use. However, the safety-critical nature of driving means that a single system is not adequate. Alpamayo is underpinned by a safety and policy evaluator that checks to ensure that the output of Alpamayo won't cause a crash or break rules, similar to the way Mobileye uses its reliability-sensitive safety model. If the safety and policy evaluator determines that the decisions of Alpamayo may be a problem, there is a full classical automated driving model that is always running in parallel that takes over to provide a robust solution. 


This fully redundant and diverse approach is necessary to create safe automated driving. Starting this quarter, the CLA will feature hands-off, eyes-on driver assist that works on almost all roads that Nvidia and Mercedes call Level 2++ and over time, they plan to expand the feature set with the goal of eventually reaching Level 4 capability. 


Thanks for listening.

bottom of page