Rivian Is Going All-in on Automated Driving
- Sam Abuelsamid
- Dec 11
- 7 min read
Rivian held its first-ever AI and Autonomy day today at its software engineering facility in Palo Alto, California, and it has set its sights directly on Elon Musk and Tesla. Like Tesla, the upstart EV maker wants to bring automated driving to the masses, and it is largely relying on in-house capabilities to do it. Unlike Tesla, Rivian hasn’t deluded itself into thinking it can do it on the cheap with cameras only.
Custom Silicon
There’s an interesting trend happening in certain parts of the auto industry. Like many recent trends, it started with Tesla and spread to China. High-performance assisted and automated driving software needs a lot of computing power. Most automakers in the last decade have taken one of two paths to get there: Mobileye or Nvidia.
Mobileye was the pioneer in a lot of modern advanced driver assistance systems (ADAS), and they have a distinct approach. They design their own silicon that is optimized to run their own software very efficiently. They work with many of the world’s automakers, and over 200 million vehicles have been produced with Mobileye ADAS. What Mobileye doesn’t do is sell either its silicon or its software independently of the other.

Nvidia and other vendors, such as Qualcomm, have taken a very different approach. They have developed a series of system-on-chip (SoC) designs based on their graphics processing unit (GPU) designs. They started with GPUs for gaming PCs in the 1990s, but Nvidia chips are now running in all manner of devices from robots to cars to data centers, and AI companies all over the world are scrambling to buy top-of-the-line GPUs to run their models. In automotive, Nvidia will sell its SoCs to suppliers and automakers, but it has also developed an array of software, including its Hyperion Drive, which is a full automated driving system (ADS) stack. In addition to SoCs, Nvidia will provide all or parts of the Hyperion Drive stack. The automakers can choose which pieces they want from Nvidia, which they want to develop in-house, and which they want from other suppliers. It’s a completely a la carte offering.
Tesla started using Mobileye’s silicon and software when it debuted AutoPilot in 2015. But after a fatal crash in May 2016, Mobileye cut its ties with Tesla for misusing its system. Tesla then adopted Nvidia chips for the next couple of iterations of AutoPilot, starting in October 2016. Then in 2019, Tesla launched its own custom-developed AI accelerator chips and migrated its AutoPilot and later “full self-driving” software to the new compute platform.
More recently, a similar pattern has happened in China, where many of the EV startups were early adopters of the Nvidia Orin SoC, including Nio and Xpeng. But this year, both of those automakers and others have started to move away from Nvidia with either custom-designed chips or adopting other Chinese vendors like Horizon Robotics or Huawei. Interestingly, even Honda, which, like other Japanese automakers, has traditionally been a bit more conservative, announced last January at CES that it would be using a custom chip developed with Renesas in its upcoming 0 Series EVs from 2026.
Now it’s Rivian’s turn. When Rivian launched the second generation of the R1T and R1S in mid-2024 with a new electrical and electronic architecture, it switched its ADAS from chips supplied by Ambarella to a configuration with dual Orin SoCs. However, by the end of 2026, it plans to use an in-house developed SoC it has dubbed the Rivian Autonomy Processor (RAP1).

As part of the Autonomy Compute Module 3 (ACM3), Rivian is installing two RAP1 SoCs and claims the module can process 1,600 trillion operations per second (TOPS) or 800 TOPS each. For comparison, the Nvidia Orin will do 254 TOPS and Nvidia’s latest generation SoC, the Thor, will do 1,000 TOPS. Honda’s SoC with Renesas is claimed to deliver 2,000 TOPS while Tesla’s latest Hardware 4 SoC is believed to deliver 300-500 TOPS.
Downside of Custom Silicon
While developing custom silicon can have some advantages in terms of optimizing the chip for maximum performance and efficiency when running custom code, just as it does for Mobileye, it has some potential downsides. When a company like Qualcomm, MediaTek, or Apple develops chips for mobile devices and computers, they are selling hundreds of millions to upwards of one billion units a year.
The entire global automotive market is fewer than 100 million units. Since it started delivering vehicles in 2021, Rivian has built fewer than 200,000 units. Success in the chip market generally requires a lot of scale, and for automakers like Rivian, Xpeng, and Nio, that scale just isn’t there. That means the cost of each chip can potentially be much higher than buying something off the shelf.
On the other hand, Nvidia’s SoCs are designed for a wide range of applications and often contain many elements that aren’t needed for the automotive application. As a result, they may not be as energy efficient as something more optimized. That’s a real issue for electric vehicles in particular, especially if you plan to run a full ADS with a lot of sensors. An inefficient chip can eat into range, so a more costly chip built in lower volumes may be worthwhile.
Expanding the Automated Domain
So what is all of this computing power going to be used for? To deliver greater levels of automation. Rivian is developing an entirely new ADAS/ADS software stack that is based around a large driving model (LDM). The concept is similar to large language models, where you feed the model huge amounts of real-world driving data to train it. Motional has adopted a similar approach for the latest version of its ADS, and while Tesla hasn’t used the term LDM, what they have described seems conceptually similar.
The key to using such an approach is to not just dump in all of the data you can get, but also teach it what not to do. While Rivian hasn’t yet provided full details at the time of writing this article, it is also using a Group-Relative Policy Optimization (GRPO), which it describes as distilling superior driving strategies from the massive datasets. Hopefully, this means it will have policies like stopping for school buses with flashing lights and stop signs.
Multi-Modal Sensing
The second-generation R1 models already adopted much higher resolution cameras and imaging radar sensors that provide more resolution. But contrary to the claims of Tesla, which relies on cameras alone, even the R1 solution isn’t enough to safely go to eyes-off capability. Currently, Rivian is offering hands-off, eyes-on ADAS on divided highways.

The midsize Rivian R2 SUV is targeted to go on sale by the middle of 2026, but buyers might want to consider waiting until closer to the end of the year. That’s when Rivian plans to add the ACM3 computer and a lidar sensor. The lidar will be mounted at the front edge of the roof above the windshield. However, instead of a lump on top of the roof, Rivian has it mounted more flush with a notch at the top of the windshield to accommodate the new sensor.
Having multiple sensing modes is essential to provide a robust and safe ADS solution. Each of the different sensor types has strengths and weaknesses, and they are all complementary to each other. Cameras are excellent for object classification, but they perform poorly in low light or when the sun shines directly into them. Unless they are configured for stereoscopic vision, they are also poor at determining the distance to an object.
Radar can detect through atmospheric obstructions like rain, snow, and fog, and lighting conditions are irrelevant. Like cameras, radar is relatively inexpensive. But even the best imaging radar has much lower resolution than the cameras of lidar.
Lidar provides resolution between cameras and radar, works in all lighting, and, with modern software, can even cope with rain and snow. But until recent years, it has been considerably more expensive. However, in the last few years, the cost of solid-state lidar has come down substantially, and some of the latest units from companies like Hesai are under $200.
Rivian provided a video demonstrating the detection capabilities of cameras only, camera+radar and camera+radar+lidar in various conditions and the differences are striking. With all three sensor types, visibility of pedestrians is dramatically improved, especially in low light, and objects in nearly the same plane can be more readily distinguished.
What Is Coming?
The new Rivian Autonomy Platform software is designed to be scalable so that elements of it can be applied to both existing and future vehicles. Starting in early 2026, Rivian will offer an Autonomy+ upgrade for $2,500 as a one-time purchase or for a $49.95 monthly subscription.
On the existing second-generation R1 models, this will bring what Rivian is calling Universal Hands-Free (UHF). This will expand hands-off, eyes-on capabilities to more than 3.5 million miles of roads in the U.S. and Canada. That should allow hands-off operation on almost all roads with clearly painted lane markings. This still isn’t a full point-to-point hands-off operation, but that is on the roadmap, and if delivered, that would effectively be what GM promised several years ago for its since-cancelled Ultra Cruise system.

For the R2 with lidar, the goal is to eventually deliver hands-off, eyes-off or Level 3 capabilities and ultimately personal Level 4 or hands-off, eyes-off, brain-off. Initially, that L4 will probably be geofenced, likely to highways, and Rivian isn’t making any promises about turning customer vehicles into robotaxis. This is more like allowing drivers to take a nap on a road trip. Robotaxi-type operations really require surround lidar in addition to the cameras and radar.
It’s unclear at this point if or when the R1 will get an update with lidar and ACM3, although it does seem likely eventually.
AI Assistants
Almost every automaker is promising to bring some degree of AI assistant into its vehicles in the near future if they haven’t already. Most of the vehicles that Google Automotive Services (GAS) built-in will be getting their Assistant replaced by the Google Gemini model sometime in 2026.
Rivian’s infotainment system also runs on Android but doesn’t use GAS; it will be the Rivian Unified Intelligence system (RUI). Rivian will be releasing new models that run locally on the vehicle and using multiple LLMs to provide voice assistance and control, an agentic framework to connect apps like Google Calendar and predictive maintenance. The AI system is also intended to help service technicians with diagnostics. It will be particularly interesting to watch to see how often it leads techs to non-existent problems.
Rivian’s vertically integrated approach to software development certainly gives it the ability to move faster than most legacy automakers, which is why VW invested and formed a joint venture to leverage that software for future vehicles. But not every decision Rivian has made has actually been beneficial to the user experience. The company’s insistence on putting almost all controls into the touchscreen interface, including adjustment of vents and climate control, as well as its use of electronic door latches, are both examples of taking tech too far.
Hopefully, Rivian takes the time to properly test the new technologies it’s planning to roll out over the next couple of years, since much of it is safety-critical. The fact that it is going with things like multi-modal sensing is a good sign, but it’s only a first step.