Hard to believe the Ride AI summit was almost two weeks ago. If you missed any part of the event, don’t worry. Most of the talks are already available on our YouTube channel—and the rest will be uploaded soon. (Plus, here is a great summary from MotorTrend).
🌱 The full spectrum challenge of deploying Waymo robotaxis
🔬 How AI breakthroughs are pushing the frontiers of automation in personal vehicles
🚛 The technology and tactics of autonomous trucking
🤳 What AV can learn from the last decade of gig mobility
💰 Lessons learned and new opportunities for investors in the AI era
🌏 Inside China’s revolution in electric and autonomous vehicles
Zoox has announced the start of testing in Los Angeles. The company will first perform human-driven mapping missions with its fleet of retrofitted Toyota Highlander vehicles before moving onto automated testing with safety drivers, and eventually driver-out testing with custom-built vehicles.
While the California DMV has not yet updated their website with Zoox’s Los Angeles permits, Zoox says it will deploy a “small number” of test vehicles in a “small area” of the city. The Amazon-backed company also says it will welcome its first public riders in Las Vegas and San Francisco “later this year.”
Last year, I visited Zoox’s depot in San Francisco, where the company was conducting driver-out tests of its custom vehicle. I even got to examine one up close during an open house event. While I was impressed by how thoughtfully designed the vehicle is, I had some questions about the material choices and what the ride experience would actually feel like—especially with half the seats facing backward. As always, a ride report will follow… whenever they let the public try one in motion.
Hesai has secured a 200K unit LiDAR order from Leapmotor, a Chinese EV manufacturer, for its upcoming models. This represents almost half of Hesai’s 2024 LiDAR shipment volume, which Hesai estimates will more than double to 1.2M units in 2025.
Leapmotor is not the first car company to integrate Hesai’s LiDAR sensors. Li Auto, a popular luxury car brand in China, also uses them; and in the ridehailing field, many of the major service providers, including Baidu’s Apollo Go, Pony.ai, WeRide, and Zoox, do too.
Waymo held a media event in Tokyo this week to commemorate the start of its Tokyo mapping efforts in collaboration with GO, Japan’s leading taxi platform, and Nihon Kotsu, Tokyo’s largest taxi company. Some notable points from this press event:
I, for one, am really looking forward to the Japanese in-car Waymo collaborations. With Japan’s tendency to have notable actors provide voice acting for the most random products, I can see a world in which we start getting anime characters welcoming riders into the car…
South Korean autonomous delivery startup Neubility and Samsung C&T start offering robot delivery services outdoors… and indoors. This is a departure from more traditional robot delivery services that have either remained entirely outdoors or entirely indoors. Neubility’s robots are also a departure from the norm as they rely on camera based VSLAM instead of LiDAR and GPS to map its surroundings.
I think Neubility’s concept of delivery robots being able to seamlessly transition from outdoors to indoors is extremely interesting, and I’m very curious how it will pan out. It’s a difficult proposition after all. Not only does the robot need to understand where it is outdoors and adapt to a variety of weather, traffic, and terrain, it also needs to be small and nimble enough to fit in the often tight spaces of Korean high-rises.
One benefit of Neubility’s system is its integration with building elevators, although which elevators it integrates with is unexplained in its promo video from earlier this year. It also remains to be seen how well the bot can navigate outdoors without GPS. One thing is for sure… that robot is so cute! Just look at its bug eyes!
As the race toward autonomous driving intensifies, automakers are increasingly torn between building in-house and partnering with proven suppliers. Nimrod Nehushtan, EVP of Strategy and Business Development at Mobileye, is at the heart of this industry inflection point. We caught up with him to explore how Mobileye is evolving its value proposition for OEMs, why scalable autonomy demands a different technical approach than end-to-end learning, and how geopolitical shifts in China and India are shaping go-to-market strategy. Check out the interview below, or go deeper with the full podcast episode.
As more automakers seek to develop or acquire their own ADAS and autonomous tech, how are you adapting your value proposition to convince OEMs that an out-of-house solution will keep them competitive and differentiated in the long run?
Automakers regularly weigh the tradeoffs between building in-house and working with a supplier, especially as new technologies emerge. As next-gen ADAS and autonomy move closer to wide deployment, the core priorities remain the same: high performance, cost efficiency, and the ability to scale across vehicle segments. Striking the right balance is where Mobileye’s expertise delivers a clear advantage.
Mobileye helps automakers meet these priorities through a flexible, scalable platform built for commercialization. Our EyeQ platform supports a broad range of capabilities from basic safety functions to hands-free driving and full autonomy—all running on a unified architecture. Shared technology enablers, including extremely efficient AI, underpin a modular product portfolio that allows automakers to tailor systems to different models while narrowing the gap between development and deployment.
Execution experience also matters. Mobileye technology is in over 200 million vehicles today. We have a strong track record of delivering safety-critical systems on time and on cost. In China, hundreds of thousands of vehicles equipped with Mobileye SuperVision have received feature upgrades via OTA, including Mobileye-enabled navigate-on-pilot.
At the same time, we understand the importance of differentiation. That’s why we introduced Mobileye DXP, which lets automakers fine-tune driving characteristics—within safety parameters—while relying on Mobileye’s core perception, compute and safety validation. This gives automakers greater control over the driving experience without needing to rebuild core systems from scratch.
As pressure grows to deliver competitive safety and comfort features at scale, Mobileye offers a more predictable, lower-risk path to deploying autonomous driving solutions.
You’ve worked on partnerships in trucking and robotaxis, too. How much of Mobileye’s future growth do you think will come from these commercial vehicle sectors versus passenger cars?
This question relates to the difference between autonomy for privately owned vehicles and for fleet-operated mobility-as-a-service models. We believe both markets can be served by using the same core technology building blocks, with variations based on use case. For example, MaaS typically involves no driver in the vehicle and is concentrated in urban environments. Therefore, our Mobileye Drive solution uses additional sensors, additional compute, and tele-operations—elements not required for Chauffeur, our eyes-off system for consumer-owned vehicles where drivers gain time back but are still available to take control if outside the defined ODD.
Despite these differences, much of the underlying technology is shared. This common architecture drives down costs, improves generalization across geographies, and offers a clear scalability advantage.
The MaaS segment has tremendous potential. Today, the percentage of miles driven autonomously remains close to zero -- even reaching 5% or 10% of total miles driven would be transformational. As technology matures, so will the business models. We are building an ecosystem with industry leaders across the value chain: Mobileye provides the self-driving system; OEMs like VW ADMT and Holon provide purpose-built AVs; and mobility providers like Lyft and MOIA contribute demand.
This collaborative approach positions us – and our customers – to scale when the market is ready. In North America, for example, Lyft recognizes demand for convenient, on-demand AV services and is working to make this a reality. We are actively testing a new AV platform in Austin and Detroit to support that vision. Because a variety of form-factors is crucial to scale, we work with Volkswagen, Holon, Verne and others on vehicles that can seat from 2 up to 15 passengers, depending on the specific use case.
Importantly, the AI stack developed for fully autonomous MaaS applications is already being applied to our broader product portfolio – including Surround ADAS, SuperVision and Chauffeur. For example, the same driving policy that governs Mobileye Drive in robotaxis also supports hands-free highway driving. This shared technology backbone means better-performing products and faster development cycles.
In an era when AI-driven autonomous systems feed on data, how will Mobileye leverage its REM data to compete with data-rich rivals like Tesla’s fleet learning? In other words, what’s Mobileye’s plan to demonstrate that its approach can deliver safer and more reliable self-driving faster than end-to-end machine learning approaches used by competitors?
The fundamental challenge we are solving for is self-driving that is significantly safer than a human driver, scalable across geographies, and cost-effective enough for mass adoption.
A purely data-driven approach struggles to meet all three goals. Whether it’s distinguishing rare-but-correct behaviors, adapting across markets without extensive retraining, or ensuring explainable safety performance, it’s a difficult case to prove. The massive cost required to train and maintain these systems is another significant challenge.
Mobileye takes a different approach. Our Compound AI System integrates AI breakthroughs like end-to-end learning within a robust, engineered architecture. Rather than one model trying to do everything, we deploy multiple specialized subsystems working together. These independent subsystems cross-validate, creating a safer, more predictable system. The bottom line is we engineer our system so that two out of three subsystems analyzing a specific task must fail for the system to fail.
We believe reaching sufficient safety will require more than massive datasets and the resources available to train them. To solve the holy grail goal of safe, scalable, cost-efficient autonomy, self-driving systems should be both dynamic in intelligence and engineered for reliability.
Chinese automakers are rapidly advancing their own ADAS capabilities with domestic suppliers. How are you adapting your strategy for a world of geopolitical tech rivalry where there's increasing pressure to use local technology providers?
The entire industry has had to adapt to a new landscape in China. Already this year, many automakers have rolled out broad new menus of ADAS and hands-free services across their product lines, because they see them as a key competitive feature in the market. That’s an important point not just for China, but the rest of the world.
China served as a proving ground for Mobileye SuperVision—we deployed one of the first L2+ systems capable of navigating on pilot across various road types. Our position remains strong with Chinese automakers exporting globally, where high safety ratings and an autonomous system tuned globally are essential to success.
India is emerging as one of the most important ADAS growth markets. It is the third-largest automotive market and is seeing rapid ADAS adoption, with safety technology becoming a key purchasing factor. Consumer demand, regulatory momentum, and industry investment are driving growth faster than anticipated. Since ADAS was first introduced only a few years ago, the market is already maturing from passive to active safety and L2+.
We expect these trends to continue and have expanded our local presence to support domestic automakers.
As software-defined vehicles become the norm, how does Mobileye plan to generate long-term recurring revenue? Is the business shifting towards a SaaS or licensing model, or will hardware continue to be the primary revenue driver?
Mobileye’s business is built on delivering high-performance, cost-effective driver-assist and autonomous systems at scale. While our revenue today is primarily driven by EyeQ-based ADAS, software has always been, and will continue to be, a key driver of both differentiation and growth.
Efficient, purpose-built hardware remains critical. But increasingly, value is coming from the software and functionality. The expansion of the content per vehicle we deliver today is evidence of this.
A strong example is our Surround ADAS product, with VW as our first customer. This is a mid-trim solution that goes beyond a front-facing camera by incorporating surround vision, REM mapping and our driving policy software to enable hands-free highway driving. Surround ADAS represents roughly 3-4x higher revenue than base ADAS per-vehicle due to the highly advanced software-enabled capabilities. The demand we’re seeing underscores a broader trend: OEMs are investing in higher-value systems, even for mass market segments.
At the higher end, products like SuperVision, Chauffeur, and Drive represent even greater growth opportunities, supporting hands-free everywhere, and eyes-off driving.
The move toward SDVs isn’t about replacing hardware—it’s about increasing intelligence in the vehicle. Mobileye is well-positioned to support this shift.
Get all the latest news in AI x Mobility across the globe.
Continue reading more about the world of robotics, automation, and AI