







Technology
Here Are the 2026 Mobility Trends from Las Vegas!
In the second week of 2026, all eyes are on mobility trends at CES in Las Vegas.
The Consumer Electronics Show (CES) 2026 in Las Vegas marks a turning point for the automotive and tech industries. This year’s spotlight is on industrial AI integration, next-generation autonomous driving systems, and robotaxis. Here’s an overview of the key highlights:
In the second week of 2026, all eyes are on mobility trends at CES in Las Vegas.
The Consumer Electronics Show (CES) 2026 in Las Vegas marks a turning point for the automotive and tech industries. This year’s spotlight is on industrial AI integration, next-generation autonomous driving systems, and robotaxis. Here’s an overview of the key highlights:
Siemens Accelerates the Industrial AI Revolution
Siemens CEO Roland Busch used the opening keynote at CES in Las Vegas to highlight how Siemens is accelerating the industrial AI revolution and how customers and partners are leveraging artificial intelligence to transform their business models.
Notably, Siemens announced it is deepening its long-standing partnership with NVIDIA to develop an industrial AI operating system. Together, they aim to build an AI-accelerated industrial value chain that spans from design and engineering to adaptive manufacturing, simulation-based optimization, intelligent supply chains, and faster innovation cycles.
To achieve this, NVIDIA will contribute AI infrastructure, simulation libraries, models, frameworks, and blueprints, while Siemens will deploy hundreds of industrial AI experts alongside leading hardware and software. The companies have identified key focus areas to realize this vision: AI-native Electronic Design Automation (EDA), AI-native simulation, AI-driven adaptive manufacturing and supply chains, and AI factories. One major goal is the establishment of fully AI-driven, adaptive manufacturing sites—with the electronics plant in Erlangen serving as the initial blueprint. Siemens also announced plans to integrate the open AI models NVIDIA NIM and NVIDIA Nemotron into its EDA software offerings to advance generative and agent-based workflows for semiconductor and PCB design.
A central element of the presentation was the new Digital Twin Composer software, expected to be available via the Siemens Xcelerator Marketplace by mid-2026. This tool combines comprehensive digital twins with real-time physical data and AI-powered simulations. It allows companies to virtually model, test, and optimize facilities, products, and processes before transferring them to the real world. Siemens cited its collaboration with PepsiCo as a concrete example, where digital twins are already simulating production and logistics facilities to identify potential issues early on.
Siemens also emphasized its close partnership with Microsoft. A standout result of this collaboration is the joint development of the award-winning Industrial Copilot. In this context, Siemens introduced nine industrial Copilots: AI-powered software assistants designed to optimize various phases of product development, manufacturing, and operations.

NVIDIA Unveils Strategy for Autonomous Mobility and AI-Defined Vehicles
In Las Vegas, NVIDIA showcased a range of new AI models, open data and simulation tools, comprehensive DRIVE software stacks for autonomous functions, and a growing ecosystem of partners and suppliers to accelerate Level 4 autonomy.
With the Alpamayo model family, NVIDIA presented a new generation of open AI models, simulation frameworks, and physical datasets specifically developed for autonomous vehicles. The goal is to make vehicles safer, more robust, and more explainable, enabling human-like "reasoning" in difficult or rare traffic situations. This openness is intended to facilitate not only faster research but also broad collaborative development.
NVIDIA also announced a major expansion of the DRIVE Hyperion ecosystem. Beyond classic OEM partners, Tier 1 suppliers, system integrators, and sensor manufacturers are now joining to create a modular, open platform for autonomous vehicles. This involves tightly interlocking hardware, AI software, and sensor technology so that OEMs and robotaxi services can realize market-ready systems faster. Hyperion serves as the technological foundation for vehicles up to Level 4 autonomy—meaning driverless operation within a defined operational domain. Alongside autonomous models and Hyperion, Nvidia demonstrated its Vera Rubin platform, designed to deliver more than triple the performance of previous architectures and serve as a base for extensive AI workloads.
Parallel to development tools, NVIDIA announced plans to get a robotaxi service on the road with partners as early as 2027, with initial tests and demo drives already conducted in collaboration with manufacturers like Mercedes-Benz. These vehicles utilize the DRIVE stack software and combine cameras, radar, and eventually LiDAR sensors to autonomously master complex traffic conditions. NVIDIA aims to bring this technology to private vehicles between 2028 and 2030.

Mercedes-Benz Presents New CLA with Next-Gen MB.DRIVE
Mercedes-Benz unveiled the new CLA in Las Vegas, featuring the next generation of MB.DRIVE technology. Developed in partnership with NVIDIA—Mercedes-Benz’s global partner for next-gen advanced driver-assistance systems—MB.DRIVE leverages NVIDIA’s AI, the full-stack DRIVE AV software, and the accelerated NVIDIA DRIVE AGX computing platform.
With MB.DRIVE ASSIST PRO, Mercedes-Benz aims to merge driver assistance and navigation into a new, safe driving experience. At the push of a button, the vehicle is designed to navigate through the city with advanced SAE Level 2 support—from parking spot to destination. Thanks to Mercedes-Benz's cooperative steering approach, the driver can make steering adjustments at any time without deactivating the system.
MB.DRIVE ASSIST PRO uses approximately 30 sensors, including ten cameras, five radar sensors, and twelve ultrasonic sensors. These deliver raw data to a powerful supercomputer capable of processing up to 508 TOPS (Terra Operations per Second). MB.DRIVE ASSIST PRO has been available in China since late 2025 and will be introduced to the US market later this year.

BMW Integrates Alexa+ and Turns the Intelligent Personal Assistant into a Conversational Companion
The BMW Group announced that the BMW Intelligent Personal Assistant will be expanded with Amazon Alexa+ technology. BMW claims to be the first automaker to implement a deep, context-aware integration of Alexa functions directly into vehicle infotainment. This update allows occupants to make natural, freely phrased requests without adhering to rigid command structures—a significant step toward conversational AI in the car.
Linking the BMW Intelligent Personal Assistant with an Amazon account is designed to make music search and streaming, checking daily news, and accessing other content effortlessly simple. The Alexa+ integration debuts in the new BMW iX3, utilizing the BMW Operating System X and the new Panoramic iDrive to tightly link voice and AI functions with navigation, climate control, media, and other vehicle services. The market launch is planned for the second half of 2026 in Germany and the US, with other markets to follow.
Hyundai Motor Group Presents Human-Centered AI Robotics Strategy
Under the theme "Partnering Human Progress," Hyundai Motor Group unveiled a comprehensive strategy for AI-supported robotics that goes far beyond classic automation. The goal is to develop physical AI systems that support humans, collaborate with them, and adapt to real-world environments—ranging from factories and logistics to everyday technology. Three fundamental partnerships are central to this vision:
Human-Robot Collaboration: Robots will initially take over repetitive, dangerous, or physically demanding tasks in production environments.
Boston Dynamics: Deepened collaboration to develop humanoid robots like Atlas capable of performing complex industrial tasks. Atlas is equipped with around 56 degrees of freedom (DoF) and tactile sensors for precise movement and autonomous learning. It is set to be integrated into existing processes and take over tasks like parts sorting in series production environments starting in 2028.
Global AI Partnerships: Strategic collaborations with leading AI providers, including NVIDIA and Google DeepMind, to combine AI hardware, software, and training systems to enable humanoid robots for broad use in industry and daily life.
Within the group-wide "Group Value Network," subsidiaries like Hyundai Motor, Kia, Hyundai Mobis, and Hyundai Glovis will jointly build an end-to-end value chain seamlessly connecting mass production, component development, and logistics. Another pillar of the strategy is the establishment of a Physical AI Application Center for the continuous advancement of robot-based solutions. Through this strategy, Hyundai aims to take a leading role in the physical AI sector and accelerate digital transformation across the mobility, manufacturing, and robotics industries.
Sony Honda Mobility Shows Afeela 1 Pre-Production Model and New SUV Prototype
Sony Honda Mobility, the joint venture between Sony and Honda, presented the pre-production model of the Afeela 1 and a new SUV prototype (Afeela Prototype 2026) under the motto "Mobility as a Creative Entertainment Space."
The Afeela 1 is a battery-electric liftback scheduled for delivery in California by late 2026, expanding to Arizona in 2027, with initial deliveries in Japan also slated for 2027. A European launch has not yet been confirmed. Technically, the Afeela 1 relies on a rich sensor base with around 40 sensors (including cameras, LiDAR, radar, and ultrasound) and a high-performance Qualcomm computer to run a Level 2+ driver assistance system, with plans to evolve it to Level 4.
Las Vegas also hosted the first public unveiling of the Afeela Prototype 2026, a battery-electric SUV coupe. A production model based on this prototype is planned for the US market starting in 2028.

Lucid, Uber, and Nuro Reveal Global Robotaxi Based on Lucid Gravity
EV manufacturer Lucid, ride-hailing platform Uber, and tech company Nuro showcased the production vehicle for their planned robotaxi service. The foundation is the all-electric Lucid Gravity SUV, retrofitted for autonomous services. They didn't just show the vehicle; they also outlined a plan for "autonomous on-road testing" starting as early as 2026.
The robotaxi will utilize an advanced sensor suite featuring high-resolution cameras, solid-state LiDAR, and radar to ensure 360-degree environmental perception. These sensors are integrated into the Lucid Gravity’s body, particularly in the specially designed roof-mounted "Halo"—a flat module designed to ensure maximum visibility. Data will be processed in real-time by a high-performance computer based on NVIDIA’s Drive AGX Thor.
Vehicle integration will plug directly into ride-hailing networks. A central element of the partnership is the plan to deploy up to 20,000 autonomous units in various urban regions via Uber’s platform in the coming years. According to the partners, initial road tests began in San Francisco in December—initially with safety drivers and no passengers. However, the first passengers are expected to use the Lucid, Nuro, and Uber robotaxis in San Francisco later this year.

AUMOVIO Showcases Holistic Mobility and SDV Technologies
AUMOVIO demonstrated a wide range of new solutions aimed primarily at Software-Defined Vehicles (SDV) and the digital transformation of the automotive industry. A key highlight is the Vehicle Control High-Performance Computer (VC HPC), which supports both safety-critical and non-safety-critical functions across multiple domains—a foundation for standardized and scalable vehicle architectures. This allows developers to integrate, test, and validate complete vehicle applications in fully virtual or hybrid real-time environments before physical hardware is available.
AUMOVIO also displayed its "Automotive Remote Control Network" architecture concept, which connects high-performance computers, zone controllers, sensors, and actuators via standardized communication protocols.
Additionally, the company showcased its Xelve portfolio for assisted and automated driving (Level 2 to 4), including solutions like Xelve Park, Xelve Drive, and Xelve Pilot. New to the lineup was Xelve Trailer, a collision warning feature for maneuvering with a trailer, based on surround-view camera data.
The Branded Personalized Cockpit demonstrated AUMOVIO’s expertise in highly individualized display solutions. It featured a colorful multi-display landscape highlighting the latest in display tech, including innovative, color-rich ePaper displays, switchable privacy functions, and invisible camera integration behind an OLED display. Safety and comfort technologies also played a role, with AUMOVIO introducing an AI-supported night vision extension that uses existing camera sensors to improve visibility of pedestrians and obstacles in poor light or weather.

ZF Leads Chassis Systems into the Future with AI Software
Automotive supplier ZF Friedrichshafen AG used Las Vegas to present its strategy for a "Chassis 2.0"—transforming traditional chassis components into software- and AI-controlled systems. Two software-based innovations designed to significantly improve comfort and safety took center stage.
First, "Active Noise Reduction" (ANR): An acoustic software function designed to reduce annoying tire noise—so-called cavity noise—directly in the chassis before it reaches the cabin. Unlike classic solutions using insulation or speakers, ANR uses software algorithms, sensor data, and semi-active dampers (CDC) to generate phase-inverted counter-signals. The current version already achieves reductions of more than three decibels, with up to ten decibels possible in the future. Series production is planned for 2028.
Second, ZF presented "AI Road Sense", an AI-supported software that detects road conditions and adapts the suspension in real-time to different surfaces—from snow-covered roads to off-road passages. Depending on the configuration, the system uses vehicle network data, cameras, or even LiDAR to detect surface information up to 25 meters ahead and control damping accordingly.

Bosch Presents New AI-Based Cockpit and World Premieres Radar Gen 7 Premium
Bosch presented a new AI-based cockpit in Las Vegas—an all-in-one system designed to make the in-car environment highly personalized. The cockpit features an AI voice model capable of natural conversation and a visual language model that can interpret events both inside and outside the vehicle.
Simultaneously, Bosch is establishing itself as a leading provider of By-Wire systems, a key technology for automated and software-defined driving. These systems replace mechanical connections in braking and steering with electrical signal lines. Bosch’s "Vehicle Motion Management" software controls vehicle movement across all six degrees of freedom by centrally managing braking, steering, powertrain, and suspension. This allows actuators to be better coordinated, used more efficiently, and eventually tailored to driver preferences.
With the new "Radar Gen 7 Premium", which celebrated its world premiere, Bosch showcased a groundbreaking combination of sensor technology and AI. This radar sensor is designed to improve driver assistance functions like the highway pilot, capable of detecting very small objects like pallets or tires at distances of over 200 meters. In the eBike sector, Bosch introduced the "eBike Flow App," allowing users to flag their e-bike or battery as stolen. Bosch also displayed its latest AI-MEMS sensor platform, BMI5.
Furthermore, Bosch announced the continuation of its collaboration with Microsoft. Together, they aim to expand the "Manufacturing Co-Intelligence" offering and explore using agentic AI to revolutionize production. Bosch also signed an agreement with Kodiak AI, a pioneer in autonomous trucking. They plan to work on vehicle-agnostic, redundant platforms for driverless trucks—comprehensive systems of specialized hardware and software integrated into standard trucks to give them autonomous capabilities.

LG Electronics Shows "Affectionate Intelligence" and AI Integration
LG Electronics (LG) unveiled its "AI in Action" strategy. The focus was on three pillars: evolving "Affectionate Intelligence" into action-oriented AI, industry-leading products based on tech excellence, and a seamlessly orchestrated ecosystem extending from the home to vehicles and commercial spaces. In the mobility sector, LG presented its vision for software-defined vehicles and intelligent in-cabin experiences based on three core systems:
Mobility Display Solution: Turns the windshield into a display surface for real-time driving data during automated driving.
Automotive Vision Solution: Uses eye-tracking and driver/interior monitoring to detect attention lapses or fatigue, enabling adaptive safety functions.
In-Vehicle Entertainment Solution: Allows seamless content streaming between home and vehicle and supports communication via side windows.
Combined with LG’s On-Device Multimodal Generative AI Platform, these modules aim to create highly individual and immersive experiences while maintaining strict safety standards.

Qualcomm Drives the AI-Defined Vehicle
Qualcomm is betting on the "AI-defined Vehicle"—moving away from isolated domain controllers toward central computing power, agentic AI, and reusable software stacks. As a guiding concept, Qualcomm introduced "Snapdragon Chassis Agents"—AI agents that execute in-vehicle tasks in a goal-oriented, context-aware manner to unify assistance, cockpit experiences, and personalized services.
This shift is becoming concrete in hardware architecture: In collaboration with Leapmotor, Qualcomm announced the "world’s first automotive central computer"—uniting Snapdragon Cockpit Elite and Snapdragon Ride Elite. Leapmotor’s upcoming flagship, the D19, is set to be the first production vehicle equipped with this dual version of the Snapdragon Elite platforms.
On the supply side, Qualcomm is boosting ADAS scaling through partnerships. With ZF, they agreed to collaborate on a scalable ADAS solution based on the new ZF ProAI supercomputer and the Snapdragon Ride platform. The goal is "turnkey" modules for various vehicle classes and automation levels up to Level 3. A second strategic cooperation was formed with Hyundai Mobis to collaborate on SDV architectures for ADAS, utilizing Qualcomm’s Snapdragon Ride Flex SoC.
To scale these platforms quickly, Qualcomm is also addressing the software base: The longstanding collaboration with Google is being expanded to simplify SDV development and accelerate the introduction of in-vehicle agentic AI—partly by tighter integration of Google’s automotive software blocks with Qualcomm’s Snapdragon Digital Chassis. A sign of this "Digital Chassis Momentum" is the announcement regarding the Toyota RAV4, which will feature the Snapdragon Digital Chassis to support personalization and immersive infotainment.

HERE Technologies Introduces AI-Powered Portfolio for SDVs
HERE Technologies introduced a new portfolio specifically for Software-Defined Vehicles (SDVs). It combines HERE’s AI-powered Live Map with advanced software to unify navigation, driver assistance, and autonomy across the entire vehicle lifecycle. This enables automakers to provide richer navigation experiences and advanced functions for ADAS, "Navigation on Autopilot" (NOA), and automated driving.
HERE also presented its enhanced Navigation SDK, capable of rendering lanes and providing lane-specific directions to support both navigation and graphical user interfaces (GUI) for ADAS.
Additionally, HERE introduced "Behavioral Maneuvers," a feature that improves automated driving by allowing vehicles to execute smooth, natural maneuvers. Developed with leading automakers and based on HERE's AI tech, it uses anonymized data from millions of vehicles to ensure precise localization and safety, aiming to increase driver confidence and reduce the likelihood of automated functions being deactivated.

CARIAD Bets on TomTom Orbis Maps to Improve Automated Systems
Volkswagen’s software subsidiary CARIAD is taking a major step toward more robust automated driving functions by adopting TomTom’s Orbis Maps. These maps provide an additional context layer beyond sensor-based perception, allowing autonomous systems to interpret traffic situations with greater nuance.
Orbis Maps deliver high-precision, minute-by-minute location and road information for over 235 countries, making them ideal for autonomous applications. By integrating this data layer, vehicles gain a dynamic, contextualized view of their environment—such as temporary construction zones, traffic restrictions, or complex intersections—that would be difficult to capture via sensors alone.

Verge Motorcycles Adopts Solid-State Batteries from DonutLab
Electric motorcycle manufacturer Verge Motorcycles announced a tech upgrade for its TS Pro platform: the adoption of a solid-state battery from Donut Lab. This battery offers significantly increased energy density, high safety, and fast charging capabilities. It is set to be used in all 2026 Verge models starting in the first quarter of 2026.
The solid-state battery reportedly achieves an energy density of 400 Wh/kg and is billed as the world's first solid-state battery ready for OEM vehicle manufacturing. DonutLab claims a full charge takes just five minutes. Furthermore, the lifespan is estimated at 100,000 charge cycles with minimal capacity loss.
The TS Pro, introduced in November, is expected to achieve a range of 350 kilometers (approx. 217 miles), with the ability to recharge 300 kilometers of range in ten minutes. A battery pack enabling up to 595 kilometers (approx. 370 miles) will also be available.

Valeo and Seeing Machines Present Integrated Driver Monitoring
Valeo and Seeing Machines showcased integrated solutions for driver and interior monitoring, focusing on gaze-based warnings, cabin analysis, and helmet detection for two-wheelers.
The core of the solution combines Valeo’s system integration expertise with Seeing Machines' perception software. The latter's ICMS software extends classic driver monitoring with capabilities like gaze tracking—determining if a driver has actually perceived a potential danger. They also demonstrated helmet detection, ensuring a helmet is worn before a motorcycle can be ridden.
Valeo also displayed a Panovision Head-Up Display utilizing adaptive warnings based on gaze tracking, and a "Safe InSight" demo vehicle illustrating a multi-layered approach to monitoring. Another focus was a SmartCluster solution for motorcycles, integrating helmet detection directly into the instrument cluster.

Donut Lab & WEVC Present Flexible E-Platform with Hub Motors
Donut Lab and the Watt Electric Vehicle Company (WEVC) unveiled an innovative electrification platform enabling a flexible, modular e-drive architecture. The heart of the solution is the "Passenger And Commercial EV Skateboard" (PACES), where battery, controls, and drive can be flexibly integrated for both passenger and commercial vehicles.
The novelty lies in the use of hub motors (in-wheel motors), which sit directly at the wheel and transmit torque independently to each wheel. This allows for finer torque application and vehicle dynamics control, as every wheel can be regulated individually—vital for all-wheel drive or scenario-adaptive systems.
This skateboard architecture is designed to help OEMs develop various vehicle types—from leisure buggies and vans to SUV-like concepts—on a common base, reducing development costs and time. The modular battery and control technology also allows systems to be configured for different range and performance requirements without altering the fundamental structure.


