State of the Art in Autonomous Vehicles Technologies: Cameras and Vision systems
The core of this report is to make clear the current status of the technologies that form autonomous vehicles. We have separated the chapters into groups covering Sensing, where we take a closer look at the latest advances in cameras, LiDAR, RADAR, ultrasonic sensors, and emerging imaging radar technologies. The Thinking and Learning and Edge Computing chapters examine the dynamic landscape that encompasses advanced AI algorithms, natural language processing (NLP), machine learning techniques, and the transformative impact of edge computing. Finally, we explore the technologies that ensure reliable communication, from rapid 5G Connectivity and dynamic Over-the-Air (OTA) Updates, to the use of Blockchain, as well as Intrusion Detection and Prevention Systems (IDPS) and AI/ML-Driven Cybersecurity.
The report's final chapter looks at four leading autonomous vehicle companies: Waymo, Tesla, Cruise, and Volvo. We compare and contrast their tech stacks presenting a clear overview of the direction of the industry.
Each section highlights recent innovations, outlining why certain technologies have become dominant, and gives examples of which companies are prominent in the area.
We invite you to download the full report for free below.

Sensing Technologies
At the cutting edge of autonomous vehicle (AV) technology, the confluence of advanced sensing modalities forms the cornerstone of vehicular autonomy. At the forefront of this confluence lies the integration of high-definition cameras with a suite of diverse sensors, including ultrasonic, LiDAR, and RADAR. This amalgamation, known as 'sensor fusion,' represents the zenith of current efforts to endow vehicles with perception capabilities necessary for full autonomous driving.
High-definition cameras, quintessential for their acute visual acuity and color discernment, play an indispensable role in this sensorial symphony. They excel in interpreting complex visual stimuli — from the nuanced hues of traffic lights to the intricate patterns of road signs. Yet, the prowess of cameras is not without its Achilles' heel; their performance can wane under the cloak of night or in the face of inclement weather. It is within these gaps that the orchestration of sensor fusion becomes critically imperative.
At the cutting edge of sensor integration, the marriage of ultrasonic sensors with LiDAR and RADAR is addressing the erstwhile shortcomings of standalone systems. This integration is particularly pivotal in surmounting the challenges of close-range detection — a realm where traditional LiDAR sensors often falter. Such precision in proximal perception is vital for executing complex parking maneuvers and navigating through constricted spaces with unerring accuracy.
The collaborative dynamics between ultrasonic and LiDAR sensors forge a more robust interpretative framework. While LiDAR imparts a detailed topographical map of the vehicle's surroundings, it is occasionally prone to misinterpretations, especially in the presence of reflective surfaces or atypical object contours. Here, ultrasonic sensors contribute a deeper dimension of spatial awareness, validating and refining LiDAR's data, thus mitigating the risks of erroneous object recognition.
Extending this synergy further, the integration of ultrasonic sensors with RADAR technology heralds a new era in perception systems capable of straddling the spectrum of short- and long-range detection. RADAR, with its broader wave patterns, often struggles with pinpoint accuracy in proximate scenarios. Ultrasonic technology deftly fills this void, granting AVs enhanced situational awareness — an attribute of paramount importance in scenarios that demand a harmonious blend of both near and distant perception, such as highway navigation interspersed with intricate parking sequences.
In this avant-garde realm, vehicle manufacturers are not merely choosing between sensor technologies; rather, they are strategically orchestrating an ensemble of LiDAR variants, each contributing its unique strengths to the collective sensory intelligence of AVs. The selection of specific LiDAR models is no longer a mere technical choice but a strategic decision, influenced by a myriad of factors including application-specific requisites, cost-benefit analyses, and the relentless march of technological innovation.
This chapter aims to delve into the intricate and sophisticated world of sensing and vision technologies in autonomous vehicles. This section covers camera's and AI-enhanced vision. Read part two covering RDAR and LiDAR here.
Cameras and Vision systems
Cameras have a foundational and technically intricate position within autonomous vehicles, functioning as primary sensors to provide vital visual data for perception and navigation systems. Their role extends beyond mere image capture, encompassing intricate computer vision processes to interpret the surroundings with pixel-level precision. Cameras are instrumental in critical tasks, including real-time lane detection, object recognition, and complex depth perception, making them indispensable for AV safety and operational efficiency.
In the last three years, there have been significant advancements in high-resolution cameras, which have shown a remarkable increase in their ability to capture fine details. This, in turn, has enabled autonomous vehicles to accurately identify objects in their surroundings, making them more reliable and safe. Therefore, in this section, we explore developments in vision technology that have impacted AV development over the last three years.
3D Stereo Vision
3D stereo vision technology utilizes two cameras to determine the depth and precise positioning of objects in the environment. This is similar to how humans use binocular vision for depth perception. They are an integral part of the future of autonomous vehicles by enabling them to navigate roads more safely than single cameras.
The technology has seen rapid growth over the past decade, with significant strides being made by companies that are enabling automakers to quickly and inexpensively add 3D Stereo Vision to existing Advanced driver assistance systems (ADAS) with software solutions.
The positioning of cameras in vehicles is an ongoing topic within the industry. Wider-placed cameras have the potential to fall out of alignment when impacted by temperature shifts in the chassis or road vibrations - an issue when the cameras need to maintain an alignment within one-100th of a degree.
Major players, such as Subaru’s EyeSight and the Drive pilot system in Mercedes’ EQS, use stereo vision systems deployed in tighter formations to negate that - those systems are working in – tandem with RADAR. Stereo vision is an ever-growing technology, with researchers and developers exploring new ways to improve its accuracy, efficiency, and field of view.
The biggest impacts will likely come from deep learning and neural networks being used to handle occlusion and calibration issues. Other interesting areas of research include active stereo vision, which is being employed to project patterns or signals onto the scene, creating artificial texture and contrast.
Some of this cutting-edge research is being tested by university teams on the race track. For example, the Formula Student racing team of the University of Bayreuth is using Nerian’s SceneScan Pro and the Karmin3 stereo camera to create a 3D stereo vision system for their autonomous racing car.1,2
The 3D stereo vision deployed on the autonomous race cars. Image credit: Nerian.
Thermal cameras
In the early 2000s, several notable car manufacturers, including General Motors, BMW, and Honda, blazed the trail by introducing passive thermal cameras to enhance safety during nighttime driving. These innovative thermal cameras were designed to address the dangers posed by animal collisions and the risk of pedestrian accidents in poorly lit or foggy areas. Their primary purpose was to provide invaluable assistance to human drivers.
However, the landscape of autonomous driving began to evolve significantly with the advent of the DARPA Grand Challenge. This competition sparked a surge of interest and substantial investment in various sensing technologies. Among them, LiDAR (Light Detection and Ranging) emerged as the frontrunner, capturing the lion's share of attention and financial support. Together with radar and visible cameras, this sensor suite gained widespread recognition as the optimal perception stack for achieving higher levels of autonomy.
In an effort to bolster their sensor capabilities, certain companies are incorporating thermal cameras into their sensor suites, recognizing the unique advantages they offer in complementing LiDAR, radar, and visible cameras. This additional sensor modality proves invaluable in addressing specific challenges, such as identifying animals and humans in environments characterized by low light or heavy obscurants like fog, smoke, or steam.
Pedestrians are most at risk of an accident with a road vehicle after dark. More pedestrian fatalities occurred in the dark (75%) than in daylight (21%), dusk (2%), and dawn (2%).3
A pedestrian crossing a dark suburban street. Visible light camera vs. FLIR® thermal camera captured by Foresight’s test vehicle. Image credit: Foresight.
Notably, pioneers like Waymo Via and Plus.ai have harnessed the power of thermal cameras to advance autonomy in the realm of trucking, particularly on highways. By doing so, they are enhancing safety and efficiency in long-haul transportation.
Companies like Nuro, Cruise, and Zoox have adopted thermal cameras as part of their sensor repertoire for purpose-built vehicles designed to navigate the intricate landscapes of densely populated urban areas. These vehicles are not only revolutionizing last-mile food and grocery delivery but also providing innovative solutions in the realm of ride-hailing services. Through the strategic deployment of thermal cameras, these companies are significantly elevating the safety and effectiveness of their operations within urban environments.
Harnessing AI-Enhanced Vision
Traditional cameras capture raw visual data, which requires subsequent processing and interpretation to derive meaningful information about the surroundings. AI algorithms, especially deep learning models, have revolutionized this process by enabling cameras to interpret visual information from their surroundings, enhancing their ability to comprehend images.
The integration of AI-enhanced vision represents a groundbreaking development that significantly improves the capabilities of camera systems in AVs. For example, HADAR, an AI-powered thermal imaging system created by Purdue and Michigan State University researchers, provides clear thermal images by interpreting heat signatures. It significantly improves AVs and robots by resolving the blurring 'ghosting' effect seen in traditional thermal imaging.
Moreover, Omniq has recently launched a face detection feature for AVs, improving safety by recognizing faces to prevent crimes. Their AI uses neural network algorithms for smart decision-making and has already seen over 20,000 global installations. In a collaborative effort, SemiDrive and Kankan Tech are improving in-car imaging systems, where SemiDrive's X9 chip powers the systems, and Kankan Tech provides comprehensive development services.
Kankan Tech has expertise in high-resolution cabin cameras and has developed a camera-based alternative to traditional rearview mirrors. They've also introduced palm vein biometric recognition for AV access. The system, unaffected by lighting changes due to IR cameras, uses YOLO v7 algorithms for real-time face detection, analyzing facial expressions and head orientation for safety, with plans for commercial market integration after thorough testing.
Cameras, empowered by convolutional neural networks (CNNs) and appropriate classification ML techniques, enable AVs' vision systems to accurately identify and categorize objects, pedestrians, road signs, and lane markings. This level of understanding improves the vehicle's ability to make informed decisions in complex and dynamic traffic scenarios.
AI-enhanced vision is crucial in autonomous vehicles, encompassing tasks like object identification, motion tracking, and classification. This technology significantly augments AVs' understanding of their surroundings, resulting in more informed and secure decision-making processes.4
An illustrative example of the potential of AI-enhanced vision comes from the research conducted at RIKEN in 2023. Their innovative approach, inspired by human brain memory formation techniques, involves degrading the quality of high-resolution images for training algorithms in self-supervised learning. This method enhances the algorithms' ability to identify objects in low-resolution images, addressing a notable challenge in the field of computer vision.5
Furthermore, researchers at Purdue University and Michigan State University have introduced a groundbreaking AI-enhanced camera imaging system known as HADAR (heat-assisted detection and ranging). HADAR utilizes AI to interpret heat signatures, effectively resolving issues such as 'ghosting' that are commonly associated with thermal imaging. Its applications span a wide spectrum, from enhancing the perception of AVs and robots to enabling touchless security screenings at public events.6
Comparison between ghosting thermal vision and HADAR TeX vision. Image credit: NVIDIA
Another example comes from NVIDIA, which has developed a pixel-level segmentation approach using a single deep neural network (DNN) to achieve comprehensive scene understanding. This technology can divide a scene into various object categories and identify distinct instances of these categories, as reflected in the lower panel's colors and numbers.
The benefits of this technology are far-reaching, including reductions in training data, improved perception, and support for the safe operation of autonomous vehicles. Collectively, these innovations underscore the transformative potential of AI-enhanced vision in shaping the future of autonomous vehicles and related technologies.7
“We have algorithms that are reading for lanes, but there's also an object detection, but then there's also a DNN we call free space. Which is looking for the absence of objects.” - Danny Shapiro - VP of Automotive at NVIDIA.
Panoptic segmentation DNN output from in-car inference on embedded AGX platform. Top: predicted objects and object classes (blue = cars; green = drivable space; red = pedestrians). Bottom: predicted object-class instances along with computed bounding boxes (shown in different colors and instance IDs). Image credit: NVIDIA.
Prominent Companies Developing AV Vision Systems
This section highlights some of the cutting-edge vision systems currently enabling the development of AVs.
Mobileye
Mobileye uses a variety of cameras within its vision-based driver assistance systems, including fisheye cameras, wide-angle cameras, and thermal cameras.8 In 2023, Mobileye launched the first camera-based Intelligent speed assist that complies with the new EU standards. Their technology, which only uses cameras, has received official approval throughout Europe, making it the first of its kind. Mobileye’s technology can recognize various traffic signs, aiding Intelligent Speed Assist systems by using cameras alone. It relies on Mobileye’s 400-petabyte database of global driving footage to swiftly meet increasing automotive safety standards.
Continental
Continental develops various cameras, including fisheye, wide-angle, and thermal cameras. These cameras are designed to meet the specific requirements of different AV applications. More specifically, the surround-view camera features fisheye optics for a short-range view and supports Ethernet or LVDS communication.
In November 2022, Continental and Ambarella entered a collaboration to co-develop hardware and software solutions based on AI for assisted and automated driving. The partnership aims to produce products for global series production by 2026, addressing the increasing demand for assisted and automated driving technologies. The collaboration focuses on camera-based perception solutions for advanced driver assistance systems and scalable full-stack systems for vehicles with Level 2+ and higher autonomy.
Continental’s AV advanced camera solutions. Image credit: Continental.
TIER IV
TIER IV, is an open-source autonomous driving technology company who are expanding production based on the huge interest in Automotive HDR Camera C1, which launched in 2022.
The camera is designed for autonomous mobility applications and has gained widespread adoption in various fields, including autonomous driving, driver assistance, autonomous mobile robots, security, and surveillance. These applications are possible thanks to its impressive 120dB high dynamic range and high-quality automotive-grade hardware.
Over 100 companies worldwide have implemented the C1 Camera. Building on the success of the C1, in June 2023 TIER IV introduces the C2 Camera, a superior model with double the resolution at 5.4 megapixels, improving its capabilities in distant objects and signal recognition. Finally, TIER IV is developing the C3 Camera featuring an 8-megapixel image sensor to meet the demands of high-speed applications such as highway driving. The goal is to complete its development within the year and start providing it in early 2024.
LiDAR
LiDAR (Light Detection And Ranging) sensors help autonomous vehicles to sense and understand their surroundings. They use laser pulses to detect and measure the time it takes for the reflected light to return, compiling this data to create 3D mappings of its environment. This information is then combined with other data to ensure safe navigation.
A core area of current LiDAR research is developing systems that combine the strengths of different LiDAR technologies to improve overall perception performance. Pairing pulsed LiDAR with FMCW LiDAR, for instance, provides comprehensive object detection, accurate distance measurement, and real-time velocity estimation.
A hybrid LiDAR setup could integrate a solid-state laser for short-distance assessments alongside an FMCW laser optimized for capturing distant measurements. Integrating LiDAR with other sensors like cameras and RADAR creates a sensor fusion ecosystem that can address sensor redundancies and data gaps, ultimately improving the robustness and reliability of autonomous driving systems.9
LiDAR light pulses covering object on the road. Image credit: Delphi
LiDAR Product Overview
Solid-state LiDAR
Solid-state LiDAR systems use non-moving optical components to steer laser beams, making them well-suited for the stringent requirements of AVs1010 99. Launched in 2018, solid-state LiDAR can enhance sensor range by more than 200 meters while reducing costs by more than ten times. They offer a promising advantage over conventional LiDAR that steers an optical beam using moving parts. The assembly and alignment of these moving parts are expensive and raise significant concerns about their long-term dependability.
The demand for solid-state LiDAR is expected to grow at a CAGR of 30.66% over the forecast period of 2021-26. This potential growth is reflected in the high volume of research in this area, including the emerging area of nanophotonics-based LiDAR sensors.
Automotive brands like Velodyne (now Velodyne + Ouster) and tech companies like Luminar & Xenomatix are advancing solid-state LiDAR research. With OEMs like Mercedes Benz entering deeper partnerships in the solid state LiDAR space.
Frequency-Modulated Continuous Wave (FMCW) LiDAR
Frequency-Modulated Continuous Wave (FMCW) LiDAR works by emitting a continuous laser signal with a modulated frequency, which enables simultaneous distance and velocity measurements.9 This real-time capability is crucial for AVs to accurately assess dynamic environments. FMCW LiDAR's continuous waveform provides higher resolution, enabling fine-grained object detection and tracking.
Although signal processing complexities exist, research in this field rapidly advances, promising improved perception for AVs. It has been recognized as a transformative advancement in LiDAR technology. Pioneering companies like Aeva, Mobileye, and Blickfeld have dedicated extensive years to developing Photonic Integrated Circuits (PICs) and FMCW sensors, poised to revolutionize the landscape of autonomous driving.11,12
Companies Developing LiDAR Technologies for AVs
In this section, we go deeper into the companies at the forefront of advancing LiDAR technology for AVs.
Velodyne
Velodyne is a prominent provider of LiDAR sensors developed for AVs. It is the first LiDAR company to go public. The company asserts itself in the automotive industry by working closely with customers to test its its LiDAR sensors based on common sets of real-world scenarios and relevant corner cases. In February 2023, Velodyne merged with Ouster. Major players in the AV industry, such as Waymo, Uber, and Cruise, utilize Velodyne's LiDAR sensors.1313
Luminar Technologies
Luminar Technologies develops vision-based LiDAR and machine perception technologies, primarily for autonomous vehicles. In February 2023, Luminar, launched Iris Plus, a LiDAR sensor designed to blend into the roofline of a production vehicle. It uses laser light waves longer than usual, at 1550 instead of the common 905 nanometers. This feature improves the device’s ability to detect small and low-reflective objects, including dark-colored cars, animals, or a child suddenly running into the street. It operates at distances exceeding 250 meters and up to 500 meters for larger, more reflective objects.
Mercedes plans to be among the first car manufacturers to incorporate Luminar’s Iris Pus LiDAR into its production vehicles. Mercedes and Luminar announced their partnership in January 2022, initially aiming to integrate Luminar’s LiDAR into a single high-end vehicle model. Since then, plans have expanded significantly, with Mercedes aiming to increase its LiDAR supply by ten times over the coming years. Big-name companies like Volvo, Toyota, and BMW also employ Luminar's sensors.14,15
Aeva Technologies
Aeva Technologies pioneers LiDAR sensors with capabilities in both visible and infrared spectrums. Uber and Continental are among the companies adopting Aeva's technology. In 2022, Aeva released its revolutionary 4D LiDAR technology Aeries II, which employs FMCW4D technology and the LiDAR-on-chip silicon photonics design. Aeries II is compact, configurable, and automotive-grade, designed for reliability across various conditions.
With ultra-long-range object detection and tracking capabilities of up to 500 meters, it stands out in detecting oncoming vehicles, pedestrians, and animals. Additionally, Aeva's FMCW technology remains unaffected by interference from sunlight or other LiDAR sensors, and its LiDAR-on-chip design enables scalable production for a wide range of autonomous applications.16
Quanergy Systems
Since 2022, Quanergy is transforming physical security, which plays a crucial role in enhancing situational awareness and safety in driving, with its real-time 3-D LiDAR solutions. The company is pioneering in providing 3-D LiDAR security solutions that bring intelligent and proactive awareness to dynamic environments. Quanergy aims to empower users to transcend current sensing limitations, offering an experience of 3-D security tailored for a 3-D world. Toyota and Geely are among the companies incorporating Quanergy's sensors into their AV products.17, 18
Intel and Mobileye
Since 2020, Intel and Mobileye have a specific focus on enhancing the performance of LiDAR and RADAR sensors for AVs by leveraging technologies such as PICs and FMCW LiDARs. They are focussing on hybrid LiDAR-RADAR solutions, aiming to capitalize on the strengths of both technologies. The proposed architecture involves the integration of cameras, LiDARs, and RADAR to cover the full field of view, aiming to overcome challenges like side lobes and limited range in traditional sensors.19, 20 The collaboration between the two companies aims to make Radars and LiDARs both better and cheaper in order to reach L5 autonomy more quickly. Their new product range is expected to launch in 2025.
Continental
Continental's High-Resolution 3D Flash LiDAR technology marks a significant advancement in vehicle vision. Released in 2021, this LiDAR system boasts a solid-state design, ensuring continuous data flow without gaps. Its high-resolution capabilities span both vertical and horizontal dimensions, offering detailed insights. The system also includes features like blockage detection, an integrated heater, an optional washing system, auto-alignment, and continuous sampling mode. 20,21
Blickfeld
Blickfeld introduced the Qb2 smart LiDAR sensor in 2022, a novel device designed for easy deployment due to its onboard processing and Wi-Fi connectivity. This marks the first smart LiDAR sensor featuring built-in software. The Qb2 LiDAR sensor merges high-performance detection and ranging capabilities with onboard software, enhancing performance and setup efficiency without any complex custom software to be developed.
Additionally, the sensor includes built-in Wi-Fi support. The Qb2 employs a custom micro-electro-mechanical systems (MEMs) mirror for beam steering, optimizing the balance between resolution, range, and field of view to create multi-dimensional maps. Achieve a maximum of 400 scan lines per frame, guaranteeing exceptional quality in-point cloud data. The Qb2 sensor is designed to accommodate three returns and boasts a laser beam divergence of 0.25° x 0.25°, facilitating meticulous scanning for precise, dependable, and reliable information.23,12,24
Hesai Technology
Hesai Technology offers a variety of LiDAR sensors designed to meet the requirements for Level 4 and higher autonomous driving, which ensures reliable and safe operation. On August 1, 2023, Hesai Technology announced its partnership with NVIDIA. This collaboration aims to integrate Hesai's advanced LiDAR sensors into the NVIDIA DRIVE and NVIDIA Omniverse platforms, setting the stage for groundbreaking developments in autonomous driving. By bringing together Hesai’s specialized LiDAR technology and NVIDIA's expertise in AI, simulations, and software development, this partnership promises to drive innovation in the AV sector.
RoboSense
RoboSense offers various Smart LiDAR perception system solutions based on three fundamental technologies: chips, LiDAR hardware, and perception software. In 2016, RoboSense began working on mechanical LiDAR, known as the R platform. By 2017, they had introduced perception software and the M platform. In 2021, RoboSense achieved the start of production for the M1, becoming the first LiDAR company globally to mass-produce automotive-grade LiDAR with internally developed chips. In 2022, to improve the M platform product range in the automotive LiDAR field, RoboSense introduced the E platform, a blind spot solid-state LiDAR. OEMs that implement RoboSense solutions are BYD, GAC MOTOR, SAIC Motor, Geely, FAW, Toyota, Baic Group, and many others.
RADAR
In advanced driver-assistance systems, a combination of radar types is utilized for optimal performance. Long-range radar (LRR) excels in detecting objects up to 250 meters away. Medium-range radar (MRR) functions effectively within a 1-60 meter radius, while short-range radar (SRR) operates best from 1-30 meters, aiding in tasks like blind-spot detection and parking assistance. Radar sensors are typically positioned on each side of a vehicle, encompassing the front, back, and sides. RADAR in autonomous vehicles operates at the frequencies of 24, 74, 77, and 79 GHz.
Structure and physics of a RADAR. Image credit: BabakShah/Wevolver
Two primary radar types are prevalent in these systems; impulse RADAR and Frequency - modulated continuous wave (FMCW) RADAR. In impulse RADAR, one pulse is emitted from the device and the frequency of the signal remains constant throughout the operation. In FMCW RADAR pulses are emitted continually.
Research and development in the last three years has pushed to solve many of the challenges in how autonomous vehicles navigate, interact, and adapt to ever-changing environments. Highlights of this research are outlined below.
Solid-State RADAR
Solid-State RADAR sensors employ electronically controlled components to eliminate the need for moving parts. This advancement contributes to higher reliability, durability, and longevity of RADAR sensors, making them suitable for the demanding operational conditions of AVs. Solid-State RADARs are also more compact, enabling easier integration into AV designs. Furthermore, their lower power consumption and reduced heat generation are crucial for maintaining energy efficiency in AVs.25 This technology is being actively researched and implemented by companies such as Continental, Bosch, and Veoneer for applications in AVs. The shift to Solid-State RADAR signifies a move towards more robust and affordable sensing solutions in the evolving landscape of autonomous driving.
4D RADAR
4D RADAR sensors build upon FMCW technology, incorporating time as the fourth dimension. This temporal information enhances the AV's ability to predict the trajectory of moving objects, providing a more comprehensive understanding of the surrounding environment.2626 AV companies like Waymo, Aurora, and Argo AI are exploring 4D RADAR sensors to enhance perception in autonomous vehicles. It is good to highlight that the importance of these sensors can vary based on the overall sensor fusion strategy employed by developers.
Synthetic Aperture RADAR (SAR)
Synthetic Aperture RADAR (SAR) represents an advanced RADAR technique that offers high-resolution imaging capabilities for RADAR sensors. It enables AVs to better perceive and analyze objects, obstacles, and terrain, even in challenging weather conditions or low visibility scenarios.
SAR generates detailed images by synthesizing multiple RADAR measurements taken from different positions as the vehicle moves. This approach creates a large virtual antenna, resulting in finer resolution and improved object recognition. SAR is particularly valuable for identifying small objects, distinguishing between pedestrians and stationary obstacles, and enhancing AVs' perception in complex scenarios. Using sensor movement, it achieves precise angular resolution by creating a substantial antenna aperture. Given the sensor locations, consecutive RADAR measurements may be processed as if a single large antenna array acquired them. The figure below illustrates this principle.27
Recent research by Cambridge, Volkswagen, and the German Institute of Institute of Microwaves and Photonics have confirmed the idea that SAR imaging can be successfully and routinely used for high-resolution mapping of urban environments in the near future.28,29,30
Illustration of a synthetic aperture created from consecutive measurements of a moving RADAR.
Imaging radars
Imaging radar represents a specific RADAR variant capable of constructing 2D or 3D depictions of the neighboring surroundings. Between 2020 and 2023, significant advancements have been made in imaging radar technology, resulting in increased efficiency, improved capabilities, and expanded applications.
First, there has been a substantial enhancement in resolution and imaging precision in modern imaging radars. This development enables the detection of smaller objects and finer environmental details, significantly bolstering safety by improving the identification of pedestrians, cyclists, and obstacles.
Additionally, imaging radars have expanded their capabilities by incorporating multi-mode functionality, including weather-penetrating RADAR modes. These modes enable the RADAR to operate effectively even in challenging weather conditions such as heavy rain, snow, or fog.
Furthermore, imaging radars are increasingly integrated with complementary sensors like LiDAR, cameras, and ultrasonic sensors to enhance perception accuracy. This sensor fusion approach facilitates a comprehensive understanding of the surrounding environment and offers redundancy during sensor failures.
Finally, imaging radars have benefited from advancements in signal processing algorithms, which now enable them to filter out noise, distinguish between various object types, and predict the behavior of detected entities. These advancements contribute significantly to improved decision-making by the autonomous vehicle's control system, enhancing overall safety and performance.
Imaging radar can differentiate between cars, pedestrians, and other objects. Image credit: NXP
4D RADAR
While traditional imaging radar systems construct 2D or 3D depictions of the surroundings, 4D imaging radars utilize echolocation and the time-of-flight principle to create a 3D representation of the surroundings, with time as the fourth dimension. This technique also provides information about the speed of approaching or retreating vehicles. These RADARs have successfully addressed the primary resolution challenge that conventional RADARs face – their resolution is significantly lower than cameras and LiDARs.
4D imaging radars excel at detecting objects both vertically and horizontally, enabling high-resolution object classification. This advancement enhances the RADAR system's ability to determine the vehicle's location independently. 4D imaging radars are not yet a standard in widespread use across all OEMs but it is a promising tendency. The adoption of radar technologies varies among automotive manufacturers, which we touch on later in the Tech Stack chapter.
Comparison between current front imaging radars (coverage range from 18º to 80º) and 4D imaging radars (100º coverage range). Image credit: Future Bridge
Millimeter Wave RADARs
Research from both the US and Japan group indicates that the millimeter wave RADAR has significant potential for AVs beyond its current use in parking assist. Millimeter-wave radar offers a cost-effective alternative to LiDAR, cameras, and optical sensors, primarily because its composition is limited to an integrated circuit (IC) and printed antennas, reducing its overall expense. Additionally, this type of radar demonstrates superior performance in challenging weather conditions like fog and rain, where traditional camera systems might falter. It also excels in detecting non-line-of-sight targets, such as those on curved road sections, making it a more reliable option in complex driving scenarios.31,32 Continental, ZF, Bosch, Hella, Aptiv, Denso, Nidec Elesys, Valeo, Veoneer, and Hitachi are all developing Millimeter Wave RADARs for use in high level autonomy vehicles.
Companies Developing RADAR Technologies for AVs
Below, we outline companies leading the charge in the development of cutting-edge RADAR technologies tailored specifically for autonomous vehicles.
NVIDIA NVRadarNet
NVIDIA NVRadarNet enhances traditional RADAR processing methods for object detection by incorporating a DNN approach. While classical RADAR processing can identify moving vehicles effectively, it struggles with stationary objects, often misclassifying them. The solution involved training a DNN using data from RADAR sensors to detect both moving and stationary objects and differentiate between various stationary obstacles.
To address sparse RADAR data, ground truth labels were transferred from corresponding LiDAR datasets, allowing the DNN to learn not only object detection but also their 3D shapes, dimensions, and orientations. The integration of the RADAR DNN with classical RADAR processing improved obstacle perception, aiding AVs in making better driving decisions, even in complex scenarios, and offering redundancy to camera-based obstacle detection. 33
“The DNNs, the deep neural networks, are becoming more and more complex. We have the ability to not just detect a pedestrian, but to detect a distracted pedestrian.” - Danny Shapiro - VP of Automotive at NVIDIA
Example of propagating bounding box labels for cars from the LiDAR data domain into the RADAR data domain. Image credit: NVIDIA
Navtech
Navtech RADAR offers a robust sensor solution for AVs, ensuring performance in adverse conditions where other sensors might falter. The high-resolution, 360°, long-range RADAR excels in adverse weather and environmental challenges, providing an extensive and accurate view of its surroundings.
In 2021, this technology was chosen by Örebro University as a key sensor for groundbreaking AV research with a special focus on the harshest of conditions for operating faultlessly in dust, dirt and when environmental visibility is low. This RADAR's application extends to test routes and behavior analysis of both autonomous and regular vehicles, further solidifying its role in advancing autonomous technology.34
NXP
In January 2023, NXP released a new industry-first 28nm RFCMOS radar one-chip IC family for next-generation autonomous driving systems, enabling the long-range detection of objects and separation of small objects next to larger ones. This technology offers faster signal processing and allows for the implementation of 4D imaging radar capabilities in vehicles, particularly for levels of automation like L2+ and higher. These developments provide a cost-effective solution for original equipment manufacturers to integrate advanced RADAR systems into their vehicles. In addition to the RADAR processor and transceivers, NXP also offers essential peripherals, including safe power management and in-vehicle network components, to create a complete RADAR node system.35
Vayyar
In 2021, Vayyar, developed a production-ready RADAR-on-Chip (RoC) platform. The platform offers a single multifunctional chip capable of replacing multiple traditional one-function sensors, reducing complexity for in-cabin and AV applications. The RoC features up to 48 transceivers, an internal DSP, and an MCU for real-time signal processing, providing all-weather effectiveness and the ability to see through objects.
This single-chip solution can replace over a dozen sensors, eliminating the need for expensive LiDAR and cameras. Vayyar's RoC offers a wide range of applications, from intruder alerts to enhanced seat belt reminders, catering to the increasing sensor density in modern vehicles while delivering uncompromising safety.36,37
Ultrasonic Sensors
An ultrasonic sensor is an electronic device that measures the distance of a target object by emitting ultrasonic sound waves, and converts the reflected sound into an electrical signal. Within autonomous vehicles, they are most commonly employed to create Intelligent Parking Assist Systems (IPAS) which aid vehicles that in parking maneuvers by providing real-time distance and object detection information to the vehicle's control system. From an innovation perspective, ultrasound technology is not known for frequent breakthroughs. Nevertheless, two recent technical solutions in the field of AVs deserve special attention.
Structure of an ultrasonic sensor.
In 2023, MEMS Ultrasonic Sensor Solution introduced an Intelligent Cabin Child Presence Detection system, crucial for child safety in vehicles. It utilizes various sensors to detect children inside a car and alerts the driver. The MEMS ultrasonic sensor module has compact dimensions, measuring 30 x 20 x 5mm, significantly smaller than both open ultrasonic and millimeter-wave RADAR modules.
This MEMS ultrasonic Child Presence Detection solution boasts a detection distance of over 1m and a field of view reaching 180° (±90°), ensuring comprehensive coverage and precise monitoring for all cabin positions. Notably, the latest Euro NCAP standards suggest that MEMS ultrasonic sensing could dominate Child Presence Detection systems due to its efficient vital sign detection, extensive sensing range, compact size, and discreet installation. NCAP has now included Child Presence Detection in its testing criteria.
Also, in 2023, Murata unveiled a new water-resistant ultrasonic sensor designed for self-driving cars, known as the MA48CF15-7N. This sensor is highly sensitive, responds quickly, and is enclosed in a sealed case to protect it from liquids. As cars become more autonomous, the demand for precise short to medium-range sensors to detect objects is growing. The MA48CF15-7N operates by emitting ultrasonic waves and measuring the time it takes for them to bounce back, determining the presence and distance of nearby objects. This sensor can detect objects as close as 15cm and as far away as 550cm, covering a wide area with a 120° by 60° angle. Notably, the sensor's capacitance is 1100pF±10% at 1kHz, ensuring consistent performance without the need for frequent adjustments. Operating at a resonant frequency of 48.2±1.0kHz and with a quality factor (Q value) of 35±10, it delivers reliable performance across various temperatures. These specifications are notably more precise than previous models from Murata, with a 50% reduction in variability, ensuring consistent performance across different units.
Location of Continental ultrasonic parking sensor. Image credit: Continental
Leadership Interviews
Interview with Mouser
Interview with Murata
Interview with MacroFab
Interview with Nexperia
Interview with SAE International
Interview with Autoware Foundation
Interview with NVIDIA
References
1. SceneScan Pro User Story: 3D Stereo Vision Cameras for Autonomous Racing Cars. https://nerian.com/news/scenescan-pro-user-story-3d-stereo-vision-cameras-for-autonomous-racing-cars/ (2022).
2. Foresight. 3D Stereo Vision for the Autonomous Vehicle Industry. Foresight https://www.foresightauto.com/3d-stereo-vision-for-the-autonomous-vehicle-industry/ (2023).
3. Bhadoriya, A. S., Vegamoor, V. & Rathinam, S. Vehicle Detection and Tracking Using Thermal Cameras in Adverse Visibility Conditions. Sensors 22, (2022).
4. Computer vision challenges in autonomous vehicles: The future of AI. https://www.superannotate.com/blog/computer-vision-in-autonomous-vehicles.
5. O’Neill, M. A Visionary Leap: Enhancing Computer Vision for Autonomous Vehicles and Cyborgs. SciTechDaily https://scitechdaily.com/a-visionary-leap-enhancing-computer-vision-for-autonomous-vehicles-and-cyborgs/ (2023).
6. AI-Enhanced night vision: A breakthrough for autonomous cars. IO https://innovationorigins.com/en/ai-enhanced-night-vision-a-breakthrough-for-autonomous-cars/ (2023).
7. Cvijetic, N. Panoptic Segmentation Helps Autonomous Vehicles See Outside the Box. NVIDIA Blog https://blogs.nvidia.com/blog/2019/10/23/drive-labs-panoptic-segmentation/ (2019).
8. Bridging to the Autonomous Future with Mobileye SuperVisionTM. Mobileye https://www.mobileye.com/blog/mobileye-supervision-bridge-to-consumer-autonomous-vehicles/.
9. LIDAR Sensor Integration in Autonomous Vehicles Platforms. https://www.mrlcg.com https://www.mrlcg.com/latest-media/lidar-sensor-integration-in-autonomous-vehicles-platforms-300849/ (2023).
10. Innovations of Autonomous Vehicles: Solid-State LiDAR Sensors. EqualOcean https://equalocean.com/analysis/2022031917148.
11. Home. Aeva https://www.aeva.com/ (2023).
12. Rahn, S. Technology. Blickfeld https://www.blickfeld.com/technology/ (2021).
13. Velodyne Lidar lanza un centro de diseño en Bangalore. https://www.businesswire.com/news/home/20210624006043/es/ (2021).
14. Innoviz Technologies. Innoviz Technologies to Present at Upcoming Investor Conferences. The Victoria Advocate https://www.victoriaadvocate.com/innoviz-technologies-to-present-at-upcoming-investor-conferences/article_69c829b2-d474-52b0-825b-7f3591907377.html (2023).
15. Luminar. Luminar (Nasdaq: LAZR). Luminar https://www.luminartech.com/.
16. Home. Aeva https://www.aeva.com/ (2023).
17. Quanergy filing for bankruptcy, going up for sale. Security Info Watch https://www.securityinfowatch.com/perimeter-security/physical-hardening/perimeter-security-sensors/press-release/21290282/quanergy-quanergy-filing-for-bankruptcy-going-up-for-sale (2022).
18. Quanergy Announces Industry-first 3D LiDAR Movement-Based False Alarm Reduction Solution. Quanergy Solutions, Inc. | LiDAR Sensors and Smart Perception Solutions https://quanergy.com/pressrel/quanergy-announces-industry-first-3d-lidar-movement-based-false-alarm-reduction-solution/.
19. [No title]. https://static.mobileye.com/website/corporate/media/radar-lidar-fact-sheet.pdf.
20. Nellis, S. Mobileye looks to build its own lidar to drive down self-driving costs. Reuters (2020).
21. High Resolution 3D Flash LiDARTM. Continental Automotive http://www.continental-automotive.com/en-gl/Passenger-Cars/Autonomous-Mobility/Enablers/Lidars/3D-Flash-Lidar.
22. Continental Aftermarket. http://www.continental-aftermarket.com/us-en/press/press-releases/2021/2021-11-08-continental-releases-hfl110-3d-flash-lidar-to-series-production.
23. Carman, A. Blickfeld Unveils First Smart LiDAR Sensor with Built-in Software Stack. All About Circuits https://www.allaboutcircuits.com/news/blickfeld-unveils-first-smart-lidar-sensor-with-built-in-software-stack/ (2022).
24. Blickfeld. Blickfeld launches Qb2; the world’s first Smart LiDAR sensor, enabling capturing and processing of 3D. Blickfeld https://www.blickfeld.com/blickfeld-launches-qb2/ (2022).
25. Difference between solid state radar and magnetron. https://lidarradar.com/info/difference-between-solid-state-radar-and-magnetron.
26. Hsu, W.-T. & Lin, S.-L. Using FMCW in Autonomous Cars to Accurately Estimate the Distance of the Preceding Vehicle. Int. J. Automot. Technol. Manage. 23, 1755–1762 (2023).
27. Global, B. Bosch Research Blog. Bosch Global https://www.bosch.com/stories/synthetic-aperture-radar/ (2022).
28. Manzoni, M. et al. Automotive SAR imaging: potentials, challenges, and performances. International Journal of Microwave and Wireless Technologies 1–10 (2023).
29. Website. https://www.researchgate.net/publication/334633392_Synthetic_Aperture_Radar_Towards_Automotive_Applications.
30. [No title]. https://arxiv.org/pdf/2306.09784.pdf.
31. Website. https://omdia.tech.informa.com/blogs/2023/sensors-for-safe-autonomous-driving.
32. IEEE Xplore Full-Text PDF: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9528970.
33. Cvijetic, N. How AI Improves Radar Perception for Autonomous Vehicles. NVIDIA Blog https://blogs.nvidia.com/blog/2021/04/28/drive-labs-ai-radar-perception-autonomous-vehicles/ (2021).
34. Autonomous Vehicles. Navtech Radar https://navtechradar.com/explore/autonomous-vehicles/ (2021).
35. The Importance of Imaging Radar. http://www.nxp.com/company/blog/the-importance-of-imaging-radar:BL-THE-IMPORTANCE-OF-IMAGING-RADAR.
36. eVehicle Technology. New 4D Imaging Radar Sensor to Revolutionise Automotive Safety. eVehicle Technology https://www.evehicletechnology.com/news/4d-imaging-radar-sensor-to-revolutionise-automotive-safety/ (2020).
37. Vayyar. Vayyar Becomes First And Only Company In World To Offer Full-cabin Monitoring With One Radar-on-chip. PR Newswire https://www.prnewswire.com/news-releases/vayyar-becomes-first-and-only-company-in-world-to-offer-full-cabin-monitoring-with-one-radar-on-chip-301251217.html (2021).
38. Maanpää, J. et al. Multimodal End-to-End Learning for Autonomous Steering in Adverse Road and Weather Conditions. https://ieeexplore.ieee.org/document/9413109.
39. Multi-modality 3D object detection in autonomous driving: A review. Neurocomputing 553, 126587 (2023).
40. [No title]. https://arxiv.org/pdf/2002.00444.pdf.
41. Ravi Kiran, B. et al. Deep Reinforcement Learning for Autonomous Driving: A Survey. https://ieeexplore.ieee.org/document/9351818.
42. Wiggers, K. Uber claims its AI enables driverless cars to predict traffic movement with high accuracy. VentureBeat https://venturebeat.com/ai/uber-claims-its-ai-enables-driverless-cars-to-predict-traffic-movement-with-high-accuracy/ (2020).
43. Edge AI Computing Advancements Driving Autonomous Vehicle Potential. Global Semiconductor Alliance https://www.gsaglobal.org/forums/edge-ai-computing-advancements-driving-autonomous-vehicle-potential/ (2021).
44. Snapdragon Ride SDK: a premium platform for developing customizable ADAS applications. https://www.qualcomm.com/news/onq/2022/01/snapdragon-ride-sdk-premium-solution-developing-customizable-adas-and-autonomous.
45. Edge AI Computing for Autonomous Driving System - Lanner Electronics. https://www.lannerinc.com/applications/transportation/edge-ai-computing-for-autonomous-driving-system.
46. AI Accelerator Chips Overview and Comparison. HardwareBee https://hardwarebee.com/ai-accelerator-chips-overview-and-comparison/ (2023).
47. Lardinois, F. Google Cloud announces the 5th generation of its custom TPUs. TechCrunch https://techcrunch.com/2023/08/29/google-cloud-announces-the-5th-generation-of-its-custom-tpus/ (2023).
48. Francisco, A. Intel Brings New Chips, Autonomous Driving, and Artificial Intelligence to CES 2020. Showmetech https://www.showmetech.com.br/en/intel-brings-chips-and-news-to-ces-2020/ (2020).
49. Hajela, R. Looking at 11th Generation Intel® Processor Performance on Intel® DevCloud for Edge Workloads. OpenVINO-toolkit https://medium.com/openvino-toolkit/looking-at-11th-generation-intel-processor-performance-on-intel-devcloud-for-the-edge-213a413fb5e1 (2022).
50. Riley, D. Intel makes a splash at CES with AI, autonomous driving tech and Tiger Lake chips. SiliconANGLE https://siliconangle.com/2020/01/06/intel-makes-splash-ces-autonomous-driving-tech-ai-tiger-lake/ (2020).
51. Self-Driving Cars Technology & Solutions from NVIDIA Automotive. NVIDIA https://www.nvidia.com/en-us/self-driving-cars/.
52. Volvo Cars, Zoox, SAIC and More Join Growing Range of Autonomous Vehicle Makers Using New NVIDIA DRIVE Solutions. NVIDIA Newsroom http://nvidianews.nvidia.com/news/volvo-cars-zoox-saic-and-more-join-growing-range-of-autonomous-vehicle-makers-using-new-nvidia-drive-solutions.
53. Korosec, K. Qualcomm unveils its Snapdragon Ride platform for all levels of automated driving. TechCrunch https://techcrunch.com/2020/01/06/qualcomm-unveils-its-snapdragon-ride-platform-for-all-levels-of-automated-driving/ (2020).
54. Website. https://europe.autonews.com/automakers/vw-use-qualcomms-snapdragon-ride-platform-autonomous-driving-push.
55. Automotive RTOS. https://blackberry.qnx.com/en/ultimate-guides/software-defined-vehicle/automotive-rtos.
56. Saxena, A. Understanding AUTOSAR and its Applications in the Automotive Industry. https://www.einfochips.com/blog/autosar-in-automotive-industry/ (2020).
57. Apex.AI. Apex.AI https://www.apex.ai/blog.
58. Cognata. Cognata Autonomous and ADAS Simulation https://www.cognata.com/ (2022).
59. Ford China deploys C-V2X service in Changchun, Jilin Province. https://autonews.gasgoo.com/m/70024138.html.
60. Qualcomm Introduces Car-to-Cloud Service for Over-the-Air Vehicle Updates and On-Demand Services & Features. https://www.qualcomm.com/news/releases/2020/01/qualcomm-introduces-car-cloud-service-over-air-vehicle-updates-and-demand.
61. Vehicle Telematics, Geofencing & Fleet Tracking Systems for the Connected Car. (2006).
62. Cisco 5G Network Architecture. (2020).
63. Infoshare Systems, Inc. How cyber security threats are impacting the automotive industry. https://www.linkedin.com/pulse/how-cyber-security-threats-impacting-automotive-industry-1f (1685456083000).
64. Website. https://www.sciencedirect.com/science/ article/pii/S209580991930503X.
65. [No title]. https://iopscience.iop.org/article/10.1088/1742-6596/1694/1/012024/pdf.
66. ESCRYPT Intrusion detection and prevention solution. https://www.etas.com/en/products/intrusion-detection-and-prevention-solution.php.
67. TRANSFORMING SECURITY FROM A LIMITATION TO A BUSINESS VALUE MULTIPLIER. C2A Security - the Only Mobility Centric DevSecOps Platform https://c2a-sec.com/ (2022).
68. C2A. C2A Security Partners with Valeo to Enhance Cybersecurity for the Software Defined Vehicle. PR Newswire https://www.prnewswire.com/news-releases/c2a-security-partners-with-valeo-to-enhance-cybersecurity-for-the-software-defined-vehicle-301713633.html (2023).
69. Dave, P. Dashcam Footage Shows Driverless Cars Clogging San Francisco. WIRED https://www.wired.com/story/dashcam-footage-shows-driverless-cars-cruise-waymo-clogging-san-francisco/ (2023).
70. Waymo Driver. Waymo https://waymo.com/waymo-driver/.
71. How Autonomous Vehicles Work. https://ltad.com/about/how-autonomous-vehicles-work.html.
72. Merchant, B. Column: Self-driving Waymo cars might worsen L.A. traffic. Los Angeles Times (2023).
73. Waymo Driver. Waymo https://waymo.com/waymo-driver/.
74. Rangaiah, M. How Waymo is using AI for autonomous driving? https://www.analyticssteps.com/blogs/how-waymo-using-ai-autonomous-driving.
75. Website. https://www.uber.com/blog/autonomous-vehicle-safety-guidelines/.
76. Reuters. Uber partners with Alphabet’s Waymo to offer driverless rides. Reuters (2023).
77. Jennewine, T. 6 Ways Tesla’s Autonomous Driving Technology Is Evolving. The Motley Fool https://www.fool.com/investing/2022/01/27/6-ways-teslas-autonomous-driving-technology-is-evo/ (2022).
78. Tesla Vision Update: Replacing Ultrasonic Sensors with Tesla Vision. Tesla https://www.tesla.com/support/transitioning-tesla-vision.
79. Moon, S. How Tesla made autonomous possible without LIDAR. Datahunt - Quality Data with Quality AI https://www.thedatahunt.com/en-insight/how-tesla-autonomous-driving-possible-without-lidar (2023).
80. Autopilot. https://www.tesla.com/autopilot.
81. Fox, E. Tesla 2020.4.11 Software OTA Update To Boost Ranges Of Model S/X. TESMANIAN https://www.tesmanian.com/blogs/tesmanian-blog/tesla-2020-4-11-software-ota-update-to-boost-ranges-of-model-s-x (2020).
82. Cardello, S. What Are Over-the-Air Updates For Cars? MUO https://www.makeuseof.com/what-are-over-the-air-updates-for-cars/ (2022).
83. Hawkins, A. J. Here’s our first full look at the Cadillac Celestiq ultra-luxury electric sedan. The Verge https://www.theverge.com/23273937/gm-cadillac-celestiq-ev-sedan-photos-specs-reveal (2022).
84. [No title]. https://www.volvocars.com/intl/news/technology/Your-Volvo-car-will-understand-you/.
85. Pollock, D. General Motors Applies For Decentralized Blockchain Map Patent. Forbes https://www.forbes.com/sites/darrynpollock/2020/04/03/general-motors-applies-for-decentralized-blockchain-map-patent/ (2020).
86. Schoppa, C. Top Autonomous Vehicles Companies to Watch in 2023. AI Time Journal - Artificial Intelligence, Automation, Work and Business https://aitimejournal.com/autonomous-vehicles-companies-to-watch/ (2022).
87. Ammann, D. The Cruise Origin Story - Cruise - Medium. Cruise https://medium.com/cruise/the-cruise-origin-story-b6e9ad4b47e5 (2020).
88. Vara, V. Leading vehicle manufacturing companies in the autonomous vehicles theme. Just Auto http://www.just-auto.com/data-insights/top-ranked-vehicle-manufacturing-companies-in-autonomous-vehicles/ (2022).
89. Autonomous Vehicle Technology. https://getcruise.com/technology/.
90. Pavlik, T. When Volvo will have its first autonomous vehicle might surprise you. MotorBiscuit https://www.motorbiscuit.com/volvo-first-autonomous-vehicle-might-surprise/ (2022).
91. Dyer, N. Volvo’s Ride Pilot autonomous driving tech allows unsupervised driving. CARHP https://www.carhp.com/news/volvo-s-ride-pilot-autonomous-driving-tech-allows-unsupervised-driving (2023).
92. Rodríguez, J., Jr. Volvo EX90 comes with self-driving hardware, accessible by a subscription. Jalopnik https://jalopnik.com/volvo-ex90-subscription-based-self-driving-tech-1850197959 (2023).
93. Kirby, C. & Flear, K. Volvo’s First OTA Update: More Range for Some Models. https://getjerry.com/insights/volvos-first-ota-update-range-models (2022).