Autonomous Vehicle Tech Stack Review
In this chapter, we delve into the current status of prominent autonomous vehicle manufacturers, shedding light on their advancements, achievements, and strategic directions. As these industry leaders push the boundaries of AV technology, they play a pivotal role in shaping the future of transportation.

Waymo, Tesla, Cruise, and Volvo
This overview provides insights into the latest developments and showcases how Waymo, Tesla, Cruise, and Volvo are navigating the complex journey toward fully autonomous vehicles. We have chosen to focus on these four, as they represent a diverse range of approaches and technologies in the autonomous vehicle space and all share a significant public amount of technical information which enables us to make this comparison meaningful.
Waymo
Waymo, a subsidiary of Alphabet (Google's parent company), started research on autonomous vehicles in 2009. In October 2020, it became the first robotaxi service to offer service to the public without safety drivers in the vehicle.
Waymo’s 5th-generation driver is a combination of hardware, software, and compute designed to navigate complex driving environments. It relies on a comprehensive sensor suite, including high-resolution 360-degree LiDAR with a 300-meter range, cameras with overlapping fields of view for detailed imaging, and a newly designed imaging radar system that provides high resolution even in adverse weather conditions. The technology was developed from over 20 million self-driven miles and 10 billion simulated miles. In the last three years, Waymo has focused on scalable production, reducing costs while increasing sensor capabilities.
Since 2018, Waymo has been working with Jaguar Land Rover to create the world’s first premium electric fully self-driving vehicle. Its latest iteration is currently being tested on public roads in the US.
5th-generation Waymo Driver. Image credit: Waymo
Camera Array and Coverage
Waymo's enhanced vision system currently integrates high-dynamic range cameras with exceptional thermal stability to deliver crisp, detailed images across extreme automotive temperature conditions. The long-range and 360-degree cameras extend vision capabilities beyond 500 meters, sharpening the detection of critical elements like pedestrians and road signs. Moreover, custom-designed lenses and meticulous optomechanical construction elevate these cameras beyond current standards. In synergy with perimeter LiDAR sensors, the perimeter vision system grants additional contextual data, improving object identification.
The peripheral vision system mitigates blind spots, ensuring safer maneuvering around large vehicles. This network of cameras empowers the Waymo Driver with unprecedented decision-making clarity and speed.
Camera view of Waymo's Jaguar I-PACE vehicle. Image credit: Waymo.
LiDAR
The 5th-generation Waymo Driver employs a sophisticated overlapping LiDAR system. Its core LiDAR creates a 3D picture of the vehicle's surroundings that can discern the size and distance of objects around it. This system is effective over 300 meters, allowing it to identify objects in various lighting conditions, from bright sunlight to moonless nights.
The 360 LiDAR system offers a comprehensive view that can distinguish minute details, such as opening a car door from a block away, aiding in navigating complex city environments. Moreover, it also enables Waymo's trucks to detect road debris from a considerable distance, allowing for timely and safe maneuvering on highways.69
Waymo's perimeter LiDARs, placed at strategic points around the vehicle, afford a wide field of view for detecting proximity objects. This feature is critical for navigating tight spaces in heavy traffic and monitoring potential blind spots caused by the terrain. Altogether, these LiDAR systems represent a significant upgrade from previous iterations, improving the Waymo Driver's ability to handle more challenging driving scenarios.
RADAR
Waymo's sensor fusion is defined by the integration of LiDAR, camera, and RADAR technologies. LiDAR constructs a 3D outline of objects, while cameras contextualize the vehicle's surroundings. The radar, with its swift velocity measurement, excels in challenging weather, offering a consistent panoramic view. The 5th-generation radar architecture contains an imaging radar system that enhances resolution and range. It is engineered to cover vast distances, such as detecting a distant motorcyclist, providing the Waymo Driver with improved reaction time, and ensuring a smoother journey for passengers.70,71
Waymo smart LiDAR solutions. Image credit: Waymo.
Artificial Intelligence
Within its AVs, Waymo integrates AI for diverse functions, including object detection, lane identification, and obstacle evasion. The company harnesses AI to create an environment mapping and route planning system for its autonomous fleet.72 In addition, Waymo quantifies uncertainty in sensor data using probabilistic methods, enabling event probabilities like pedestrian crossing calculations. Moreover, data augmentation is harnessed to expand training data artificially, diminishing the impact of noise. The company also enhances accuracy by using ensemble learning and training distinct autonomous perception models.
Waymo employs a hybrid strategy, blending deep learning with handcrafted features to enhance their feature extraction process. Their DL models are educated using an extensive dataset collected from their self-driving vehicles, encompassing images, LiDAR, and RADAR data. These models learn to identify vital driving-related attributes, such as object shapes, distances, and velocities.
Furthermore, the company incorporates handcrafted features within its ML models. In this case, humans design these attributes based on their knowledge of the environment and driving dynamics. As an illustration, they may incorporate features like sun position, road color, and the presence of traffic signage.
Simulation and Testing
Waymo's engineering team deploys simulations to expose autonomous driving systems to collision scenarios. This method refines algorithms and responses without risking actual vehicles. Accumulating over 20 billion miles in simulation, Waymo identifies challenging situations autonomous cars might face on roads. This ongoing practice, involving simulations adjusted with accurate data and virtual scenario creation, enhances the autonomous driving software.7373,7474
Commercial Partnerships
In 2022, Waymo and Uber partnered to introduce driverless cars to Uber’s platform, allowing customers to use a specific number of Waymo's AVs for rides and deliveries within a defined area.75,76
Tesla
Founded in 2003 by a group of engineers with the mission of proving that electric cars could be better than gasoline-powered cars, Tesla, Inc. has grown to become the most recognizable name in the electric vehicle (EV) market but also in the frontier of autonomous driving technology. Headquartered in Palo Alto, California, Tesla's name pays homage to Nikola Tesla, the renowned inventor and electrical engineer.
Camera Array and Coverage
Tesla's adoption of cameras began in 2021, when the company transitioned North American production of the Model 3 and Model Y to a pure vision model, removing the RADAR sensors.77 However, the most recent Model X HW4 has a new Tesla-built RADAR equipped. The Model 3 and Model Y, built for the European and Middle Eastern markets, use the internally developed camera-based Tesla Vision, relying solely on Tesla’s advanced suite of cameras and neural net processing to deliver Autopilot and related features.
Tesla Vision leverages the capabilities of neural networks and machine learning to interpret visual data, a technique similar to the human visual system. This approach relies mainly on camera inputs, eschewing other sensor modalities commonly used in autonomous driving systems, such as LiDAR. The Tesla Vision system utilizes eight external cameras, providing 360-degree visibility around the vehicle at distances of up to 250 meters. These cameras are divided into three categories based on their field of view: main, wide, and narrow. The main front-facing camera is responsible for detecting objects directly ahead of the vehicle, the wide-angle cameras assist with peripheral vision and short-range data, and the narrow-angle cameras focus on distant objects, enabling early recognition of fast-approaching vehicles and other hazards.
Coverage zones of Tesla's car cameras. Image credit: Armstrong, K. In 2022, the company began removing ultrasonic sensors from their vehicles, replacing them with its vision-based occupancy network, currently used in Full Self-Driving (FSD).78
Data Processing and Neural Network Architecture
In 2019, Tesla unveiled a proprietary AI-driven hardware platform, Hardware 3.0 or AP3, which is the foundation for its Full Self-Driving (FSD) suite. Recently, since May 2023, Hardware 4.0 or HW4 has been used in Teslas. HW4 uses a RADAR named "Phoenix" that operates in the 76-77 GHz spectrum and supports three sensing modes. The advanced HD Synthetic Aperture Radar (SAR) system improves Tesla's situational comprehension, surpassing the clarity provided by optical systems. This innovative RADAR technology is designed to augment the precision of Tesla's autonomous navigation features. Its ability to deliver superior performance in low-visibility conditions such as nocturnal settings, fog, precipitation, or snowy landscapes significantly bolsters the safety and reliability of the vehicles' self-driving functions.
This onboard processing unit is equipped with a powerful neural network accelerator capable of performing trillions of operations per second. The neural networks employed by Tesla are trained on vast datasets collected from the fleet, encompassing diverse driving conditions and scenarios.
These networks are designed to perform complex visual recognition tasks such as identifying lane lines, traffic signals, road signs, and obstacles. They can make temporal associations across frames, which is critical for understanding the dynamics of the driving environment, such as the trajectory of moving objects.
Tesla's neural networks are trained in PyTorch using real-world and simulated data gathered from their vehicles. This approach strongly emphasizes feature extraction directly from visual data, making the system reliant on robust image-based feature representations. Tesla's preprocessing techniques include data augmentation, which involves introducing various transformations to the training data, enhancing the model's ability to generalize to different scenarios.79,80
Visual Perception and Decision-Making
Tesla Vision's algorithms process the camera feeds to create a coherent picture of the environment around the vehicle. They include detecting and classifying various elements like vehicles, pedestrians, cyclists, and static objects. The system then uses this information to make real-time driving decisions, such as steering, braking, and accelerating, aiming to mimic an attentive and skilled human driver.
OTA
In 2020, Tesla provided updates for its Autopilot self-driving system, improving its capabilities and resolving problems.81 Tesla’s software integration allows it to introduce updates that affect various aspects of the vehicle, including multimedia, performance, safety, and even new features like in-car gaming and streaming video.82
Continuous Improvement and Fleet Learning
As Tesla Vision collects data, the neural networks are continually refined and updated, which Tesla deploys to its vehicles through OTA software updates. This process results in a progressively more capable and robust autonomous driving system.
Tesla Vision's reliance on cameras has been a subject of debate in the autonomous vehicle industry. Cameras can be affected by environmental factors such as lighting conditions, weather, and obstructions. However, Tesla asserts that the adaptability and advancement of its neural networks can overcome these challenges, and the continuous learning loop allows the system to adapt to new situations that it may not have encountered before.
Cruise
Cruise started developing an autonomous on-demand feature in 2013. In 2016, they were acquired by General Motors (GM). The partnership has combined GM's resources as a global automotive leader with Cruise's proficiency in advanced software algorithms, sensor fusion, and machine learning.
The latest Ultra Cruise driver-assist system is expected to be included in the ultra-luxury Cadillac Celestiq in 2024. The system includes LiDAR and several other sensor technologies, enabling hands-free driving and covering 95% of driving maneuvers.83
Photo of a Cruise car parked on the street. Image credit: Cruise.
Camera Array and Coverage
Cruise's camera technology forms part of a complex sensor system, including LiDAR, RADAR, and GPS, which collectively provide a comprehensive perception of the vehicle's environment. The bespoke Sensor Placement Tool ensures optimal sensor placement on the Cruise Origin, providing 360-degree coverage for detecting other road users and obstacles. The hardware-accurate CAD-based model allows for precise sensor positioning, avoiding potential occlusions and enabling the testing of various camera configurations for an unobstructed field of view.
LiDAR
Cruise's autonomous vehicles are equipped with LiDAR sensors that contribute to the 360-degree overlapping sensor coverage, which is vital for the safe maneuvering of the Cruise Origin. The LiDAR system's point cloud data, which captures the distribution and intensity of the light reflections, is processed through advanced algorithms, allowing Cruise vehicles to identify objects and their movements with centimeter-level precision.
This technology is also pivotal in Cruise's redundancy and safety mechanisms, ensuring that the vehicle can continue to operate safely even in the unlikely event of a sensor failure. Additionally, the simulations used to accelerate sensor development at Cruise include evaluating the LiDAR's range and field of view, ensuring optimal sensor placement and calibration for reliable navigation in various driving conditions.
RADAR
RADAR sensors are placed on the Cruise Origin to ensure comprehensive coverage and to complement the data gathered by LiDAR and cameras. These sensors detect the distance, speed, and angle of objects around the vehicle, contributing to a robust 360-degree understanding of the environment. RADAR sensors allow Cruise vehicles to maintain a constant awareness of nearby objects, right down to the centimeter, which is essential for navigating complex urban environments safely.
Moreover, Cruise's RADAR technology also includes enhanced night vision capabilities, ensuring clear detection around the clock. The company's computing platforms process the RADAR data alongside inputs from other sensors, facilitating instantaneous and informed decision-making crucial for autonomous driving.
View with LiDAR data of a Cruise AV. Image credit: Cruise.
Computing Platform
Cruise's autonomous vehicle technology relies heavily on its advanced computing platforms, which form the backbone of its operational capabilities. These platforms are designed to handle the enormous amount of data generated by the vehicle's sensors, including LiDAR and RADAR. These computing systems use GPUs and custom-designed chips to ensure that data is processed in real-time, enabling the vehicle to make swift and accurate decisions on the road. The robustness of these computing solutions is crucial for the continuous and intensive demands of autonomous driving, where data processing speed and reliability are non-negotiable for safety and efficiency.
Artificial Intelligence
Cruise use AI algorithms are used for object detection, lane detection, obstacle avoidance, and route planning. Moreover, the company uses NLP to enable passengers to communicate with the vehicle using voice commands. This NLP capability has been advanced further to understand more complex queries, reflecting Cruise's commitment to creating an interactive and user-friendly experience. AI also plays a role in Cruise's Continuous Learning Machine, which automates data collection, labeling, model training, and deployment, ensuring that the vehicle's driving systems improve over time. This machine-learning approach enables Cruise's vehicles to handle the unpredictability of real-world driving scenarios with greater accuracy and safety.
The GM subsidiary has also integrated NLP since 2018 in order to enable passengers to communicate with their AVs using voice commands. Expanding on this, in 2021, Cruise introduced "natural language understanding," a feature empowering passengers to pose more intricate queries to the vehicle.84
Cruise employs a fusion of handcrafted attributes and deep learning to enrich its feature extraction process from raw sensor data. Their human-designed features are shaped by expert insights into the environment and driving dynamics. These include the car's position, speed, and proximity to other objects. They have introduced a sophisticated DL framework, which adeptly extracts features from images and RADAR data. The framework learns to recognize pivotal driving-related attributes, encompassing object shapes, distances, and velocities. Cruise also adopts multimodal data fusion to strengthen its approach further, merging information from diverse sensors.
A core challenge identified was accurately predicting the intentions of pedestrians and vehicles for making informed decisions. Two strategies are auto-labeling prediction data using the vehicle's perception system and automated error identification through active learning. These concepts are integral to Cruise's Continuous Learning Machine (CLM), which automates data collection, labeling, model training, and deployment. Despite the rarity of certain scenarios, CLM progressively improves predictions through continuous learning. The CLM approach minimizes human intervention and scales to handle even the most intricate longtail problems.
Cruise’s continuous learning machine loop. Image credit: Cruise
Blockchain
Cruise employs blockchain to secure vehicle data while also facilitating personalized customer experiences. The company has submitted a patent application for a "Decentralized Distributed Map Using Blockchain".85 The patent aims to address the challenge of maintaining dynamic vehicle mapping information without incurring high costs. Their solution involves sensors that assess the vehicle's surroundings and a discrepancy detector that identifies variations compared to a known navigation map. These differences are transmitted to a blockchain map network, leveraging the blockchain's ability to maintain an updated and reliable record.
Vehicle Applications
The Cruise Origin, Cruise's latest venture, offers a fully autonomous vehicle that lacks mirrors, pedals, or steering wheels. The modular design of the Origin enables it to be upgraded with new sensors or computers without requiring the replacement of the entire fleet. With the capacity to accommodate 4-5 people and the concept of an autonomous bus, the Origin can also be used for deliveries.86,87,89
Illustration of the Cruise Origin. Image credit: Cruise
Volvo
In early January 2022, at the CES consumer electronics show, Volvo unveiled its novel Level 3 autonomous driving system known as Ride Pilot. The name 'Ride Pilot' succinctly conveys its purpose: while the car autonomously drives, Volvo Cars assumes driving responsibility, ensuring driver comfort and peace of mind.90
The Ride Pilot system will enable fully autonomous, hands-free driving on specific roads under particular traffic conditions. This involves utilizing OTA software updates in tandem with a cutting-edge sensor configuration. The software is a collaborative effort between autonomous driving software company Zenseact, Volvo Cars' in-house developer team, and engineers from Luminar, one of its technology partners.
Volvo's ride pilot hands free system. Image credit: Volvo.
Sensing
Ride Pilot is able to be installed in the newest XC90 SUV. This electric EV will encompass all necessary components for operating Ride Pilot, including five RADAR sensors, eight cameras, 16 ultrasonic sensors, a LiDAR unit, and the requisite controlling software. These sensors will be factory-installed, and once Volvo completes its testing and obtains approvals, it will likely enable the Ride Pilot feature through an over-the-air update.9191 The LiDAR sensor will be integrated into the car's roofline, while the other cameras and sensors will be strategically positioned around the rest of the EX90. These sensors will be able to scan the road ahead and identify pedestrians up to 250 meters away and even small, dark objects like a tire on a black road 120 meters ahead. This technology aims to assist drivers in avoiding road hazards or halting the car when necessary, with the company asserting that it could potentially reduce accidents causing injuries or fatalities by up to 20%.92
Illustration of a Volvo EX90 detecting obstacles on the street through its Ride Pilot system. Image credit: Volvo.
OTA Updates
In 2022, Volvo's Version 1.7 OTA update includes bug fixes, multimedia system improvements, and Sirius XM radio updates, and for electric vehicles, aims to improve range by modifying the drive system, showcasing the promising potential of vehicle software updates.93
Blockchain
When it comes to bolstering the safety and quality of its products throughout the entire supply chain, Volvo utilizes blockchain. This strategy minimizes recall risks and ensures component adherence to rigorous standards. Volvo Cars introduced global cobalt traceability in its batteries through blockchain technology, making it the first automaker to do so. Blockchain improves supply chain transparency by securely recording material origin and characteristics, making alterations impossible to hide. Volvo has partnered with CATL and LG Chem, as well as blockchain companies like Circulor, Oracle, RSBN, RCS Global, and IBM, to implement traceability across battery supply chains, promoting transparency, trust, and ethical practices.
Report Summary
In the ever-evolving landscape of the automotive industry, the last three years have witnessed remarkable technological strides. These advances encompass various aspects, starting with sensing technology. Recent developments in sensing technology, such as high-resolution cameras, wider field-of-view cameras, and improved image processing algorithms, have significantly enhanced the capabilities of autonomous vehicles. Furthermore, the infusion of AI into vision systems, utilizing deep learning models like convolutional neural networks (CNNs), has revolutionized the way AVs perceive and understand their surroundings, further improving safety and object detection.
LiDAR technology has also undergone noteworthy transformations. Traditional LiDAR systems with moving parts have given way to more compact and precise solid-state and advanced sensors. Emerging Frequency Modulated Continuous Wave (FMCW) LiDAR, in particular, has enabled real-time distance and velocity measurements, elevating AV perception. The integration of hybrid LiDAR systems has bolstered object detection and distance assessment, emphasizing the significance of sensor fusion for robust autonomous driving systems.
RADAR sensors, too, have witnessed substantial enhancements, functioning effectively in diverse weather conditions and providing comprehensive 360° coverage around vehicles. Multi-mode RADAR sensors' ability to switch between detection ranges has boosted adaptability, while innovations like Digital Beamforming RADAR have improved object tracking. The shift towards solid-state RADAR, with its elimination of moving parts, has not only increased reliability but also enhanced energy efficiency. Meanwhile, technologies such as Frequency Modulated Continuous Wave (FMCW) RADAR and 4D RADAR have introduced precise distance and velocity measurements, while Synthetic Aperture RADAR (SAR) has contributed high-resolution imaging for advanced object recognition and perception.
AI and computing have played a central role in AV advancements, particularly with the adoption of Deep Reinforcement Learning and Generative Adversarial Networks (GANs). These technologies support dynamic AV scenarios and facilitate realistic visual generation for object recognition. Robust AI algorithms have become indispensable for AVs to navigate diverse and challenging conditions while maintaining resilience in the face of disturbances and uncertainty.
Furthermore, AI integration with Natural Language Processing (NLP) is transforming vehicle interactions, making them more intuitive and efficient. To ensure seamless operation, cloud-based AI with high-speed connectivity addresses in-vehicle limitations. Edge computing, on the other hand, reduces latency, enhances sensor data handling, and offers reliability and data privacy, optimizing AV operations and traffic management.
Communication technologies are evolving as well, with 5G revolutionizing AV connectivity through faster data speeds, improved safety, and cost reduction. AVs are now equipped to process information swiftly, enhance connectivity, make informed decisions, and prioritize safety communication even in crowded networks. Over-the-Air (OTA) updates have become a critical advancement, allowing remote software enhancements for AVs without the need for recalls. However, this convenience also brings challenges such as data security, reliability, data capacity, integration, and regulatory compliance.
Enhancing AV security is paramount, and blockchain technology has emerged as a robust solution, offering secure storage and sharing of AV data, ensuring integrity and transparency without central control. Moreover, blockchain is revolutionizing insurance processes, simplifying shipping logistics, and expediting self-driving vehicle development through collaborative data exchange.
To further bolster AV security, Intrusion Detection and Prevention Systems (IDPS) have integrated machine learning and AI to enhance detection accuracy and adaptability to evolving threats. These systems utilize sophisticated anomaly detection techniques to identify novel attacks, operate in real-time by analyzing sensor data, and may rely on shared threat intelligence databases for collective threat response based on shared experiences among vehicles.
In summary, these interconnected advancements in sensing technology, AI, computing, communication, and cybersecurity are reshaping the landscape of autonomous vehicles, offering safer, smarter, and more sustainable mobility solutions. However, despite the remarkable progress in the engineering of autonomous vehicles, several challenges remain on the horizon. Ensuring robust and reliable cybersecurity measures to protect AVs from cyberattacks is still a significant concern. Additionally, refining the ability of AVs to navigate complex and unpredictable urban environments, handle adverse weather conditions, and effectively communicate with both other vehicles and pedestrians remains a challenge. Balancing AI decision-making with human intervention in critical scenarios presents a persistent ethical and technical challenge.
Leadership Interviews
Interview with Mouser
Interview with Murata
Interview with MacroFab
Interview with Nexperia
Interview with SAE International
Interview with Autoware Foundation
Interview with NVIDIA
References
1. SceneScan Pro User Story: 3D Stereo Vision Cameras for Autonomous Racing Cars. https://nerian.com/news/scenescan-pro-user-story-3d-stereo-vision-cameras-for-autonomous-racing-cars/ (2022).
2. Foresight. 3D Stereo Vision for the Autonomous Vehicle Industry. Foresight https://www.foresightauto.com/3d-stereo-vision-for-the-autonomous-vehicle-industry/ (2023).
3. Bhadoriya, A. S., Vegamoor, V. & Rathinam, S. Vehicle Detection and Tracking Using Thermal Cameras in Adverse Visibility Conditions. Sensors 22, (2022).
4. Computer vision challenges in autonomous vehicles: The future of AI. https://www.superannotate.com/blog/computer-vision-in-autonomous-vehicles.
5. O’Neill, M. A Visionary Leap: Enhancing Computer Vision for Autonomous Vehicles and Cyborgs. SciTechDaily https://scitechdaily.com/a-visionary-leap-enhancing-computer-vision-for-autonomous-vehicles-and-cyborgs/ (2023).
6. AI-Enhanced night vision: A breakthrough for autonomous cars. IO https://innovationorigins.com/en/ai-enhanced-night-vision-a-breakthrough-for-autonomous-cars/ (2023).
7. Cvijetic, N. Panoptic Segmentation Helps Autonomous Vehicles See Outside the Box. NVIDIA Blog https://blogs.nvidia.com/blog/2019/10/23/drive-labs-panoptic-segmentation/ (2019).
8. Bridging to the Autonomous Future with Mobileye SuperVisionTM. Mobileye https://www.mobileye.com/blog/mobileye-supervision-bridge-to-consumer-autonomous-vehicles/.
9. LIDAR Sensor Integration in Autonomous Vehicles Platforms. https://www.mrlcg.com https://www.mrlcg.com/latest-media/lidar-sensor-integration-in-autonomous-vehicles-platforms-300849/ (2023).
10. Innovations of Autonomous Vehicles: Solid-State LiDAR Sensors. EqualOcean https://equalocean.com/analysis/2022031917148.
11. Home. Aeva https://www.aeva.com/ (2023).
12. Rahn, S. Technology. Blickfeld https://www.blickfeld.com/technology/ (2021).
13. Velodyne Lidar lanza un centro de diseño en Bangalore. https://www.businesswire.com/news/home/20210624006043/es/ (2021).
14. Innoviz Technologies. Innoviz Technologies to Present at Upcoming Investor Conferences. The Victoria Advocate https://www.victoriaadvocate.com/innoviz-technologies-to-present-at-upcoming-investor-conferences/article_69c829b2-d474-52b0-825b-7f3591907377.html (2023).
15. Luminar. Luminar (Nasdaq: LAZR). Luminar https://www.luminartech.com/.
16. Home. Aeva https://www.aeva.com/ (2023).
17. Quanergy filing for bankruptcy, going up for sale. Security Info Watch https://www.securityinfowatch.com/perimeter-security/physical-hardening/perimeter-security-sensors/press-release/21290282/quanergy-quanergy-filing-for-bankruptcy-going-up-for-sale (2022).
18. Quanergy Announces Industry-first 3D LiDAR Movement-Based False Alarm Reduction Solution. Quanergy Solutions, Inc. | LiDAR Sensors and Smart Perception Solutions https://quanergy.com/pressrel/quanergy-announces-industry-first-3d-lidar-movement-based-false-alarm-reduction-solution/.
19. [No title]. https://static.mobileye.com/website/corporate/media/radar-lidar-fact-sheet.pdf.
20. Nellis, S. Mobileye looks to build its own lidar to drive down self-driving costs. Reuters (2020).
21. High Resolution 3D Flash LiDARTM. Continental Automotive http://www.continental-automotive.com/en-gl/Passenger-Cars/Autonomous-Mobility/Enablers/Lidars/3D-Flash-Lidar.
22. Continental Aftermarket. http://www.continental-aftermarket.com/us-en/press/press-releases/2021/2021-11-08-continental-releases-hfl110-3d-flash-lidar-to-series-production.
23. Carman, A. Blickfeld Unveils First Smart LiDAR Sensor with Built-in Software Stack. All About Circuits https://www.allaboutcircuits.com/news/blickfeld-unveils-first-smart-lidar-sensor-with-built-in-software-stack/ (2022).
24. Blickfeld. Blickfeld launches Qb2; the world’s first Smart LiDAR sensor, enabling capturing and processing of 3D. Blickfeld https://www.blickfeld.com/blickfeld-launches-qb2/ (2022).
25. Difference between solid state radar and magnetron. https://lidarradar.com/info/difference-between-solid-state-radar-and-magnetron.
26. Hsu, W.-T. & Lin, S.-L. Using FMCW in Autonomous Cars to Accurately Estimate the Distance of the Preceding Vehicle. Int. J. Automot. Technol. Manage. 23, 1755–1762 (2023).
27. Global, B. Bosch Research Blog. Bosch Global https://www.bosch.com/stories/synthetic-aperture-radar/ (2022).
28. Manzoni, M. et al. Automotive SAR imaging: potentials, challenges, and performances. International Journal of Microwave and Wireless Technologies 1–10 (2023).
29. Website. https://www.researchgate.net/publication/334633392_Synthetic_Aperture_Radar_Towards_Automotive_Applications.
30. [No title]. https://arxiv.org/pdf/2306.09784.pdf.
31. Website. https://omdia.tech.informa.com/blogs/2023/sensors-for-safe-autonomous-driving.
32. IEEE Xplore Full-Text PDF: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9528970.
33. Cvijetic, N. How AI Improves Radar Perception for Autonomous Vehicles. NVIDIA Blog https://blogs.nvidia.com/blog/2021/04/28/drive-labs-ai-radar-perception-autonomous-vehicles/ (2021).
34. Autonomous Vehicles. Navtech Radar https://navtechradar.com/explore/autonomous-vehicles/ (2021).
35. The Importance of Imaging Radar. http://www.nxp.com/company/blog/the-importance-of-imaging-radar:BL-THE-IMPORTANCE-OF-IMAGING-RADAR.
36. eVehicle Technology. New 4D Imaging Radar Sensor to Revolutionise Automotive Safety. eVehicle Technology https://www.evehicletechnology.com/news/4d-imaging-radar-sensor-to-revolutionise-automotive-safety/ (2020).
37. Vayyar. Vayyar Becomes First And Only Company In World To Offer Full-cabin Monitoring With One Radar-on-chip. PR Newswire https://www.prnewswire.com/news-releases/vayyar-becomes-first-and-only-company-in-world-to-offer-full-cabin-monitoring-with-one-radar-on-chip-301251217.html (2021).
38. Maanpää, J. et al. Multimodal End-to-End Learning for Autonomous Steering in Adverse Road and Weather Conditions. https://ieeexplore.ieee.org/document/9413109.
39. Multi-modality 3D object detection in autonomous driving: A review. Neurocomputing 553, 126587 (2023).
40. [No title]. https://arxiv.org/pdf/2002.00444.pdf.
41. Ravi Kiran, B. et al. Deep Reinforcement Learning for Autonomous Driving: A Survey. https://ieeexplore.ieee.org/document/9351818.
42. Wiggers, K. Uber claims its AI enables driverless cars to predict traffic movement with high accuracy. VentureBeat https://venturebeat.com/ai/uber-claims-its-ai-enables-driverless-cars-to-predict-traffic-movement-with-high-accuracy/ (2020).
43. Edge AI Computing Advancements Driving Autonomous Vehicle Potential. Global Semiconductor Alliance https://www.gsaglobal.org/forums/edge-ai-computing-advancements-driving-autonomous-vehicle-potential/ (2021).
44. Snapdragon Ride SDK: a premium platform for developing customizable ADAS applications. https://www.qualcomm.com/news/onq/2022/01/snapdragon-ride-sdk-premium-solution-developing-customizable-adas-and-autonomous.
45. Edge AI Computing for Autonomous Driving System - Lanner Electronics. https://www.lannerinc.com/applications/transportation/edge-ai-computing-for-autonomous-driving-system.
46. AI Accelerator Chips Overview and Comparison. HardwareBee https://hardwarebee.com/ai-accelerator-chips-overview-and-comparison/ (2023).
47. Lardinois, F. Google Cloud announces the 5th generation of its custom TPUs. TechCrunch https://techcrunch.com/2023/08/29/google-cloud-announces-the-5th-generation-of-its-custom-tpus/ (2023).
48. Francisco, A. Intel Brings New Chips, Autonomous Driving, and Artificial Intelligence to CES 2020. Showmetech https://www.showmetech.com.br/en/intel-brings-chips-and-news-to-ces-2020/ (2020).
49. Hajela, R. Looking at 11th Generation Intel® Processor Performance on Intel® DevCloud for Edge Workloads. OpenVINO-toolkit https://medium.com/openvino-toolkit/looking-at-11th-generation-intel-processor-performance-on-intel-devcloud-for-the-edge-213a413fb5e1 (2022).
50. Riley, D. Intel makes a splash at CES with AI, autonomous driving tech and Tiger Lake chips. SiliconANGLE https://siliconangle.com/2020/01/06/intel-makes-splash-ces-autonomous-driving-tech-ai-tiger-lake/ (2020).
51. Self-Driving Cars Technology & Solutions from NVIDIA Automotive. NVIDIA https://www.nvidia.com/en-us/self-driving-cars/.
52. Volvo Cars, Zoox, SAIC and More Join Growing Range of Autonomous Vehicle Makers Using New NVIDIA DRIVE Solutions. NVIDIA Newsroom http://nvidianews.nvidia.com/news/volvo-cars-zoox-saic-and-more-join-growing-range-of-autonomous-vehicle-makers-using-new-nvidia-drive-solutions.
53. Korosec, K. Qualcomm unveils its Snapdragon Ride platform for all levels of automated driving. TechCrunch https://techcrunch.com/2020/01/06/qualcomm-unveils-its-snapdragon-ride-platform-for-all-levels-of-automated-driving/ (2020).
54. Website. https://europe.autonews.com/automakers/vw-use-qualcomms-snapdragon-ride-platform-autonomous-driving-push.
55. Automotive RTOS. https://blackberry.qnx.com/en/ultimate-guides/software-defined-vehicle/automotive-rtos.
56. Saxena, A. Understanding AUTOSAR and its Applications in the Automotive Industry. https://www.einfochips.com/blog/autosar-in-automotive-industry/ (2020).
57. Apex.AI. Apex.AI https://www.apex.ai/blog.
58. Cognata. Cognata Autonomous and ADAS Simulation https://www.cognata.com/ (2022).
59. Ford China deploys C-V2X service in Changchun, Jilin Province. https://autonews.gasgoo.com/m/70024138.html.
60. Qualcomm Introduces Car-to-Cloud Service for Over-the-Air Vehicle Updates and On-Demand Services & Features. https://www.qualcomm.com/news/releases/2020/01/qualcomm-introduces-car-cloud-service-over-air-vehicle-updates-and-demand.
61. Vehicle Telematics, Geofencing & Fleet Tracking Systems for the Connected Car. (2006).
62. Cisco 5G Network Architecture. (2020).
63. Infoshare Systems, Inc. How cyber security threats are impacting the automotive industry. https://www.linkedin.com/pulse/how-cyber-security-threats-impacting-automotive-industry-1f (1685456083000).
64. Website. https://www.sciencedirect.com/science/ article/pii/S209580991930503X.
65. [No title]. https://iopscience.iop.org/article/10.1088/1742-6596/1694/1/012024/pdf.
66. ESCRYPT Intrusion detection and prevention solution. https://www.etas.com/en/products/intrusion-detection-and-prevention-solution.php.
67. TRANSFORMING SECURITY FROM A LIMITATION TO A BUSINESS VALUE MULTIPLIER. C2A Security - the Only Mobility Centric DevSecOps Platform https://c2a-sec.com/ (2022).
68. C2A. C2A Security Partners with Valeo to Enhance Cybersecurity for the Software Defined Vehicle. PR Newswire https://www.prnewswire.com/news-releases/c2a-security-partners-with-valeo-to-enhance-cybersecurity-for-the-software-defined-vehicle-301713633.html (2023).
69. Dave, P. Dashcam Footage Shows Driverless Cars Clogging San Francisco. WIRED https://www.wired.com/story/dashcam-footage-shows-driverless-cars-cruise-waymo-clogging-san-francisco/ (2023).
70. Waymo Driver. Waymo https://waymo.com/waymo-driver/.
71. How Autonomous Vehicles Work. https://ltad.com/about/how-autonomous-vehicles-work.html.
72. Merchant, B. Column: Self-driving Waymo cars might worsen L.A. traffic. Los Angeles Times (2023).
73. Waymo Driver. Waymo https://waymo.com/waymo-driver/.
74. Rangaiah, M. How Waymo is using AI for autonomous driving? https://www.analyticssteps.com/blogs/how-waymo-using-ai-autonomous-driving.
75. Website. https://www.uber.com/blog/autonomous-vehicle-safety-guidelines/.
76. Reuters. Uber partners with Alphabet’s Waymo to offer driverless rides. Reuters (2023).
77. Jennewine, T. 6 Ways Tesla’s Autonomous Driving Technology Is Evolving. The Motley Fool https://www.fool.com/investing/2022/01/27/6-ways-teslas-autonomous-driving-technology-is-evo/ (2022).
78. Tesla Vision Update: Replacing Ultrasonic Sensors with Tesla Vision. Tesla https://www.tesla.com/support/transitioning-tesla-vision.
79. Moon, S. How Tesla made autonomous possible without LIDAR. Datahunt - Quality Data with Quality AI https://www.thedatahunt.com/en-insight/how-tesla-autonomous-driving-possible-without-lidar (2023).
80. Autopilot. https://www.tesla.com/autopilot.
81. Fox, E. Tesla 2020.4.11 Software OTA Update To Boost Ranges Of Model S/X. TESMANIAN https://www.tesmanian.com/blogs/tesmanian-blog/tesla-2020-4-11-software-ota-update-to-boost-ranges-of-model-s-x (2020).
82. Cardello, S. What Are Over-the-Air Updates For Cars? MUO https://www.makeuseof.com/what-are-over-the-air-updates-for-cars/ (2022).
83. Hawkins, A. J. Here’s our first full look at the Cadillac Celestiq ultra-luxury electric sedan. The Verge https://www.theverge.com/23273937/gm-cadillac-celestiq-ev-sedan-photos-specs-reveal (2022).
84. [No title]. https://www.volvocars.com/intl/news/technology/Your-Volvo-car-will-understand-you/.
85. Pollock, D. General Motors Applies For Decentralized Blockchain Map Patent. Forbes https://www.forbes.com/sites/darrynpollock/2020/04/03/general-motors-applies-for-decentralized-blockchain-map-patent/ (2020).
86. Schoppa, C. Top Autonomous Vehicles Companies to Watch in 2023. AI Time Journal - Artificial Intelligence, Automation, Work and Business https://aitimejournal.com/autonomous-vehicles-companies-to-watch/ (2022).
87. Ammann, D. The Cruise Origin Story - Cruise - Medium. Cruise https://medium.com/cruise/the-cruise-origin-story-b6e9ad4b47e5 (2020).
88. Vara, V. Leading vehicle manufacturing companies in the autonomous vehicles theme. Just Auto http://www.just-auto.com/data-insights/top-ranked-vehicle-manufacturing-companies-in-autonomous-vehicles/ (2022).
89. Autonomous Vehicle Technology. https://getcruise.com/technology/.
90. Pavlik, T. When Volvo will have its first autonomous vehicle might surprise you. MotorBiscuit https://www.motorbiscuit.com/volvo-first-autonomous-vehicle-might-surprise/ (2022).
91. Dyer, N. Volvo’s Ride Pilot autonomous driving tech allows unsupervised driving. CARHP https://www.carhp.com/news/volvo-s-ride-pilot-autonomous-driving-tech-allows-unsupervised-driving (2023).
92. Rodríguez, J., Jr. Volvo EX90 comes with self-driving hardware, accessible by a subscription. Jalopnik https://jalopnik.com/volvo-ex90-subscription-based-self-driving-tech-1850197959 (2023).
93. Kirby, C. & Flear, K. Volvo’s First OTA Update: More Range for Some Models. https://getjerry.com/insights/volvos-first-ota-update-range-models (2022).