Home > AI Solutions > Artificial Intelligence > White Papers > Dell Automotive Reference Architecture > Terminology
The following table provides definitions for terms commonly used in the ADAS/AD industry.
Term | Definition |
AD | Autonomous Driving |
ADAS | Advanced Driver Assistance Systems |
Controller Area Network (CAN) | A CAN bus is a standardized communication network that facilitates communication and data exchange between various electronic control units (ECUs) or modules in a vehicle. |
Autonomy 2.0 (AV2.0) | AV2.0 represents the next generation of autonomous vehicle technology. This approach uses advanced artificial intelligence (AI), particularly generative AI, to create more sophisticated, end-to-end AI models that manage various aspects of the vehicle's operations, including perception, planning, and control. |
Closed loop simulation | Closed loop simulation is a type of simulation environment in which the controller (the ECU in this case) interacts with a dynamic virtual world and provides control feedback (acceleration, braking, steering, shifting) based on the ECU’s sensor inputs. |
Data lake | A data lake is a centralized repository that allows organizations to store vast amounts of structured and unstructured data at scale. Unlike traditional databases or data warehouses that require data to be structured before it can be stored and analyzed, a data lake can store data in its raw, native format. It includes data from various sources such as databases, logs, sensors, social media, and more. |
Data logger | A data logger in the ADAS/AD market is a specialized device used to capture, store, and analyze a wide range of data related to the operation and performance of autonomous and semiautonomous vehicles. This data is instrumental in the development, testing, validation, and improvement of ADAS/AD systems, ensuring their safety, reliability, and effectiveness. |
Digital twin | A digital twin refers to a virtual, computer-generated replica or simulation of an autonomous vehicle, its components, and its surrounding environment. This digital representation is created to represent the real-world vehicle (or a physical asset or physical process) and its behavior accurately. Digital twins are used in the development, testing, and validation of ADAS and autonomous vehicle systems for predictive analysis and other use cases. |
Dell Validated Design (DVD) | DVDs are valuable resources that help reduce the complexity and risks associated with deploying complex IT solutions. By following the validated designs and best practices, organizations can be confident that their IT infrastructure is optimized for the intended workload and potential compatibility issues have been addressed during the design and testing phases. This process leads to faster deployments, reduced troubleshooting efforts, and improved overall system performance and reliability. See Generative AI in the Enterprise with AMD Accelerators. |
Electronic Control Unit (ECU) or Sensor Fusion ECU or virtual-ECU (vECU) | A Sensor Fusion ECU or vECU is a specialized component in an automotive or robotics system responsible for processing data from multiple sensors and integrating it into a unified and coherent understanding of the surrounding environment (sensor fusion). The ECU plays a critical role in enhancing perception and decision-making capabilities in autonomous vehicles. A vECU is a software representation of a physical ECU. |
EGO or EGO vehicle | An EGO vehicle describes the self-driving car that is equipped with sensors, cameras, and other technologies that enable it to perceive the environment and make decisions. The term "ego" refers to a vehicle as self-centered or self-referential, meaning that it perceives the world primarily from its own perspective. The vehicle's sensors and cameras are focused on detecting objects and obstacles in its immediate surroundings, which is essential for safe navigation. The term also suggests a certain level of autonomy or independence, as the vehicle can operate on its own without human intervention. It is a self-contained entity that can make decisions based on its own perception of the environment. The term "ego vehicle" is used to differentiate the self-driving car from other vehicles or objects in its environment and to highlight its ability to operate autonomously |
Frames per Second (FPS) | FPS refers to a metric used to measure the rate at which a camera or sensor system captures and processes individual frames or images per second. It is a crucial parameter when assessing the performance and capabilities of sensors, such as cameras or LiDAR, that are used for perception and environmental sensing in autonomous vehicles. |
Generative AI | Generative AI models are trained on vast datasets and learn to capture the underlying patterns and structures of the data. They can then use this knowledge to produce content that is coherent and contextually relevant |
Global Positioning System (GPS) | GPS is a satellite-based navigation and timing system that allows users to determine their precise geographic location, velocity, and time information anywhere. It has a wide range of applications, from everyday navigation to critical services such as emergency response and financial transactions. |
Global Navigation Satellite System (GNSS) | GNSS describes any satellite constellation that provides positioning, navigation, and timing (PNT) services on a global or regional basis |
Ground truth data | Ground truth data refers to the authoritative or reference data that is considered the most accurate and reliable information available for a particular task or problem. It serves as a benchmark or gold standard against which other data or measurements can be compared or evaluated. Ground truth data is raw data that is labeled, tagged, and segmented and believed to be the most reliable data available. |
Hardware-in-the-loop (HiL) | HiL is a testing and validation technique used to assess the performance and functionality of hardware components, particularly Sensor Fusion ECUs, which are part of an autonomous vehicle system. HiL testing simulates the behavior of various hardware components in a controlled environment to evaluate how they interact with each other and the overall system. |
Hybrid-in-the-loop (HyiL) or Accelerated SiL (AcSiL) | HyiL or AcSiL is a combination of hardware and software components in a traditional SiL environment, such as Ambarella’s CV3 family of ECU controllers, used to accelerate SiL testing. |
C++ Heterogeneous-Compute Interface for Portability (HIP) or HIPify | HIP is a C++ runtime API and kernel language that lets developers create portable applications for AMD and NVIDIA GPUs from single source code. For HIPify, hipify-clang and hipify-perl are tools to translate CUDA source code into portable HIP C++ automatically. |
Intelligent Connected Vehicle (ISV) | An ICV is a technologically advanced automobile designed to offer increased safety, connectivity, and functionality through various sensors, communication systems, and smart features. |
In-Car or In-Vehicle | In-Car or In-Vehicle typically refers to components, technologies, or systems that are integrated in the vehicle itself. It encompasses the hardware, software, sensors, and ECUs that are installed and operate in the vehicle to enable various functions related to safety, automation, infotainment, and vehicle control. |
Ingest | Ingest is the critical process of collecting, processing, and incorporating sensor data and environmental information into the autonomous vehicle's system. This data is the foundation for perceiving the surroundings, making decisions, and safely navigating the vehicle. The ingest process is essential for achieving the high levels of situational awareness required for autonomous and semiautonomous driving. |
Inertial Measurement Unit (IMU) | An IMU is an electronic device that measures and reports specific motion-related parameters and orientation of an object in three-dimensional space. IMUs can play a critical role as a backup or complementary sensor system in autonomous vehicles, particularly in situations where primary sensors like cameras, radar, and LiDAR might fail or encounter limitations. IMUs provide information about the vehicle's motion and orientation, which can help maintain control and safety in rare cases of sensor failure or challenging conditions. |
ISO 26262 | ISO 26262 is an international standard that defines functional safety requirements for the development of electrical and electronic systems in road vehicles. It is specifically focused on the automotive industry and aims to ensure the safety and reliability of electronic systems used in vehicles, particularly in the context of ADAS/AD systems and other safety-critical applications. |
Autonomous Driving SAE levels (L0 – L5) | Each Society of Automotive Engineers (SAE) level represents a step toward higher automation and reduced driver involvement. As the levels progress from Level 0 to Level 5, the vehicle assumes greater responsibility for driving tasks, while the human driver's role diminishes. However, the transition between levels, especially from Level 2 to Level 3 and beyond, presents significant technical, safety, and regulatory challenges that must be addressed before fully autonomous vehicles become a reality on the roads. |
Light Detection and Ranging (LiDAR) | LiDAR is a key technology used in autonomous driving systems to enable vehicles to sense and understand their surroundings. It emits laser pulses or beams of light and measures the time it takes for these pulses to bounce back after hitting objects in the environment. The data collected from these laser reflections is used to create detailed 3D maps and generate a precise understanding of the surrounding objects and terrain. |
Model-in-the-Loop (MiL) | MiL testing is a verification and validation technique used to assess the performance of mathematical models or simulations of various components and subsystems of autonomous vehicles. It focuses on the functional behavior of these models, allowing developers and engineers to evaluate how different components interact and make decisions in a simulated environment before integrating them into the physical vehicle or conducting real-world testing. |
Radio Detection and Ranging (RADAR) | RADAR technology is a fundamental component of the sensor suite in autonomous vehicles, providing valuable information for perception, decision-making, and control systems. It complements other sensors like cameras, LiDAR, and ultrasonic sensors to create a comprehensive understanding of the vehicle's surroundings, helping ensure safe and reliable autonomous driving. |
Replay or Re-Sim | Replay or Re-Sim is the process of reproducing and simulating real-world scenarios by replaying previously recorded data into an ECU or vECU and comparing the output of the ECU with ground truth data. Also known as Open Loop simulation. |
Rig (that is, HiL Rig) | A "Rig" refers to a testing setup or system that integrates real hardware components with a simulated environment. This setup is used in a data center to test and validate the performance of hardware components (such as sensors, actuators, or control units) in a controlled environment that replicates real-world conditions. |
Roadside Unit (RSU) | An RSU is a communication and data exchange device or enclosure placed along roadways to support intelligent transportation systems. It plays a vital role in enhancing road safety, managing traffic, and enabling various transportation-related applications by facilitating communication between vehicles and the infrastructure. |
SAE | SAE or SAE International is a global professional association and standards organization. Formerly known as the Society of Automotive Engineers, the organization adopted its current name in 2006 to reflect both its international membership and the increased scope of its activities beyond automotive engineering and the automotive industry to include aerospace and other transport industries, as well as commercial vehicles including autonomous vehicles such as self-driving cars, trucks, surface vessels, drones, and related technologies. |
Scenario | A carefully designed and defined situation or environment that is simulated, either in a physical testing environment or a virtual simulation platform, to evaluate the behavior and performance of autonomous vehicles and their components. |
Sound Navigation and Ranging (SONAR) | SONAR uses sound waves, typically ultrasonic waves, for close range object detection and distance measurement. SONAR is an important sensing technology that can complement other sensors like cameras, radar, and LiDAR to enhance the perception capabilities of autonomous vehicles. |
Software-in-the loop (SiL) | SiL testing is V/V of AI models using a software-based ECU and sensors models from simulation. SiL testing is an important step in the development of autonomous driving technology, as it helps identify and address issues in software algorithms early in the development process. It complements other testing methodologies, such as HiL and real-world testing, to ensure the overall safety and reliability of autonomous vehicle systems. |
Synthetic Data Generation (SDG) | SDG is a tool that helps overcome the limitations of real-world data collection while providing a safe, scalable, and cost-effective way to train and test AI and machine learning models for autonomous and semiautonomous driving systems. It plays a crucial role in accelerating the development and validation of these technologies. |
Software Defined Vehicle (SDV) | An SDV is a vehicle that relies heavily on a software-centric architecture like microservices or a Service Oriented Architecture (SOA) to control and manage various aspects of its operation, including navigation, safety, communication, and overall functionality. SDVs represent a shift from traditional, mechanically driven vehicles to vehicles in which software plays a central role in enabling autonomous driving and enhancing vehicle capabilities. |
Tagging or labeling | Tagging or labeling assigns descriptive labels or annotations to elements in sensor data (such as images, LiDAR point clouds, or sensor fusion data) to provide meaningful context and information for training, testing, or interpreting autonomous driving systems. These labels are essential for teaching autonomous vehicles how to understand and respond to their surroundings accurately. |
Validation | The process of evaluating a product, service, or system to determine whether it meets the needs of the customer and other identified stakeholders. Validation occurs after the completion of a specific module or the end of a project. |
Verification | The process of evaluating a product, service, or system to meet the specified requirements, exact specifications, or standards. Verification checks whether something is true. Verification occurs while the product, service, or system is still under development. |
Vision Language Models | Vision Language Models (VLMs) are used in AV2.0 (Autonomous Vehicles 2.0) systems. These models play a crucial role in enabling vehicles to understand and process complex visual and textual information simultaneously. VLMs can help with tasks such as interpreting traffic signs, understanding scenes, and making decisions based on both visual inputs and contextual information from surrounding environments. This integration enhances the vehicle's ability to navigate more safely and efficiently in various driving conditions. |