Quick Takeaways
  • AI-based high-resolution radar combined with accelerated in-vehicle AI enables reliable hands-off, eyes-off driving at highway speeds.
  • The solution delivers LiDAR-like perception performance with lower cost and all-weather reliability, improving scalability for production vehicles.
On January 5, Arbe Robotics announced a major step forward in AI-based perception for hands-off eyes-off driving, revealing the integration of its high-definition radar technology with NVIDIA’s accelerated computing platforms. The collaboration aims to enable safer, more predictable automated driving by combining high-resolution radar sensing with advanced in-vehicle AI processing.
The automotive-grade radar system is designed to deliver exceptionally high detection accuracy, producing more than 20,000 detections per frame through 2,304 channels. This level of detail significantly enhances object recognition and situational awareness, particularly in complex highway driving environments where reliability is critical.
AI-Based Perception for Hands-Off Eyes-Off Driving at Highway Speeds
The Arbe HD radar solution is engineered to operate consistently across all weather and lighting conditions. By combining perception-grade radar with NVIDIA’s accelerated computing, the system supports AI-based perception for hands-off eyes-off driving at highway speeds, delivering smooth, human-like driving behavior that enhances safety and builds consumer confidence.
Unlike traditional sensing approaches, the radar-based perception system maintains accuracy in rain, fog, dust, and low-visibility scenarios. This ensures predictable performance and redundancy, which are essential for higher levels of automated driving.
High-Resolution Radar Enables LiDAR-Like Capabilities
Arbe will also demonstrate how its high-resolution radar delivers LiDAR-like performance while reducing overall sensor costs. The technology improves system redundancy by complementing existing sensor suites and minimizing dependence on expensive hardware, making advanced driver assistance and automation more scalable for production vehicles.
Key technical highlights include:
  • More than 20,000 detections per frame for precise object classification
  • 2,304 radar channels enabling fine-grained environmental mapping
  • Reliable operation in all environmental conditions

AI-Based Occupancy Mapping and In-Vehicle Integration
Live demonstrations will showcase an AI-based occupancy grid developed with Perciv AI, creating a detailed, real-time representation of the vehicle’s surroundings. This grid-based perception enhances path planning and decision-making by accurately identifying free space, obstacles, and dynamic objects.
The system also integrates Arbe’s high-resolution radar data with the NVIDIA DRIVE AGX Orin in-vehicle computing platform. This integration enables real-time processing of complex perception data, supporting advanced automation features while meeting automotive-grade performance and safety requirements.
Company Press Release

Click above to visit the official source.

Share: