NVIDIA DRIVE IX empowers AV developers with means to interact with the vehicle as an occupant. DRIVE IX is the integrated vision, voice, and graphics user experience.


DRIVE IX is suited for the following objectives:

  • Driver Monitoring.
  • Human/Machine Interface (HMI) i.e. user interface using voice/speech/natural language and gesture for driver and vehicle occupants.
  • Visualization - both for production as well as for development.
    • Production visualization may include confidence view in the instrument cluster.
    • Development visualization may include camera/sensor inputs augmented with useful statistics.

Details


AI CoPilot

AI CoPilot provides driver monitoring system using a driver-facing camera, IR LED, and sophisticated deep learning software running on the NVIDIA DRIVE AGX system in the car. This module enables development of applications to ensure drivers stay alert, or take action if a driver is distracted or drowsy. For example, by tracking head and eyes to understand where driver is paying attention, and monitoring blink frequency to assess fatigue and drowsiness. Some of the key capabilities of AI Co-Pilot include: 3 dimensional gaze detection, drowsiness detection, distraction detection and head pose detection.


Visualization

DRIVE IX takes input from DRIVE AV and AI Cockpit applications to provide real-time information on the instrument cluster for the driver as well as visual information of what the autonomous vehicle or driver monitoring systems are viewing. AutoPilot Monitor or Confidence View is the virtualized view for the instrument cluster; of what the autonomous driving application perceives, and acts upon. AV Viz and IX Viz applications take camera and sensor feeds from DRIVE AV and DRIVE IX applications augmented with important useful information (such as bounding boxes, stats etc) for a software developer.

Confidence View

DRIVE Hub
IX Viz

AI Assistant

AI Assistant enables voice recognition, lip reading, face identification and emotions detection for an AI powered human/machine interface (HMI). Some of the main capabilities of AI Assistant are: automatic speech recognition (ASR), natural language understanding (NLU), face detection and emotion detection.



DRIVE IX In Action

The intelligent user experience cockpit, powered by NVIDIA DRIVE IX, leverages AI to ensure the driver’s attention is always on the road and deliver seamless convenience features. With advanced driver monitoring, the system can determine if a driver is drowsy or distracted, and recognize the driver for easy vehicle access.

Developing with DRIVE IX


How to set up

You will need:

Steps:


How to develop

Development Tasks Getting Started
Build applications using the APIs for Driver Monitoring

DriveIX client library provides a C-styled interface to interact with NVIDIA's AI driven DriveIX service. User can obtain data inferences provided by DriveIX via this interface.

Available features are:

  • Gaze
  • Head pose
  • Face detect
  • Face identification plugin
  • Eye open/close
  • Drowsiness
  • Distraction

Consult the DRIVE IX API section of the DRIVE Software Documentation for details of NVIDIA DRIVE IX Client Lib APIs.

Use DRIVE Hub to launch visualization applications

NVIDIA DRIVE Hub provides a unified user interface to launch all DRIVE Software applications using the display inside the car.

Requires DRIVE Hyperion Developer Kit

Consult the DRIVE IX section included in the DRIVE Software Documentation

Record your drive using the DRIVE Hub interface Click on the "RECORD DRIVE" button from the AppSelect screen to start recording


Additional Development Resources:

Documentation