Computer Vision Edge AI B.Tech Final Year 2024 Major Project

Driver Assistance System

A complete ADAS suite — six integrated safety systems monitoring both the road and the driver, running real-time on edge hardware.

B.Tech Final Year · 2024
MLR Institute of Technology
YOLOv8 · OpenCV · TensorRT · Jetson Orin Nano

Road safety is one of the leading causes of injury and death globally. This project addresses it by building a complete Advanced Driver Assistance System (ADAS) that monitors both the road and the driver simultaneously — alerting in real-time and even taking corrective action when needed.

System Overview

Six integrated safety components

01
Drowsiness Detection
Monitors Eye Aspect Ratio (EAR) throughout the ride to detect fatigue and alert the driver before it becomes dangerous.
02
Lane Detection
Ultrafast lane detector provides real-time spatial awareness by recognising road markings in all conditions.
03
Lane Departure Warning
Triggers an alert the moment the vehicle deviates from its lane without signalling.
04
Lane Keeping Assist
Detects continued lane deviation and feeds corrections to the vehicle's steering system to bring it back.
05
Object Detection
YOLO-based recognition of pedestrians, vehicles, and road objects in real time across both close and far range.
06
Collision Warning
Combines OpenCV distance measurement with object detection to alert the driver about imminent collision risks.
Architecture

Two points of view

The system operates with two independent camera perspectives running concurrently:

Proposed system architecture
Overall proposed system architecture

POV 1 — Monitoring the external environment

Handles lane tracking, object detection, and collision warning. Three UML diagrams define the class relationships, collaboration, and use-case flows.

Class UML diagram
Class UML — POV 1 external environment
Sequence UML diagram
Sequence UML — request-response flow

POV 2 — Monitoring the internal environment

Handles drowsiness detection using Eye Aspect Ratio (EAR). If EAR drops below a threshold for a sustained period, the system triggers an alert.

Drowsiness detection UML
Drowsiness detection UML diagram
Drowsiness pseudo code
Drowsiness detection algorithm pseudocode
Results

System in action

Front collision warning
Front Collision Warning System — live detection
Lane departure warning
Lane Departure Warning System — deviation alert
Lane keeping assist
Lane Keeping Assist System — steering correction
Active driver
POV 2 — Active, alert driver (normal EAR)
Drowsy driver
POV 2 — Drowsy driver detected (low EAR threshold)
Hardware

Jetson-powered edge deployment

Jetson Orin Nano
Pi Camera Module 3
Servo Motor
Arduino Nano
ADXL-345 (Accelerometer)
GPS Neo-6m
GSM SIM800I
LM2596 Step Converter
Power Bank (10–20 000 mAh)
12V 2A Power Supply
Hardware integration diagram
Full hardware integration schematic
Setup

Running the project

Bash
# Install dependencies
pip install -r requirements.txt

# Run POV_1 — external environment monitoring
python demo.py

# Run POV_2 — drowsiness detection (from Drowsiness detector dir)
python detect.py

YOLO model conversion

Bash
# Convert ONNX model to TensorRT for edge deployment
python convertOnnxToTensorRT.py -i <onnx-model> -o <trt-model>

# Quantize to float16 to reduce model size
python onnxQuantization.py -i <onnx-model>
Outcome
A safety system that thinks faster than the driver can react.

This project demonstrates that a comprehensive ADAS suite — normally found in premium vehicles — can be built and deployed on affordable edge hardware like the Jetson Orin Nano. The dual-POV architecture means the system simultaneously watches the road and the driver, providing layered protection that neither a human nor a single-camera system can match alone.

HM
Hemanth Sai .M
MS AI · Northeastern University
← Back to Projects