Decoupled Perception-Actuation Architecture for Low-Cost Level 2 ADAS Prototyping Using YOLO and ESP32
Description
Advanced Driver Assistance Systems (ADAS) are transforming modern transportation, but their proprietary and expensive nature limits accessibility for students and independent researchers. This paper presents the development of a low-cost, functional Level 2 autonomous vehicle prototype that bridges the gap between advanced perception software and reproducible hardware. Our architecture decouples high-level computer vision from low-level physical actuation. An external computational unit processes wireless visual telemetry using OpenCV for nearest-pair lane tracking and a fine-tuned YOLO model for dynamic object and traffic sign recognition. Navigation commands are transmitted via UDP to an ESP32 microcontroller, which executes precise motor control using a Proportional-Integral-Derivative (PID) algorithm. The ESP32 has built-in hardware reflexes like ultrasonic emergency braking and an infrared-triggered corrective steering reflex for lane correction to make sure safety no matter how slow the network is. Experimental results demonstrate a 90 percent object detection success rate under optimal lighting and highly reliable obstacle avoidance, while also highlighting the environmental limitations of single-camera visual telemetry. This project proves that complex autonomous driving logic can be effectively simulated and studied on an affordable, small-scale platform.
Files
Ahir_J_Level2_ADAS_Decoupled_Architecture_ESP32_YOLO.pdf
Files
(116.4 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:c811473543a17dbb93f76a6c8ebef8f5
|
116.4 kB | Preview Download |
Additional details
Software
- Repository URL
- https://github.com/ahirjaydeep/Autonomous-AI-Car
- Programming language
- Python , C++ , Rust
- Development Status
- Active