Published February 16, 2024 | Version v1
Conference proceeding Open

Analysing Adversarial Threats to Rule-Based Local-Planning Algorithms for Autonomous Driving

  • 1. ROR icon Tallinn University of Technology

Contributors

Contact person:

  • 1. ROR icon Tallinn University of Technology

Description

The safety and security of navigation and planning algorithms are essential for the adoption of autonomous driving in real-world operational environments. Adversarial threats to local-planning algorithms are a developing field. Attacks have primarily been targeted at trajectory prediction algorithms which are used by the autonomous vehicle to predict the motion of ego vehicles and other environmental objects to calculate a safe planning route. This work extends the attack surface to focus
on a rule-based local-planning algorithm, specifically focusing on  the planning cost-based function, which is used to estimate the safest and most efficient route. Targeting this algorithm, which is used in a real-world, operational autonomous vehicle program, we devise two attacks; 1) deviation to the lateral and longitudinal pose values, and 2) time-delay of the sensed-data input messages to the local-planning nodes. Using a low-fidelity simulation testing environment, we conduct a sensitivity analysis using multiple deviation range values and time-delay duration. We find that the impact of adversarial attack cases is visible in the rate of failure to complete the mission and in the occurrence of safety violations. The cost-function is sensitive to deviations in lateral and longitudinal pose and higher duration of message delay. The result of the sensitivity analysis suggests minor deviations of the pose (lateral, longitudinal) values as an optimal range for the attackers search space. Options for mitigating such attacks are that the AV should run a concurrent process executing a concurrent planning instance for redundancy.

Files

vehiclesec2023-23086-paper.pdf

Files (1.2 MB)

Name Size Download all
md5:bef199bf7c43d09119625745989bc157
1.2 MB Preview Download

Additional details

Funding

IRIS – artificial Intelligence threat Reporting and Incident response System 101021727
European Commission