Published October 31, 2025 | Version v1
Journal article Open

Testing the limits: exploring adversarial techniques in AI models

  • 1. ROR icon Foundation for Research and Technology Hellas
  • 2. ROR icon University of Piraeus
  • 3. SPHYNX Technology Solutions AG

Description

The rising adoption of artificial intelligence and machine learning in critical sectors underscores the pressing need for robust systems capable of withstanding adversarial threats. While deep learning architectures have revolutionized tasks such as image recognition, their susceptibility to adversarial techniques remains an open challenge. This article evaluates the impact of various adversarial methods, including the fast gradient sign method, projected gradient descent, DeepFool, and Carlini & Wagner, on five neural network models: a fully connected neural network, LeNet, Simple convolutional neural network (CNN), MobileNetV2, and VGG11. Using the EVAISION tool explicitly developed for this research, these attacks were implemented and analyzed based on accuracy, F1-score, and misclassification rate. The results revealed varying levels of vulnerability across the tested models, with simpler architectures occasionally outperforming more complex ones. These findings emphasize the importance of selecting the most appropriate adversarial technique for a given architecture and customizing the associated attack parameters to achieve optimal results in each scenario.

Files

peerj-cs-3330.pdf

Files (2.2 MB)

Name Size Download all
md5:c724812f5ca43996b92db2707305e14f
2.2 MB Preview Download

Additional details

Funding

European Commission
cPAID - Cloud-based Platform-agnostic Adversarial aI Defence framework– CPAID 101168407
European Commission
AIAS - AI-ASsisted cybersecurity platform empowering SMEs to defend against adversarial AI attacks 101131292
European Commission
RESCALE - Revolutionised Enhanced Supply Chain Automation with Limited Threats Exposure 101120962
European Commission
ANTIDOTE - AI Attack and Defense for the Smart Healthcare 101183162