Journal article Open Access

An Intelligent Way for Discerning Plastics at the Shorelines and the Seas

Kyriaki Kylili; Constantinos Hadjistassou; Alessandro Artusi

Irrespective of how plastics litter the coastline or enter the sea, they pose a major threat to birds and marine life alike. In this study an Artificial Intelligence tool was used to create an image classifier based  on a Convolutional Neural Network architecture that utilises the Bottleneck Method. The trained Bottleneck Method image classifier was able to categorise plastics encountered either at the shoreline or floating at the sea surface into eight (8) distinct classes, namely, plastic bags, bottles, buckets, straws,  derelict nets, fish and other objects. Discerning objects with a success rate of 90%, the proposed Deep  Learning approach constitutes a leap towards the automatic identification of plastics at the coastline and the sea. Training and testing loss and accuracy results for a range of epochs and batch sizes has lent credibility to the proposed method. Results originating from a resolution sensitivity analysis demonstrated that the prediction technique retains its ability to correctly identify plastics even when  image resolution was downsized by 75%. Intelligent tools, such as the one suggested here can replace manual sorting of macroplastics from human operators revealing, for the first time, the true scale of the amount of plastic polluting our beaches and the seas.

This work has been partly supported by the project that has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 739578 (RISE – Call: H2020-WIDESPREAD-01-2016-2017-TeamingPhase2) and the Government of the Republic of Cyprus through the Directorate General for European Programmes, Coordination and Development.
Files (11.5 MB)
Name Size
Pre-print-J-ESPR-2020.pdf
md5:2d0f6dd83563e3abdf870aafe43b5526
11.5 MB Download
53
190
views
downloads
Views 53
Downloads 190
Data volume 2.2 GB
Unique views 39
Unique downloads 183

Share

Cite as