Published October 27, 2023 | Version v1
Software Open

A computational neuroscience framework for quantifying warning signals

  • 1. University of St Andrews
  • 2. Newcastle University
  • 3. University of Bristol
  • 4. Abertay University


Animal warning signals show remarkable diversity, yet subjectively appear to share certain visual features that make defended prey stand out and look different from more cryptic palatable species. For example, many (but far from all) warning signals involve high contrast elements, such as stripes and spots, and often involve the colours yellow and red. How exactly do aposematic species differ from non-aposematic ones in the eyes (and brains) of their predators?

Here we develop a novel computational modelling approach, to quantify prey warning signals and establish what visual features they share. First, we develop a model visual system, made of artificial neurons with realistic receptive fields, to provide a quantitative estimate of the neural activity in the first stages of the visual system of a predator in response to a pattern. The system can be tailored to specific species. Second, we build a novel model that defines a 'neural signature', comprising quantitative metrics that measure the strength of stimulation of the population of neurons in response to patterns. This framework allows us to test how individual patterns stimulate the model predator visual system.

For the predator-prey system of birds foraging on lepidopteran prey, we compared the strength of stimulation of a modelled avian visual system in response to a novel database of hyperspectral images of aposematic and undefended butterflies and moths. Warning signals generate significantly stronger activity in the model visual system, setting them apart from the patterns of undefended species. The activity was also very different from that seen in response to natural scenes. Therefore, to their predators, lepidopteran warning patterns are distinct from their non-defended counterparts, and stand out against a range of natural backgrounds.

For the first time, we present an objective and quantitative definition of warning signals based on how the pattern generates population activity in a neural model of the brain of the receiver. This opens new perspectives for understanding and testing how warning signals have evolved, and, more generally, how sensory systems constrain signal design.


Neural model and computation of metrics

The software required for using the model, extracting the metrics from the model response, and generating the figures is Matlab (proprietary; MATLAB and Statistics Toolbox Release 2019b, (R2019b). Natick, Massachusetts, The MathWorks Inc.). An open-source alternative to run the Matlab routines is Octave (

Statistical analysis

The software required for the statistical analysis is R (free, open-source; Team, R. C. (2020). "R: A Language and Environment for Statistical Computing, R Foundation for Statistical Computing, Vienna, Austria.) 

Funding provided by: Maria Zambrano Fellowship for attraction of international talent for the requalification of the Spanish university system—NextGeneration EU (ALRC)*
Crossref Funder Registry ID:
Award Number:

Funding provided by: Biotechnology and Biological Sciences Research Council
Crossref Funder Registry ID:
Award Number: BB/N006569/1

Funding provided by: Biotechnology and Biological Sciences Research Council
Crossref Funder Registry ID:
Award Number: BB/N00602X/1

Funding provided by: Biotechnology and Biological Sciences Research Council
Crossref Funder Registry ID:
Award Number: BB/N005945/1

Funding provided by: Biotechnology and Biological Sciences Research Council
Crossref Funder Registry ID:
Award Number: BB/N007239/1


Database construction

The novel database of lepidopteran patterns of aposematic and non-aposematic species consists of a representative set made of 125 species of Lepidoptera across 12 families (96 aposematic and 29 non-aposematic species, with a total of 676 hyperspectral images; see paper's Supplementary Material 1 for details). Samples of each species were located in museum collections (the Natural History Museum (BMNH), London, UK, the Manchester Museum (MMUE), Manchester, UK, and the American National Museum (AMNH), New York, USA). Their dorsal and ventral sides were photographed using an ultraviolet hyperspectral camera (Resonon Pika NUV, Resonon Inc., MT USA) covering the 350 nm – 800 nm spectral range, with a spectral resolution of 1 nm. The camera was fitted with a near ultraviolet 17 mm focal length objective lens. To maximize the homogeneity of the light field, the specimens were illuminated by four blue-enhanced halogen lamps (SoLux, 35W, 12V-MR16 GU5.3 4700K, EiKO Global, KS USA) placed 22 cm apart on a squared fixture light and oriented vertically toward the horizontal scanning plane. See the paper's Supplementary Methods 1 for details on the spatial and spectral calibration of the imaging system.

The database is freely accessible at 

Image analysis – neural model of predator vision – computation of metrics (summary statistics)

The neural model of a predator visual system and the computation of the metrics of the modelled neural activity were coded in Matlab (MATLAB and Statistics Toolbox Release 2019b, (R2019b). Natick, Massachusetts, The MathWorks Inc.). Please see details in accompanying the file and Supplementary Method 2 and 3.

Statistical analysis

The statistical analysis was done in R (R Development Core Team 2020) using generalized linear models (function glm) for the logistic regressions and the function glmer in the package lme4 (Bates et al. 2014) for fitting generalized linear mixed models. See and Supplementary Method 4 for details.


Files (152.8 kB)

Name Size Download all
6.7 kB Download
5.6 kB Download
11.8 kB Download
11.1 kB Download
11.0 kB Download
2.0 kB Download
22.3 kB Download
16.9 kB Download
1.0 kB Download
3.7 kB Preview Download
8.6 kB Download
8.3 kB Download
4.5 kB Download
2.4 kB Download
6.2 kB Download
448 Bytes Download
30.3 kB Download

Additional details

Related works