Published March 12, 2024 | Version v1
Dataset Open

Spike-timing based coding in neuromimetic tactile system enables dynamic object classification

Description

Coding dynamic tactile information in spike timing is essential to human haptic exploration and dexterous object manipulation. Conventional electronic skins generate frames of tactile signals upon interaction with objects and are unfortunately ill-suited for efficient coding of temporal information and rapid feature extraction. Here, we report a neuromorphic tactile system that uses spike timing, especially the first-spike timing, to code dynamic tactile information about touch and grasp. This strategy enables the system to seamlessly code highly dynamic information with millisecond temporal resolution on par with the biological nervous system, yielding dynamic extraction of tactile features. Upon interaction with objects, the system rapidly classifies them in the initial phase of touch and grasp, thus paving the way to fast tactile feedback desired for neuro-robotics and neuro-prosthetics.

Notes

Funding provided by: Swedish Research Council
Crossref Funder Registry ID: https://ror.org/03zttf063
Award Number: 2019-05484

Funding provided by: Swedish Research Council
Crossref Funder Registry ID: https://ror.org/03zttf063
Award Number: 2022-06725

Funding provided by: Swedish Foundation for Strategic Research
Crossref Funder Registry ID: https://ror.org/044wr7g58
Award Number: FUS21-0067

Funding provided by: European Horizon 2020 Research and Innovation Program
Award Number: 965044

Funding provided by: eSSENCE Research Program
Award Number:

Methods

The method is provided in SM. In brief, the raw data of spike trains in voltage were generated and collected by using the e-skin that touched surfaces and grasped objects. The raw data was converted to digital spike trains with 1 as a spike and 0 as non-spike. The processed spike trains were analyzed to study the encoding performance of the system and were used to train and test the spiking neural networks for classification tasks. The tabular data provided are corresponding to the main text figures (i.e., Fig 2, Fig 3 and Fig 4). 

Files

README.md

Files (768.6 kB)

Name Size Download all
md5:4e6745645fdd24a4daaa8934e19137f9
503.2 kB Download
md5:03890f30bec304bd4f997c21ed843ed7
20.1 kB Download
md5:1666a4b6677af7bdeb6d95c3b46231d3
9.0 kB Download
md5:788127bcdd9146bff0d938eb1bc471e9
10.3 kB Download
md5:5690d7f389b8affcee950ad99906ce63
35.8 kB Download
md5:5171cbb463cda5ccc53d259c285f85de
38.5 kB Download
md5:5ef60bbaae8594b8754ee661f375195b
9.0 kB Download
md5:c7215349ca346216e9f06c70c95dce58
8.9 kB Download
md5:3d403d71e0bf4b8807fac4fc82ddc981
9.0 kB Download
md5:1fda3ba40159fa55396fe3e2cf0b8a93
9.3 kB Download
md5:a9f6728051718f653a53ea2a0561e8b0
9.7 kB Download
md5:f8701d44cfd1bf86c594e4e8c39727f0
14.1 kB Download
md5:69aec3327b9188c1226b847522984f28
26.8 kB Download
md5:e2d2c3d6d86e0ce135c731fa41b76b0d
30.2 kB Download
md5:6bebbcfc2fc2e3f6b6714c4481dcf61e
30.1 kB Download
md5:aa56c29d40499b8f1868a50781f8e268
4.5 kB Preview Download