Conference paper Open Access

A Novel Posit-based Fast Approximation of ELU Activation Function for Deep Neural Networks

Cococcioni, Marco; Rossi, Federico; Ruffaldi, Emanuele; Saponara, Sergio

Nowadays,  real-time  applications  are  exploiting DNNs  more  and  more  for  computer  vision  and  image  recognition  tasks.  Such kind of applications are posing strict constraints in terms of both fast and efficient information representation and processing. New formats for representing real numbers have been proposed and among them the Posit format appears to be very promising, providing means  to  implement  fast  approximated  version  of widely  used activation functions in DNNs. Moreover, information processing performance  are  continuously  improved  thanks  to  advanced vectorized  SIMD  (single-instruction  multiple-data)  processor architectures  and  instructions  like  ARM  SVE (Scalable Vector Extension). This  paper explores both  approaches (Posit-based implementation of activation functions and vectorized SIMD processor architectures) to  obtain  faster  DNNs.  The  two  proposed  techniques  are able  to  speed  up  both  DNN training  and  inference steps. 

Files (191.4 kB)
Name Size
ROSSI_C3_smart_computing_2020_su_3_pp.pdf
md5:c3961c8b727d556a9817d356a46728d3
191.4 kB Download
80
215
views
downloads
All versions This version
Views 8080
Downloads 215215
Data volume 41.2 MB41.2 MB
Unique views 7070
Unique downloads 207207

Share

Cite as