Conference paper Open Access

A Novel Posit-based Fast Approximation of ELU Activation Function for Deep Neural Networks

Cococcioni, Marco; Rossi, Federico; Ruffaldi, Emanuele; Saponara, Sergio

Nowadays,  real-time  applications  are  exploiting DNNs  more  and  more  for  computer  vision  and  image  recognition  tasks.  Such kind of applications are posing strict constraints in terms of both fast and efficient information representation and processing. New formats for representing real numbers have been proposed and among them the Posit format appears to be very promising, providing means  to  implement  fast  approximated  version  of widely  used activation functions in DNNs. Moreover, information processing performance  are  continuously  improved  thanks  to  advanced vectorized  SIMD  (single-instruction  multiple-data)  processor architectures  and  instructions  like  ARM  SVE (Scalable Vector Extension). This  paper explores both  approaches (Posit-based implementation of activation functions and vectorized SIMD processor architectures) to  obtain  faster  DNNs.  The  two  proposed  techniques  are able  to  speed  up  both  DNN training  and  inference steps. 

Files (191.4 kB)
Name Size
ROSSI_C3_smart_computing_2020_su_3_pp.pdf
md5:c3961c8b727d556a9817d356a46728d3
191.4 kB Download
72
146
views
downloads
All versions This version
Views 7272
Downloads 146146
Data volume 27.9 MB27.9 MB
Unique views 6262
Unique downloads 138138

Share

Cite as