Published September 22, 2020 | Version v1
Conference paper Open

A Novel Posit-based Fast Approximation of ELU Activation Function for Deep Neural Networks

  • 1. University of Pisa
  • 2. MMI spa

Description

Nowadays,  real-time  applications  are  exploiting DNNs  more  and  more  for  computer  vision  and  image  recognition  tasks.  Such kind of applications are posing strict constraints in terms of both fast and efficient information representation and processing. New formats for representing real numbers have been proposed and among them the Posit format appears to be very promising, providing means  to  implement  fast  approximated  version  of widely  used activation functions in DNNs. Moreover, information processing performance  are  continuously  improved  thanks  to  advanced vectorized  SIMD  (single-instruction  multiple-data)  processor architectures  and  instructions  like  ARM  SVE (Scalable Vector Extension). This  paper explores both  approaches (Posit-based implementation of activation functions and vectorized SIMD processor architectures) to  obtain  faster  DNNs.  The  two  proposed  techniques  are able  to  speed  up  both  DNN training  and  inference steps. 

Files

ROSSI_C3_smart_computing_2020_su_3_pp.pdf

Files (191.4 kB)

Name Size Download all
md5:c3961c8b727d556a9817d356a46728d3
191.4 kB Preview Download

Additional details

Related works

Is identical to
Conference paper: 10.1109/SMARTCOMP50058.2020.00053 (DOI)

Funding

EPI SGA1 – SGA1 (Specific Grant Agreement 1) OF THE EUROPEAN PROCESSOR INITIATIVE (EPI) 826647
European Commission