FAST AND ACCURATE DEEP NETWORK LEARNING BY EXPONENTIAL LINEAR UNITS (ELUs)

FAST AND ACCURATE DEEP NETWORK LEARNING BY EXPONENTIAL LINEAR UNITS (ELUs)

22 Feb 2016 | Djork-Arné Clevert, Thomas Unterthiner & Sepp Hochreiter
The paper introduces the "exponential linear unit" (ELU), a new activation function designed to speed up learning in deep neural networks and improve classification accuracy. Unlike rectified linear units (ReLUs), leaky ReLUs (LReLUs), and parametrized ReLUs (PReLUs), ELUs have negative values, which push mean activations closer to zero, reducing bias shifts and speeding up learning. ELUs saturate to a negative value with smaller inputs, decreasing the variation and information propagated to the next layer, making them noise-robust and computationally efficient. Experimental results show that ELUs outperform other activation functions on various datasets, including CIFAR-100, achieving the best published result on this dataset without multi-view evaluation or model averaging. ELU networks also perform better than ReLU networks with batch normalization, demonstrating their superior learning behavior and generalization capabilities.The paper introduces the "exponential linear unit" (ELU), a new activation function designed to speed up learning in deep neural networks and improve classification accuracy. Unlike rectified linear units (ReLUs), leaky ReLUs (LReLUs), and parametrized ReLUs (PReLUs), ELUs have negative values, which push mean activations closer to zero, reducing bias shifts and speeding up learning. ELUs saturate to a negative value with smaller inputs, decreasing the variation and information propagated to the next layer, making them noise-robust and computationally efficient. Experimental results show that ELUs outperform other activation functions on various datasets, including CIFAR-100, achieving the best published result on this dataset without multi-view evaluation or model averaging. ELU networks also perform better than ReLU networks with batch normalization, demonstrating their superior learning behavior and generalization capabilities.
Reach us at info@study.space
Understanding Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)