17 Jun 2020 | Vincent Sitzmann*, Julien N. P. Martel*, Alexander W. Bergman, David B. Lindell, Gordon Wetzstein
The paper introduces a novel approach called Sinusoidal Representation Networks (SIRENs) that leverage periodic activation functions to enhance the representation of complex natural signals and their derivatives. SIRENs are designed to address the limitations of traditional neural networks, which struggle with fine details and higher-order derivatives of signals. The authors propose an initialization scheme for SIRENs that ensures the distribution of activations is preserved through the network, leading to faster convergence and better performance. SIRENs are demonstrated to effectively represent images, videos, audio signals, and their derivatives, as well as solve challenging boundary value problems such as the Poisson equation, Helmholtz equation, and wave equations. The paper also explores the use of SIRENs in solving inverse problems and combining them with hypernetworks to learn priors over the space of implicit functions. The results show that SIRENs outperform ReLU-based networks in terms of accuracy and efficiency, making them a powerful tool for various applications in scientific and engineering fields.The paper introduces a novel approach called Sinusoidal Representation Networks (SIRENs) that leverage periodic activation functions to enhance the representation of complex natural signals and their derivatives. SIRENs are designed to address the limitations of traditional neural networks, which struggle with fine details and higher-order derivatives of signals. The authors propose an initialization scheme for SIRENs that ensures the distribution of activations is preserved through the network, leading to faster convergence and better performance. SIRENs are demonstrated to effectively represent images, videos, audio signals, and their derivatives, as well as solve challenging boundary value problems such as the Poisson equation, Helmholtz equation, and wave equations. The paper also explores the use of SIRENs in solving inverse problems and combining them with hypernetworks to learn priors over the space of implicit functions. The results show that SIRENs outperform ReLU-based networks in terms of accuracy and efficiency, making them a powerful tool for various applications in scientific and engineering fields.