This paper introduces three stable classes of Neural Stochastic Differential Equations (Neural SDEs): Langevin-type SDE, Linear Noise SDE, and Geometric SDE, designed to handle irregular time series data with missing values. Traditional methods for time series data assume regular intervals and complete data, which are not always the case in real-world scenarios. Neural Ordinary Differential Equations (Neural ODEs) offer an alternative by learning continuous latent representations through parameterized vector fields. Neural SDEs extend Neural ODEs by incorporating a diffusion term, but this addition can lead to instability if not carefully designed. The proposed Neural SDEs are trained based on theoretically well-defined SDEs, ensuring stability and robustness.
The study demonstrates that the proposed Neural SDEs maintain excellent performance under distribution shift and effectively prevent overfitting. Extensive experiments on four benchmark datasets for interpolation, forecasting, and classification tasks, as well as 30 public datasets under different missing rates, show that the proposed method is effective in handling real-world irregular time series data. The results indicate that the proposed Neural SDEs achieve state-of-the-art results in various experiments, with robustness to missing data and improved stability in training. The study also highlights the importance of carefully designing drift and diffusion functions to ensure the stability and performance of Neural SDEs. The proposed methods are less memory-efficient compared to CDE-based methods, but they significantly enhance the stability of Neural SDE training and improve classification performance under challenging circumstances. The paper concludes that the proposed Neural SDEs are a promising approach for handling irregular time series data with missing values.This paper introduces three stable classes of Neural Stochastic Differential Equations (Neural SDEs): Langevin-type SDE, Linear Noise SDE, and Geometric SDE, designed to handle irregular time series data with missing values. Traditional methods for time series data assume regular intervals and complete data, which are not always the case in real-world scenarios. Neural Ordinary Differential Equations (Neural ODEs) offer an alternative by learning continuous latent representations through parameterized vector fields. Neural SDEs extend Neural ODEs by incorporating a diffusion term, but this addition can lead to instability if not carefully designed. The proposed Neural SDEs are trained based on theoretically well-defined SDEs, ensuring stability and robustness.
The study demonstrates that the proposed Neural SDEs maintain excellent performance under distribution shift and effectively prevent overfitting. Extensive experiments on four benchmark datasets for interpolation, forecasting, and classification tasks, as well as 30 public datasets under different missing rates, show that the proposed method is effective in handling real-world irregular time series data. The results indicate that the proposed Neural SDEs achieve state-of-the-art results in various experiments, with robustness to missing data and improved stability in training. The study also highlights the importance of carefully designing drift and diffusion functions to ensure the stability and performance of Neural SDEs. The proposed methods are less memory-efficient compared to CDE-based methods, but they significantly enhance the stability of Neural SDE training and improve classification performance under challenging circumstances. The paper concludes that the proposed Neural SDEs are a promising approach for handling irregular time series data with missing values.