Modeling complex time series presents unique challenges due to high dimensionality, temporal dependencies, non-stationarity, and limited interpretability. In this talk, we present recent advances in deep learning architectures for time series analysis, focusing on two developments: Deep Structured State Space Models (DS3M) and Functional Neural Tangent Kernels (FNTK). DS3M is a modular and interpretable architecture that combines the expressive power of deep neural networks with the dynamic modeling capabilities of state space representations. By integrating recurrent latent structures with external inputs and temporal abstractions, DS3M effectively captures both short- and long-term dependencies. This enables accurate forecasting, robust anomaly detection, and causal reasoning in irregular or partially observed systems. Complementing this, we introduce the functional Neural Tangent Kernel (fNTK) estimator, which captures nonlinear interactions across functional time series in a theoretically grounded and computationally efficient manner. We establish its connection to functional kernel regression and demonstrate its empirical advantages using over 6 million S&P 500 Index option contracts spanning January 2009 to December 2021. The fNTK consistently delivers improved forecasting accuracy across multiple horizons. We illustrate how these approaches jointly address key challenges in complex time series modeling. Empirical results across multiple domains show that DS3M and fNTK achieve state-of-the-art performance while offering enhanced interpretability, solid theoretical foundations, and scalability.
Underlying papers:
https://www.tandfonline.com/doi/full/10.1080/07350015.2025.2489087
https://www.sciencedirect.com/science/article/abs/pii/S0169207025000433
Personal website of Ying Chen
