System Identification Through Causal Information Measures

System identification is the process of building a model for a physical system from observed experimental data. System ID processes are used in a wide variety of scientific and engineering applications from weather prediction to aircraft design. While numerous system ID algorithms have been developed to date, many current methods yield poorly performing models when applied to complicated physical systems involving numerous interacting components. However, recent advancements in data analytics (the study of “big data”) have yielded new algorithms that can identify patterns, and specifically causal relationships, in data. This project explores how these new data analysis tools can inform the system ID process and enable a new class of system ID algorithms specifically applicable to large-scale, complex systems. The resulting algorithms may be useful in difficult modeling and prediction problems including atmospheric/climate prediction, modeling of biological systems, or financial market analysis. The approaches developed here may lead to better predictive models for many of these complex systems.

Despite extensive research in system identification over the past several decades, system ID tools for nonlinear or high-order systems are rather underdeveloped and oftentimes suffer from convergence or computational issues. The research performed here leverages very recent advances in the mathematics and data analytics communities to derive a fundamentally novel approach to system identification based on information theory. At the core of this research is the concept of causation entropy, an entropic measure of information transfer within a dynamical system that can be computed directly from measured output data. Our work seeks to derive rigorous, causation entropy-based approaches for nonlinear parameter estimation and model order reduction, as well as establish a fundamental realization theory for linear Gaussian systems using causation entropy. Furthermore, the problem of identifying input-output dynamics will be addressed from an information theory perspective. Case studies will be extremely important in highlighting the practicality of these new system identification methods in a wide range of real-world examples.

Plot of Magnitude of A Matrix Components

Magnitude of Components of State Update Matrix for Coupled Harmonic Oscillator System.

Magnitude of Entropy Matrix Components

Magnitude of Causation Entropy Matrix Components Estimated from Output Realization.

References:

Sun, J., Bollt, E., “Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings,” Physica D, Vol. 267, pp. 49-57, 2014.