Kolic, Blas; Juan Sabuco and J. Doyne Farmer

In this paper we study the problem of inferring the initial conditions of a dynamical system under incomplete information. Studying several model systems, we infer the latent microstates that best reproduce an observed time series when the observations are sparse, noisy and aggregated under a (possibly) nonlinear observation operator. This is done by minimizing the least-squares distance between the observed time series and a model-simulated time series using gradient-based methods. We validate this method for the Lorenz and Mackey-Glass systems by making out-of-sample pre- dictions. Finally, we analyze the predicting power of our method as a function of the number of observations available. We find a critical transition for the Mackey-Glass system, beyond which it can be initialized with arbitrary precision.