
“Lex ex Machina“
Over the next few years, Rubin/LSST, Euclid, Roman, SKA, and other instruments will produce petascale, information‑rich datasets that trace stars, galaxies, and large-scale structure with unprecedented fidelity. Hidden in these data may be regularities that point to new and unexpected physical relationships. Can we build modelling frameworks that can discover such relationships accurately, efficiently, and in forms we can interpret?
After a general introduction to optimization, I will present NestyNet, a new modelling and analysis tool we have developed over the last three years. It assembles networks with analytic derivatives and trains them using second-order methods, yielding fast, high accuracy fits to datasets, solvers for ordinary and partial differential equations, action-angle transformations, Gaussian-mixture inference, and dynamical modelling, with exact gradients and Hessians throughout. It is particularly powerful as a means to represent, analyse and simulate physical fields and their evolution.
I will demonstrate how such field-representations of data can be iteratively distilled to give compact symbolic mathematical equations and differential equations. The machinery can now rediscover textbook equations including challenging vector and complex-valued differential equations (e.g. Maxwell, Schroedinger, MOND Poisson equations). This approach is a practical route toward explainable, robust models for the forthcoming data deluge, aimed less at “automating Kepler” than at accelerating analysis while keeping physical insight.

