Research Interests

My research in Mathematics follows three main directions. In this page the publications that I co-authored are listed. 

In particular, my doctoral research with the Chair of Statistical Field Theory at EPFL was concluded in 2021 with the dissertation of my doctoral thesis, the Introduction of which is available here.


Work in progress

Book: Gabriel F., Hongler C., Spadaro F., Lattice Models and Conformal Field Theory, (preliminary version)

This book is devoted to explaining the connection between Conformal Field Theories and lattice models, in light of the progress that were made recently. We present several two-dimensional lattice models (e.g. Ising model, FK-Ising model, and Tricritical Ising model), their phase transtions and their conjectural connection with CFT. Starting from the notion of lattice models and lattice local fields, the key ideas and results of two-dimensional Conformal Field Theories are introduced, with a special emphasis on the unitary minimal models. We detail all the delicate proofs and ideas that lead to the classification of unitary minimal models. In particular, we punctually discuss the nature of the assumptions on lattice models upon which CFTs are built, thus yielding to a definition of CFTs with a probabilistic approach (rather than an axiomatic/algebraic one). These chapters also further detail and expand aspects of CFTs that are not dealt with depth in standard textbooks: in particular, we review in details the essence of transformation laws at the continuum and at the discrete level, we propose a complete proof of FQS non-unitarity theorem for the minimal models and we propose both recursive and closed formulas for correlation functions of the stress-tensor.

Publications and Preprints

Conformal Field Theory and Lattice models

[9] Spadaro F., Complex Analysis meets Statistical Mechanics: Applications to Kernel Methods and Conformal Field Theory, (2021) (EPFL Infoscience)

[9] Spadaro F., Tricritical Ising semi-local CFT (2021) (extract of Chapter 13 and 14 of my Ph. D. thesis)

We consider the second unitary minimal model M(4), with central charge c = 7/10 , also known as the Tricritical Ising Model (TIM) CFT, which conjecturally describes the scaling limit of the Blume-Capel model at its tricritical point. We propose new formal results that can be derived by means of CFT methods such as the stress-tenor and super stress-tensor correlation functions in the full-plane (n-point) and in bounded domain with Dobrushin boundary conditions (2-point).Unlike the Ising case and the minimal modle M(3), at the present day we still lack rigorous results about the scaling limit of the discrete model; however, it is quite remarkable how, under the assumption of conformal invariance, which grants convergence to M(4) , many aspects of the Blume-Capel model at the tricritical point can be understood (modulo some conjectures), thanks to the clear picture that we have of the Ising model. In fact, like in the Ising case, a more profound understanding of the theory is reached when one introduces fermions and extends the local theory to a semi-local one. 

[5] Spadaro F., On the construction of discrete fermions in the FK-Ising model, (2020) (arXiv:2003.13305).

The Ising field theory is known to possess a free holomorphic fermion ψ, which as such is a non-local field. Whilst in many case, one can have a rather natural intuition of how to represent spinless bosonic field in terms of lattice model precursor, this is not the case for field with spin. Yet in the case of the fermion ψ, one can understand it both in terms of the Ising lattice model as a pair of spin and disorder operator and in terms of the FK-Ising model as a winding observable. We consider the latter case and construct its 2n-point correlation functions: we bypass discrete holomorphicity techniques, and show that per se the observable exists at all temperature, albeit a scaling limit exists only at criticality.

Random Symmetries of Stochastic Differential Equations

[8] Gaeta G., Kozlov R., Spadaro F., Asymptotic symmetry and asymptotic solutions to Ito stochastic differential equations, Mathematics in Engineering, Vol. 4, Issue 5: 1-52, (2022) (arXiv:2110.00670);

[3] Gaeta G., Spadaro F., Symmetry classification of scalar Ito equations with multiplicative noise, J. of N. Math. Phys. 27, Issue 4, (2020) (arXiv:2002.05122);

[2] Gaeta G. & Lunini C. & Spadaro F., "Recent advances in symmetry of stochastic differential equations", Rend. Mat. Appl. (7) 39 (2018) (arXiv:1901.05533);

[1] Gaeta G. & Spadaro F., “Random Lie-point symmetries of stochastic differential equations”, J. Math. Phys. 58 (2017) (arXiv:1705.08873).

Symmetry analysis of differential  equations is a powerful and by now standard tool in the study of nonlinear problems. Yet, its use in the context of stochastic differential equations is comparatively much less developed. In [1] we study near-identity smooth (random) transformations of random fields and explicitly show the determining equations for the symmetries. In [2] we focus on the strategy and method of exploiting the theory of symmetry as an help in determining solution for stochastic (ordinary) differential equation. In [3] we use the theory of stochastic symmetry to provide a symmetry classification of scalar stochastic equations with multiplicative noise, a class  of equations that plays a central role in Mathematical Biology and in particular in Population Dynamics.In [8] we consider several aspects of conjugating symmetry methods, including the method of invariants, with an asymptotic approach. In particular we consider how to extend to the stochastic setting several ideas which are well established in the deterministic one, such as conditional, partial and asymptotic symmetries. A number of explicit examples are presented. 

Theoretical Machine Learning

[7] Şimşek B., Ged F., Spadaro F., Jacot A., Hongler C., Brea J., Gerstner W., Geometry of the Loss Landscape in Overparameterized Neural Networks: Symmetries and Invariances, Proceedings of the 38th International Conference on Machine Learning, Online, PMLR 139,  (2021) (arXiv:2105.12221).


Starting form the general wisdom that critical points influence gradient descent trajectories, in [7] we study how permutation symmetries in over-parameterized multi-layer neural networks generate `symmetry-induced' critical points. Assuming a minimal network with L layers of widths r1*, ..., r{L-1}* reaches a unique (up to permutations) zero-loss minimum on some learning task, we show that adding one extra neuron to each layer is sufficient to make all the r1*! ...r{L-1}*! previously discrete minima connected into a single manifold. For a two-layer network with an overparameterization of r*+ n =: m neurons we explicitly describe the global minima manifold: it consists of T(r*, m) affine sub-spaces of dimension at least n that are connected to each other. Moreover, for a network of width m, we identify the number G(r, m) of affine sub-spaces containing only symmetry-induced critical points that are related to critical points of a smaller network of width r<r*. Via a combinatorial analysis, we derive closed-form formulas for T and G and show that the number G of symmetry-induced critical sub-spaces dominates the number T of affine sub-spaces that form the global minima manifold in the mildly over-parameterized regime (small n) and vice versa in the excessively over-parameterized regime n>>r*).

[9] Spadaro F., Complex Analysis meets Statistical Mechanics: Applications to Kernel Methods and Conformal Field Theory, (2021) (EPFL Infoscience)

[6] Jacot A., Şimşek B., Spadaro F., Gabriel F., Hongler C., Kernel Alignment Risk Estimator: Risk Prediction from Training Data, Advances in Neural Information Processing Systems 33, (2020) (arXiv:2006.09796 and poster presented at NIPS2020);

[4] Jacot A., Şimşek B., Spadaro F., Gabriel F., Hongler C., Implicit regularization of Random Feature Models, Proceedings of the 37th International Conference on Machine Learning, Online, PMLR 119, (2020) (arXiv:2002.08404).

Berfin's coool blog on Implicit Regularization.

Random Feature (RF) models are used as efficient parametric approximations of kernel methods. In [4] we investigate, by means of random matrix theory, the connection between Gaussian RF models and Kernel Ridge Regression (KRR). For a Gaussian RF model with P features, N data points, and a ridge λ, we show that the average (i.e. expected) RF predictor is close to a KRR predictor with an effective ridge µ and that finite RF sampling has an effect of implicit regularization. Equivalently, if one wants to approximate a kernel method with ridge µ by means of random features with ridge, with no surprise, by taking infinitely many random features with ridge λ=µ we approximate well the kernel method; yet what we show is that it is possible to approximate sufficiently well the kernel method also with a finite number of features, provided that the ridge λ is finely chosen w.r.t. µ. We show how to characterize this fine tuning. We then compare the risk (i.e. test error) of the µ -KRR predictor with the average risk of the λ-RF predictor and obtain a precise and explicit bound on their difference. Finally, we empirically find an extremely good agreement between the test errors of the average λ-RF predictor and µ-KRR predictor. In [6] we continue the study of Kernel methods in a more general setting (allowing in particular also infinite dimensional settings). For this, we introduce two objects: the Signal Capture Threshold (SCT) -a suitable generalization of the effective ridge µ of [4]- and the Kernel Alignment Risk Estimator (KARE). The SCT ϑ(K,λ) is a function of the data distribution: it can be used to identify the components of the data that the KRR predictor captures, and to approximate the (expected) KRR risk. This then leads to a KRR risk approximation by the KARE ρ(K,λ), an explicit function of the training data, agnostic of the true data distribution. We phrase the regression problem in a functional setting. The key results then follow from a finite-size analysis of the Stieltjes transform of general Wishart random matrices. Under a natural universality assumption (that the KRR moments depend asymptotically on the first two moments of the observations) we capture the mean and variance of the KRR predictor. We numerically investigate our findings on the Higgs and MNIST datasets for various classical kernels: the KARE gives an excellent approximation of the risk, thus supporting our universality assumption. Using the KARE, one can compare choices of Kernels and hyper-parameters directly from the training set. The KARE thus provides a promising data-dependent procedure to select Kernels that generalize well. 

Simulations

Blume-Capel Model and Tricritical Ising

With this Javascript one can play the different phases of the Blume-Capel model on a square lattice. 

Click on the phase diagram to change temperature and fugacity in the simulation.

Discrete Gaussian Free Field

The following Mathematica code simulates a discrete Gaussian Free Field.

It is the code that appears in Scott Sheffield's Gaussian Free Field for Mathematicians (see https://arxiv.org/abs/math/0312099) and it is based on sampling the discrete Fourier Modes of the dGFF.

Simple Random Walk and its scaling

Here a small JavaScript program to pictorially convince oneself that the scaling t→√t is the right scaling for the Brownian Motion. It just simulates a Simple Random Walk and it zooms out accordingly. The parabola is then the 'good frame' for the trace of the BM. Compare also with the law of the iterated logarithm.

Schramm Loewner Evolution

Simulation of SLE(k) flow line (Laurie Field)

Simulation of CLE(3) carpet (David Wilson)