A substantial part of the seminar will be devoted to developing the mathematical foundations of classical models from statistical physics such as Gibssian systems, bootstrap percolation and random processes with reinforcement. In the final talks, we discuss more recent articles adapting these models to the special features that are characteristic for neural networks. The prerequisite for the seminar is an introductory course in probability theory.
Target audience. Master Mathematics, Master Financial and Insurance Mathematics, TMP. Prerequisite. An introductory course in probability theory.[1] Aizenman, M.; Lebowitz, J.L. Metastability effects in bootstrap percolation, Journal of Physics A: Mathematical and General 21 (1988), 3801--3813.
[2] Einarsson H., Mousset F., Lengler J., Panagiotou K., Steger A. Bootstrap percolation with inhibition. arXiv:1410.3291.
[3] Møller, J. A review of perfect simulation in stochastic geometry. In: Selected Proceedings of the Symposium on Inference for Stochastic Processes, IMS Lecture Notes Monogr. Ser. 37, 333--355
[4] Ferrari, P. A.; Fernández R.; Garcia, N. L. Perfect simulation for interacting point processes, loss networks and Ising models, Stochastic Processes and their Applications 102 (2002), 63--88.
[5] Galves, A.; Löcherbach, E. Infinite Systems of Interacting Chains with Memory of Variable Length -- A Stochastic Model for Biological Neural Nets, J Stat Phys 151 (2013) : 896--921.
[6] Bovier, A. A short course in mean field spin glasses. In: Spin Glasses: Statics and Dynamics. Progress in Probability, vol 62
[7] Bovier, A; Gayrard, V. Rigorous bounds on the storage capacity of the dilute Hopfield model. J Stat Phys 69 (1992) 69: 597--627.
[8] M. Benaïm. Dynamics of stochastic approximation algorithms. In: Seminaires de Probabilités XXXIII, volume 1709 of Lecture Notes in Mathematics, 1--68.