Séminaire / Seminar GReCO |
« Information geometry, a gentle introduction » |
Jean-François Cardoso |
Statistical inference and modeling can be expressed in a geometrical framework where probabilistic models are seen as Riemannian manifolds of probability distributions. The natural (i.e. statistically motivated) metric on such manifolds is given by the Fisher information matrix. But what happens next, when one wants to understand curvature, is more intriguing: the natural connection to capture the most relevant statistical structures is not Levi-Civita. This may be traced back to the fact the `distance' between two probability distributions has good statistical reasons for *not* being symmetric. In this talk, I will briefly review some basic statistical notions and methods (the maximum likelihood principle, Fisher efficiency, Kullback divergence aka relative entropy, exponential families, a few others) and show how they fit nicely in a differential-geometric framework with a dual set of connections. There will be a Pythagoras theorem and, hopefully, some fun. In spite of the above pompous wording, the talk will not be technical but is rather intended as showing how geometry can support statistical intuition. |
lundi 15 mai 2017 - 11:00 Salle des séminaires Évry Schatzman Institut d'Astrophysique de Paris |
Pages web du séminaire / Seminar's webpage |