Communications and Signal Processing Seminar
Information Geometry: Geometrizing Statistical Inference
Add to Google Calendar
Information Geometry is the differential geometric study of the manifold of probability models, where each probability distribution is just a point on the manifold. Instead of using metric for measuring distances on such manifolds, these applications often use "divergence functions" for measuring proximity of two points (that do not impose symmetry and triangular inequality), for instance Kullback-Leibler divergence, Bregman divergence, f-divergence, etc. Divergence functions are tied to generalized entropy (for instance, Tsallis entropy, Renyi entropy, phi-entropy) and cross-entropy functions widely used in machine learning and information sciences. After a brief introduction to IG, I illustrate the geometry of maximum entropy inference and exponential family. I then use a general form of entropy/cross-entropy/divergence function, and show how the geometry of the underlying probability manifold (deformed exponential family) reveals an "escort statistics" that is hidden from the standard exponential family.
Jun Zhang, PhD, is Professor of Mathematics and Psychology in the College of Literature, Science, and the Arts at the University of Michigan, Ann Arbor.
Prof. Zhang develops algebraic and geometric methods for data analysis. Algebraic methods are based on theories of topology and partially ordered sets (in particular lattice theory); an example being formal concept analysis (FCA). Geometric methods include Information Geometry, which studies the manifold of probability density functions. He interests include mathematical psychology and computational neuroscience, broadly defined to include neural network theory and reinforcement learning, dynamical analysis of nervous system (single neuron activity and event-related potential), computational vision, choice-reaction time model, Bayesian decision theory and game theory.