Fundamental Information
Geometry Problems: Theory and Applications

Joseph A. (Jody) O'Sullivan

Sachs Professor of Electrical Engineering

Washington University in
St. Louis

The negative log-likelihood
function in many estimation theory problems can be written in
terms of a relative entropy or I-divergence between two
distributions, the first taking values in a linear family (also
referred to as an m-flat manifold) and the second taking values in
an exponential (an e-flat or log-linear) family. Considering
only this term, many problems may be written in terms of a double
minimization over distributions from these two families.
Some basic properties of this minimization are presented,
including computational issues. Model order estimation
involves, in part, consideration of nested families of exponential
distributions. The I-divergence can be decomposed into
estimation and approximation error terms for such nested
families. These problems arise naturally in many Poisson
data models, including emission tomography and optical imaging, in
x-ray imaging, and in some emerging problems in imaging.
Each of these problems is introduced and example images presented.