Applications of proper score and sufficiency

Peter Harremo√ęs

Information divergence (KL-divergence, quantum relative entropy) appear in both statistics, statistical mechanics, information theory and portfolio theory. One often talk about applications of information theory, but we will demonstrate that there is a common structure that is exactly share in these fields. This common structure is related to optimization and leads to proper scoring rules and Bregman divergences. If we place a sufficiency condition on top of this we are lead to information divergence and logaritmic score, but contrary to the first part of the construction, this second step is often only valid under some special conditions that may or may not be fulfilled in practice.