E-mail: harremoes@ieee.org Mobile: +45 27 82 41 71 Skype: peterharremoes Address: |
Research newsAt ISIT I will give a talk entitled Information Theory on Spectral Sets. The results strongly indicate that information theory is only possible in mathematical structures that can be represented on Jordan algebras. There are 5 basic types of Jordan algebras, and density matrices with complex entries is the most important in the sense that quantum information theory is normally represented on this type of algebra. I guess that many results from quantum information theory can be generalized to Jordan algebras. This work follows up on a paper entitled Divergence and Sufficiency for Convex Optimization that I recently published in Entropy. I published a long paper with various bounds on tail probabilities in terms of the signed log-likelihood function in Kybernetika. It has just been awarded the best paper in Kybernetika 2016. Although the inequalities are quite sharp I am sure even sharper inequalities of these types can be obtained. I think the results can be generalized to cover all exponential families with simple variance functions. Even more general inequalities may exist, but at the moment I don't know how to attack the general problem. In the following paper arXiv:1601.04255 we obtain a lower bound on the rate of convergence in the law of small numbers, that seem to be asymptotically tight. Lattice theory of causation I an working on a
larger project where I want to see to what extend
concepts related to cause and effect can be studied
using lattice theory. In 2015 I presented some
derived results related to Lattices with
non-Shannon Inequalities. In January and
February 2016 I was visiting University of Hawai'i where
several lattice experts are situated and now I try to put all the results together. |
||
ResearchMy research is centered
on information theory. One of my interests is how to
use ideas from information theory to derive results
in probability theory. Many of the most important
reslts in probability theory are convergence
theorems, and many of these convergence theorems can
be reformulated so that they state that the entropy
of a system increases to a maximum or that a
divergence converge to a minimum. These ideas are
also relevant in the theory of statistical tests.
Recently I have formalized a method for deriving
Jeffreys prior as the optimal prior using the
minimum description length principle. Editorial workI am associated editor of the journal IEEE Transactions on Information Theory covering probability and statistics. I am editorial board member of the journal Entropy and the journal Stats. I was Editor-in-Chief of Entropy 2007-2011 and editor 2011-2015. This journal was pioneering in open access publishing. I serve as reviewer of numerous other
journals as can be seen from my newly created Publons profile.
EventsI was visiting University of Hawai'i Jan.-Feb. 2016.I will gave a lecture series on Bregman divergences. The first will be held Thursday 21/1 in Holmes Hall. See flyer for detailed information. Quantum Lunch 21/12 2016. Entropy Power Inequalities at AIM, San José May 1st-5th. 2017. ISIT 2017 June. Takes place in Aachen in Germany. I will give a presentation entitled Quantum Information on Spectral sets. WITMSE 2017 Paris 11-13/9 I gave an invited talk. International Zürich Seminar on Information and Communication (IZS) 2018 21-23/2 I will give an invited talk entitled Horizont independent MDL. From Physics to Information Sciences and Geometry May 14-16 in Barcelona, Spain (or Catalonia?). ISIT 2017 June 17-22, Vail, USA.
LinksI have made a BibTeX file with a lot of items related to information theory. Information Theory
Society I am senoir member og IEEE and the
Information Theory Society is a part of this organization. The page has a
lot of useful links related to
information theory. SoftwareHere are links to some of my favorite software. Dropbox Very
convenient for storing and sharing documents. ColleguesHere are links to some persons that I use to collaborate with. Andrew
Barron |