Homepage of Peter Harremoës (Entro-Peter)

Midgårdsormen

E-mail: harremoes@ieee.org

Mobile: +45 27 82 41 71

Skype: peterharremoes

Address:
Rønne Allé 1, st.
2860 Søborg
Denmark
Tel.:
+45 39 56 41 71


Research news

At IGAIA IV and MaxEnt 2016 I presented results that strongly indicate that information theory is only possible in mathematical structures that can be represented  on Jordan algebras. There are 5 basic types of Jordan algebras, and density matrices with complex entries is the most important in the sense that quantum information theory is normally represented on this type of algebra. I guess that many results from quantum information theory can be generalized to Jordan algebras. The new results can be found in arXiv:1607.02259 and some related results can be found in Proper Scoring and Sufficiency and in arXiv: 1601.07593.

I finished a long paper with various bounds on tail probabilities in terms of the signed log-likelihood function arXiv: 1601.05179 . Although the inequalities are quite sharp I am sure even sharper inequalities of these types can be obtained. I think the results can be generalized to cover all exponential families with simple variance functions. Even more general inequalities may exist, but at the moment I don't know how to attack the general problem.

In the following paper arXiv:1601.04255 we obtain a lower bound on the rate of convergence in the law of small numbers, that seem to be asymptotically tight.

Lattice theory of causation I an working on a larger project where I want to see to what extend concepts related to cause and effect can be studied using lattice theory. Last summer I presented some derived results related to Lattices with non-Shannon Inequalities. In January and February I was visiting University of Hawai'i where several lattice experts are situated and now I try to put all the results together.

Link to my page on Research Gate

List of Publications

ORCID: 0000-0002-0441-6690

arXiv

Research

My research is centered on information theory. One of my interests is how to use ideas from information theory to derive results in probability theory. Many of the most important reslts in probability theory are convergence theorems, and many of these convergence theorems can be reformulated so that they state that the entropy of a system increases to a maximum or that a divergence converge to a minimum. These ideas are also relevant in the theory of statistical tests. Recently I have formalized a method for deriving Jeffreys prior as the optimal prior using the minimum description length principle.
I am also interested in quantum information theory, and I think that information theory sheds new light on the problems of the foundation of quantum mechanics. In a sense the distinction between matter and information about matter disappear on the quantum level. Combining this idea with group representations should be a key to a better understanding of quantum theory.
I have also worked on the relation between Bayesian networks and irreversibility, and my ultimate goal is to build a bridge between these ideas and information theory. I am working on a new theory where methods from lattice theory are used. I think lattices of functional dependence will provide a more transparent framework to describe causation. Hopefully it will lead to better algorithms for detecting causal relationship, but the most important application might be in our description of quantum systems, where we know that our usual notion of causation break down.

Editorial work

I am associated editor of the journal IEEE Transactions on Information Theory covering probability and statistics. I am editorial board memberof the journal Entropy. I was Editor-in-Chief of Entropy 2007-2011 and editor 2011-2015. 

I serve as reviewer of numerous other journals as can be seen from my newly created Publons profile.

Events

Quantum Lunch Talk at Math. Dept. Univ. Copenhagen 6/1 2016 I will give a talk entitled "Determinism and Causality. Which aspects are shared between classical physics and quantum physics?"
I will be visiting University of Hawai'i Jan.-Feb. 2016.
I will give a lecture series on Bregman divergences. The first will be held Thursday 21/1 in Holmes Hall. See flyer for detailed information.
IGAIA IV I have been invited to give a talk at the conference Information Geometry and its Applications IV, to be held on June 1-17, 2016 at Liblice Castle, Czech Republic. The title of my talk will be announced later.
MaxEnt 2016 36th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering. 10-15 July, Gent Belgium.
ISIT 2016 10-15/7 2016, Barcelona, Spain.
Beyond IID 4 18-22/7 2016, Barcelona, Spain. This is a satellite conference to ISIT.
WITMSE 2016 19-21/9 Helsinki.

Links

I have made a BibTeX file with a lot of items related to information theory.

Information Theory Society I am senoir member og IEEE and the Information Theory Society is a part of this organization. The page has a lot of useful links related to information theory.
Danish Climbing Federation I am active climber and has served as president of this organization.
Entropy The Journal that I was Editor-in-Chief for.
Ideal scientific equipment A less useful link.
Minimum Description Length on the web This pages is devoted to MDL and its applications. I contains links to articles and people working in the field.
Teaching portfolio Here I have collected various material relevant for my teaching activities.
The cross entropy method

Software

Here are links to some of my favorite software.

Dropbox Very convenient for storing and sharing documents.
GeoGebra Software to create mathematical figures. It can output an image as pgf code.
JabRef Program that helps you manage your BibTeX database.
LyX User friendly interface to LaTeX.
PDF-XChange Great for annotating PDF documents. The free version is good and the prof. version is even better.
R Great for statistical programming. With the module Sweave R-code can be embedded directly into LaTeX documents.
TikzEdt Editor for TikZ/pgf code.

Collegues

Here are links to some persons that I use to collaborate with.

Andrew Barron
Christophe Vignat
Flemming Topsøe
Frantisek Matús
Gábor Tusnády
Ioannis Kontoyiannis
László Györfi
Lukasz Debowski
Narayana Prasad Santhanam
Oliver Johnson
Peter Grunwald
Tim van Erven