News

The function H (X) in equation #1 is called the Shannon information entropy which is maximum when the events xj have equal probabilities of occurring, pj.
But if any character is equally probable, Shannon’s formula gets simplified and becomes exactly the same as Boltzmann’s formula for entropy. The physicist John von Neumann supposedly urged Shannon to ...
More information: Ginestra Bianconi, Gravity from entropy, Physical Review D (2025). DOI: 10.1103/PhysRevD.111.066001 Journal information: Physical Review D Provided by Queen Mary, University of ...
The Lefschetz zeta function associated to a continuous self-map f of a compact manifold is a rational function P/Q. According to the parity of the degrees of the polinomials P and Q, we analize when ...
Proper scoring rules derive from convex functions and relate to information measures, entropy functions, and Bregman divergences. In the case of categorical variables, we prove a rigorous version of ...
Entropy is surely one of the most intriguing and misunderstood concepts in all of physics. The entropy of the universe must always increase – so says the second law of thermodynamics. It’s a ...