Abstract
The laws of statistical thermodynamics are used for the definition of entropy, and it is shown that the definition of information can be reduced to a problem of Fermi‐Dirac statistics or to a generalized Fermi statistics. With these definitions, the entropy of a certain message can be defined, and the information contained in the message can be directly connected with the decrease of entropy in the system. This definition leads directly to the formulas proposed by C. E. Shannon for the measure of information, and shows that Shannon's ``entropy of information'' corresponds to an equal amount of negative entropy in the physical system. The physical background of the whole method is discussed and found in agreement with previous discussions.

This publication has 2 references indexed in Scilit: