Abstract
Information science is badly in need of an informa tion theory. The paper discusses both the need, and the possi bility of developing such a theory based on the assumption that information is a basic property of the universe. That is, like matter and energy, information has physical reality. Any system which exhibns organisation contains information. Changes in entropy represent changes in the organisational states of systems and, as such, quantify changes in the informa tion content of such systems. Information, like energy, exists in many forms. These are interconvertible. Likewise, energy and information are readily mterconverted: A change of 1 J/°K equals apprommately 10 23 bits. The paper also considers re lated phenomena such as "meaning" and "mtelligence", and argues that the emergence of machine intelligence in the milieu of human society presages an evolutionary discontinuity.