Information entropy is a concept from information theory. It tells how much information there is in an event. In general, the more uncertain or random the event is, the more information it will contain. The concept of information entropy was created by mathematician Claude Shannon.
The information gain is a measure of the probability with which a certain result is expected to happen. In the context of a coin flip, with a 50-50 probability, the entropy is the highest value of 1. It does not involve information gain because it does not incline towards a specific result more than the other. If there is a 100-0 probability that a result will occur, the entropy is 0.
Example[change | change source]
Let's look at an example. If someone is told something they already know, the information they get is very small. It will be pointless for them to be told something they already know. This information would have very low entropy.
If they were told about something they knew little about, they would get much new information. This information would be very valuable to them. They would learn something. This information would have high entropy.
Related pages[change | change source]
Other websites[change | change source]
- Information is not entropy, ! - a discussion of the use of the terms "information" and "entropy".
- I'm confused: how could information equal entropy? - a similar discussion on the bionet.info-theory FAQ.
- Java "entropy pool" for cryptographically-secure unguessable random numbers
- Description of information entropy from Tools for thought' by Howard Rheingold
- An intuitive guide to the concept of entropy arising in various sectors of science – a wikibook on the interpretation of the concept of entropy.