From Wikipedia, the free encyclopedia
Jump to: navigation, search

Entropy is a law of nature in which everything slowly goes into disorder. The entropy of an object is a measure of the amount of information it takes to know the complete state of that object, atom by atom. The entropy is also a measure of the number of possible arrangements the atoms in a system can have. In this sense, entropy is a measure of uncertainty or randomness. The higher the entropy of an object, the more uncertain we are about the states of the atoms making up that object because there are more states to decide from.

The word Entropy came from the study of heat and energy in the period 1850 to 1900. Some very useful mathematical ideas about probability calculations emerged from the study of entropy. These ideas are now used in Information theory, Chemistry and other areas of study.

For example, when one animal species, especially a keystone species, is taken out of the food chain eventually the food chain will collapse.

  • Information entropy, which is a measure of information communicated by systems that are affected by noise.
  • Thermodynamic entropy, which is part of the science of heat energy and is a measure of how organized or disorganized energy is in a system of atoms or molecules.

In thermodynamics, entropy means measure of disorder.