Entropy is a law of nature in which everything slowly goes into disorder. The entropy of an object is a measure of the amount of information it takes to know the complete state of that object, atom by atom. The entropy is also a measure of the number of possible arrangements the atoms in a system can have. In this sense, entropy is a measure of uncertainty. The higher the entropy of an object, the more uncertain we are about the states of the atoms making up that object because there are more states to decide from.
The word Entropy came from the study of heat and energy in the period 1850 to 1900. Some very useful mathematical ideas about probability calculations emerged from the study of entropy. These ideas are now used in Information theory, Chemistry and other areas of study.
For example, when one animal species, especially a keystone species, is taken out of the food chain eventually the food chain will collapse.