Zipf's law

From Simple English Wikipedia, the free encyclopedia

Zipf's law is an empirical law, formulated using mathematical statistics, named after the linguist George Kingsley Zipf, who first proposed it.[1][2] Zipf's law states that given a large sample of words used, the frequency of any word is inversely proportional to its rank in the frequency table. So nth word has a frequency proportional to 1/n.

Thus the most frequent word will occur about twice as often as the second most frequent word, three times as often as the third most frequent word, etc. For example, in one sample of words in the English language, the most frequently occurring word, "the", accounts for nearly 7% of all the words (69,971 out of slightly over 1 million). True to Zipf's Law, the second-place word "of" accounts for slightly over 3.5% of words (36,411 occurrences), followed by "and" (28,852). Only about 135 words are needed to account for half the sample of words in a large sample.[3]

The same relationship occurs in many other rankings, unrelated to language, such as the population ranks of cities in various countries, corporation sizes, income rankings, etc. The appearance of the distribution in rankings of cities by population was first noticed by Felix Auerbach in 1913.[4]

It is not known why Zipf's law holds for most languages.[5]

References[change | change source]

  1. Zipf George K. 1935. The psychology of language. Houghton-Mifflin.
  2. Zipf George K. 1949. Human behavior and the principle of least effort. Addison-Wesley.
  3. Fagan, Stephen; Gençay, Ramazan (2010), "An introduction to textual econometrics", in Ullah, Aman; Giles, David E. A. (eds.), Handbook of Empirical Economics and Finance, CRC Press, pp. 133–153, ISBN 9781420070361. P. 139: "For example, in the Brown Corpus, consisting of over one million words, half of the word volume consists of repeated uses of only 135 words."
  4. Auerbach F. 1913. Das Gesetz der Bevölkerungskonzentration. Petermann’s Geographische Mitteilungen 59, 74–76
  5. Brillouin, Léon [1959] 2004. La science et la théorie de l'information.