Complexity theory

From Simple English Wikipedia, the free encyclopedia

Complexity theory is a type of Computer science. It looks at how hard a problem is to do for a computer, and how good particular solutions (algorithms) to that problem are. Different algorithms that solve a problem may be better or worse in multiple ways. An algorithm may be faster than some other algorithm, but it may need more resources, like memory.

Part of complexity theory is understanding how algorithms perform in their worst-case scenario. An algorithm's worst-case scenario is a problem that will take a long time, or lots of resources, to solve. If people know how good an algorithm is in its worst case, they can guarantee that the algorithm will be at least that good in every case. People are also interested in how a certain algorithm does on average; many algorithms are very inefficient in their worst case, but are okay on most cases. Knowing how good an algorithm is on average lets people know what to generally expect when using that algorithm.

Related pages[change | change source]