The observer effect, or observer bias, means several things in different situations, although there are similarities.
Use in science[change | change source]
For example, for us to "see" an electron, a photon must first interact with it, and this interaction will change the path of that electron. It is also theoretically possible for other, less direct means of measurement to affect the electron; even if the electron is simply put into a position where observing it is possible, without actual observation taking place, it will still (theoretically) alter its position.
In physics, a more mundane observer effect can be the result of instruments that by necessity alter the state of what they measure. In electronics, ammeters and voltmeters usually need to be connected to the circuit, and so by their very presence affect the current or the voltage they are measuring. A standard mercury-in-glass thermometer must absorb some thermal energy to record a temperature, and therefore changes the temperature of the body which it is measuring.
A misunderstanding of the term refers to quantum mechanics, where, if the outcome of an event has not been observed, it exists in a state of 'superposition', which is something like being in all possible states at once. In the famous thought experiment known as Schrödinger's cat, the cat is empirically neither alive nor dead until observed – until that time, the cat is theoretically both alive and dead (technically half-alive and half-dead in probability terms). However, most quantum physicists, in resolving Schrödinger's seeming paradox, now understand that the acts of 'observation' and 'measurement' must also be defined in quantum terms before the question makes sense. From this point of view, there is no 'observer effect', only one vastly entangled quantum system. A significant minority still find the equations point to an observer; John Archibald Wheeler, who probably worked more deeply on this subject than any physicist thus far, devised a graphic in which the universe was represented by a "U" with an eye on one end, turned around and viewing itself, to describe his understanding.
The Heisenberg uncertainty principle is also frequently, confused with the "observer effect". The uncertainty principle actually describes how precisely we may measure the position and momentum of a particle at the same time – if we increase the precision in measuring one quantity, we are forced to lose precision in measuring the other. Thus, the uncertainty principle deals with measurement, and not observation. The idea that the Uncertainty Principle is caused by disturbance (and hence by observation) is not considered to be valid by some, although it was discussed in the early years of quantum mechanics, and is often repeated in popular treatments.
There is a related issue in quantum mechanics relating to whether systems have pre-existing – prior to measurement, that is – properties corresponding to measurements that could possibly be made on them. The assumption that they do is often referred to as "realism" in the literature, although it has been argued that the word "realism" is being used in a more restricted sense than philosophical realism. A recent experiment in the realm of quantum physics has been quoted as meaning that we have to "say goodbye" to realism, although the author of the paper states only that "we would [..] have to give up certain intuitive features of realism". These experiments demonstrate a puzzling relationship between the act of measurement and the system being measured, but it is unclear if they require a conscious observer or not.
Use in Information technology[change | change source]
In information technology, the observer effect refers to potential impact of the act of observing a process output while the process is running. For example, if a process records its progress in a log file, the act of viewing the file while the process is running could cause an I/O error in the process, which could cause it to stop.
Another example would be observing the performance of a CPU by running both the observed and observing programs on the same CPU. This will lead to inaccurate results because the observer program itself affects the CPU performance.
Observing (or rather, debugging) a running program by modifying its source code (such as adding extra output or generating log files) or by running it in a debugger may sometimes cause certain bugs to diminish or change their behavior. This makes more difficulty for the person trying to isolate the so-called "Heisenbug".
[change | change source]
In the social sciences and general usage, the effect refers to how people change their behavior when aware of being watched (see Hawthorne effect). For instance, in the armed forces, an announced inspection is used to see how well soldiers can do when they put their minds to it, while a surprise inspection is used to see how well prepared they generally are.
Observer bias[change | change source]
The related social-science term observer bias is error introduced into measurement when observers overemphasize behavior they expect to find and fail to notice behavior they do not expect. This is why medical trials are normally organized as double-blind tests. Observer bias can also be introduced because researchers see a subject doing something, and interpret it according to what it means to them, whereas it may mean something else to the subject.