An intelligence quotient (IQ) is a total score derived from a set of standardized tests or subtests designed to assess human intelligence. Originally, IQ was a score obtained by dividing a person's mental age score, obtained by administering an intelligence test, by the person's chronological age, both expressed in terms of years and months. The resulting fraction (quotient) was multiplied by 100 to obtain the IQ score. For modern IQ tests, the raw score is transformed to a normal distribution with mean 100 and standard deviation 15. This results in approximately two-thirds of the population scoring between IQ 85 and IQ 115 and about 2 percent each above 130 and below 70.
Historically, many proponents of IQ testing have been eugenicists who used pseudoscience to push now-debunked views of racial hierarchy in order to justify segregation and oppose immigration. Such views are now rejected by a strong consensus of mainstream science, though fringe figures continue to promote them in pseudo-scholarship and popular culture. IQ scores have also been used for educational placement, assessment of intellectual ability, and evaluating job applicants.
Scores from intelligence tests are estimates of intelligence. Unlike, for example, distance and mass, a concrete measure of intelligence cannot be achieved given the abstract nature of the concept of "intelligence". IQ scores have been shown to be associated with such factors as nutrition, parental socioeconomic status, morbidity and mortality, parental social status, and perinatal environment. While the heritability of IQ has been investigated for nearly a century, there is still substantial debate about the significance of heritability estimates and the mechanisms of inheritance.
Raw scores on IQ tests for many populations have been rising at an average rate that scales to three IQ points per decade since the early 20th century, a phenomenon called the Flynn effect.