Tags

, , , , , , , , , ,

Generally speaking, it’s a bad idea to use culturally loaded tests like the SAT to compare the intelligence of people from very different cultures (i.e. 21st century America vs. 21st century Japan, 21st century America vs. 19th century America) however in light of fascinating new claims that (genetic) intelligence has declined by a full standard deviation (1 SD) in just the last century or so, even as scores on culture reduced IQ tests have increased by over 2 SD over the same time, I decided to look at some old SAT data with new eyes.

On page 429 of the book The Bell Curve by C. Murray and R. Herrnstein, they have a chart showing the percentage of American 17-year-olds capable of scoring 700+ on the old verbal SAT and old math SAT. Even though only about a third of U.S. young adults took the old SAT (on a good year), the authors assumed that the higher the ability, the greater the odds of taking the test, so they assumed that virtually every 17-year-old American capable of scoring 700+ actually took the test, allowing them to express their data as a percentage of an entire generation, not just the college-bound segment.

The data showed that over a 26 year span (1967-1993), the percentage of American 17 year-olds capable of scoring 700+ on the old verbal SAT dropped from 0.8% to 0.3%. This implies a 0.33 SD decline in just 26 years, equivalent to a decline of 1.92 verbal IQ points a decade! Recklessly extrapolating, that’s a drop of 19 verbal IQ points in a century!

However math SAT scores showed a different pattern. The percentage of 17-year-olds capable of scoring 700+ on the math SAT increased from roughly 1.25% to roughly 1.6%, implying the math distribution moved 0.27 SD to the right over those same 26 years. This implies a gain of 1.54 math IQ points a decade (or 15 points a century).

While it’s unwise to draw longterm conclusions from a test as cultural as the SAT, in my very humble opinion, the data makes perfect sense. Innovative chronometric research by scholars Michael A. Woodley, Guy Madison and Bruce G. Charlton, implies that because of dysgenic trends (including declining infant mortality), general intelligence has declined by 1.8 IQ points a decade (at least in women). Very similar to the verbal SAT (the more g loaded SAT subscale) drop of 1.92 IQ points a decade.

However at the same time, scholar Richard Lynn showed that better nutrition has been improving brain size and perhaps neurological development, causing scores on spatial tests to increase by over 2 SD over the 20th century. However nutrition only seems to consistently improve spatial ability. There’s been virtually zero Flynn Effect on tests of working memory (Digit Span, Arithmetic). Since the math SAT requires both spatial reasoning (high nutrition loading) and working memory (zero nutrition loading), it shows a Flynn Effect intermediate between these two abilities.

Of course one shouldn’t take my analysis too seriously because I only looked at data from 1967-1993, and I only looked at very high SAT scores which do not correlate well with trends in the general population (in the general population verbal IQ has gone up substantially!). However the gifted might be the best place to look for longterm intelligence trends because the most brilliant people tend to teach themselves, so their scores are more indicative of true ability, while the psychometric scores of more average people are more sensitive to externally imposed cultural opportunities (schooling, educated parents, mass media exposure, etc) and thus can be spuriously inflated by cultural bias. The pioneer of mental testing, Sir Francis Galton, commonly judged the intelligence of entire populations by the number of highly gifted people they produced.

Future research on dysgenics should continue to focus on the verbal skills of elites, perhaps analyzing the idea density of the speeches of U.S. presidents, from the 18th century to the 21st century.

Advertisements