Tags

, , , , , , , , ,

One of the biggest mysteries in psychology is the Flynn Effect; the fact that over the 20th century, people have been performing better and better on IQ tests.  Of course, the average IQ in Western countries by definition is always about 100, however because people keep scoring higher every decade, the tests routinely have to be made more difficult and the norms must be regularly updated.  Now if this only happened on culturally loaded tests like General Knowledge and Vocabulary, we could simply conclude that the tests are just culturally biased against past generations who had less access to schooling and media.  But some of the biggest gains have been found on tests like the Raven which were explicitly designed to be culture fair.

In one study (see figure 2) the top 10% of British people born in 1877 (by definition those with IQ’s above 120 for their era) performed the same on the Raven as the bottom 5% of British people born in 1967 (by definition those with IQ’s below 75 for their era).  In other words, performance on the Raven had increased by the equivalent of 45 points in less than a century!  Of course it wasn’t a level playing field because those born in 1877 took the test when they were a somewhat elderly 65 while those born in 1967 took the test when they were young sharp 25 year olds, however Flynn cites longitudinal studies showing that Raven type reasoning declines by no more than 10 points by age 65.  That still leaves us with 35 points to explain.

Another source of inaccuracy was that although the test was not timed for either group, those born in 1877 took the test supervised while those born in 1967 got to take the test home.  This could potentially make a large difference; not necessarily because the unsupervised group would cheat, but because they would probably take more breaks since they were in the comfort of their homes.  They would probably return to challenging items after they had time to relax and see those items from a fresh perspective, while those who took the test supervised in some strange room were probably more likely to rush through the tasks so they could go home. I would estimate that being allowed to take an test home improves test performance by about 5 IQ points on average, though this is just a guess.

But that still leaves a huge difference of 30 IQ points.  It’s important to note that the British born in 1877 probably completed no more than eight grades of schooling on average, while those born in 1967 probably averaged more than 12 years of schooling, and not attending high school may reduce IQ scores (though probably not real intelligence) by 8 points.  It may seem unlikely that schooling could influence a test that seems as culture fair as the Raven, but some people argue that the Raven is actually culturally biased.  Richard Lynn argues that it requires basic math skills like addition and subtraction and believes the rise in education explains part of the adult Flynn Effect.  At the very least, people with more schooling might be more likely to take the test from a mathematical perspective or with more motivation, confidence, and persistence.

So if we subtract the 8 point schooling effect, that still leaves 22 IQ points unexplained. Is it possible that real intelligence could have improved by 1.5 standard deviations since 1877?  Scientist Stephen Hsu recently noted that male height in Europe also increased by 1.5 standard deviations since 1870, and he compared this to the Flynn Effect.  It may seem unlikely that a variable as genetic as intelligence could be so improved by the environment, but height is even more genetic than intelligence is, so it’s obviously quite plausible.  This brings us to Richard Lynn’s nutrition hypothesis for the Flynn Effect, which claims that not only has better nutrition made us taller, but smarter too. Included in Lynn’s nutrition hypothesis is disease reduction, since diseases impede nutrients.

How does nutrition make us smarter?  Most obviously by increasing brain size, and cranial capacity may have increased an astonishing 150 cubic centimeters in the last 150 years.   Of course as Arthur Jensen noted in his book The g Factor, brain size is only moderately correlated with intelligence, so the increased brain size could only have caused less than half of the Flynn Effect.  However if nutrition has improved brain size by over 1.5 standard deviations, it likely improved other properties of the brain to the same degree; they’re just harder to measure and notice than something as visible as head size, but perhaps IQ tests are detecting them.  For example Lynn notes animal studies showing nutrition affects the growth and number of glial brain cells, as well as neuron myelination, dendrite growth, and synaptic connections and points to human autopsy studies suggesting similar effects.

But if biological intelligence has increased by 1.5 standard deviations, then the average Victorian would have a mean IQ below 80 by today’s standards.  An intelligence researcher found this hard to believe given their superior scientific and literary output.  In response I made some of these points:

1) The effects of nutrition may have much less impact on the parts of the brain responsible for language (literature) and academic learning.  Despite the 20th century rise in schooling, the Flynn Effect appears to be much more pronounced on non-verbal tests of abstract reasoning than on tests of crystallized knowledge, vocabulary and arithmetic.  Lynn discusses the same pattern in identical twins, where one twin is born less nourished than the other.  By age 15, the twins are equal on verbal-academic abilities, but the less nourished, smaller brained twin is worse on non-verbal fluid problem solving.  In my opinion this makes perfect sense.  Human survival depends more on our ability to take advantage of generations of cultural knowledge (crystallized intelligence) than on our ability to figure things out for ourselves (fluid intelligence), so when malnutrition strikes, evolution would prioritize and preserve the crystallized functions of the brain at the expense of fluid abilities.  And perhaps no crystallized ability is more important than language and literacy, so it’s no surprise that literary abilities would be relatively preserved in the malnourished mind.

2) By what objective standard can we conclude that 19th century humans were more intellectually accomplished?  Since the 19th century, humans have invented television,  put a man on the moon, grown human ears on mice, used DNA to solve human crimes and map ancient human migrations, invented the internet, Iphones, Ipads, GPS systems, military drones, video games, word processors, microwave ovens,magnetic resonance brain imaging, computer animation, 3-D printing, nuclear weapons, holograms…the list goes on and on.  Of course we are building on the accomplishments of those who came in the 19th century, but every generation builds on past generations.  Extrapolating from the rate of accomplishments through most of history, I suspect that 20th century accomplishments would exceed mathematical predictions, though I have no idea how to test such a hypothesis.

In addition to scientific advances, since the 19th century humans have become less superstitious (declining religiosity), less violent, and more moral (i.e. civil rights movement, feminism).  Moral non-violent behavior indicates intelligence.  Also, popular movies and television shows have become far more sophisticated, subtle, and nuanced in the last half century.

3) The fact that average 19th century intellect was much lower does not preclude the existence of 19th century geniuses anymore than the fact that average 19th century height was much lower precludes the existence of 19th century giants like John Rogan or Edouard Beaupre.    

 

 

 

 

Advertisements