Tags

, , , , , , , , , , , , ,

There’s been much discussion about a paper by scholars Michael A. Woodley, Jan te Nijenhuis, and Raegan Murphy, published in the prestigious journal Intelligence, arguing that genetic IQ in Western populations has declined by about 1 standard deviation since the 19th century. Although conventional IQ tests indicate people are getting smarter, the paper argues that simple reaction time (measured in milliseconds) is better for comparing people across centuries because although it’s a very crude measure of intelligence, it’s much less sensitive to the non-genetic factors that have caused the Flynn Effect.

However such a large decline in IQ in such a short span of time seems extremely unlikely.

A blogger named HBD Chick argued the paper went wrong by using poor sampling.  The oldest study the paper cites was conducted by Francis Galton circa 1889.  The paper was criticized for using this study because the sample is too elite to represent Victorians (only 4% of the sample was unskilled laborers, even though 75% of Victorians were).  But if the vast majority of Victorians were unskilled laborers, then I suggest we use the mean reaction time of just this occupation to represent Victorians. Yes, excluding the top 25% of occupations might bias the estimate downward, but the fact that these were unskilled workers intellectually curious enough to volunteer for Galton’s study would bias the estimate upward, so it cancels out.

An analysis of Galton’s sample (see figure 10 & 11) shows that unskilled laborers aged 14-25 averaged reaction times of 195 milliseconds in males and 190 in females (an average of 192.5).

Do we have a modern sample of similarly aged people that are equally representative of a Western population?  Yes.  A 1993 study by W.K. Anger and colleagues (see table 2) found that a sample of American postal, hospital, and insurance workers, aged 16-25 (in three different cities) had reaction times of 260 in males and 285 in females (an average of 273).  Just as unskilled labor was an average job in the 19th century, working for the post office, hospital or insurance company seems pretty average in modern times.  Thus, by subtracting 192.5 from 273, we can estimate the average Western reaction time has slowed by 80.5 milliseconds since the 19th century.  Since the standard deviation for reaction time is estimated to be 160.4 milliseconds (see section 3.2), reaction time has slowed by 0.50 standard deviations in over a century (equivalent to a drop of about 8 IQ points). This is actually virtually identical to the effect size the paper found using far more data points, but then they statistically adjusted the effect size, making it implausibly large. So in my humble opinion, the problem with the paper was not the samples they cited, but the statistical corrections they made.

The paper argued that since simple reaction time has a true correlation of NEGATIVE 0.54 with general intelligence, they needed to divided the effect size by 0.54 to estimate the true decline in general intelligence.  The logic was that since reaction time is a very rough measure of intelligence, it underestimates the true decline in genetic intelligence. I disagree.  Such inferences only make sense if you know a priori that there’s been direct selection for lower intelligence, thus dragging reaction time along for the ride, but that’s an assumption the paper was supposed to test, not rest upon.  It could be the opposite.  There’s been selection for slower reaction time, thus dragging intelligence along for the ride, in which case the effect size should be multiplied by 0.54, not divided.  Most likely, there’s just been recent selection for more primitive traits in general, and both reaction time and genetic intelligence reflect this dysgenic effect to parallel degrees, so the change in one equals the change in the other.

To illustrate the point further, consider that height has increased by 1.5 SD since the 19th century.  Height correlates only 0.2 with IQ.  Would it make sense to argue that since height is such a weak proxy for intelligence, we need to divide the 1.5 SD increase by 0.2 to estimate how intelligence has changed since the 19th century?  By such logic, intelligence would have increased by 7.5 standard deviations since the 19th century (equivalent to 113 IQ points)!

So clearly, the paper was unjustified in dividing the effect size by 0.54.

Without, the adjustment, Victorians were genetically 8 points smarter than moderns, which sounds a lot more believable than 14 points.

But if Victorians had a genetic IQ of 108 (by modern standards) , how could they score only 70 on the Raven IQ test?  In a previous post I argued that the Raven is a culture fair IQ test and thus the low Victorian scores must be biological (Richard Lynn’s nutrition theory).  Citing Richard Lynn, I also argued that malnutrition stunts non-verbal IQ (the Raven) more than Verbal-numerical IQ (I estimate by a factor of 31).  So if sub-optimum nutrition stunted their non-verbal IQ by 38 points, their verbal-numerical IQ’s would have been stunted by only 38/31= 1 point.  Thus the Victorians had a verbal-numerical IQ of 107, however because verbal tests are so culturally biased, their verbal scores would be artificially depressed by lack of schooling and exposure to mass media. But on a culture reduced measure of verbal-numerical ability like Backwards Digit Span, they might have scored the equivalent of 107.

So with a culture fair verbal-numerical IQ (Backwards Digit Span) of 107, and a culture fair non-verbal IQ (the Raven) of 70, their overall IQ’s would have been 86, which is 1.5 standard deviations below their genetic IQ of 108.  That 1.5 SD gap between phenotype and genotype is perfectly explained by Richard Lynn’s nutrition theory, since average height in Western countries was also 1.5 standard deviations lower in the 19th century.  And the great accomplishments of Victorians can be explained by their verbal-numerical IQ of 107.