The theory that the Victorians were more intelligent than modern populations leads to the expectation that the Victorians were more innovative than moderns, and indeed, a couple weeks ago, an expert helpfully informed me of this extremely impressive paper by Michael A. Woodley, published in the prestigious journal Intelligence, which argues exactly that. Citing scholars such as J. Huebner and Charles Murray, the paper concludes that innovation rates were higher in the 19th century than today, at least in fields like science and technology, which are probably the most important and most cognitively demanding, and thus the most relevant to the paper’s thesis.
Of course measuring innovation rates is an inexact science requiring subjective judgments on someone’s part, however the paper argues that when different independent sources using different methodologies all conclude that innovation rates were highest in the 19th century, then it’s no longer just a matter of opinion.
As I read the paper I was reminded of a fascinating book called The 100 by Michael H. Hart (1992 edition). In an act of an extraordinary intellectual ambition, Hart attempts to identify the 100 most influential people of all time, and then rank them in order of historical significance, and vigorously defends his choices (see the complete ranking here). I mention this because on pg 530 of Hart’s book, he has a table documenting when these 100 people flourished:
Before 600 BC: 3% of the list
600 BC to 201 BC: 13% of the list
200 BC to 1400 AD: 16% of the list
15th century: 4% of the list
16th century: 9% of the list
17th century: 9% of the list
18th century: 12% of the list
19th century: 18% of the list
20th century: 16% of the list
So yet another independent source, using yet another methodology (actually no formal methodology, just his own arguments) converges on the conclusion that the 19th century was a period of enormous cultural significance. It’s especially striking that by Hart’s assessment, the 19th century had more influential people than the 20th century, despite the fact that the latter had far more people with far more opportunity to have an impact.
Ironically, Woodley’s paper did cite Hart, though not for this book, but for his later work on IQ. Perhaps it was best not to cite Hart’s ranking, because it is just one man’s opinion, though one man who claims to have an A.B. from Cornell University, an L.L.B. from New York Law School, an M.S. in physics from Adelphi University, and a PhD in astronomy from Princeton University. Given the very linear relationship between IQ and years of schooling, I’m guessing Hart’s IQ is way up there, especially since the degrees include STEM subjects and from prestigious schools to boot.
Of course one difficulty with assessing historical influence or innovation rates is that we might not have the historical perspective to fully appreciate how culturally significant the 20th century really was.
Previously I blogged about research showing that Victorians had faster simple reaction times than modern people. Since simple reaction time partly reflects the basic physiological speed of the brain, some folks think Victorians were (genetically) smarter than people today.
In a paper documenting the 20th century decline in reaction speed, scholar Irwin W. Silverman considers the confounding role of height. Height has increased by 1.5 standard deviations over the last 150 years, and this may be producing spuriously slow reaction times because nerve impulses have further to travel in a taller body. However Silverman seems to dismiss this possibility, citing research showing taller people have faster reaction times.
However within generations, taller people tend to be genetically smarter than shorter people. This is thought to be because both height and intelligence (or at least its correlates: money, status) are socially valued, so people who have an above average amount of both, or either, tend to reproduce with one another, causing the genes for both to become associated. In addition, some of the same genes that influence height, may also influence intelligence. A related point is that short stature and low intelligence may both reflect genetic mutation load, or inbreeding depression.
So the fact that the nerve impulse has further to travel in tall people may be completely negated by the fact that tall people have genetically faster brains. In other words, tall people may be so mentally quick, that they still perform well on reaction time tasks despite the test being physically biased against them.
However this genetic relationship between height and intelligence probably only holds within generations. Between generations, heights differ for nutritional reasons, probably not genetic reasons, so tall modern people do not have a genetic advantage over short Victorians with which to negate the fact that the reaction time tests are physically biased against the tall.
The confounding role of height may also explain why studies investigating the relationship between intelligence and nerve conduction velocity have yielded extremely inconsistent results. Speaking of which, has anyone investigated long-term changes in nerve conduction velocity? Measures of human NCV have been collected since the 19th century, though old studies may be crude.
Now to further confuse the issue, even though the Victorian sample from which scientist Francis Galton collected his reaction time data was short by modern standards, they were actually tall by 19th century standards. This is significant because attempts have been made to correct for the elite nature of Galton’s samples by adjusting the sample for occupational status. However even when you look at the subsets of Galton’s sample who were not elite (i.e. unskilled men, aged 26+) you find they were 66.47 inches tall (see table 10 in this HBD Chick blog post), even though the average 19th century man was, according to one major study, 166 cm (65.35 inches).
So why were even the non-elite men in Galton’s sample about 0.43 SD taller than the British average? Perhaps, because as HBD Chick explained, Galton’s sample was not just elite, they were extremely self-selected, and that may have biased the sample independently of occupational status. They had to come to Galton (he didn’t come to them). These were people who were intellectually curious and literate enough to even read about Galton’s research, and motivated enough to travel (perhaps in some cases from great distances) to the museum, find Galton’s test and pay good money to take it. And why would one want to take the test so badly unless deep down, one had reason to believe in one’s own biological superiority? Is it really surprising that people who wanted so badly to demonstrate that they’re superior, actually were, and that they would be high, not just on intelligence, but on its weak genetic correlates: height and reaction time. So, if even adjusting for occupation, Galton’s sample was 0.43 SD taller than other Victorians, then perhaps they were also 0.43 SD mentally faster than other Victorians of the same occupation, skewing Galton’s data.
But above I argued that Galton’s sample did better than modern people because they’re shorter than us, but now I’m arguing they did better than other Victorians because they were taller than them. Sounds like ad hoc gibberish, and maybe it is. But remember, within generations, good genes make people both taller and mentally quicker so tall people have faster reaction times than short people. But between generations, nutrition improves height but does not appear to improve reaction time, so shorter generations should have an unfair advantage on reaction time tests because the nerve impulse has less distance to travel.
There’s been much discussion about a paper by scholars Michael A. Woodley, Jan te Nijenhuis, and Raegan Murphy, published in the prestigious journal Intelligence, arguing that genetic IQ in Western populations has declined by about 1 standard deviation since the 19th century. Although conventional IQ tests indicate people are getting smarter, the paper argues that simple reaction time (measured in milliseconds) is better for comparing people across centuries because although it’s a very crude measure of intelligence, it’s much less sensitive to the non-genetic factors that have caused the Flynn Effect.
A blogger named HBD Chick argued the paper went wrong by using poor sampling. The oldest study the paper cites was conducted by Francis Galton circa 1889. The paper was criticized for using this study because the sample is too elite to represent Victorians (only 4% of the sample was unskilled laborers, even though 75% of Victorians were). But if the vast majority of Victorians were unskilled laborers, then I suggest we use the mean reaction time of just this occupation to represent Victorians. Yes, excluding the top 25% of occupations might bias the estimate downward, but the fact that these were unskilled workers intellectually curious enough to volunteer for Galton’s study would bias the estimate upward, so it cancels out.
An analysis of Galton’s sample (see figure 10 & 11) shows that unskilled laborers aged 14-25 averaged reaction times of 195 milliseconds in males and 190 in females (an average of 192.5).
Do we have a modern sample of similarly aged people that are equally representative of a Western population? Yes. A 1993 study by W.K. Anger and colleagues (see table 2) found that a sample of American postal, hospital, and insurance workers, aged 16-25 (in three different cities) had reaction times of 260 in males and 285 in females (an average of 273). Just as unskilled labor was an average job in the 19th century, working for the post office, hospital or insurance company seems pretty average in modern times. Thus, by subtracting 192.5 from 273, we can estimate the average Western reaction time has slowed by 80.5 milliseconds since the 19th century. Since the standard deviation for reaction time is estimated to be 160.4 milliseconds (see section 3.2), reaction time has slowed by 0.50 standard deviations in over a century (equivalent to a drop of about 8 IQ points). This is actually virtually identical to the effect size the paper found using far more data points, but then they statistically adjusted the effect size, making it implausibly large. So in my humble opinion, the problem with the paper was not the samples they cited, but the statistical corrections they made.
The paper argued that since simple reaction time has a true correlation of NEGATIVE 0.54 with general intelligence, they needed to divided the effect size by 0.54 to estimate the true decline in general intelligence. The logic was that since reaction time is a very rough measure of intelligence, it underestimates the true decline in genetic intelligence. I disagree. Such inferences only make sense if you know a priori that there’s been direct selection for lower intelligence, thus dragging reaction time along for the ride, but that’s an assumption the paper was supposed to test, not rest upon. It could be the opposite. There’s been selection for slower reaction time, thus dragging intelligence along for the ride, in which case the effect size should be multiplied by 0.54, not divided. Most likely, there’s just been recent selection for more primitive traits in general, and both reaction time and genetic intelligence reflect this dysgenic effect to parallel degrees, so the change in one equals the change in the other.
To illustrate the point further, consider that height has increased by 1.5 SD since the 19th century. Height correlates only 0.2 with IQ. Would it make sense to argue that since height is such a weak proxy for intelligence, we need to divide the 1.5 SD increase by 0.2 to estimate how intelligence has changed since the 19th century? By such logic, intelligence would have increased by 7.5 standard deviations since the 19th century (equivalent to 113 IQ points)!
So clearly, the paper was unjustified in dividing the effect size by 0.54.
Without, the adjustment, Victorians were genetically 8 points smarter than moderns, which sounds a lot more believable than 14 points.
But if Victorians had a genetic IQ of 108 (by modern standards) , how could they score only 70 on the Raven IQ test? In a previous post I argued that the Raven is a culture fair IQ test and thus the low Victorian scores must be biological (Richard Lynn’s nutrition theory). Citing Richard Lynn, I also argued that malnutrition stunts non-verbal IQ (the Raven) more than Verbal-numerical IQ (I estimate by a factor of 31). So if sub-optimum nutrition stunted their non-verbal IQ by 38 points, their verbal-numerical IQ’s would have been stunted by only 38/31= 1 point. Thus the Victorians had a verbal-numerical IQ of 107, however because verbal tests are so culturally biased, their verbal scores would be artificially depressed by lack of schooling and exposure to mass media. But on a culture reduced measure of verbal-numerical ability like Backwards Digit Span, they might have scored the equivalent of 107.
So with a culture fair verbal-numerical IQ (Backwards Digit Span) of 107, and a culture fair non-verbal IQ (the Raven) of 70, their overall IQ’s would have been 86, which is 1.5 standard deviations below their genetic IQ of 108. That 1.5 SD gap between phenotype and genotype is perfectly explained by Richard Lynn’s nutrition theory, since average height in Western countries was also 1.5 standard deviations lower in the 19th century. And the great accomplishments of Victorians can be explained by their verbal-numerical IQ of 107.
In a previous post about the Flynn Effect, I discussed evidence showing that in Britain the average performance on the Raven IQ test had improved by an equivalent of about 30 IQ points over the 20th century, a figure that implies that the average Victorian, by today’s standard had a mean IQ of only 70! In an attempt to explain this, I suggested that the IQ’s of the average Victorian may have been artificially depressed by low education levels, however James Flynn did an analysis showing that only 5% of the Raven Flynn Effect can be explained by education. Add to that the fact that the Raven was designed to be culture fair, and thus not influenced by schooling, and I think we can virtually exclude education as even a partial explanation.
When studying the Flynn Effect, it’s best to focus on culture fair tests. You wouldn’t use tests that require defining words or describing similarities if you were comparing Bushmen to the British, you would use culture reduced tests. Similarly, the 21st century British experience a very different culture from Victorians, so they too can only be compared on culture reduced tests.
But if the Raven is culture fair, how do I explain why a population as capable as the Victorians scored so low? According to Richard Lynn, past generations had low IQ’s (and low brain size and low height) because they suffered from sub-optimum nutrition (especially before birth). However nutrition does not affect all parts of intelligence equally. Lynn cited data on identical twins showing that twin born better nourished typically had a non-verbal IQ 5.3 to 7.1 points higher, but a verbal IQ only 0 to 0.4 points higher. In other words, prenatal nutrition affects non-verbal IQ 31 times more than verbal IQ. In my opinion, this happens because humans evolved to live in groups, so the verbal IQ needed to speak, read, write, and exploit generations of cultural knowledge was more important than visuo-spatial abilities, so given the metabolic expense of the brain, when nutrients are scarce, evolution tends to preserve verbal-cultural abilities by sacrificing non-verbal acumen. Only when living standards skyrocket and nutrients become abundant do visuo-spatial abilities rapidly resurface, and you get a lot of kids who can slaughter their parents at video games and Rubik cubes.
We don’t know what the verbal IQ’s of Victorians were because unlike tests like the Raven, verbal IQ tests (with the possible exception of Digit Span on which they would have likely done as well as us) are far too culturally biased to meaningfully compare across generations. However if malnutrition stunted their non-verbal IQ’s by 30 points, we can divide this number by 31 to see that their verbal IQ’s were stunted by about 1 point. So even though Victorians had non-verbal IQ’s around 70, they had verbal IQ’s of 99 and thus were just as culturally sophisticated as we are.
What was their overall IQ? It depends how you calculate the composite score but probably more than one standard deviation below our mean, just as their brain size and height was more than one standard deviation below the modern mean. However because of their preserved verbal IQ’s, they were far more culturally advanced than a modern people with an overall IQ in the 80s. The Victorians had the intellectual skills the culture values (literacy, numeracy), so if we could meet them, we might regard them as brain damaged or learning disabled, rather than globally unintelligent.
So the lesson is, even though the Raven IQ test is an excellent measure of overall intelligence in well nourished populations, it dramatically underestimates the cultural functioning of people whose brains have been stunted by sub-optimum nutrition. Still, it’s often the only culture reduced test for third world populations who speak little English. In such cases, it should be a supplemented by a culture fair measure of verbal IQ like Backwards Digit Span, which for the poorly nourished, better measures genotypic ability.
One of the biggest mysteries in psychology is the Flynn Effect; the fact that over the 20th century, people have been performing better and better on IQ tests. Of course, the average IQ in Western countries by definition is always about 100, however because people keep scoring higher every decade, the tests routinely have to be made more difficult and the norms must be regularly updated. Now if this only happened on culturally loaded tests like General Knowledge and Vocabulary, we could simply conclude that the tests are just culturally biased against past generations who had less access to schooling and media. But some of the biggest gains have been found on tests like the Raven which were explicitly designed to be culture fair.
In one study (see figure 2) the top 10% of British people born in 1877 (by definition those with IQ’s above 120 for their era) performed the same on the Raven as the bottom 5% of British people born in 1967 (by definition those with IQ’s below 75 for their era). In other words, performance on the Raven had increased by the equivalent of 45 points in less than a century! Of course it wasn’t a level playing field because those born in 1877 took the test when they were a somewhat elderly 65 while those born in 1967 took the test when they were young sharp 25 year olds, however Flynn cites longitudinal studies showing that Raven type reasoning declines by no more than 10 points by age 65. That still leaves us with 35 points to explain.
Another source of inaccuracy was that although the test was not timed for either group, those born in 1877 took the test supervised while those born in 1967 got to take the test home. This could potentially make a large difference; not necessarily because the unsupervised group would cheat, but because they would probably take more breaks since they were in the comfort of their homes. They would probably return to challenging items after they had time to relax and see those items from a fresh perspective, while those who took the test supervised in some strange room were probably more likely to rush through the tasks so they could go home. I would estimate that being allowed to take an test home improves test performance by about 5 IQ points on average, though this is just a guess.
But that still leaves a huge difference of 30 IQ points. It’s important to note that the British born in 1877 probably completed no more than eight grades of schooling on average, while those born in 1967 probably averaged more than 12 years of schooling, and not attending high school may reduce IQ scores (though probably not real intelligence) by 8 points. It may seem unlikely that schooling could influence a test that seems as culture fair as the Raven, but some people argue that the Raven is actually culturally biased. Richard Lynn argues that it requires basic math skills like addition and subtraction and believes the rise in education explains part of the adult Flynn Effect. At the very least, people with more schooling might be more likely to take the test from a mathematical perspective or with more motivation, confidence, and persistence.
So if we subtract the 8 point schooling effect, that still leaves 22 IQ points unexplained. Is it possible that real intelligence could have improved by 1.5 standard deviations since 1877? Scientist Stephen Hsu recently noted that male height in Europe also increased by 1.5 standard deviations since 1870, and he compared this to the Flynn Effect. It may seem unlikely that a variable as genetic as intelligence could be so improved by the environment, but height is even more genetic than intelligence is, so it’s obviously quite plausible. This brings us to Richard Lynn’s nutrition hypothesis for the Flynn Effect, which claims that not only has better nutrition made us taller, but smarter too. Included in Lynn’s nutrition hypothesis is disease reduction, since diseases impede nutrients.
How does nutrition make us smarter? Most obviously by increasing brain size, and cranial capacity may have increased an astonishing 150 cubic centimeters in the last 150 years. Of course as Arthur Jensen noted in his book The g Factor, brain size is only moderately correlated with intelligence, so the increased brain size could only have caused less than half of the Flynn Effect. However if nutrition has improved brain size by over 1.5 standard deviations, it likely improved other properties of the brain to the same degree; they’re just harder to measure and notice than something as visible as head size, but perhaps IQ tests are detecting them. For example Lynn notes animal studies showing nutrition affects the growth and number of glial brain cells, as well as neuron myelination, dendrite growth, and synaptic connections and points to human autopsy studies suggesting similar effects.
But if biological intelligence has increased by 1.5 standard deviations, then the average Victorian would have a mean IQ below 80 by today’s standards. An intelligence researcher found this hard to believe given their superior scientific and literary output. In response I made some of these points:
1) The effects of nutrition may have much less impact on the parts of the brain responsible for language (literature) and academic learning. Despite the 20th century rise in schooling, the Flynn Effect appears to be much more pronounced on non-verbal tests of abstract reasoning than on tests of crystallized knowledge, vocabulary and arithmetic. Lynn discusses the same pattern in identical twins, where one twin is born less nourished than the other. By age 15, the twins are equal on verbal-academic abilities, but the less nourished, smaller brained twin is worse on non-verbal fluid problem solving. In my opinion this makes perfect sense. Human survival depends more on our ability to take advantage of generations of cultural knowledge (crystallized intelligence) than on our ability to figure things out for ourselves (fluid intelligence), so when malnutrition strikes, evolution would prioritize and preserve the crystallized functions of the brain at the expense of fluid abilities. And perhaps no crystallized ability is more important than language and literacy, so it’s no surprise that literary abilities would be relatively preserved in the malnourished mind.
2) By what objective standard can we conclude that 19th century humans were more intellectually accomplished? Since the 19th century, humans have invented television, put a man on the moon, grown human ears on mice, used DNA to solve human crimes and map ancient human migrations, invented the internet, Iphones, Ipads, GPS systems, military drones, video games, word processors, microwave ovens,magnetic resonance brain imaging, computer animation, 3-D printing, nuclear weapons, holograms…the list goes on and on. Of course we are building on the accomplishments of those who came in the 19th century, but every generation builds on past generations. Extrapolating from the rate of accomplishments through most of history, I suspect that 20th century accomplishments would exceed mathematical predictions, though I have no idea how to test such a hypothesis.
In addition to scientific advances, since the 19th century humans have become less superstitious (declining religiosity), less violent, and more moral (i.e. civil rights movement, feminism). Moral non-violent behavior indicates intelligence. Also, popular movies and television shows have become far more sophisticated, subtle, and nuanced in the last half century.
3) The fact that average 19th century intellect was much lower does not preclude the existence of 19th century geniuses anymore than the fact that average 19th century height was much lower precludes the existence of 19th century giants like John Rogan or Edouard Beaupre.