Sunday, 9 March 2014

What is the potential evidence AGAINST a one standard deviation decline of intelligence over the past 150-200 years?


Here is a list of some objections to and evidence against the assertion that average Victorian IQ would have been measured at one SD higher than moderns - that is at a modern IQ of 115 or more.

My comments follow [in square brackets]


1. The decline of intelligence is too fast to be accounted for by known mechanisms related to differential reproductive success between the most and the least intelligent people.

[I agree, that mechanism only accounts for about half the rate of decline required to produce 1 SD slowing in simple reaction times, hence intelligence. Another mechanism, or more than one extra mechanism, is required. I favour the accumulation of deleterious (intelligence damaging) mutations generation upon generation, due to very low child mortality rates since 1800, compared with all previous times in history.]


2. A 1 SD decline in intelligence since Victorian times would lead to a collapse of high level intellectual activity such as the number of creative geniuses and the rate of major innovations...

[I agree - it would lead to collapse...]

but this collapse has not happened - therefore there cannot have been a 1 SD decline.

[But my interpretation is that collapse has happened: the number of creative geniuses has collapsed and so has the rate of major innovations. Unless we are fooled by hype, or the self-interested self-promotion of insiders, I think this collapse is very obvious indeed across the whole of Western culture. I was writing about this collapse for many years before I came across the evidence of reducing intelligence - but I was trying to explain it in other ways such as the decline in scientific motivation, honesty, institutional factors, modern fashions, bureaucratization, Leftism etc. But the data for intellectual collapse are solid: what is in dispute are the best explanations.]


3. Intelligence has been rising, not falling, in developed countries - as evidenced by the rising average IQ test scores - a phenomenon usually called The Flynn Effect.

[I agree that average IQ test scores rose through the twentieth century - but that this was a matter or rising test scores; meanwhile average intelligence was declining. In other words, test scores were subject to inflation - or more accurately stagflation: as when prices are rising but economic production is declining. IQ test scores were rising, but real intelligence was declining.]


4. The evidence of slowing simple reaction times is not valid, because measurements and sampling methods in Victorian times are too different from modern measurement and sampling methods.

[Michael A Woodley and I have argued that these micro-methodological quibbles are inappropriate and invalid - and I think we have refuted them.]


5. Simple reaction times are not a sufficiently accurate, or valid, measurement of intelligence. In fact the idea that reaction times measure intelligence is obvious nonsense, because the best fist fighters and athletes have the quickest reactions, so they would have to be the most intelligent people - but they aren't...

[Simple reaction times are nothing to do with what the general public thinks of as 'quick reactions', and nothing to do with athletics, sports, or that kind of thing. Since the mid 1800s it has been known that differences in simple reaction time - such as seeing a light flash and pressing a button, are correlated positively with differences in intelligence. The correlation is not very tight, there is a lot of scatter around the line, but there always is a correlation - and average sRT differences accurately predict measured intelligence differences between both individuals and groups such as class, sex and race. Nobody who knew the field disputed the robust correlation between sRT and IQ - and many of the main scholars (such as Jensen) have assumed that the reason for the correlation was causal - that sRT reflects speed of neural processing which is a fundamental aspect of general intelligence. It is dishonest scientific practice to overturn more than a century of good research just because the sRT results go in a direction that you find surprising.] 


6. One SD slowing in sRT does not necessarily imply a 15 point reduction in IQ.

[I agree, because IQ is not a 'real' interval scale - which means that the difference in intelligence measured by 1 IQ point is not known and is presumably varied at different points in the scale. Reaction time is, however, an interval scale - measured in milliseconds. I have assumed that therefore sRT should take priority as the most valid scale and IQ should be calibrated against sRT. Therefore I argue that if sRT has slowed by about one SD then this should be understood to mean one SD decline in real intelligence. ]


7. An sRT slowing of about 70 milliseconds between the 1880s and nowadays may average at about 1 IQ point per decade, but this does not necessarily imply a linear rate of decline - the rate of change may vary.

[I agree. The actual rate of decline will depend on the main causes of decline. This is not known. Indeed, if I am correct that a generation upon accumulation of deleterious and intelligence-damaging  gene mutations is an important factor - the way that this works is not known. My feeling or hunch is that this kind of effect would not be linear but that the incremental amount of damage would increase with each generation - perhaps exponentially or by some other accelerating rate. So that if there were 2 new deleterious mutations per generation, then 4 would be more than twice as harmful as 2; and 8 would be more than twice as harmful as 4 - and so on. So the rate of decline of intelligence (and slowing of sRT) over 150 years need not be linear - but I would guess it is accelerating.]


8. There is just not enough evidence. One historical study with not very many data points is not enough to overturn the consensus from the Flynn effect studies that intelligence is rising.

[Fair point - except that the current consensus is not very secure - since confidence that rising IQ test scores really means rising 'g' (general intelligence) has never been very high. But on the other hand, the sRT historical evidence of declining intelligence is too strong to ignore. The best response is to seek further methods of confirming the decline in intelligence using different data and methods. That is what Michael A Woodley and I are doing, as best we may - but it would be great to have other people also working on the problem.]



Grey said...

#1 Iodine deficiency is fast enough e.g. if a mostly rural and small-town population that got most of its iodine from milk urbanized rapidly and got less milk in their diet (e.g. Britain).

#2 A 1SD drop *everywhere* might but if it dropped in northern Europe but not elsewhere then there wouldn't be a collapse e.g. US adding iodine to salt.

#3 The hares in northern Europe looked into the causes of why the tortoises were slower and one answer they came up with was add iodine but they didn't apply it to themselves. (Also there's the question of how the tests are based. If the tests are based on a population suffering a decline then that would make everyone else look like they were improving.)

Bernard Brandt said...

Dear Dr. Charlton:

With all due respect, your conjecture that the 1 SD decline in human reaction time signifies a 1 SD decline in human intelligence, and your conjecture that such declines are a result of dysgenic mutations in the human genome since 1800, still remain conjectures.

This is not meant to disparage your work. I would rather suggest that you may wish to find a means of converting your conjectures into testable, falsifiable, hypotheses.

To that end, I would suggest the following:

1) At a time when human genome analysis has become rather less expensive, it might be reasonable to exhume the bones of a cohort of graves from the year 1800 or so, and to subject them to the above-mentioned analysis;

2) It might also be reasonable to examine similar cohorts in subsequent generations, up to the present;

3) When computational speeds have increased and their concomitant expenses have gone down to a reasonable cost, it might be reasonable to compare the respective cohorts for specific changes, or mutations; and finally

4) It might be reasonable to take tests of modern children or adults, both for their reaction times, and as to their genetic profiles, to determine whether those lacking certain mutations have faster reaction times or not.

Alternatively, you might want to proceed on the assumption that the changes in reaction time might have an environmental cause, and attempt to examine changes in environment since 1800, and conduct tests on the individual changes, to determine whether the presence of any of these (e.g., food additives, fluorine in water, exposure to radio waves, etc.) might occasion changes in reaction time sufficient to explain the noted decline in reaction time.

Finally, you might wish either to review the relationship between reaction time and g, or to conduct experiments to that end.

Of course, the problem of how to bell all those cats (or to put it another way, how to fund such experiments), I leave as an exercise for the professor.

With warmest regards,

Bernard Brandt

Bernard Brandt said...

Dear Dr. Charlton,

I would like to offer an additional factor which might be both falsifiable, and would tend to confirm your present conjectures:

Many people have noted that among the Ashkenazim (or Eastern European Jews), there is a 1+ SD difference in g from the general population of Europeans.

Some have suggested that this difference is due to a genetic selection for intelligence among the Ashkenazim for the last millenium or so.

But what if, instead, something in the culture or environment of East European Jews has prevented the decline in g which has been present in the general population?

Just a thought.

Bruce Charlton said...

@BB- "Just a thought. "

And a very good one!