Better than a Tiger’s Maw

Sep 27, 2016 by

Why the Future is Looking Bright for Dullards

Robert Baxter –

David H. Freedman’s “The War on Stupid People” (The Atlantic) provides a compelling argument that we collectively assign great weight to the trappings of intelligence. Academic credentials, standardized test scores, a professed interest in technology, and even the right kind of physical appearance (glasses, not too athletic) all serve as easily recognizable markers of intelligence, which is to say that they indicate that a given person conforms to the stereotypes associated with this trait. And superficially, Mr. Freedman appears correct in that we almost certainly do assign people who conform to the brainy stereotype more status than we would have years prior.

To argue that this indicates that the less intellectually gifted among us are doing worse than they did in the past–that they suffer more than they would have before the most recent iteration of the technological era–or even that we genuinely value intelligence itself (rather than just the indicators of it), requires a leap in logic over a canyon of evidence that we would do well not to ignore. The issue here (and the source of my disagreement with Mr. Freedman) is that of the difference between the trappings of intellect and its actual presence.

That the visible signifiers of intellect–myopia and a lack of athleticism–indicate very little should surprise none of us. The first may well have once been an authentic badge of studiousness, but it is just as likely to be gained at present from interacting with a computer and engaging in tasks that may not require much in the way of intellectual rigor. As for a lack of athleticism, it too might have indicated a life of the mind decades ago. The underdeveloped body and weak eyes of the scholar–of someone who spent years on end poring over tome after tome of esoterica in long dead tongues–may well have once been legitimate points of feeble-framed pride. Now, such a body is the default in many countries. A life free of brutal exertion (and the poor muscle tone and weak eyes to match this privileged state) indicates nothing more than that one has had a fairly normal first-world existence.

What of education? Although college attendance rates are slightly lower than they were in 2011, the number of Americans enrolled in college as of 2013–the latest year for which I was able to find reliable numbers–remains extraordinarily high, with 40.0% of 18-to-24-year-olds attending college. Certainly, this appears to indicate that intelligence and intellectual rigor are more valued than they were in dark and ignorant industrial and pre-industrial days.

Appearances can be deceiving.

While intellectual capacity does correlate to a certain extent with educational attainment, that relationship varies greatly from one major to the next, and to regard level of higher education as a proxy measure of intelligence (and the growth of higher education as a testament to the increasing importance of intelligence on a societal level) is to hazard making at least one of several errors.

Even if one does take the Flynn effect–the long-term increase in raw (unadjusted) intelligence test scores–as a sign of increasing intelligence of the general population, rather than as a sign of increased testing aptitude, we need not believe that less intelligent people are suffering discrimination, only that their offspring, just as the offspring of essentially everyone else, are becoming more intelligent. This isn’t discrimination. It’s a rise tide raising all ships. Even then, we might do well to not make too much of the Flynn effect, which has only been observed since the 1930s–when IQ tests became commonplace. Extrapolating larger long-term historical trends from observations made from thereabouts of 1930 to 1980 leads to absurd results with which any sane person is likely to take issue, and considered carefully, an examination of the Flynn effect, even within its established timeframe, leads to more questions than answers.

Assuming an average improvement of about three IQ points per decade since the beginning of the industrial revolution leads to the conclusion that an ordinary American living during the Civil War would, if tested on modern instruments, have an IQ of around 55–effectively being incapable of performing even the simplest tasks of soldiering. Too far back, you say? Well, what if we go back a mere 86 years, to around the beginning of the modern dataset? If such is accurate, the average American living in 1930 would have had the intellectual capacity of a modern American with an IQ of slightly more than 74, which would suggest that even moderately difficult intellectual tasks would have beyond the vast majority of the populace. Even if we allow that the Flynn effect stopped in the 1990s (and a limited amount of evidence suggests that it has begun to reverse), the better part of the American population in 1930 would be deemed quite dull by modern standards. Given the politics of 1930s America and the aggressive (and presumably politically progressive) push for eugenics, it is a wonder that any of our forebears (or at least those living in Indiana) were allowed to reproduce.

On the contrary, there is some evidence that intellectual ability has decreased significantly over the last few thousand years. Gerald Crabtree, a professor at Stanford University, suggests that intelligence and emotional stability are the result of the interaction of many genes, rather than a few, and that this makes both traits relatively fragile and subject to decline without ongoing selection pressure to eliminate genetic errors. Additional research by Woodley, Nijenhuis, and Murphy indicates that mean simple reaction time–a relatively straightforward and culturally unbiased measure of intellect–has increased in the developed world, suggesting that we process information more slowly than did our ancestors and are, in effect, somewhat less intelligent than were they. If such is the case, the argument that less intelligent people are penalized more for their dimwittedness than were premodern dullards simply does not hold water. Were such the case and the legions of duh facing greater disadvantages as society develops, one would expect to see selection pressure in favor of increased intellect. Of course, such a mechanism would probably not be perfect, but it would have at least some effect.

These two positions are not entirely contradictory. The Flynn effect was observed over a relatively short period of time, whereas Professor Crabtree’s thesis covers thousands of years. Conceivably, both could be correct. As anyone who has carefully studied economic or marketing data can attest, trend lines are rarely perfectly smooth. Even if a pattern looks reasonably consistent over epochs, small variations will occur in short timespans that seem to be in opposition to the larger direction of movement. With some evidence of the Flynn effect having stopped or reversed, the 1930s to the 1980s may have been nothing more than a flutter in the larger pattern of degeneration. The findings of Woodley, Nijenhuis, and Murphy, based on two datasets–the first gathered before the Flynn effect was identified and the second gathered after it appears to have stopped (and possibly begun to reverse)–may have simply skipped over the time period during which the Flynn effect was, well, in effect. Even this doesn’t entirely explain the discrepancy in results, but such might be attributed to 19th century instrumentation error. Perhaps the older response time recording devices ran a bit slow, causing the researchers of the era to underreport the time required for their subjects to complete a given task, making the data meaningless. One could also argue that the 1930s just happened to be about the time when the intellectual capacity of homo sapiens bottomed out–the point of peak idiocy–and that, at least intellectually, things are looking up. The early 20th century itself might have simply been highly anomalous–a few decades in which childhood nutrition was unusually bad or the presence of industrial neurotoxins (lead, namely) was unusually high–and the Flynn effect indicates nothing more than a return to mean. Which of these hypotheses is correct (or if they are all incorrect) is a question not easily answered, but if nothing else, one can very sensibly argue that evidence of increasing human intelligence is mixed.

All of this discussion of genetics and simple reaction times is a bit dry, so a hypothetical may be in order to clarify which larger pattern seems more plausible across the generations.

As an intellectual exercise, let’s consider the wages of stupidity in modern and pre-modern (pre- 20th century) times and try to answer a highly pertinent question: does the world suffer fools more or less gladly than she did in days of yore?

To keep things simple, we’ll only create two characters. Please keep in mind this is an intellectual exercise, not a proper experiment, so an n of 2 is fine. Both of our characters are from the same family and live in the same place–let’s say the long-settled American Southeast, although it could just as easily be in any number of regions. These people are only separated by time. The first, I.M. Dunce Sr., was born in 1850, supposedly of good Scottish stock, although rumor has it the name was originally (horror of horrors) Duncé. The second, I.M. Dunce IV, was born in 1985.

I.M. Dunce Sr. was a subsistence farmer who raised root vegetables, greens, apples, cotton, and hogs. He also grew tobacco as a cash crop. His wife, Jane Dunce (née Stump), worked on the farm as well, and in addition to fulfilling her agricultural and childrearing duties, she made most of the family’s clothes, largely from cloth woven by Mr. Dunce on a loom of his own construction. Now, before one attacks any of this as being even slightly over the top, I would point out that I am essentially describing the life of my maternal great-grandparents as reported to me by grandmother, who was from the Kentucky side of Appalachia–pretty close to the Dunce family farm. Such handed-down lore isn’t infallible, but what is?

Both Mr. and Mrs. Dunce were literate, as was the majority of the white population of America in the 19th century (non-white literacy rates were somewhat lower–an unfortunate fact that does not undermine the larger argument). Although they rarely had the time to read, the Dunces had a small library that consisted of the Bible, a farmer’s almanac, a few volumes of poetry, and a copy of Common Sense that had been in the Dunce family collection since colonial days.

In a typical year, the Dunces needed to gauge the appropriate time to plant their crops, plow their fields, plant aforementioned crops, tend to them, harvest them, raise and slaughter animals (as well as preserve the meat from these animals), and store food for the winter. They also needed to produce their own garments, avoid major infection, avoid snakebites, and maintain their property (including their house and barn).

Here are the natural penalties for failing to do any of these tasks properly:

* miscalculating when to plant crops: death by starvation

* improperly planting crops: death by starvation

* improperly harvesting or storing crops: death by starvation (a popular outcome it seems)

* receiving an injury and not tending to it in a timely manner: death by infection

* getting bitten by a poisonous snake: death by venom/infection

* failing to maintain their property: death by freezing, death by fire, or death by disease

* stepping on a nail while maintaining said property: death by tetanus (a decidedly less-than-wonderful way to expire, smiling or not)

Did I mention that the Dunces’ lives were hard, often terribly so?

Nature–at least that of the senior Dunce’s time–ran a trophy shop, and passed out Darwin Awards without about as much hesitancy as modern primary schools do participation medals and martial arts academies do Best Kicker awards and yellow belts.

Now, let’s consider the life of a descendant of the elder Dunce–I. M. Dunce IV. The younger Dunce earned a bachelor’s degree from a department in a university that would rather not claim him (although it had no problem taking his student loan dollars). Young Dunce works in an office. He buys his food at a store–the same one where he buys his clothes and medicine. Although he is incapable of making anything of material worth, the younger Dunce occasionally posts videos of product reviews and political ramblings, of some of which he is quite proud. He nearly led a Twitter mob once, but regarding what he can no longer recall. He started college reading at the 7th grade level–somewhat below the level required to read the elder Dunce’s King James Bible (although the younger Dunce could probably make sense of the almanac, assuming he was familiar with the concept of lunar phases)–and like many of his college-educated peers, left college with the same (academically) virginal mind with which he entered, despite 13 hours a week of study–a truly heroic effort.

In a typical year, Mr. Dunce needs to drive to and from work, enter data and take phone calls at the office, pay bills and taxes, buy food, and periodically take his car in for maintenance. In the event his home suffers from an electrical or plumbing problem, he can call an electrician or a plumber. He does not need to sow, reap, slaughter, or preserve anything; have any knowledge of the seasons the basics of animal husbandry; or make any of the goods he consumes.

Here are the penalties for failing to do any of these tasks properly:

* entering data and taking phone calls: losing his job, having to get on unemployment

* paying bills and taxes: disconnection of services, fines, fees, and (worst case scenario, for taxes) prison

* buying food: having to make a late night run to Dominoes

* taking his car in for maintenance: having to walk to a service station

I’ve saved driving for last, as the consequence for doing it improperly can be severe, although advances in safety technology have made this task easier and safer than it was in years past. (Mr. Dunce III did, after all, need to learn how to operate a manual transmission–something that even a great many of the car thieves of the young Dunce’s generation cannot manage, much to the amusement of elder purloiners everywhere.)

That the use of modern technology requires considerable intellectual effort and ability–that we must be smarter to survive in a technological world–is one of the most commonly made arguments in support of the idea that civilization now demands more of our intellects (and those of the Dunces and their mouth-breathing brethren) than it did in the good old days. Allow me to counter this in no uncertain terms.

The design of modern systems does, of course, require considerable intellect, but the usage thereof is another matter. The elder Dunce would almost certainly be confounded by his descendant’s smartphone, but would the younger Dunce be any less confounded by his forebear’s collection of farm implements or plants, each with their own requirements for care and cultivation? Complex machines do not necessarily make for complex operation, and a great many highly sophisticated machines may actually demand less of their users than did their comparatively primitive antecedents.Modern (2016) computers are several million times more powerful than the mainframes of the 1960s and 1970s, yet they are far easier to use than were their predecessors, and certain modern planes and automobiles are at least moderately automated.Such isn’t always a positive development, however, as it may well lead to dangerous overreliance on automation, and they may not be as capable as we think.

Considering all of the safety measures built into the machines we use today and the subsequent warnings and cautions attached, we must all but go out of our way to hurt ourselves (which we sometimes do). One wonders how well the modern consumer would fare on the savannah, or even in a factory built with the limited safety equipment of 50 years ago. The modern lawnmower’s blades are better than a tiger’s maw, for at least the lawnmower has a sticker adjacent reading “Do not insert hands here!” I doubt that most tigers would provide such a helpful hint.

Credentials and the intellectual aesthetic–both of these markers of intellectual development are more readily available than they were in years past. Educational credentials are expensive financially, but as for the cost in terms of intellectual effort, one may argue that they have become quite cheap. Even more advanced degrees, such as the JD, generally have lower admissions standards than they did years prior. One may argue that this system is superior to the one that came before it–one in which a great many lawyers were self-taught–yet self-study produced Abraham Lincoln, whereas a great many of our weaker law schools produce students oftentimes unable to pass the bar. This isn’t an entirely fair comparison, of course, as many of those who studied for the bar independently probably failed as well; however, the investment of community resources in such self-starters was comparatively little, and reading the law does not require the accumulation of significant debt.

All of this is to say that while signifiers of intelligence may be more socially relevant than they were in the past, the importance of intelligence itself, as enforced by social and natural selection pressures, is probably less than it was in any previous time in the existence of the species. In fact, the massive increase of credentials’ in our society may be due to our increasing hesitancy to evaluate people based upon their intelligence.

Furthermore, educational achievement is affected by factors unrelated to intellectual capacity, and often enough to undermine claims of credentials being reliable systems for sorting people by cognitive ability. Certainly, family socioeconomic status (SES) has some bearing on academic achievement, but even if one subscribes to the (somewhat controversial) theory of Charles Murray that family SES is a meaningful indicator of family intelligence and one believes that intelligence is highly heritable, as was argued by Benyamin et al., good evidence for the relevance of non-intellectual personal differences in determining academic achievement cannot be readily dismissed. For instance, Duckworth and Seligman found that women do not have higher IQs than men, yet they are more likely to achieve high grades in secondary school and are more likely to earn college degrees. These researchers attribute this to differences in self-discipline, which although a useful trait, is distinct from intelligence. This would suggest that employers may be, unintentionally or not, using academic credentials to select employees at least partially on their demonstrated self-discipline, rather than their sheer intellectual capacity. If one chooses to view academic credentials with a somewhat more jaundiced eye, they may be seen as proof of willingness to comply with complex bureaucratic procedures without complaint–certificates of conformity, in effect.

The growth of credentialism may have less to do with even needing the appearance of intellectual capacity and more to do with our aforementioned growing hesitancy to rely on genuine intelligence-based assessments. In a wonderfully detailed history of the case, David Garrow notes that prior to Griggs v. Duke Power Company, general measures of intelligence were given as part of employment-screenings with little legal restriction, and although such tests are still used from time to time (largely by government agencies), their application is severely limited. From a purely rational and economic standpoint, this necessitates a different sort of screening measure for employers, particularly when hiring for jobs that require little in the way of objectively measureable technical skills, and academic credentials seem to fill the bill. It should be noted, however, that Chief Justice Burger, writing for the majority, criticized “the inadequacy of broad and general testing devices as well as the infirmity of using diplomas or degrees [emphasis mine] as fixed measures of capability.” This suggests that using academic credentials as an arbitrary sorting tool–which appears commonplace–may be of questionable legality as well, although it seems to have tolerated the test of time far better than has testing itself.

As for common college entrance tests, namely the ACT and the SAT, acting as intelligence screening tools (and thus keeping or discouraging the weak-minded members of society from attending college) there is some compelling evidence that they once may have been just that, but much (although not all) of the research suggesting this was conducted either during the late 1970s or early 1980s or used datasets gathered during that time. It is important to note that both the SAT and the ACT were changed several times over the last few years, with certain components strongly correlated with intelligence, such as analogies, being dropped. Given these restructurings, both of these assessments are probably less IQ-test like than they were even a generation ago, and certain high-IQ societies will not accept scores from either of these tests if administered within the last twenty years.

To a certain extent, information from ACT, Inc. (the developer of the ACT) agrees with the college readiness information Mr. Freedman cites regarding its competitor–the SAT. Somewhat less than 30% of those who took the ACT were found to be ready for college in all tested domains (mathematics, English, reading, and reasoning). This may be less of an obstacle than it appears on the face. Even if one does believe that SAT (or ACT) scores are good measures of IQ and that one does need a certain IQ to attend college as an ordinary student, the matter of remedial classes and their rapid growth need be taken into account. With 60% of community college students and 20% of four-year students enrolling in what amount to (very expensive) high school classes, it is difficult to argue that only the reasonably qualified are gaining admission to college.

Although Mr. Freedman may not have necessarily intended to make economic inequality the centerpiece of his work, it does appear to be relevant to his argument, and inequality has grown very substantially in the United States, now approaching a level last seen in the Roaring Twenties. To take this as a sign of a developing meritocracy, regardless of how desirable or undesirable one considers such a society to be, is to demonstrate a profoundly naive understanding of the roles of credentials and credentialing institutions in our nation–both of which serve at least as much as parts of an elaborate status signaling system as they do anything else.

Social connections, wealth, birthplace (urban, suburban, or rural), discrimination (be it officially condoned or condemned), and a willingness to unquestioningly toe the party line may well give some people advantages over others in both the labor and educational marketplaces, and our higher educational system does at least as much to formally recognize and crystalize these status differences as it does to counteract them. There is much to be said for addressing this matter by providing all Americans with educational opportunities and resources commensurate with their abilities, interests, and level of dedication, which would be distinct from the current practice of trying to convince all and sundry that they should aspire to become members of the academic, managerial, or professional class–a practice founded upon a surprisingly old-fashioned (and not particularly wise) disdain for more practically minded pursuits.

That much said, those who would have us believe that higher educational institutions are the only (or even the primary) educational resource in our society and that credentials from such institutions are the only meaningful way to demonstrate intellectual competence are either woefully rigid and uncreative in their thinking or are, worse yet, shills for an ever-growing educational bureaucracy, the existence of which is largely predicated on the argument that you don’t know it unless we say you do!

To address the issues of inequality and the undermining of the wellbeing of America’s hardest working citizens is both noble and necessary. With this many will agree; however, such would constitute efforts toward a meritocracy, not away from one. As it stands, Mr. Freedman’s argument gives our educational institutions and our nation both more and less credit than they deserve. The ways in which this argument over-credits us are relatively obvious, as is the potential harm: the myth of the efficiently recognized (and entitled) meritocracy allows the more credentialed members of our society to grow smug, even more than they already are. They win because they are better! (Cue Teutonic song of your choice, although preferably not “99 Luftballoons.”) But it affords us too little credit as well. As a society, we’ve spent tremendous resources trying to idiot-proof almost every interaction and task, from birth to retirement and everything in between (sometimes with counterintuitive results). This is no mean feat: Idiots can be surprisingly clever. Your lawnmower’s dead man’s switch almost certainly cost an engineer a weekend and an idiot a foot, and your car’s backup proximity alarm may well have cost idiot and engineer alike a great deal more than that. And thanks to advances in the fortification of food, one would almost need to try to become malnourished, whereas in the past, one needed take substantial and frequent steps to avoid the many horrible ailments of the vitamin and mineral deprived.

The point of all of this is not to argue that being an idiot is not a disadvantage–it is. Rather, being an idiot is now less harmful to one’s chances of survival or the survival of one’s offspring than it would have been in any previous time in human history, and there is absolutely no reason to believe that we will not continue to further soften nature’s brutal pummelings of the cognitively challenged. Nature will remain merciless, and woe unto the fool who steps outside his (or her) foolproof environs without adequate preparation, but we’ll keep adding ever more padding until the furious beatdowns of a vicious universe feel as though they are little more than love taps.

As for the credentialing of the nation and tying that to economic success, what is to be done? If the history of secondary school is any guide, I fear that we’ll almost certainly keep lowering our already modest standards until anyone and everyone willing to agree with the ideology of the month gets a feel-good certificate–er, degree–with an ever-larger number of graduates only achieving the most basic levels of literacy (as is sometimes already the case). Until then, remind all those you hold near and dear, particularly the young, that credentials are not always tickets to the upper class.

Source: Better than a Tiger’s Maw – The Unz Review

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.