History of Intelligence Theories

After money, comfort, and love, Raymond Cattell had to make one more sacrifice…

In Cattell’s (1974) autobiography, we find not only warm gratitude for his mentor (Charles Spearman) but also a taste of the kinds of personal sacrifices many have chosen to make while contributing to our field. After an idyllic childhood and a romantic courtship of his first wife, Cattell was unable to find a permanent academic position in England. After years of near poverty (and neglect from an especially driven husband), Cattell’s wife, with whom Cattell was still very much in love, left him for “more comfortable circumstances.” After the divorce and further failure in securing an academic position in Britain, Cattell (1974) considered leaving for America,

But England was deep in my bones…The personal crisis, well nigh of despair,…tested the truth of Scawen Blunt’s lines: “He who has once been happy is for aye, out of destruction’s reach.” The broken marriage and the bleak future could be met. But could I disloyally uproot myself from that which had created the fiber of my being? The die was cast one day when I received a persuasive letter from E. L. Thorndike, asking me to be a research associate with him for a year. Of course, I knew of Thorndike’s work and it seemed to me about the most imaginative and fundamental that I knew of in America…I was stirred by the privilege and the possibilities, and after three days of emotional struggle decided to go. After all, it was only for a year. It was characteristic of Thorndike’s perspective, and independence, that he had reached out to a stranger three thousand miles away, possessing no personal “pull.” He had reacted purely to what he had found in my publications. I have tried to do the same in my turn for oncoming psychologists, judging by performance, not the “old school associations.”

(p. 69)

After several temporary positions, Cattell took a position at the University of Illinois at Urbana/Champaign. He was extremely grateful to the taxpayers of Illinois that he was about to spend the next three decades pursuing any question that he deemed important to answer. He said that, for him, life began at 40. He spent his time productively, producing dozens of books and hundreds of articles:

For many years I rarely left the laboratory before 11 P.M., and then was generally so deep in thought or discussion that I could find my car only because it was the last still in the parking lot!

(p. 75)

Cattell, R. B. (1974). Raymond B. Cattell. In G. Lindzey, (Ed.) A history of psychology in autobiography (Vol. 6) (pp. 59–100). Englewood Cliffs, NJ: Prentice-Hall.

Advertisement
Standard
History of Intelligence Theories, Psychometrics, Statistics

Cronbach: Factor analysis is more like photography than chemistry.

Lee Cronbach would later achieve immortality for his methodological contributions (e.g., coefficient α, construct validity, aptitude by treatment interactions, and generalizability theory). His first big splash, though, was a 1949 textbook Essentials of Psychological Testing. Last week I was reading the 1960 edition of his textbook and found this skillfully worded comparison:

“Factor analysis is in no sense comparable to the chemist’s search for elements. There is only one answer to the question: What elements make up table salt? In factor analysis there are many answers, all equally true but not equally satisfactory (Guttman, 1955). The factor analyst may be compared to the photographer trying to picture a building as revealingly as possible. Wherever he sets his camera, he will lose some information, but by a skillful choice he will be able to show a large number of important features of the building.” p. 259

Standard
History of Intelligence Theories

Fun quote from Raymond Cattell on the importance of taxonomies

Raymond Cattell (1987, p. 61):

A taxonomy of abilities, like a taxonomy anywhere else in science, is apt to strike a certain type of impatient student as a gratuitous orgy of pedantry. Doubtless, compulsions to intellectual tidiness express themselves prematurely at times, and excessively at others, but a good descriptive taxonomy, as Darwin found in developing his theory, and as Newton found in the work of Kepler, is the mother of laws and theories.

Raymond Cattell

Raymond Cattell (1905–1998)

Standard
History of Intelligence Theories

William Stern (1871–1938): The Individual Behind the Intelligence Quotient

Lamiell (1996) notes that if mentioned at all, Stern is known as “the IQ guy,” which in one sense is true enough. He was indeed the one who invented the formula for the intelligence quotient:

IQ=100\dfrac{\text{Mental Age}}{\text{Chronological Age}}

What is not typically mentioned is that he was a little embarrassed by his IQ idea and would have been happy if his name were not associated with it (Lamiell, 2003, p. 1). He wrote movingly about how IQ tests should not be used to degrade individuals (Stern, 1933, as cited in Lamiell, 2003):

Under all conditions, human beings are and remain the centers of their own psychological life and their own worth. In other words, they remain persons, even when they are studied and treated from an external perspective with respect to others’ goals….Working “on” a human being must always entail working “for” a human being….The psychotechnician has every good reason to take these considerations seriously. Because if there are places today where the term “psychotechnician” is uttered with something of a disdainful tone, that is due to the implicit or explicit belief that psychotechnicians not only intercede but interfere in the lives and rights of the individuals they deal with. The feeling is that psychotechnicians degrade persons by using them as a means to others’ ends. (pp. 54–55)

Stern wrote extensively about a wide variety of issues about intelligence, personality, individuality, and many other topics. It irked him that the IQ formula was the idea that caught on. Fortunately, scholars are beginning to remember Stern as more than just “The IQ guy.”

Stern’s Humanism & the Limits of Science

Stern used intelligence tests and other scientific approaches to understand people but also wanted to be clear about the limits of such approaches. His work did not constitute a romantic rejection of science but rather a clear-headed delineation of its proper boundaries. In arguably the first book on the psychology of individual differences, Stern (1900, as cited in Lamiell, 2003) provides this thought, which, provided suitably tasteful graphic design, should probably be made into framed posters that psychologists who cherish individuality can hang in their offices:

[E]very individual is a singularity, a one-time existing being, nowhere else and never before present. To be sure, certain law-like regularities apply to him, certain types are embodied in him, but the individual is not exhausted by these laws and types; there remains ever something more, through which the individual is distinct from others who conform to the same laws and types. And this last kernel of being, which reveals the individual to be thus and so, distinct from all others, is not expressible in the language of scientific concepts, it is unclassifiable, incommensurable. In this sense, the individual is a limiting concept, toward which theoretical investigation strives but can never reach; it is, one could say, the asymptote of science. (pp. 15-16)

If a whole poster seems a bit much, “The Individual—The Asymptote Of Science” would fit nicely on a bumper sticker.

Reading Recommendations

Selected publications Comments
The psychological methods of testing intelligence (Stern, 1914) Besides showing Stern to be extremely sensible and practical, this book is eye-opening in showing the extremely wide variety of very basic questions that had to be answered before IQ tests could be taken seriously.
William Stern (Stern, 1930) For those wishing to acquire some intellectual humility, Stern’s autobiography might do the trick. Stern’s prose is at times dense with ideas that at first seem like gibberish but upon close inspection are seen to be quite profound.

References

Lamiell, J. T. (1996). William Stern: More than “the IQ guy.” In G. A. Kimble, C. Alan Boneau, & M. Wertheimer (Eds.), Portraits of pioneers in psychology, Vol. II, pp. 73–85. Hillsdale, NJ: Erlbaum.

Lamiell, J. T. (2003). Beyond individual and group differences: Human individuality, scientific psychology, and William Stern’s critical personalism. Thousand Oaks, CA: Sage.

Stern, W. (1900). Über Psychologie der individuellen Differenzen (Ideen zu einer “Differentiellen Psychologie”) [On the psychology of individual differences (Toward a “differential psychology”)]. Leipzig: Barth.

Stern, W. (1914). The psychological methods of testing intelligence (G. M. Whipple, Trans.). Baltimore: Warwick & York. (Original work published 1912)

Stern. W. (1930). William Stern. In C. Murchinson (Ed.), A history of psychology in autobiography, (Vol. 1, pp. 335-388). New York: Russell & Russell,

Stern, W. (1933). Der personale Faktor in Psychotechnik und praktischer Psychologie [The personal factor in psychotechnics in practical psychology]. Zeitschrift für angewandte Psychologie, 44, 52–63.

Standard
History of Intelligence Theories

Charles Spearman Reading Recommendations

Selected publications Comments
“General intelligence, objectively determined and measured” (Spearman, 1904) The work that started it all. Along with the historical review, statistical analysis, and some raw data, here and there, you get delicious bits of rhetoric like this:

But when we assert that the decision of Regulus to vote against making peace with Carthage was no more than a conglomeration of visual, auditory, and tactual sensations in various stages of intensity and association, then there is an undeniable risk that some precious psychical elements may have slipped through our fingers. (p. 204)

The nature of “intelligence” and the principles of cognition (Spearman, 1923) Spearman considered this book to be his most important work (Jensen, 1994). The book is easier to appreciate if you think of it as the work of Spearman the philosopher—to whom we grant the privilege of asserting things without really explaining or justifying those assertions. The ideas themselves are fascinating. The empirical justification for them mostly comes in later works.
The abilities of man: Their nature and measurement (Spearman, 1927) Most scholars consider this to be his most important work (Jensen, 1994). Although the Spearman-Brown prophecy formula holds a special place in my heart, I must agree.
C. Spearman (Spearman, 1930) Reading Spearman’s autobiography makes it hard to dislike the man. Funny, moving, and insightful.
Standard
History of Intelligence Theories

Francis Galton Reading Recommendations

Here are some sources that begin to explain the positive side of my ambivalence about Galton:

Selected publication Comments
Hereditary genius (Galton, 1869) Galton attempts to show that talent is hereditary. The methods are crude but entertaining. Hundreds of mini-biographies, strange details, and curious asides (One of many: William Pitt’s talented niece, Lady Hester Stanhope ended her days in Syria, dressing as a man and claiming supernatural powers.). Sarah Austen is given credit for Jane’s novels.
Inquiries into human faculty and its development (Galton, 1883) This book is a romp through every weird place the human mind can go. If you like that sort of thing, you will like this very much.
“Regression towards mediocrity in hereditary stature” (Galton, 1886) It is amusing how much detail about statistics needed to be explained explicitly in 1886. The glorious scatterplot alone makes this article worth a look.
Memories of my life (Galton, 1908) Galton’s mind was sharp right up to the end of his life. Filled with anecdotes, gossip, and rich humor. Fun story about the aftermath of the Sarah/Jane Austen fiasco. Googling the quotations that Galton inserts without sourcing makes for hours of entertainment. There is also much pathos here as well.
Standard
History of Intelligence Theories

The impractical, intangible, invaluable consolations of studying old, outmoded theories

When Machiavelli wrote The Prince, his stated goal was to distill useful information from history for his patron and, in a compact form, provide practical knowledge about how to govern effectively. Unfortunately, few textbook authors have Machiavelli’s talent for summary, let alone his ability to rise above his sources. Textbook presentations of grand old theories, necessary as they are, often amount to decontextualized lists, stripped of what made these works compelling in the first place. What is quickly presented, is too often quickly forgotten. Of course, like C.S. Lewis’s description of the power of myths, some theories are compelling no matter how dull the presentation.

At times I too present concise summaries of historical theories of intelligence but I think that there is no substitute for reading the historical sources for oneself. Pick just one old theory of intelligence, read the original sources thoroughly, and see what happens to your thinking.

I do not wish to give the impression that reading these works is the most efficient manner of learning useful information. If the goal is simply to find the most information-dense sources of practical knowledge available, one should probably stick to contemporary works of scholarship. Alongside the conceptual breakthroughs that made certain older works noteworthy, there are lesser ideas and also (sometimes charming, sometimes distasteful) informational clutter.

A difference kind of mindset and a patient, slower pace are required while reading these sources—but something hard to describe happens when they are encountered directly. The ideas we discover in our explorations of history—they are not only more memorable because they are embedded in personal stories and the overall narrative of the field—they become unforgettable because they work their way into our identity as psychologists.

In every culture, perhaps the most important function of history is to create a sense of group identity amongst otherwise unrelated people who need to cooperate for the common good. In the United States, for example, we tell tales about Washington, Jefferson, and other founders, and in so doing come to feel committed to the core ideas and ideals of our republic. When we come to know and care about Abraham Lincoln, Susan B. Anthony, and Martin Luther King, Jr., we refine and broaden our sense of group identity.

Although there are indeed heroes and villains from the history of our discipline, most contributors cannot so easily be categorized; there is both wisdom and folly embedded in their stories and in their ideas. When we learn who they were and what they thought, not only are we able to learn useful information, their values and their concerns inform our values and our concerns. Even their mistakes are instructive and help us acquire much-needed humility. When we know and cherish the stories of how individuals in our discipline have committed themselves to the betterment of our field, we are inspired to do likewise.

Even quirky old theories with deep flaws are worth studying. Usually, a quirky theory was developed by a quirky person with quirky concerns. However, just as we value demographic diversity in the classroom, there are good reasons to value diversity of thought in the theoretical domain.  That is, with a bit of empathy, exposure to those odd concerns can often provoke creative thoughts relevant to currently overlooked problems.

These various theories, old and new, were and are rivals to some degree. But they also were and are part of a grand collaborative process in advancing our understanding of cognitive abilities. Typically, the differences between the theories are not so much due to mutually exclusive positions but mostly due to different emphases. We study diverse theories not so much to figure out which is the right one but we learn from each theory a new set of cognitive tools with which we can unlock new mysteries both at the broad theoretical level and at the point of contact with individuals who need our help.

Standard
History of Intelligence Theories

Galton’s “ridiculous” intelligence tests

As some scholars tell the story, this so-called genius Francis Galton had some pretty stupid ideas about how to measure intelligence. Instead of measuring useful things like reasoning, knowledge, and creativity, Galton measured totally irrelevant things like visual acuity, grip strength, and reaction time! What an idiot! Anyone with the least bit of common sense would realize that the senses are not the same as intelligence. Galton, as the story goes, was blinded by his philosophical allegiance to empiricism and associationism, which emphasized the primacy of the senses in explaining human knowledge and reasoning. The world would have to wait for Alfred Binet to set things right.

As it turns out, the reason that Galton measured basic sensorimotor abilities like visual acuity, grip strength, and reaction time was that he was interested in… basic sensorimotor abilities like visual acuity, grip strength, and reaction time! He was interested in them for roughly the same reasons that medical doctors, developmental psychologists, and neuropsychologists are interested in them. For example, Galton published tables showing the ages at which sensory acuity and grip strength are at their peaks and the ages at which they tend to decline. This is useful, basic research.

In reading Galton’s publications of such findings, one will find no grand pronouncements about the nature of intelligence. Galton never claimed that basic sensorimotor abilities constituted the whole of intelligence. Nevertheless it is true that, from time to time, he claimed to have found evidence that the most intellectually able tend to have greater sensitivity and sensory discrimination (Galton, 1883, p. 20), though this evidence was never formally presented. Galton had no intention of measuring intelligence directly for diagnostic purposes. He was much more interested in discovering the precursors of intelligence. Galton, instead of measuring intelligence directly, preferred to infer intelligence from measures of eminence.

Many summaries of Galton’s work report that Galton’s hypotheses about sensory acuity and intelligence failed. Within the last few decades, Galton’s hypothesis has been revived. The correlation between sensory acuity and higher cognitive abilities is indeed positive as Galton predicted, but the effect is much smaller than he expected it to be (Jensen, 2006). It is unlikely that the relationship between sensory acuity and reasoning is direct (i.e., better input→better output). It is likely that the overall health of the body manifests itself (inconsistently and probabilistically) in both the sense organs and in the brain.

What can we learn from Galton’s “ridiculous” tests? We need to understand that there is an important difference between applied research that uses intelligence tests to forecast important life outcomes and more basic research that aims to explain the foundations of intellectual ability. Galton’s research, however crude, was of the latter type. Failing to understand this distinction leads us to scoff at perfectly reasonable research. For example, it is common to come across criticisms of research that aims to understand the relation between brain size and intelligence. The correlation between brain size and intelligence appears to be on the order of ρ = 0.3 to 0.4 (Lange, Froimowitz, Bigler, Lainhart, & Brain Development Cooperative Group, 2010). Reports of such findings bring to mind frightening visions of arbitrary bureaucrats in a future dystopia granting and denying privileges to people based on the size of their heads. No sane scholar is hoping this will happen. No scholar believes that head size per se causes higher or lower intelligence. Rather, head size is an imperfect indicator of one or more developmental processes that do have a more direct influence on intelligence. Studying head size is merely a stepping stone (it is hoped) to the discovery of such processes.

Roughly the same interpretive error occurs when it is announced that IQ tests are soon to be replaced with working memory tests because researchers have found that individual differences in working memory capacity explain much of IQ tests’ predictive validity.

Such announcements miss the point. If we care about individual differences in knowledge, reasoning, and creativity, we should measure these abilities directly. The fact that working memory plays a vital role in such abilities is useful to know, but such tests only supplement our understanding of intellectual processes in individuals. They help explain what might have gone wrong (or especially right) in an individual’s cognitive development. The correlations of working memory capacity and higher-order cognitive abilities are not nearly high enough that traditional cognitive ability tests are redundant. Research on working memory capacity (or processing speed or elementary cognitive tasks) is unlikely to ever lead to a replacement of direct measures of knowledge, reasoning, and creativity. Rather, such research helps us understand the origins of individual differences in these higher-order capacities.

Standard
History of Intelligence Theories

Our debt to Francis Galton is great…and embarrassing

Francis Galton (1822–1911) was born to privilege in a highly accomplished family in Great Britain. He was also something of a child prodigy, learning to read at age two and by early childhood aptly quoting from classic poetry and literature, often to humorous effect (Terman, 1917). Galton was not a psychologist nor was he an academic researcher. He was simply a gentleman-scholar who spent his leisure time in pursuit of any scientific question that seemed interesting to him.

There are many anecdotes in circulation about Galton’s zesty and quirky approach to life, numbers, and the female form (e.g., Murdoch, 2007, pp. 10­­–11). In his autobiography, Galton (1908) comes off as a rather likeable, self-effacing, and witty person and it is easy to see how he was much admired in his day. It is not for nothing that a first-rank genius like Karl Pearson found time in his busy schedule to write a three-volume biography about him. Galton was a rebel, a rogue, a visionary, and a dynamic force—a rock star geek.

Despite all this, few of us today can express unreserved admiration for him. Although his brilliance is undeniable and his place in history secure, parts of his intellectual legacy are hard to stomach. To be fair, given the times, there was nothing unusual about his negative opinions about women, Africans, or many other groups (including Americans, whom he considered to be middling in their intellectual talents compared to the English). In fact, Galton did not have a high opinion of anyone except the most talented among us (Galton, 1869):

Every tutor knows how difficult it is to drive abstract conceptions, even of the simplest kind, into the brains of most people—how feeble and hesitating is their mental grasp—how easily their brains are mazed—how incapable they are of precision and soundness of knowledge. It often occurs to persons familiar with some scientific subject to hear men and women of mediocre gifts relate to one another what they have picked up about it from some lecture—say at the Royal Institution, where they have sat for an hour listening with delighted attention to an admirably lucid account, illustrated by experiments of the most perfect and beautiful character, in all of which they expressed themselves intensely gratified and highly instructed. It is positively painful to hear what they say. Their recollections seem to be a mere chaos of mist and misapprehension, to which some sort of shape and organization has been given by the action of their own pure fancy, altogether alien to what the lecturer intended to convey. The average mental grasp even of what is called a well-educated audience, will be found to be ludicrously small when rigorously tested. (p. 21)

On the other hand, Galton was noteworthy for going out of his way to express the bigotry of the times in scientific (and pseudoscientific) terms, providing seemingly persuasive intellectual cover for those who wished to justify ghastly acts of imperialism and genocide (e.g., proposing that the British government facilitate the colonization of Africa by Chinese immigrants, displacing the native population, Galton, 1879).

Nevertheless, Galton is rightly given credit for making important scientific advances in many fields. Among his many accomplishments, psychologists remember him primarily for his advances in statistics, behavioral genetics, and cognitive ability research.

Standard