Evolution of human intelligence
The evolution of human intelligence is closely tied to the evolution of the human brain and to the origin of language. The timeline of human evolution spans approximately seven million years,[1] from the separation of the genus Pan until the emergence of behavioral modernity by 50,000 years ago. The first three million years of this timeline concern Sahelanthropus, the following two million concern Australopithecus and the final two million span the history of the genus Homo in the Paleolithic era. Many traits of human intelligence, such as empathy, theory of mind, mourning, ritual, and the use of symbols and tools, are somewhat apparent in other great apes, although they are in much less sophisticated forms than what is found in humans like the great ape language. HistoryHominidae
The great apes (Hominidae) show some cognitive and empathic abilities. Chimpanzees can make tools and use them to acquire foods and for social displays; they have mildly complex hunting strategies requiring cooperation, influence and rank; they are status conscious, manipulative and capable of deception; they can learn to use symbols and understand aspects of human language including some relational syntax, concepts of number and numerical sequence.[2] One common characteristic that is present in species of "high degree intelligence" (i.e. dolphins, great apes, and humans - Homo sapiens) is a brain of enlarged size. Additionally, these species have a more developed neocortex, a folding of the cerebral cortex, and von Economo neurons. Said neurons are linked to social intelligence and the ability to gauge what another is thinking or feeling and are also present in bottlenose dolphins.[3] Homininae![]() Around 10 million years ago, the Earth's climate entered a cooler and drier phase, which led eventually to the Quaternary glaciation beginning some 2.6 million years ago. One consequence of this was that the north African tropical forest began to retreat, being replaced first by open grasslands and eventually by desert (the modern Sahara). As their environment changed from continuous forest to patches of forest separated by expanses of grassland, some primates adapted to a partly or fully ground-dwelling life where they were exposed to predators, such as the big cats, from whom they had previously been safe. These environmental pressures caused selection to favor bipedalism - walking on hind legs. This gave the Homininae's eyes greater elevation, the ability to see approaching danger further off, and a more efficient means of locomotion.[citation needed] It also freed their arms from the task of walking and made the hands available for tasks such as gathering food. At some point the bipedal primates developed handedness, giving them the ability to pick up sticks, bones and stones and use them as weapons, or as tools for tasks such as killing smaller animals, cracking nuts, or cutting up carcasses. In other words, these primates developed the use of primitive technology. Bipedal tool-using primates from the subtribe Hominina date back to as far as about 5 to 7 million years ago, such as one of the earliest species, Sahelanthropus tchadensis. From about 5 million years ago, the hominin brain began to develop rapidly in both size and differentiation of function. There has been a gradual increase in brain volume as humans progressed along the timeline of evolution (see Homininae), starting from about 600 cm3 in Homo habilis up to 1500 cm3 in Homo neanderthalensis. Thus, in general there's a positive correlation between brain volume and intelligence.[4] However, modern Homo sapiens have a brain volume slightly smaller (1250 cm3) than neanderthals, and the Flores hominids (Homo floresiensis), nicknamed hobbits, had a cranial capacity of about 380 cm3 (considered small for a chimpanzee) about a third of that of Homo erectus. It is proposed that they evolved from H. erectus as a case of insular dwarfism. With their three-times-smaller brain, the Flores hominids apparently used fire and made tools as sophisticated as those of their ancestor H. erectus. HomoRoughly 2.4 million years ago Homo habilis had appeared in East Africa: the first known human species, and the first known to make stone tools, yet the disputed findings of signs of tool use from even earlier ages and from the same vicinity as multiple Australopithecus fossils may put to question how much more intelligent than its predecessors H. habilis was. The use of tools conferred a crucial evolutionary advantage, and required a larger and more sophisticated brain to co-ordinate the fine hand movements required for this task.[5][6] Our knowledge of the complexity of behaviour of Homo habilis is not limited to stone culture; they also had habitual therapeutic use of toothpicks.[7] A larger brain requires a larger skull, and thus is accompanied by other morphological and biological evolutionary changes. One such change required for the female to have a wider birth canal for the newborn's larger skull to pass through. The solution to this was to give birth at an early stage of fetal development, before the skull grew too large to pass through the birth canal. Other accompanying adaptations were the smaller maxillary and mandibular bones, smaller and weaker facial muscles, and shortening and flattening of the face resulting in modern-human's complex cognitive and linguistic capabilities as well as the ability to create facial expressions and smile.[6] Consequentially, dental issues in modern humans arise from these morphological changes that are exacerbated by a shift from nomadic to sedentary lifestyles.[6] Humans' increasingly sedentary lifestyle to protect their more vulnerable offspring led them to grow even more dependent on tool-making to compete with other animals and other humans, and rely less on body size and strength.[6] About 200,000 years ago Europe and the Middle East were colonized by Neanderthals, extinct by 39,000 years ago following the appearance of modern humans in the region from 40,000 to 45,000 years ago. History of humans In the Late Pliocene, hominins were set apart from modern great apes and other closely related organisms by the anatomical evolutionary changes resulting in bipedalism, or the ability to walk upright.[8][9] Characteristics such as a supraorbital torus, or prominent eyebrow ridge, and flat face also makes Homo erectus distinguishable. Their brain size substantially sets them apart from closely related species, such as H. habilis, as seen by an increase in average cranial capacity of 1000 cc. Compared to earlier species, H. erectus developed keels and small crests in the skull showing morphological changes of the skull to support increased brain capacity. It is believed that Homo erectus were, anatomically, modern humans as they are very similar in size, weight, bone structure, and nutritional habits. Over time, however, human intelligence developed in phases that is interrelated with brain physiology, cranial anatomy and morphology, and rapidly changing climate and environments.[9] ![]() Tool-useThe study of the evolution of cognition relies on the archaeological record made up of assemblages of material culture, particularly from the Paleolithic Period, to make inferences about our ancestors' cognition. Paleo-anthropologists from the past half-century have had the tendency of reducing stone tool artifacts to physical products of the metaphysical activity taking place in the brains of hominins. Recently, a new approach called 4E cognition (see Models for other approaches) has been developed by cognitive archaeologists Lambros Malafouris, Thomas G. Wynn, and Karenleigh A. Overmann, to move past the "internal" and "external" dichotomy by treating stone tools as objects with agency in both providing insight to hominin cognition and having a role in the development of early hominin cognition.[10] The 4E cognition approach describes cognition as embodied, embedded, enactive, and extended, to understand the interconnected nature between the mind, body, and environment.[10] There are four major categories of tools created and used throughout human evolution that are associated with the corresponding evolution of the brain and intelligence. Stone tools such as flakes and cores used by Homo habilis for cracking bones to extract marrow, known as the Oldowan culture, make up the oldest major category of tools from about 2.5 and 1.6 million years ago. The development of stone tool technology suggests that our ancestors had the ability to hit cores with precision, taking into account the force and angle of the strike, and the cognitive planning and capacity to envision a desired outcome.[11] ![]() Acheulean culture, associated with Homo erectus, is composed of bifacial, or double-sided, hand-axes, that "requires more planning and skill on the part of the toolmaker; he or she would need to be aware of principles of symmetry".[11] In addition, some sites show evidence that selection of raw materials involved travel, advanced planning, cooperation, and thus communication with other hominins.[11] The third major category of tool industry marked by its innovation in tool-making technique and use is the Mousterian culture. Compared to previous tool cultures, in which tools were regularly discarded after use, Mousterian tools, associated with Neanderthals, were specialized, built to last, and "formed a true toolkit".[11] The making of these tools, called the Levallois technique, involves a multi-step process which yields several tools. In combination with other data, the formation of this tool culture for hunting large mammals in groups evidences the development of speech for communication and complex planning capabilities.[11] While previous tool cultures did not show great variation, the tools of early modern Homo sapiens are robust in the amount of artifacts and diversity in utility. There are several styles associated with this category of the Upper Paleolithic, such as blades, boomerangs, atlatls (spear throwers), and archery made from varying materials of stone, bone, teeth, and shell. Beyond use, some tools have been shown to have served as signifiers of status and group membership. The role of tools for social uses signal cognitive advancements such as complex language and abstract relations to things.[11] Homo sapiens![]() ![]()
Homo sapiens intelligenceThe eldest findings of Homo sapiens in Jebel Irhoud, Morocco date back c. 300,000 years[12][13] Fossils of Homo sapiens were found in East Africa which are c. 200,000 years old. It is unclear to what extent these early modern humans had developed language, music, religion, etc. The cognitive tradeoff hypothesis proposes that there was an evolutionary tradeoff between short-term working memory and complex language skills over the course of human evolution.[14] According to proponents of the Toba catastrophe theory, the climate in non-tropical regions of the earth experienced a sudden freezing about 70,000 years ago, because of a huge explosion of the Toba volcano that filled the atmosphere with volcanic ash for several years. This reduced the human population to less than 10,000 breeding pairs in equatorial Africa, from which all modern humans are descended. Being unprepared for the sudden change in climate, the survivors were those intelligent enough to invent new tools and ways of keeping warm and finding new sources of food (for example, adapting to ocean fishing based on prior fishing skills used in lakes and streams that became frozen).[citation needed] Around 80,000–100,000 years ago, three main lines of Homo sapiens diverged, bearers of mitochondrial haplogroup L1 (mtDNA) / A (Y-DNA) colonizing Southern Africa (the ancestors of the Khoisan/Capoid peoples), bearers of haplogroup L2 (mtDNA) / B (Y-DNA) settling Central and West Africa (the ancestors of Niger–Congo and Nilo-Saharan speaking peoples), while the bearers of haplogroup L3 remained in East Africa.[citation needed] The "Great Leap Forward" leading to full behavioral modernity sets in only after this separation. Rapidly increasing sophistication in tool-making and behaviour is apparent from about 80,000 years ago, and the migration out of Africa follows towards the very end of the Middle Paleolithic, some 60,000 years ago. Fully modern behaviour, including figurative art, music, self-ornamentation, trade, burial rites etc. is evident by 30,000 years ago. The oldest unequivocal examples of prehistoric art date to this period, the Aurignacian and the Gravettian periods of prehistoric Europe, such as the Venus figurines and cave painting (Chauvet Cave) and the earliest musical instruments (the bone pipe of Geissenklösterle, Germany, dated to about 36,000 years ago).[15] ![]() The human brain has evolved gradually over the passage of time; a series of incremental changes occurring as a result of external stimuli and conditions. It is crucial to keep in mind that evolution operates within a limited framework at a given point in time. In other words, the adaptations that a species can develop are not infinite and are defined by what has already taken place in the evolutionary timeline of a species. Given the immense anatomical and structural complexity of the brain, its evolution (and the congruent evolution of human intelligence), can only be reorganized in a finite number of ways. The majority of said changes occur either in terms of size or in terms of developmental timeframes.[16] The cerebral cortex is divided into four lobes (frontal, parietal, occipital, and temporal) each with specific functions. The cerebral cortex is significantly larger in humans than in any other animal and is responsible for higher thought processes such as reasoning, abstract thinking, and decision making.[17] Another characteristic that makes humans special and sets them apart from any other species is our ability to produce and understand complex, syntactic language. The cerebral cortex, particularly in the temporal, parietal, and frontal lobes, are populated with neural circuits dedicated to language. There are two main areas of the brain commonly associated with language, namely: Wernicke's area and Broca's area. The former is responsible for the understanding of speech and the latter for the production of speech. Homologous regions have been found in other species (i.e. Area 44 and 45 have been studied in chimpanzees) but they are not as strongly related to or involved in linguistic activities as in humans.[18] ModelsMassive modularity of mindIn 2004, psychologist Satoshi Kanazawa argued that g was a domain-specific, species-typical, information processing psychological adaptation,[19] and in 2010, Kanazawa argued that g correlated only with performance on evolutionarily unfamiliar rather than evolutionarily familiar problems, proposing what he termed the "Savanna-IQ interaction hypothesis".[20][21] In 2006, Psychological Review published a comment reviewing Kanazawa's 2004 article by psychologists Denny Borsboom and Conor Dolan that argued that Kanazawa's conception of g was empirically unsupported and purely hypothetical and that an evolutionary account of g must address it as a source of individual differences.[22] In response to Kanazawa's 2010 article, psychologists Scott Barry Kaufman, Colin G. DeYoung, Deirdre Reis, and Jeremy R. Gray gave 112 subjects a 70-item computerized version of the Wason selection task (a logic puzzle) in a social relations context as proposed by Leda Cosmides and John Tooby in The Adapted Mind,[23] and found instead that "performance on non-arbitrary, evolutionarily familiar problems is more strongly related to general intelligence than performance on arbitrary, evolutionarily novel problems".[24][25] Peter Cathcart Wason originally demonstrated that not even 10% of subjects found the correct solution and his finding was replicated.[26][27] Psychologists Patricia Cheng, Keith Holyoak, Richard E. Nisbett, and Lindsay M. Oliver demonstrated experimentally that subjects who have completed semester-long college courses in propositional calculus do not perform better on the Wason selection task than subjects who do not complete such college courses.[28] Tooby and Cosmides originally proposed a social relations context for the Wason selection task as part of a larger computational theory of social exchange after they began reviewing the previous experiments about the task beginning in 1983.[23] Despite other experimenters finding that some contexts elicited more correct subject responses than others, no theoretical explanation for differentiating between them was identified until Tooby and Cosmides proposed that disparities in subjects performance on contextualized versus non-contextualized variations of the task was an artifact of the task measuring a specialized cheater-detection module.[29][30] Tooby and Cosmides later noted that whether there are evolved cognitive mechanisms for the content-blind rules of logical inference is disputed,[31][32] and consistently noted that a body of research about the Wason selection task had concluded that cognitive adaptations for social exchange were not a by-product of general-purpose reasoning mechanisms, domain-general learning mechanisms, or g.[33][34][35] Relatedly, economist Thomas Sowell has noted that numerous studies finding disparities between the mean test scores of ethnic groups on intelligence tests have found that ethnic groups with lower mean test scores have tended to perform worst on spatial, non-verbal, or abstract reasoning test items.[36][37] Writing after the completion of the Human Genome Project in 2003, psychologist Earl B. Hunt noted in 2011 that no genes related to differences in cognitive skills across various racial and ethnic groups had ever been discovered.[38] In 2012, American Psychologist published a review by Nisbett, psychologists Joshua Aronson, Clancy Blair, Diane F. Halpern, and Eric Turkheimer, economist William Dickens, and philosopher James R. Flynn of findings since the publication of the 1995 American Psychological Association report on intelligence that concluded that almost no single-nucleotide genetic polymorphisms that have been discovered are consistently associated with variation in IQ in the normal range, and that adoption research on race and intelligence showed that differences could be entirely accounted for by environmental factors.[39][40] In 2021, subsequent research using polygenic scores for educational attainment and cognitive performance in African and European samples from the 1000 Genomes Project found no evidence of divergent selection by race and a statistically insignificant contribution to racial differences in IQ.[41][42] Flynn had argued earlier that the Flynn effect presented multiple paradoxes for g as a psychological trait with a heritable basis because the increases in the statistical average scores among later birth year cohorts born in the 20th century were occurring without sufficient increases in vocabulary size, general knowledge, and ability to solve arithmetical problems, and that the increases were so large that they would imply that the statistically average members of the birth year cohorts in the late 19th century and early 20th century (the Lost Generation and the Greatest Generation) would have been intellectually disabled (as well as more distant human ancestors).[43] Hunt noted that the latter paradox would imply that half of the soldiers who served in the U.S. military during World War II would not pass the Armed Services Vocational Aptitude Battery in 2008.[44] Flynn proposed that these paradoxes could be answered by the increasing use of abstraction, logic, and scientific reasoning to address problems,[45] while Nisbett argued that the Flynn effect was largely attributable to increases in formal education among human populations during the 20th century.[46] In 2010, psychologist David Marks found through 8 statistical analyses that average population IQ scores across race, time, and nationality correlated with literacy rates between a range of 0.79 and 0.99, which led to the conclusion that both the Flynn effect and racial differences in mean scores on intelligence tests were statistical artifacts of uncontrolled variation in literacy rates due to test performance requiring literacy.[47][48] However, in reference to theoretical issues with constructivism in mathematics education and the failure of whole language in literacy education, psychologist David C. Geary and cognitive scientist Steven Pinker have noted that literacy, numeracy, and formal mathematical and logical reasoning are not psychological adaptations but biologically secondary cognitive skills (i.e. acquired characteristics) that require extensive practice after formal, explicit, and direct instruction in contrast with natural language and number sense, since language acquisition and numerosity develop automatically and unconsciously due to specialized neurobiological systems for language and numerical cognition which the biologically secondary cognitive skills lack.[53] Pinker has also noted that writing is not a cultural universal since writing systems were independently invented only a few times in human history and most societies documented by ethnographers lacked writing systems,[54] while literacy rates in European countries did not begin to exceed 50 percent until the 17th century since the movable-type printing press was not invented until the 15th century.[55] Similarly to the lack of improvement in performance on the Wason selection task by college students that take courses in propositional calculus, Pinker referenced the response by professional mathematicians and statisticians to the solution to the |