The central, dazzling argument this book presents is that human language isn isn't something we just _learn_ like we learn to tell time or bake a cake; it's actually a deeply ingrained **instinct**. Think of it like spiders spinning webs. They don't learn it in spider school, and it wasn't invented by a super-smart spider way back when. They spin webs because they have spider brains that are just _built_ to do it. Pinker suggests that humans talk and understand language in much the same fundamental sense. It's a distinct, complex, specialized skill, part of our biological makeup, that develops spontaneously in children with no conscious effort or formal teaching. This idea turns a lot of common beliefs on their head, especially views often found in the humanities and social sciences. Language is seen here not as just a cultural invention or simply a manifestation of general intelligence. While it's magnificent and unique to our species among living ones, seeing it as a biological adaptation integrates the study of humans back into the broader domain of biology, just like studying a bat's sonar navigation or a bird's migratory navigation. Seeing language this way, as one of nature's amazing engineering feats – like Darwin's admired "organ of extreme perfection and complication" – can give us a whole new appreciation for how remarkable this everyday ability truly is. The sheer ability to shape events in another person's brain with such exquisite precision, simply by making noises with our mouths, is truly one of the wonders of the natural world. It comes so naturally that we often forget just how miraculous it is. We effortlessly cause precise new combinations of ideas to arise in each other's minds. When we understand sentences, the stream of words is so transparent that we see right through to the meaning, automatically. This effortlessness, this transparency, is actually an illusion masking a system of great richness and beauty. Back in 1871, none other than Charles Darwin himself articulated a similar notion, concluding that language ability is "an instinctive tendency to acquire an art". He noted that while every language has to be learned (like brewing or baking), humans have an instinctive tendency to speak, as seen in babies' babbling. No child, after all, has an instinctive tendency to brew or bake. Fast forward to the 20th century, and the most famous proponent of language as an instinct is the linguist Noam Chomsky. Chomsky pointed out two fundamental facts about language that strongly suggest an innate basis. First, nearly every sentence we utter or understand is brand new, appearing for the first time ever. This means language can't just be a collection of learned responses; the brain must have a "recipe or program" – a mental grammar – that can build an unlimited set of sentences from a finite set of words. Second, children develop these complex grammars quickly, without formal teaching, and consistently interpret novel sentences they've never heard before. Chomsky argued that this implies children must be innately equipped with a universal plan for grammars across all languages, a Universal Grammar, which helps them figure out the syntactic patterns from what they hear. Pinker's story is deeply influenced by Chomsky, but he approaches it somewhat differently. While Chomsky has sometimes been skeptical about whether Darwinian natural selection explains the origins of this language organ, Pinker finds it fruitful to view language as an evolutionary adaptation, like the eye, with parts designed for important functions. Also, while Chomsky often uses technical linguistic analysis, Pinker aims for a more eclectic approach, drawing evidence from various fields, from genetics to psychology. Of course, the idea that language is universal across human societies is gripping, but as skeptics like philosopher Hilary Putnam have noted, not everything universal is innate. Eating with one's hands is universal, but we don't need a special instinct for that. Language is incredibly useful, perhaps invented multiple times out of necessity, and universal grammar could just reflect universal human experiences and limits on information processing. So, the universality alone isn't absolute proof. To truly convince us of a language instinct, Pinker takes us down the trail of evidence, particularly focusing on how children develop language. The core argument here is that complex language is universal because children **reinvent** it generation after generation. They don't do it because they are taught, or are generally smart, or because it's useful – they do it because, essentially, they can't help it. A powerful piece of evidence comes from situations where children are exposed to very simplified language, like pidgins. Pidgins are makeshift communication systems used by adults who speak different languages. They are often fragmentary, lacking consistent grammar. But linguist Derek Bickerton has presented evidence that when children are exposed to a pidgin as their native language, they spontaneously inject grammatical complexity where none existed before, creating a brand-new, richly expressive language called a creole. This process of creolization can happen remarkably fast, sometimes in a single generation. Bickerton suggests that the grammar of creoles offers a clear look at the brain's innate grammatical machinery. He notes striking resemblances between creoles from unrelated language mixtures. This innate grammar also seems to surface in the creative "errors" children make when learning established languages. When English-speaking children say things like _Why he is leaving?_, _Nobody don't likes me_, or _We holded the baby rabbits_, they aren't just imitating adults. They are applying rules and logic that are part of a basic grammar, rules that happen to be grammatical in many creoles. These aren't random mistakes; they show the child's mind actively constructing grammar based on an underlying design. Recent observations of the creation of new sign languages offer stunning corroboration for Bickerton's idea in real time. Sign languages used by deaf communities are not just pantomimes or codes based on spoken language; they are full, complex languages with their own grammars. For example, American Sign Language has complex grammatical features reminiscent of Navajo and Bantu, not English. Crucially, most deaf children are born to hearing parents who may not know sign language well, if at all. These children often acquire sign language from parents who learned it incompletely, or even from crude sign systems invented by educators. Yet, deaf children convert these limited inputs into full, natural languages, showing the creolization process at work. Simon, a deaf child exposed to a pidgin-like sign system by his hearing parents, spontaneously developed a sign system with grammatical features his parents lacked. His achievements are remarkable because they _showed_ this process to a psycholinguist, but there must be thousands of similar cases. Another key argument, associated with Chomsky, is the "poverty of the input". Children learn complex grammatical rules, like how to form questions with auxiliary verbs (e.g., turning _The doggie that is eating the flower is in the garden_ into _Is the doggie that is eating the flower in the garden?_). The correct rule for this involves understanding the structure of the sentence, moving the main auxiliary verb. A simpler rule, just moving the _first_ auxiliary verb you find, would be wrong in complex sentences like this. Children rarely, if ever, hear sentences structured in a way that would explicitly teach them the complex, structure-sensitive rule is right and the simpler, linear rule is wrong. Yet, they consistently apply the correct structure-sensitive rule. This suggests the principle that grammatical rules operate on structures, not just linear strings of words, is wired into children's minds. The universal underlying plan of languages, with features like auxiliaries, inversion rules, nouns, verbs, subjects, objects, phrases, clauses, case, and agreement, seems to point to commonality in human brains. It's as if isolated inventors miraculously came up with identical standards for traffic signals. Further evidence comes from children's mastery of things like the seemingly superfluous English "-s" for third-person singular present tense (He walk**s**). This small suffix requires keeping track of four grammatical details at once, yet preschoolers tacitly know how to use it, often even with brand-new or unusual words like _faxes_ or _wugs_, showing they are applying a rule that recognizes abstract categories like "noun stem" rather than just memorizing word lists. If language is an instinct, it should ideally have a specific location in the brain, maybe even genes that help wire it in. While no single "language organ" or "grammar gene" has been definitively found and labeled, the search is on, and evidence points to specific brain regions. There are neurological and genetic impairments that affect language while leaving other cognitive abilities relatively intact, and vice versa. This suggests language is not just a byproduct of general intelligence. For over a century, it's been known that damage to certain areas in the left frontal lobe can cause Broca's aphasia, a syndrome primarily affecting grammar. Patients with Broca's aphasia often struggle to produce grammatically complex sentences, though their understanding of content words and overall intelligence can be preserved. Mr. Ford, a Broca's aphasia patient, had significant grammatical impairment but was alert, intelligent, and aware of his condition, with high average nonverbal intelligence. This is a linguistic deficit that doesn't make the person "stupid across the board". Studies using brain imaging techniques like PET and fMRI show that areas in the left perisylvian region, particularly Broca's area in the front, light up when people process sentence structure. This region can be considered the primary language organ. Another area, Wernicke's area, located more towards the back of the perisylvian region, seems more involved in storing and retrieving the sounds and meanings of words, especially nouns. Damage here can cause anomia, difficulty retrieving words, even while grammar is relatively intact. There's also genetic evidence, like studies of Specific Language Impairment (SLI), which seems to run in families, hinting at a genetic basis for language abilities, perhaps even grammar. While some might find the idea of a "grammar gene" preposterous, especially if they see the brain as a blank slate, if there is a language instinct, its physical basis must be in the brain circuits, which are influenced by genes. Let's shift gears and think about **how language actually works** to convey meaning. One key principle, identified by Ferdinand de Saussure, is the **arbitrariness of the sign**. The word "dog" doesn't look, sound, or act like a dog. It means "dog" simply because every English speaker has memorized this conventional pairing of sound and meaning. This shared memorization is a huge benefit, allowing a community to convey precise information efficiently. The second major trick, captured by Wilhelm Von Humboldt and echoed by Chomsky, is that language **"makes infinite use of finite media"**. We can understand the difference between "Dog bites man" and "Man bites dog" not just because of the words, but because of their order. This order, governed by a set of rules called a **generative grammar**, translates word arrangements into combinations of thoughts. This mental grammar is what allows us to produce and understand that infinite variety of sentences. Generative grammar is fundamentally different from a simple **word-chain device**, which just predicts the next word based on the previous ones. As Chomsky showed with sentences like "Colorless green ideas sleep furiously," a sentence can be grammatically correct even if the sequence of words is statistically improbable. Actual word chains generated by probability tables sound eerie but aren't proper sentences. The mind uses a more sophisticated system based on **phrase structure**, grouping words into units like noun phrases (NP) and verb phrases (VP). These phrases are like modular components that can be plugged into different positions in a sentence according to rules. This structure is "invisible," not obvious from the linear order of words alone, but it's crucial for determining meaning. For instance, in "The cat in the hat came back," phrase structure tells us that "the cat in the hat" is a single unit acting as the subject. Ambiguities like "Groucho Marx shot an elephant in his pajamas" often arise because the same string of words can be parsed into different phrase structures, leading to different meanings. Besides the mental grammar, we also have a **mental dictionary**, or lexicon. This is where we store tens of thousands of words, each arbitrarily linked to a meaning. These memorized units are called **listemes**. Interestingly, listemes aren't just single words; they can be multi-word phrases like idioms (e.g., "kick the bucket") whose meaning can't be figured out from the individual words using grammatical rules. Idioms have to be memorized as single units of meaning, making the lexicon a repository of these "lawless" chunks, as some grammarians put it. But memorizing all these words and their arbitrary links is itself an "inspiring feat". A key insight here is that words are understood by babies as shared, bidirectional symbols, a universal currency within a community, not just behaviors tied to specific actors or targets. Words themselves have internal structure too, built from smaller meaningful units called **morphemes**. Morphemes combine to form stems and roots. For example, the root "Darwin" can become the stem "Darwinian" by adding a suffix, and then "Darwinianism" by adding another. Below the level of morphemes are **phonemes**, the basic units of sound. Unlike words and morphemes, phonemes don't carry meaning themselves. They combine to form morphemes and words, but the meaning of "dog" isn't a combination of the meanings of 'd', 'o', and 'g'. Phonemes are linked outwards to sound production, while morphemes, words, and phrases are linked inwards to meaning (mentalese). This division into two independent combinatorial systems – meaningless sounds combining into meaningful units, and meaningful units combining into larger meaningful structures – is a fundamental design feature called **duality of patterning**. Converting the discrete symbols of language into the continuous stream of speech sounds (and back again) requires complex "digital-to-analog" conversion, another marvel of the system. When we hear or read a sentence, our mind doesn't just passively absorb it; it actively **parses** it. Parsing is the unconscious process of analyzing the sentence structure, grouping words into phrases, identifying subjects, verbs, objects, and their relationships. This process is guided by the mental grammar. The program in our mind that does this is called the **parser**. It's incredibly fast and powerful, allowing us to understand sentence structure quickly. Parsing also helps us resolve **ambiguity**. Sometimes a single word has multiple meanings (like "bug" meaning "insect" or "surveillance device"), or a sentence can have multiple possible structures. Research suggests the brain might briefly activate multiple meanings for ambiguous words, even unlikely ones, before selecting the one that fits the context. The demands of verbs also guide the parser, pushing it to find the expected subjects and objects. Understanding also relies heavily on **context**, shared knowledge, and inferences. When we talk, we rely on the listener to fill in missing information and connect sentences based on mutual understanding. This cooperative principle in communication, first pointed out by philosopher Paul Grice, assumes speakers are being informative, truthful, relevant, clear, unambiguous, brief, and orderly. These expectations help us interpret ambiguous sentences, understand incomplete utterances (like telegrams or headlines), and figure out what pronouns refer to. Legal language, with its extreme explicitness, shows what happens when you _can't_ rely on a cooperative receiver. Now, let's tackle a really big question: **Do we think in language?** Or, as George Orwell's _Nineteen Eighty-Four_ explored with Newspeak, can a language limit our thoughts?. The idea that thought _is_ the same thing as language is quite widespread, but Pinker argues it's a "conventional absurdity" – something people believe because they've vaguely heard it, even though it defies common sense. Think about it: Haven't you ever struggled to find the right words to express a thought?. Or remembered the gist of something someone said, but not the exact words?. How could we coin new words if thought depended entirely on existing ones?. How could translation be possible if languages carved up reality in fundamentally different, untranslatable ways?. These everyday experiences suggest that there must be a "what we meant to say" that is different from what we actually said. Pinker argues that our thoughts are couched in a separate, silent medium in the brain – a **language of thought**, or **mentalese**. Evidence for this comes from people like Mr. Ford, the intelligent aphasic who lost his grammatical ability but retained non-linguistic cognitive skills. More strikingly, there are cases of deaf adults who have grown up without _any_ language form (spoken or signed) but still demonstrate complex abstract thinking abilities, like handling money, solving problems, and entertaining with pantomimes. Einstein himself described his productive thought as a "combinatory play" of visual and muscular elements, with conventional words sought only secondarily. The computational theory of mind, sometimes called the "physical symbol system hypothesis," provides a framework for understanding how thinking might work without relying on words or a "little man" inside the head. Alan Turing's hypothetical machine showed how a simple device with fixed reflexes could manipulate symbols according to rules to perform reasoning. Mentalese, in this view, uses symbols to represent concepts and arrangements of symbols for logical relations, not necessarily looking anything like English or Chinese. It's likely richer in concepts than any spoken language (more concepts than words) but simpler in structure, lacking conversation-specific elements like articles or information about pronunciation and word order. Pinker suggests it's probably a universal mentalese, the same for everyone. So, knowing a language is really about knowing how to translate between mentalese and strings of words, and vice versa. Babies presumably have simpler dialects of mentalese, and nonhuman animals might too. Without mentalese, it's hard to even imagine how learning a language could happen. This view also suggests that concepts like freedom and equality would remain thinkable even if words for them were removed from a language like Newspeak, and new meanings would quickly emerge for existing words. The creativity of children in reinventing grammar would likely turn Newspeak into a natural language anyway. How did this incredible language instinct evolve? The complexity and apparent design of language – the discrete combinatorial systems of syntax and morphology, the huge lexicon, specialized brain areas, parsing algorithms, etc. – strongly suggest it's an **adaptation shaped by natural selection**. Natural selection favors traits that help organisms survive and reproduce. Language is clearly useful for conveying vital information (like dangers or how to build things) and coordinating social interactions, which would enhance survival and reproduction. Chomsky, however, has been skeptical that natural selection alone explains the language organ, sometimes suggesting it might have arisen as a "concomitant" of other evolutionary pressures (like bigger brains) due to physical law. Pinker disagrees, arguing that the intricate, functional design of language points specifically to natural selection, just like the design of the eye. Reconstructing the exact steps of language evolution is challenging. What would an intermediate, partially grammatical form look like, and how would it be useful?. Derek Bickerton's idea of "protolanguage" (like pidgins or early child language) offers a possible intermediate step, a system less complex than modern language but still capable of conveying basic information. While there's a huge gap between protolanguage and full language, the idea that language evolved gradually through a series of useful intermediate forms, perhaps driven by selection favoring both speakers who could be better decoded and hearers who could decode better, is consistent with natural selection. The fact that systems like pidgins, child language, and aphasic speech exist and are viable, though less efficient, demonstrates a continuum of possible language systems. Finally, let's think about what the language instinct tells us about the **nature of the human mind itself**. The existence of such a complex, specialized instinct challenges the idea that the brain is a completely general-purpose learning device, a "blank slate" shaped entirely by culture and experience. Language development looks less like general-purpose learning and more like a biological function, developing like other instincts [13, 85, 114, 1 _enable_ the kind of flexible generalization needed to master language. This suggests the mind might be composed of various **adapted computational modules**, specialized instincts or faculties for dealing with different domains, rather than being a single, undifferentiated learning machine. Besides language, candidates for such modules, based on engineering analysis, evolutionary history, child development patterns, and neuroscience, might include intuitive mechanics, intuitive biology, number sense, spatial mapping, face recognition, and social cognition (like intuitive psychology and a sense of justice). This perspective, sometimes called the Integrated Causal Model or evolutionary psychology, doesn't discount learning or culture; it seeks to explain _how_ they work, rooted in the brain's evolved design. And crucially, claims about the language instinct and other mental modules are about the commonalities shared by _all normal people_. They don't negate the principle of human equality; focusing on these universal aspects is different from studying individual differences in ability, which is like studying differences in lung capacity instead of how lungs work.aguan Sign Language) to see which grammatical structures reliably emerge without explicit instruction?. - **What exactly are the "symbols" and "processors" of mentalese?** The computational theory of mind offers an abstract model. Can neuroscience ever directly observe or measure these internal representations and the computations that manipulate them?. - **How does the brain's neural network physically implement grammatical rules?** The suggested network for the -s suffix is a fantasy model. What does the actual intricate wiring of millions of neurons in the perisylvian region do to compute grammar?. - **What are the ongoing debates about the evolution of language?** We touched on Pinker's view (natural selection) vs. Chomsky's skepticism. What are the latest findings and arguments in this rapidly developing field?. How plausible are different scenarios for intermediate steps?. - **How do specific brain injuries reveal the modularity of language?** We heard about Broca's aphasia and anomia. What other highly specific language deficits have been observed, and what do they tell us about how language is organized in the brain?. Are there even more precise brain imaging techniques now available to map language functions?. - **Beyond language, what other cognitive abilities show signs of being specialized modules or instincts?** The book lists candidates like intuitive physics, biology, number, and social cognition. What kind of evidence (from child development, brain studies, or cross-cultural comparisons) supports the idea that these are distinct mental tools?. - **How does language actually influence thought, if we don't _think_ in language?** The book argues against radical linguistic determinism but acknowledges some influence. In what more subtle ways might the structure or vocabulary of our native language shape our perception, memory, or reasoning?. - **What is the latest understanding of how children acquire the meaning of words?** We saw that children assume words label kinds of things. Is it purely based on general intelligence and theory of mind, or are there dedicated mechanisms for word learning?