Pinker, S. & Bloom, P. (1990). Natural language and natural selection. Behavioral and Brain Sciences13(4): 707-784.
Many people have argued that the evolution of the human language faculty cannot be explained by Darwinian natural selection. Chomsky and Gould have suggested that language may have evolved as the by‐product of selection for other abilities or as a consequence of as‐yet unknown laws of growth and form. Others have argued that a biological specialization for grammar is incompatible with every tenet of Darwinian theory ‐‐ that it shows no genetic variation, could not exist in any intermediate forms, confers no selective advantage, and would require more evolutionary time and genomic space than is available. We examine these arguments and show that they depend on inaccurate assumptions about biology or language or both. Evolutionary theory offers clear criteria for when a trait should be attributed to natural selection: complex design for some function, and the absence of alternative processes capable of explaining such complexity. Human language meets this criterion: grammar is a complex mechanism tailored to the transmission of propositional structures through a serial interface. Autonomous and arbitrary grammatical phenomena have been offered as counterexamples to the position that language is an adaptation, but this reasoning is unsound: communication protocols depend on arbitrary conventions that are adaptive as long as they are shared. Consequently, language acquisition in the child should systematically differ from language evolution in the species and attempts to analogize them are misleading. Reviewing other arguments and data, we conclude that there is every reason to believe that a specialization for grammar evolved by a conventional neo‐Darwinian process.
Tomasello, M., & Call, J. (2018). Thirty years of great ape gestures. Animal Cognition, 1-9.
Graham, Kirsty E; Catherine Hobaiter, James Ounsley, Takeshi Furuichi, Richard W. Byrne (2018) Bonobo and chimpanzee gestures overlap extensively in meaning. PLoS Biology
Subscribe to:
Post Comments (Atom)
PSYC 538 Syllabus
Categorization, Communication and Consciousness 2022 Time : FRIDAYS 8:30-11:25 Place : BIRKS 203 Instructor : Stevan Harnad Office : Zoom E...
-
Harnad, S. (2003b) Categorical Perception . Encyclopedia of Cognitive Science . Nature Publishing Group. Macmillan. http://www.youtube.co...
-
Searle, John. R. (1980) Minds, brains, and programs . Behavioral and Brain Sciences 3 (3): 417-457 This article can be viewed as an atte...
-
Fodor, J. (1999) " Why, why, does everyone go on so about thebrain? " London Review of Books 21(19) 68-69. I once gave a (perfec...
The link to the reading does not seem to be working ("this page could not be found"). Also, are the 2 other readings optional or mandatory? Last question: did we come to an agreement for due weeks of skywritings? Thank you in advance.
ReplyDeleteHey did you ever receive a response for your questions? @Amelie
DeleteI've fixed the link for P&B but there are 21 versions accessible through google scholar (and probably another via sci-hub, and more if you search with google. So by now you should have more experience with finding one!
ReplyDeleteI am not too sure if I understood this article, but here are my thoughts on it:
ReplyDeleteThis article presents an exhaustive overview of the arguments put out by various thinkers on whether the capacity for human beings to develop language has appeared on its own or if it gradually evolved in our genes through Natural Selection.
Those that believe that language has seemingly appeared by itself would also endorse the idea that language is just a consequence of some other adaptation. However, the authors argue that although it's sensible to see it this way, it does a disservice to the theory of evolution overall Although they do not come up with any empirically proving evidence, they put forth the notion that language came gradually. Each subsequent generation of human beings has shaped and mutated a few of our genes responsible for our language capacities. Additionally, their claim is also supported by classifying communication as a tool for either a reproductive or survival function (communicating effectively with others saves lives).
I am still not entirely certain whether I understood the article. Feel free to add what I have missed!
Thank you so much for this response!
DeleteI have also noticed the lack of explanation about how is UG ingrained within our cognitive systems.
From my understanding, the "Poverty of Stimulus" refers to the idea that a child learning a new language cannot only be from environmental inputs. These inputs do not tell the child how to utilize language correctly. They already know how to do it innately. The environment only plays a role in teaching how to use OG properly.
Josie, spot-on.
DeleteAlexei, please distinguish UG from OG (and from "language," which has many features, among them UG and OG).
Hi!
DeleteUG is not something that is learned (children also never make any "UG"
mistakes). We are simply born with it.
OG is learned by children when they grow up through environmental inputs (unsupervised learning) and feedback from others (supervised learning).
As for languages, I believe those evolve with OG across time, but UG stays the same regardless.
Reposting for Josie
DeleteCompiling all my skywritings and just realized this one got deleted so here's a repost: Hi Alexei, I agree that the environment plays a role in learning OG but not UG, since UG cannot be learned. OG is learnable directly (through supervised and unsupervised learning) or indirectly (through instruction). In this way, children make OG errors and get corrected, so OG is continually updated. On the other hand, UG does not get updated or changed. UG is not learnable due to the “poverty of the stimulus”: there are no negative examples of UG, meaning children cannot make UG errors and get corrected on them. They must already have UG innately. This reading did not separate UG from OG, and instead it lumped them together to avoid addressing the evolution of UG. This makes me wonder, is there a way to determine when UG came into the equation during the evolution of language?
The authors themselves note that this paper is "entirely conventional." So at least they are aware of its shortfalls! We know that the mechanisms behind language production have evolved over time. We know that language is complex and has allowed humans to progress far and above other species. We know that something happens when we evolve from "show" (pantomiming) to "tell" (verbal communication). The paper describes these ideas in detail. What we don't know are the circumstances that made language possible for humans, and how this plays into Chomsky's "Universal Grammar." The idea that some innate, evolved mechanism for grammer exists within human development is a compelling intersection between evolution and cognition which warrants more attention than this paper paid to it.
ReplyDeleteTeegan, yes, exactly. But what is UG, and how does it differ from OG? And what evolutionary problem does UG pose?
DeleteI have been trying to consolidate the difference between ordinary grammar (discussed in depth in this article) and universal grammar. UG (universal grammar) is different from OG (ordinary grammar) as there are no counterexamples ever run across in the world. OG can be learned as children make mistakes and are corrected, and feedback is received (through supervised learning and verbal learning). For OG, there are examples of what is and is not in the category of OG (members and non-members).
DeleteHowever, for UG is complex and universal (french and english UG is not different), AND one never runs into a non-member, a phrase that does not follow the rules of universal grammar, no one commits an error of UG. It must therefore be innate. I would be very interested in learning how this language property evolved as it is unclear to me. But the authors in this paper do not go into this.
To add onto Kayla's response, UG poses an evolutionary problem because all children follow UG rules without ever making mistakes nor receiving corrective feedback. Indeed, UG cannot be learned because children do not receive supervised nor verbal learning. Nonetheless, UG rules appear complex, and they are not only generalized but also standardized across languages. As such, UG must be innate (evolved) since children follow its complex rules across languages. This raises the evolutionary issue of how and why UG evolved. I was a bit disappointed that this was not at all addressed in the reading, despite its domain of interest being evolution and language. This left me feeling like the 'black box' of UG had been avoided by Pinker and Bloom.
DeleteKayla and Amelie did a nice job of answering the question. The only thing I would add to the part about why UG poses an evolutionary problem:
DeleteWe don't even know what evolutionary function UG serves! What is the practical benefit of having innate grammar rules "built in" to the human brain, rathering than just learning the rules (as with OG)?
Suppose we accept the idea that UG is the result of evolution, just like any other biological trait. In that case, we could argue that UG's emergence responds to our need to schematize our sensorimotor experiences in our environment. For instance, navigating our environment requires us to distinguish singularity from plurality, pose questions (reducing ambiguity), negate (territorial defence?), etc. Such requirements are universal given that the constraints of the physical world dictating our sensorimotor experiences are universal. Therefore, the grammatical features need to be shared in order to communicate said sensorimotor experiences (i.e., when referring to one Homosapien Vs. a group of homosapiens, the differences between the two underlie the same sensorimotor differences regardless of where you are from). Depending on where you are from, the ways to express the difference between "one Homosapien" and "a group of homosapiens" may be different; but that is OG, which is learnable via supervised and unsupervised learning. On the other hand, UG perhaps refers to the underlying schema that implicitly (the hard problem; how?) differentiates our sensorimotor experiences when seeing/saying "a Homosapien" Vs. "a group of Homosapiens." By the same logic, we could also argue that OGs are just derivatives/variations of a common UG.
DeleteUG stresses that most languages share plenty of grammatical features. Perhaps, this sharing suggests that "crude" human sensorimotor experiences with the physical world are universal and geographically independent. It makes evolutionary sense that having a universal set of schemas to communicate and categorize our sensorimotor experiences helped us survive throughout evolution.
Follow-up question:
DeleteI guess UG serves an evolutionary function and helps us survive in similar ways to certain mental heuristics. But we all know that mental heuristics can also lead to cognitive biases (i.e., availability heuristic, though it helps us make rapid, above-chance accurate decisions, it can lead us to errors in various situations if used mindlessly). Similarly, what is considered a universal sensorimotor experience is a matter of spectrum. Though UG may refer to schemas that approximately underlie humans' shared sensorimotor experiences, there are definitely exceptions. For instance, speakers of Guugu Yimithirr in North Queensland, Australia, use cardinal directional cues (i.e., north, west) instead of relative directional cues (i.e., left, right) like most parts of the world. My question is, though UG may suggest the existence of a set of universal sensorimotor schematic systems that all humans share, are statistical outliers such as the Guugu Yimithirr speaker enough to prove the non-existence of UG?
Hi Yucen! I do not believe that speakers of Guugu Yimithirr are an exception to UG. Althout the cardinal directional system and the relative directional system are different, they are still both directional systems. The schema may be that simple; people have the innate capacity for a sense of direction in general, but the exact directional system is not specified. A true exception would be if a language had no terms for direction at all, and if the speakers of that language were unable to comprehend direction or come up with terms for it.
DeleteKayla, the only UG errors encountered in the world are the ones deliberately made by Chomskian linguists, in trying to infer the rules of UG. Children never make UG errors, and the only children who might hear them are the eavesdropping children of Chomskian linguists. (And, yes, P & B say nothing about this special problem that UG poses for evolution.)
DeleteAmélie, yes, P & B ignored the evolutionary problem of UG. If they had addressed it, they would have had to say that no one has any idea how or why UG evolved.
DeleteSee Week 9a’s supplementary readings (short):
Harnad, S. (2008) Why and How the Problem of the Evolution of Universal Grammar (UG) is Hard. Behavioral and Brain Sciences 31: 524-525
Harnad, S (2014) Chomsky's Universe. -- L'Univers de Chomsky. À babord: Revue sociale es politique 52.
Teegan, yes, that’s why the problem of the evolution of language is not as straightforward as P & B suggest…
DeleteYucen, you are speculating about what might be properties of UG, but you can have no idea, because you have not studied UG. The properties are not sensorimotor; nor are they spatial. If a paper cited Guugu Yimithirr speakers’ direction cues as evidence against UG then the paper’s authors had no idea what UG was either. A whole course could be devoted to multiple decades’ worth of critiques of UG based on no knowledge of UG. If your interest is serious, you should take a linguistics course on syntax.
Delete(I have not taken a syntax course either! But I know enough to be able to detect when critiques of UG miss the mark.)
Kimberly, that’s right.
I find the idea that UG is a necessity of thought (as was related in the optional reading 9a) to be a compelling alternative to the functional, evolutionary explanation of UG. There is an immediate regress, where we have to explain the evolutionary advantage of developing thought the way we did, but it seems like an analysis of our thought and why it solves the problems it does might be easier than why the rules language have some specific advantage. I don't understand the specifics of UG well enough to understand how it might be related to thought this way (via the proto-Merge/ Merge/ unbounded Merge as Chomsky proposed), but it certainly seems to unburden the evolutionary psychologists in a way that makes sense: language expresses our thoughts as propositions, there is regularity to the logic of thought, and so there is bound to be regularity in the expression of it.
DeleteAccording to last week's reading (7.a.), an argument to the effect that cognitive capacity X evolved by natural selection involves 3 stages:
ReplyDelete1. Identifying a an adaptive problem and proposing an adaptation that could have evolved to solve that problem – which should include an specification of the specific design features of the hypothesized mechanism.
2. Testing for evidence of the hypothesized mechanism’s design features using one or more of the methods and measures already familiar to psychological researchers, such as laboratory experiments, cross-cultural studies, observational studies, etc.
3. Evaluate the degree of fitness fit between prediction and observation and rule out alternative hypotheses such as competing adaptation hypotheses and incidental byproduct hypotheses.
My question to you all is, do you think Pinker & Bloom succeed at each of these 3 steps? In particular, do you think they have (1) provided a compelling hypothesis as to what evolutionary problem language was meant to solve, and how (by means of what mechanisms) it solves it; (2) offered evidence that such mechanisms did (or could have) evolved, or at the very least proposed ways of empirically testing their claims; (3) successfully refuted competing explanations, especially the "side effect" / "spandrel" explanation.
Gabriel, excellent questions. We will address them in the next two weeks. All have plausible answers except the question of the origins and the adaptive advantages of UG -- and how they evolved.
DeleteThis article reflects how humans have adapted language and how words have been learned by either direct experience (induction) or by either dictionaries or others explaining concepts to them (instruction). I personally hadn't really considered how induction could play a vital role in early with instruction being our primary form of communication such as like the simulated mushroom scenario. It was interesting to consider the only reason we developed language is because we have had the motivation to when I had always considered out evolution happening to be dumb luck. I felt like this paper helped me critically challenge things, concepts and subjects I mistook for granted.
ReplyDeleteHelen, P & B’s paper did highlight many of the evolutionary and cultural/historical aspects of language (omitting a few of the more speculative ones, such as how it began, and whether it began in the gestural or the vocal modality). But the most important omission was the special problem of the origin of UG.
DeleteI just realized I comment on the wrong article page, my bad! My comment was supposed to be for Symbol Grounding and the Origin of Language: From Show to Tell
DeleteThis paper argues that there is an evolutionary explanation for the capacity of language because of its adaptive benefits through learning and cultural changes but makes no distinction between UG/OG grammar. However, I don't quite understand this quote in section 3.3. "Instead of positing that there are multiple languages, leading to the evolution of a mechanism to learn the differences among them, one might posit that there is a learning mechanism, leading to the development of multiple languages. That is, some aspects of grammar might be easily learnable from environmental inputs by cognitive processes that may have been in existence prior to the evolution of grammar." From what I understand, this argues for the existence of UG, which allows distinctive OG from various languages to develop, and this supports the evolutionary theory for OG development. Nevertheless, the paper did not mention the origin or development theory for UG.
ReplyDeleteThe paper does not make the distinction between OG and UG.
DeleteOG rules can be learned by the language-learning child through unsupervised learning, supervised learning or instruction (explanation). Humans have certain preferences of innate biases that will allow them to facilitate their language acquisition. Although there is an innate mechanism in place to facilitate language acquisition, the child will still need input from which they will be able to extract which takes to follow by getting feedback.
UG rules can only be learned by language-speaking adults (or adolescents) by instruction.
While other species have communication, they do not have language, they do not have a “universal’ code. Language enables us to acquire new categories by recombining names of different categories. Language is not just computational, because unlike mathematics (which is), syntax cannot be independent of meaning/semantics. Humans are unique such that they have a “universal code” that allow the formation of such propositions which can describe anything there is, or any fiction.
Jenny, OG, like chess, can be learned. That has no bearing on UG, with which all languages comply, and the child complies with it without errors or feedback.
DeleteMelis, adolescents can be taught the rules of UG, but their brains already “knew” them, because they have already been following them for years, error-free, ever since they began to speak.
I’m not sure what you mean by “universal code.” Any natural language can express any possible proposition. But why syntax has to be UG-compliant, no one so far knows (not even Noam Chomsky!).
This article was very extensive and it seemed to me that the only point they were trying to make after all was that Grammar is a result of natural selection despite what some big names in this field may say (such as Chomski or Fodor). After reading some skywrittings, especially the professor's response to Alexi, I feel like now I understand more the logic argument as to why a part of grammar, UG, ‘needs’ to be innate. To me, the logical prepositions shown by Steven make a lot of sense. Furthermore, I question how this article, or perhaps the discussion whether grammar was evolved by natural selection or not related to the class material? Or how does it relate to the hard problem of consciousness? I am confused about that.
ReplyDeleteLanguage, more specifically the capacity for language, are central aspects of cognition, as can be readily shown by Turing's T2 test which evaluates the lifelong ability for verbal behavior indistinguishable from humans. This is why distingusihing between learned aspects of language capacity (OG) and innate/evolved facets of it (UG) appears interesting in cognitive science. The easy problem of cognitive science notably involves explaining how and why humans have the ability for language (which is a doing ability), which includes the ability for following UG rules. As for the hard problem of consciousness, I'm not sure it's related directly to evolutionary issues with UG, but anyone is more than welcome to pitch in about this!
DeleteThanks Amelie. I can start seeing now how this is related to the easy problem and knowing whether this ability is innate and even more, if it is innate then can we teach or even learn it? This seems to be crucial if we want to ever build a T2 passing robot.
DeleteAdding onto Amélie's response: Distinguishing between the OG and UG is important for answering the Easy Problem of cognition as the "how" and "why" of a human's capacity for behavior is connected to all language behavior, as language is a cognitive capacity. While evolutionary explanations posit an answer to "why" humans have language (except for UG as stated by my peers above), there is still a question of "how" we have these capacities that can be potentially answered through the reverse-engineering of language behavior in the TT.
DeleteOn this note, one thing that I was wondering was if we will ever be able to model UG properly when reverse engineering our cognitive capacities in the Turing Test. Or if we can accurately model human abilities through creating technology such as neural networks? OG displays that it can be learned through unsupervised learning, supervised learning, or instruction (explanation), which can be modeled in neural networks by exposing a machine to a lot of input so it can make correlations through unsupervised learning, or through supervised learning models where weights between associations can be properly altered based on corrective feedback on their output. However, since we have UG as innate, is there any way in which this can be currently modeled to figure out the “how” of its innateness besides coding what we know UG to be in a computer as a given before its exposure to input? I’m not sure if this question makes sense, if not let me know and I can try and clarify.
Hi Darcy! I had similar thoughts when considering how UG and OG related to the Turing test. If I am understanding your last question correctly, you are considering if UG can ever truly be reverse-engineered since it is hypothesized to be an innate capacity in humans. This concern is why UG poses such a problem to cognitive science and the evolutionary argument. Hypothetically, I think a machine would need UG to be able to pass the Turing test, as it would have to have to same ability as humans to acquire language quickly with limited feedback. The machine should be able to grasp any human language, as human beings also have this capacity.
DeleteHi Kimberly and Darcy. I think you're both speaking on the difficulty of reverse engineering UG. Personally, I don't think it would be any different than reverse engineering any other innate process. For example, memory is innate, we remember things but never learn how to do that. Obviously this is not to say it's easy to reverse engineer memory but I don't think that UG is distinct in its ability to be understood just because it is innate. Perhaps a model could be created which is able to stimulate the way language evolved and then we can see to what capacity UG is necessary to that process, or how it develops (if at all). Maybe UG would evolve independently or maybe it would need to be preprogrammed in order to simulate any language evolution.
DeleteAmélie, bravo (and thanks for being my T.A., answering Vitoria’s question!).
DeleteDarcy, good question: We learn to detect the features of animals, but the feature-detectors are innate for colors (we just have to learn color names: colors “pop out” already in the rainbow. Well, UG requires feature-detectors to recognize what is and is not UG-compliant, but it also requires “feature-producers” to produce UG-compliant speech. These are “mirror capacities,” as in phoneme perception and production. They can be innate, as they are for imitation or emotion perception/production. Either way, learned or innate, they can be reverse-engineered for T-testing to solve the Easy Problem.
But there is no special evolutionary problem for evolving color-detectors or phoneme-detectors or emotion perception/production mirror capacity, whether learned or innate.
Evolving UG feature detectors and producers is a much bigger challenge: What was their adaptive advantage? And how did they evolve? If the adaptive advantage of language was the power to produce any proposition, it is not clear why a much simpler (and learnable) syntax like 2nd Order Predicate Calculus could not have done the job, without UG. (See Chomsky’s hunch in Chomsky’s Universe, Week 9a.)
Kimberly, the problem for reverse-engineering is not that a capacity is innate and evolved. With UG the problem is that no one knows why UG was needed to be able to produce every proposition, and if so, how it would have evolved.
Sophie, you’re right, but UG is still a special problem: What is it for? Could it evolve gradually? How? 5% UG, 10% UG….? And if not gradually, then how could it be an all-or-none mutation? Nor is it credible that it is a side-effect (“spandrel”) of some other capacity: Which capacity? And what was its adaptive advantage? and how and why did it evolve?
Trying to answer what UG is for this led me to question, does UG have to be ingrained within language? Considering it's universality, I would have to take a superficial (based on my small knowledge deposits on the topic) stance and say yes, which further leads me to the conclusion that there is no potential for a language to exist without UG. To say this is to say that maybe, the 'innateness', as Chomsky describes it, of language in our brains really comes from the mechanisms built into the construction of language itself that make it so, and we have just evolved to understand language based on these mechanisms.
DeleteOr, I could take a different stance and say that since it was our brains that developed language over time, we built its rules around what seemed 'natural' to us aka around innate pre-existing structures/ideas that make it seem like language itself is innate, but in fact, it is just built on structures that are themselves innate.
This reading was particularly interesting as is posed a very well-though of view, (at least in my opinion), that contradicted one that I had previously learned about in many classes which is Chomsky’s thoughts that language is an innate human property, and not the result of natural selection. Pinker and Bloom refute that grammar does not come from natural selection, which is what I had previously believed as well. I too believed that due to the lack of genetic variation or intermediate forms, that grammar was indeed universal and not related to Darwinian evolution. What this paper has made clear to me is grammar is complex and meant to communicate complicated propositions, and evolutionary theory provides reasoning for traits being attributed to natural selection. Among these traits involve having a complex design for a function, (which is outlined in grammar’s purpose of allowing complex communication), and the lack of alternative mechanisms to explanation the complexity; which evidently fits grammar. I had also failed to previously think of how a child’s language acquisition differs from the overall language’s evolution in the species, as well as how language evolves conventionally so long as it’s shared between members of a species. By discussing these points, it is clear to me that language evolution therefore does demonstrate elements of natural selection. A further point I have thought of further refutes the initial counterexample of grammar lacking a selective advantage, as throughout history, when language-use has evolved, those who do not follow the newer “norms” or ways of putting together propositions could have been outcasted, demonstrating there is indeed a selective preference to language evolution as well.
ReplyDeleteKarina, not so fast! How did P & B explain how and why UG evolved? They spoke about “language” in general both its learned and evolved aspects – except UG (which they did not distinguish from OG). Please look at the other replies in this and other threads for Weeks 8a,b and 9a,b).
DeleteAs Dr. Harnad and the authors of this long piece have mentionned, such a discussion is a mere convention rather than any groundbreaking step in cognitive science. I was confused as to why the authors would refer to language as a “complex computational systems employing the same basic kinds of rules” as this comes to simplify our in-class discussion about language being much more than rule-based symbol manipulation, and that it must stem from a sensorimotor interaction in between symbols and our interaction with their referents in the real world. What’s more, in their paper, the authors refute certain theories about the development of our language capacities like the possibility language evolved as a side effect of “overall brain size” or “constraints of as-yet unknown laws of structure and growth.” We can all agree that adding more neurons to an existing circuit cannot create language capacities on its own. I believe language is not but a spandrel because it is a completely functional capacity as it facilitates second-hand learning in a way that makes our lifespans that much more efficient. Indeed, in our communities, we have the gift of cooperating through language which spares us time and energy, allowing us not only to focus on more important things but also to interest others and consider their own states, all of which endow evolutionary and competitive advantage.
ReplyDeleteTess, you will see the problem more clearly if you don't think of it as the problem of the evolution of "language" (which has many features, learned and innate), but the evolution of UG and particular (distinguishing it from OG, which is learnable, learned, and not a problem).
DeleteI found this reading to be quite dry and repetitive. Many of Pinker and Bloom’s replies to non selectionist theories seemed to be very trivial responses of the air that “we have no reason to believe this over selectionism”, which although may be true, does not provide a very solid foundation off of which to build a case for natural selection. Overall, and what is talked about in the second reading, the most compelling case for language is that it’s s advent provided a huge opportunity to make learning largely more efficient and thus survival more viable. I am surprised Pinker did not seem to touch on this much, he seemed more intent on disproving non-selectionist supporters. His weak conclusion is that language has a complex function and design within our cognitive capacities such that it could only be explained by natural selection.
ReplyDeleteLaura, P & B were begging the only nontrivial question when it comes to evolution and language: the evolution of UG. Please read the other comments and replies. P & B were doing what it called “breaking down open doors,” by talking about the features of language that are not a problem for evolution, and ignoring the one feature that is: UG.
DeleteIn this article, Pinker and Bloom argue against the nonselectionist arguments which state that the evolutionary framework is not adapted to explain the phenomenon of language. Instead, they make the argument that language, as it has a complex design serving a function, and no alternative explanations for its complexity, can only be explained by Darwinian theory. One argument they counter is the one that the mind is a single general-purpose computer. The authors answer by arguing that there most likely will never exist a program that can acquire language. This is because, according to this paper, the types of generalizations needed to acquire grammar are not compatible with the generalizations used to acquire other systems of knowledge from examples. While I am not entirely sure of what this part of the paper means, it seems relevant in our discussion of t-testing, even at the language level of the T2 computer.
ReplyDeleteMathilda, see the other comments and replies.
DeleteIn this paper Pinker and Bloom make the case that while the surface presentation of the grammar we consciously understand (OG) is variable and subject to cultural variation, the universal grammar proposed by Chomsky, whose theory proposes that all human language has shared constraints and structure that govern OG, could only have arisen from evolution. This is a solid argument, and I don't see any apparent flaws in Pinkers objections to Chomsky's logic. Indeed, if all humans' linguistic behaviour is reducible to some invariable substrate, then there is no way that could have come about except for evolution.
ReplyDeleteHowever, it seems to me that this paper only really works if you accept prima facie that Chomsky is right about universal grammar, just not about how it arose. And ultimately, it is hard to prove that universal grammar is a 'real thing' as far as that goes when we are studying diffuse cognitive architecture. Universal grammar is pretty easily disproven in its complex form (Chomsky himself only supports recursion).
So I wonder whether we could formulate another coherent way of understanding the evolution of language- which seems indisputable- without relying on a contentious universal grammar. Perhaps it is enough to say that language evolved (which P&B show well) and work backwards from there.
As many before me have mentioned, the focus of this article was to defend grammar specialization by neo-Darwinian processes. As mentioned also by many others and the professor the main problem with this paper is the omission of Universal Grammar (UG). It speaks of its evolution but hardly does it speak of where does it evolve from. As language is a crucial part of cognition, this is like a difficult approach if we only had OG to rely on when considering the Turing Test.
ReplyDeleteInteresting puzzle for the whole class: How come GPT-3 (which certainly can't pass T2) speaks/writes UG-compliantly?
DeleteFrom my understanding, GPT-3 composes sentences by selecting the “best” next word as it goes, based on the previous words and all the words across the internet and books that it was immersed in. I think that its vast exposure to sentences used on the internet/texts allows GPT-3 to generate the probability of what word might occur next in a sentence so it can accurately predict the correct word. Just from being exposed to language, GPT-3 can speak/write UG-compliantly. It learns UG-compliant sentences through unsupervised learning, whereas humans know UG innately and cannot learn/change it. Although GPT-3 can compose UG-compliant sentences, it does not do so using the rules of UG; it does not have the same innate mechanism that humans share. UG has not been reverse engineered in GPT-3, and we do not know how or why UG evolved.
DeleteTo answer "How come GPT-3 (which certainly can't pass T2) speaks/writes UG-compliantly?"
DeleteGPT-3 is able to write UG-compliantly because it's trained to write OG-compliantly. GPT-3 is trained on 45TB of text content created by humans (that innately always write and speak UG compliantly) to generate its own written content. Because all language follows the rules of UG and GPT-3 was trained on an immense amount of content that follows those rules, GPT-3 is able to write UG-compliantly. Like Josie said, even though GPT-3 is able to compose UG-compliant content, it does not do it using the rules of UG.
GPT-3 is interesting because, unlike its predecessors, it is far more capable of generating languages other than English (German, Russian, Japanese, etc). Because of its greater flexibility in handling multiple languages, it would be interesting to see if it were possible to use GPT-3 to find shared features between all the languages it is exposed to. If GPT-3 is able to find shared features across a massive multilingual dataset, it may provide hints about the rules of UG.
Josie and Sophearah, but Sophearah you seem to contradict yourself in saying GPT-3 can’t learn UG from just reading UG compliant sentences in one language, but maybe it could do so from reading UG compliant sentences in many languages. How? It still has the poverty of the stimulus (positive examples only): no UG-violating sentences in any language, hence no corrections.
DeleteWhat GPT-3 might do, though, if you give it texts in multiple languages, is produce typical 2nd-language-learner errors, some of which might violate not just OG but UG in one of the languages. But GPT-3 does not get corrections, so the errors would never get fixed.
While I find this reading interesting in the sense that Pinker and Bloom have approached the evolution of language from a different perspective, I am surprised that they have only slightly mentioned Chomsky’s poverty of the stimulus argument and therefore how universal grammar has evolved. Since this paper is refuting many ideas that involve that language is not the product of natural selection, I would honestly expect more oppositions to the poverty of stimulus argument. Therefore, I think the paper is lacking the arguments about some of the most important theories about language and I would have preferred to know what these authors think about this topic and how they would have argued against these concepts.
ReplyDeleteThe authors seem to argue that language evolved through natural selection in the same way other human biological features evolved because they were advantageous. The authors reject the point of view that language evolved only as a side effect of natural selection because except universal grammar (UG) that seems to be more innate, different aspects of language such as speech and auditory features are characteristics of languages that have always genetically evolved over time. UG is different because, as previously said, children seem to never make UG mistakes, compared to OG mistakes. That is surprising if we think that they never learned UG nor received feedback for it. Universal grammar is found in every language and cannot be learned or unlearned. One thing that I am wondering is why there is no one universal language for the human species instead of many different languages spoken around the world. Is there an evolutionary advantage to that?
ReplyDeleteI was also wondering why there is no universal language. However, after consideration, I want to assume the answer has to do with efficiency. Specifically, perhaps the origin of the various existing languages can be traced back to a survival instinct. In this sense, it is easier to communicate with those in the same area/kin with the same language that fits their needs (ex: protection of resources, words that fit the environment, location of threats, etc.). Hence, humans simply cannot function on a single universal language, or else their survival tactics will be jeopardized.
DeleteAnother theory I have on the reason why there is no one universal language may be due to the notion that “evolution is lazy”. In this case, evolution will never go off in a direction that enhance current scenarios (in this case, humans going on one language), evolution will instead accept a viable solution that simply works (having various languages)
Maira, you’re right, it’s because evolution is lazy. Evolving the capacity to learn language rather than building in one specific language for all would be both costly and inflexible. (Same thing applies to everything else for which lazy evolution found it cheaper and easier to evolve the capacity to learn it, rather than building it all in.)
DeleteThis is definitely an interesting discussion. I too was pondering on the idea of why there isn't a universal language but there is a universal grammar. Evolution being lazy as an answer to that question makes sense, and makes me think of other instances where evolution is lazy, due to the other solution being too inflexible. This paper gave an interesting run down and arguments on language as an advantageous natural selection and UG as an innate ability.
DeleteAs my peers have said above, Pinker and Bloom argue in favour of language/grammar being an evolved function. In the last part of the paper, they outline arguments that have tried to refute this. I found their response addressing the reproductive advantages of language quite enlightening. It is not obvious how having a recursive grammar is beneficial evolutionarily, cavemen speaking in complex sentences would not help them hunt for food or find a mate. As Pinker and Bloom point out, this is a bit of a “Granny argument”, just because it does not seem like cave-people needed grammar, does not mean it's not evolved. Evolution does not work on a small scale like that, it occurs over many generations and is contingent on certain probabilities. They further discuss the role of grammar in technology and socialization, outlining why it is important for the ‘survival of the fittest’.
ReplyDeleteSophie, yes, but P & B still skirt the only nontrivial question: How and why did UG evolve?
DeleteAs Pinker illustrates, humans discovered their ability to acquire information about the world second-hand, through other individuals who have gone through the trial-and-error process to win that knowledge. Language, therefore, is probably the most important factor that contributed to the development and success of human beings. Since we could teach our young and others in our group our ways of living, no knowledge was lost, and the new techniques of hunting, tool-making, campsites, etc., were passed on from generation to generation. As this knowledge grew, so too did the information being passed on, and we were able to continuously learn better and more nuanced ways to survive.
ReplyDeleteAlexander, those are adaptive advantages of language in general -- but what about UG in particular?
DeleteThis reading argues for the theory that our use of language and grammar was developed through evolution primarily by refuting arguments against the theory. I do agree that an innate feature such as universal grammar that greatly aids in survival would be a product of evolution. I was most interested by the discussion of the development of more complex features of grammar, as it helped me realize how complex grammar must be to convey even basic concepts and resolve common ambiguities. Furthermore, the resulting 'cognitive arms race' helped explain why our grammar is perhaps more advanced than would be strictly necessary for survival.
ReplyDeleteElena, how does language with UG help survival better than language with just OG?
DeleteI think that with UG, children are somehow prepared to language since there is already a device in their heads that tells them innately how rules of languages should look like, and thus the structure of languages will be familiar to them to some extent (the language acquisition device). The presence of UG makes possible the poverty of the stimulus, where children can acquire their first language without any errors heard, and let humans be able to acquire a new language that is completely foreign by immersion. Without UG, children have to try learning their first language by supervised / unsupervised learning, which is much more costly than with UG. It is actually possible that a lot of people would not learn any language in their lives if there were no UG – and communication and thus survival will be harder. Also, the only way to learn another language is by knowing all of their OG rules, and learning by immersion is not possible, which will also increase the cost of inter-group communication. This will also hinder survival.
DeleteI would agree with Han's response. I think the primary benefit of UG is that it makes language learning more efficient. It also makes miscommunication less likely, as it restricts the possible interpretations of a string of words in a particular order.
DeleteThe benefits of UG are clear and have been described by Han above. Without UG humans would not have language capacities that are even close to what we have; because we would have to learn all the rules for language for all cases (super inefficient and probably impossible). While we don't know what exactly provided the biological basis for the emergence of UG in the human brain, I still think it is safe to assume that there is an evolutionary explanation. Human beings language capacities are the greatest differentiator between us and other species, and are probably more than anything else responsible for the domination of our species on this planet.
DeleteUG is an innate ability that cannot be learned through: supervised learning (trial-and-error with corrective feedback), hearsay (subject-predicate explanations), or unsupervised learning (exposure without feedback) since children never make UG errors nor hear others make them. This lack of exposure to UG errors is referred to as 'the poverty of the stimulus'. Thus, UG is a genetic, evolved ability, and there is no clear explanation for how/why it evolved.
ReplyDeletePinker and Bloom suggest that language is no more special than any other evolved trait. I like this argument because I think it's silly to privilege language above other doing capacities. For example, a friend of mine is in a 500-level biology course on animal communication this semester and told me yesterday that prairie dogs can recognize different coloured shirts on humans and communicate this with their community. If prairie dogs have evolved to have rudimentary language (i.e., gestures and sounds from one prairie dog are messages understood by another), then surely language is no more special than any other adaptive trait selected for because of its survival and reproductive utility.
This discussion made me initially think that a T4-passing mechanism (i.e., neural indistinguishability) would be needed to explain how/why UG evolved, however, we would run into Fodor's functional localization problem: even if we knew the genetic seat of UG, we could still not explain how it works nor why it evolved. I'm looking forward to discussing UG's evolutionary origins further over the coming weeks to see if we can shed any light on this mystery.
Polly, a language is a communicative code for conveying information, but not every communicative code for conveying information is a language (unless you are using the word “language” very loosely!) A language is a communicative code that can convey any and every proposition. Prairie dogs (whom I respect and admire) are brilliant, but they do not have a communicative code that convey any and every proposition. (And it does not have – or need – OG, let alone UG!)
DeleteA reverse-engineered mechanism that can pass T4 can also pass T3 and T2, so Fodor’s problem is irrelevant, whether or not we know how to map the T3 capacity (how-and-why) onto the where-and-when of the neural function. We have still provided a causal explanation of T3 capacity.
The paper presents some good arguments for WHY language is the result of evolution. It is a very complex thing to logically analyze: the language capacity has many components, some need the biological / evolutionary causal mechanisms to learn, and others require a priori exposure to language. UG is special because it faces the poverty of stimulus problem, it seems to be an "innate" part of language.
ReplyDeleteMonica, yes, but the problem is not just that UG is unlearnable, hence innate, but how, is it evolvable?
DeleteThis paper’s general argument is that human language evolved by natural selection. The evolution of human language has been done because of its evolutionary advantages, partly through culture and learning, partly through genetic evolution. Its aim is to show that there are no contradiction between language abilities and evolutionary principles. As we’ve discussed we can essentially distinguish two types of grammaes. On one hand there is ordinary grammar (OG) which is a type of grammar that is learned by children at their youngest age. It differs from language to language. It is essentially the rules we learn as children and we define as “grammar”. On the other side, there is universal grammar (UG) which is not learnable by children. They can’t make mistake with UG and learn. It is just innately there. Furthermore, UG does not evolve. Thus, it poses a problem to the evolutionary explanation of language capacities. This reading does not distinguish languages features which are UG and OG. It treats language as a whole.
ReplyDeleteInes, UG poses a problem for evolutionary explanation not because it does not evolve but because it is there: How (and why) did it evolve?
DeleteFrom this paper, and from reading other skies and replies above, to my understanding universal grammar is an innate mechanism that helps us acquire language that is rule-based. The motivating question here is how do humans have the capacity of language-learning, and what makes language- learning so special? Universal grammar cannot be learned by unsupervised learning because it is not solely reliant algorithmic mechanisms identifying patterns in language and stimulus. Nor can it be learned by supervised learning because that would require corrective feedback and trial & error. If no errors are being made, then UG must be innate, which is why a child cannot learn UG by instruction, and therefore must already know it. Ordinary grammar refers to the specific rules of grammar that a child learns, which is distinct in each language. This is what P and B’s paper fails to address, how UG evolved, and rather focused on OG and UG in unison when referring to language.
ReplyDeleteI think that the explanation of how UG evolved, despite adding a deeper sense of understanding to this paper, is not entirely necessary for the authors to clarify for this paper to make the point that it does. However, solving the origin of UG is definitely a process that is going to involve linguistic, anthropological and biological evidence. If we are to believe in Darwin’s theory of evolution, it stands to reason that natural selection must have dictated at some point in human history that a certain set of communication rules was advantageous. The question however is how this notion was inherited and passed down genetically, since Chomsky’s claim is that it is not learned, it is innate, and so far, we have yet to find a neural mechanism that resembles this.
DeleteBrandon, yup, that’s why explaining the evolution of UG is problematic (which P & B never mentioned, when they explained how explaining the evolution of “language” is unproblematic).
DeleteI'd like to take another crack at this. So the reason why it is hard to explain the evolution of UG is the following: UG seems to be an essential part of language, but nobody knows why. We understand grammar as being the rules for language, and these rules vary between languages. If we're going to have variation in rules, we need to have language first. And only then can we establish variation in its rules. We have very strong evidence that all languages that are spoken complies with UG. And as humans, we are born with the natural constraints imposed on by UG. In class, we discussed the difference between illegal and legal sentences in UG. A child learning a language will never utter an illegal sentence in UG (because they never make mistakes, there is no correction required, they already know it innately) is what we refer to as the poverty of the stimulus argument. And so, UG unlike OG, is a set of rules that we follow, but they can't be learned.
DeleteSara, good summary. The only thing to add is that without (innate) UG language could not be learned, and hence language presumably could not have evolved. But the details are not yet clear to anyone.
DeleteThe notion that grammar in language is “good design”, as evidence by universal linguistic facts that are provided in the reading such as lexical categories, phrase structure rules and wh- movement makes sense in my opinion, especially in regard to what I learned in my linguistics class last year. Apparently, variation in language occurs for many reasons, but there is some evidence that environmental adaptation does play some what of a role; for example, in Inuit languages, there are many words for snow due to the fact that being able to classify snow in different ways is useful for hunting, tools and survival. Since adaptation is the root of natural selection, it would make sense that language and its universal grammar rules are a product of evolution.
ReplyDeleteBrandon, see (by the author of the skyreading for Week 9b):
DeletePullum, G. K. (1989). The great Eskimo vocabulary hoax. Natural Language & Linguistic Theory, 275-281
(though that story is not entirely over yet...)
While the paper presented a comprehensive evolutionary explanation for language, it omitted the origin of universal grammar. Other features of the capacity for language, like ordinary grammar are explained by its adaptive function both by our physical specifications that enable speech production, and its cultural and social relevance, however, universal grammar presents a problem for this line of explanation. While ordinary grammar is learned through directly from unsupervised and supervised learning but that the rules universal grammar are too complex that they appear to be innate. This indicates an ontological difference between universal and ordinary grammar. P&B do not make this distinction, nor do they reconcile why this distinct feature of language might have evolved.
ReplyDeleteEmma, "ontological difference"? What does that mean? The problem with UG is not that it is innate. Lots of complex capacities are innate (spider web weaving, for example. The problem is how and why UG evolved.
DeleteNadila, explaining the evolution of UG is hard, but it is not unsolvable, and will no doubt be solved (see "Chomsky's Universe). It is "a" had problem, but not "the hard problem" (of how and why organisms FEEL rather than just DO). That may turn out to be unsolvable.
ReplyDeleteNadila Asikaer commented on "8a. Pinker, S. & Bloom, P. (1990). Natural language and natural selection"
Delete14 hours ago
(my comment got omitted again) From what I read in the article and what was discussed in the replies, I better understood why the evolution of language is not a unique evolutionary problem- only UG is. And how any UG must have evolved like every other human capacity. The rule of UG must be innate because no one makes UG errors, and no correction is needed. Even the children know master UG rules when in fact, they cannot be learned. Therefore, the rules of UG must be innate. The whole article talked about the evolution of language. Nevertheless, P&B skipped the problem of the evolution of UG-- mainly, how it is evolved and why, because they don't know the answer like everyone else. It seems like we are trying to find an answer for something unsolvable --- the evolutionary origin of the complex and abstract rules of UG, which is unlearnable because of the poverty of stimulus
I re-posted Nadila's earlier posting because a bug in blogger deleted it.
DeleteSky8a:
ReplyDeleteThere are some interesting discussion in the replies about reverse engineering UG(universal grammar) capacity. But first what might UG capacity be? If we were to reverse engineer there must be a way to test it using T2, T3 or T4. Further, is UG capacity a "spandrel" of language evolution? How is the UG capacity related to other language capacities such as OG(ordinary grammar)? Finally, why evolution does not explain UG capacity? Why does evolution explains OG but not UG? (I think I understand the difference of UG and OG, but these last two questions seem clear to the class but really confuse me.)
We know that Universal grammar (UG) exists and has evolved because it has stayed with us and is still here. However, it is hard to explain how and why did UG evolve?
ReplyDeleteThis question has not thoroughly been answered in the text by Pinker and Bloom even though this is the only non-trivial problem in language. Universal grammar is said to be innate because it cannot be learned. Indeed, it is not possible (unless you are willingly trying to like Chomsky at MIT) to make incorrect UG sentences. There can never be a case where someone is corrected and receives feedback pertaining to their mistake. So, the complex rules that form UG cannot be learned through supervised learning (trial-and-error with corrective feedback). In addition, it cannot also be learned by unsupervised learning or hearsay. This is the argument of “poverty of stimulus”, children do not hear enough UG errors to be able to learn UG rules, so they must be an innate capacity that has evolved with time. The way I understand the powerful reasoning behind Chomsky’s argumentation is that for anything for which you cannot give incorrect examples, you cannot learn. In other words, since there are no wrong UG-compliant sentences, children are only exposed to positive stimuli, so they never get feedback and cannot possibly learn. I wonder if there are other rules defining categories for which you cannot find any counterexample? And if so, can we use Chomsky’s vigorous argumentation to imply that these categories are innate?
Étienne, linguists are not trying to make UG-violating sentences. They are trying to test potential rules of UG by seeing whether they produce UG+ or UG- sentences. The way they test it is by consulting their own innate knowledge (implicit, not explicit) of UG through their “grammaticality judgments”: “If I violate this potential rule, does this make this sentence sound wrong (the way it does when I violate an OG rule)? (See the discussion of grammaticality judgments in other threads of 9a and 9b.)
DeleteUG (or at least as much of it as linguists have so far discovered explicitly) can be taught verbally to those who already have language, but only by taking syntax courses. And it cannot be taught to the language-learning child, who does not yet have enough of a grip on language to understand the instruction!
You cannot learn the features or rules that distinguish members from non-members of a (nontrivial) category by supervised learning (trial, error, corrective feedback) unless you have some trials with members and some with non-members (C+ and C-). (Same is true for unsupervised learning, but for UG+/UG- unsupervised learning is far too underdetermined.
We’ve discussed the differences between OG and UG at length at this point, and the way that children do not make UG errors because it is innate. I still find myself intrigued at the place of second languages in the discussion on OG and UG. Obviously learning a second language, especially outside of the critical period, is not learned in the same way a native language is, and so therefore UG mistakes are possible for a second language learner, but I wonder if the relation of the second language to the native language has any bearing on the likelihood of UG mistakes. If, for example, one is learning a second language that is closely related to their first language with a similar grammar structure (like if both had a SVO structure), would the UG errors made be significantly reduced? Or if the languages were highly mutually intelligible, like Swedish and Norwegian, would a native speaker of one even be vulnerable to UG errors when learning the other?
ReplyDeleteI wouldn't say that UG mistakes are possible for a second language learner either, considering the understanding of UG we've been using in this class. From my understanding (from this course and as a linguistics student), UG provides the general structural possibilities which each particular language (OG) can then instantiate. In previous traditions in linguistics ("Principles & Parameters") UG provided the principles which all languages follow but each language (OG) sets particular parameter settings, and learning a language is essentially just learning its parameter settings. For example, as you mentioned SVO languages, whether the verb will precede the object (SVO) or vice versa (SOV) within the verb phrase is a parametric difference; each language will do one of these, and then may also move the verb or the object somewhere else (OVS, OSV, VSO...). But UG provides the cognitive architecture for representing and manipulating these structures, that provides those possibilities in the first place. The mistakes a second language learner makes are just going to be mistakes related to the parameter settings and not having fully acquired them, and accidentally using the parameter settings of their first language. But theoretically any mistake they make will be according to the parameter settings of some possible language, thus they will never actually be UG errors as they wouldn't quite be cognitively able to produce UG-violating sentences when trying to speak in a natural context.
DeleteIt seems my comment for 8A was removed by blogspot. My comment was the following:
ReplyDeleteWhat I’ve gathered from this reading is that UG is innate– it is a set of rules that we follow that cannot be learned through any sort of feedback. The Poverty of Stimulus argument addresses this… children never make UG errors, so there is no feedback system or any sort of learning involved with it because there are no errors! OG, on the other hand, is learned. What this paper fails to address is the question of the evolutionary value of UG… how and (more importantly) WHY did UG get into our brains? In a way, it is much like the hard problem of cognitive science. The (so far) unanswerable question of the hard problem is how and why do we feel what we feel? UG is the same.
One of the goals of cognitive science is to reverse engineer human beings’ capacity for language. In this endeavour it is important to differentiate between learned language capacity and innate language capacity. Unfortunately, the paper by Bloom & Pinker does not do a very good job of differentiating the two. The authors argue that language evolved like any other trait because natural selection deemed it valuable for survival. For anyone who believes in Darwin’s theory of evolution, this conclusion seems somewhat obvious. Things start to get less obvious when we look specifically at UG capacities though, which are a built-in feature and not learned. How then did UG capacities manifest in our brains? I do not know, but UG’s value in jumpstarting language acquisition in children is clear, and provides a big evolutionary advantage.
ReplyDeleteI think language must be my favorite topic in cognition. Pinker and bloom just about acknowledge what I find to be most interesting in language without discussing the structure behind it (UG being innate). “Humans acquire a great deal of information during their lifetimes. Since this acquisition process occurs at a rate far exceeding that of biological evolution…” So while they discussed all the reasons why language was an advantage to come about whether it be the desirable vocal auditory channel and the rich set of propositions it allows us, they still failed to address a foundation of language. As language is one of the doing capacities of cogsci that we would need to include in reverse engineering, it would be important to explain its how and why (UG).
ReplyDeleteUniversal grammar is the innate component of language that goes against evolution because all children follow UG without breaking the rules or making feedback. It is innate because children receive no verbal or supervised learning, meaning they do not receive feedback. Without UG, people would be unable to grasp the aspects of language that need to be taught. However, an issue in this paper is that they need to delve into UG. They focus on dissecting language through its aspects that are learned and have evolved. UG goes against Pinker and Bloom’s stance on language because UG is the innate system within all humans that helps humans acquire language.
ReplyDeleteThe downfall of Pinker and Bloom’s text is that they fail to differentiate OG and UG. UG, as supported by Chomsky, cannot be learned, because children can’t make mistakes in it (they can’t violate UG). It seems to be an essential structure in learning language. Without it, children would have to learn grammar rules from scratch, requiring much more supervised learning, unsupervised learning, and correction than children receive. On the other hand, OG is learnable in every language and requires supervised and unsupervised learning.
ReplyDeleteUG poses a problem because of the poverty of the stimulus. Children can’t make errors or receive instruction about UG. They are never exposed to negative examples. This supports it being innate, meaning it must have evolved over time. But the big question is how and why it came to be.