{"title":"SRGAP2 and the gradual evolution of the modern human language faculty","authors":"Pedro Tiago Martins,Maties Marí,Cedric Boeckx","doi":"10.1093/jole/lzx020","DOIUrl":"https://doi.org/10.1093/jole/lzx020","url":null,"abstract":"","PeriodicalId":37118,"journal":{"name":"Journal of Language Evolution","volume":"359 ","pages":"67-78"},"PeriodicalIF":2.6,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138515168","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"What aDNA can (and cannot) tell us about the emergence of language and speech","authors":"R. DeSalle, I. Tattersall","doi":"10.1093/JOLE/LZX018","DOIUrl":"https://doi.org/10.1093/JOLE/LZX018","url":null,"abstract":"","PeriodicalId":37118,"journal":{"name":"Journal of Language Evolution","volume":"3 1","pages":"59-66"},"PeriodicalIF":2.6,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JOLE/LZX018","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"61534574","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Rethinking the relationship between pronoun-drop and individualism with Bayesian multilevel models","authors":"Sean Lee","doi":"10.1093/JOLE/LZX003","DOIUrl":"https://doi.org/10.1093/JOLE/LZX003","url":null,"abstract":"Can the language we speak determine how we represent the world around us? To those familiar with the theory of linguistic relativity, this may seem like an age-old question about which everyone has their own answer. Although the evidence supporting linguistic relativity remains controversial, the long reach of language into our perception and behavior is nevertheless an intriguing possibility that deserves further investigation. Here I take a closer look at a case of linguistic relativity that had a particularly strong impact on cross-cultural research: the pronoun-drop effect. The theory of pronoun-drop effect posits that languages that allow their speakers to drop subject pronouns in verbal communication would lead their speakers to create collectivistic culture. It was argued that the absence of pronouns necessitates the speakers to embed their self-identities in the context of social interaction, so the linguistic practice of omitting pronouns would reduce the sense of individuality in the minds of speakers. After conducting a series of Bayesian multilevel analyses on the original dataset, however, the current study concludes that the pronoun-drop effect is unlikely to be a robust, universal phenomenon. The analyses revealed that the majority of statistical signal supporting the phenomenon comes from the Indo-European language family, and other families provided little or inconsistent evidence. It was also observed that the Indo-European languages alone made up 61 per cent of the original dataset, and dropping them from analysis completely nullified the pronoun-drop effect. These observations suggest that the pronoun-drop effect is a consequence of failing to account for (i) varying effects among language families and (ii) overrepresentation of the Indo-European languages. With these results, this article suggests that the theory of pronoun-drop effect should be thoroughly revised. Additionally, the article provides several suggestions for many similar cross-cultural studies that suffer from the same problems as the pronoun-drop effect study.","PeriodicalId":37118,"journal":{"name":"Journal of Language Evolution","volume":"2 1","pages":"188-200"},"PeriodicalIF":2.6,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JOLE/LZX003","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45544050","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Gorillas may use their laryngeal air sacs for whinny-type vocalizations and male display","authors":"Marcus Perlman, Roberta Salmi","doi":"10.1093/JOLE/LZX012","DOIUrl":"https://doi.org/10.1093/JOLE/LZX012","url":null,"abstract":"Great apes and siamangs—but not humans—possess laryngeal air sacs, suggesting that they were lost over hominin evolution. The absence of air sacs in humans may hold clues to speech evolution, but little is known about their functions in extant apes. We investigated whether gorillas use their air sacs to produce the staccato ‘growling’ of the silverback chest beat display. This hypothesis was formulated after viewing a nature documentary showing a display by a silverback western gorilla (Kingo). As Kingo growls, the video shows distinctive vibrations in his chest and throat under which the air sacs extend. We also investigated whether other similarly staccato vocalizations—the whinny, sex whinny , and copulation grunt —might also involve the air sacs. To examine these hypotheses, we collected an opportunistic sample of video and audio evidence from research records and another documentary of Kingo’s group, and from videos of other gorillas found on YouTube. Analysis shows that the four vocalizations are each emitted in rapid pulses of a similar frequency (8–16 pulses per se-cond), and limited visual evidence indicates that they may all occur with upper torso vibrations. Future research should determine how consistently the vibrations co-occur with the vocalizations, whether they are synchronized, and their precise location and timing. Our findings fit with the hypothesis that apes—especially, but not exclusively males—use their air sacs for vocalizations and displays related to size exaggeration for sex and territory. Thus changes in social structure, mating, and sexual dimorphism might have led to the obsolescence of the air sacs and their loss in hominin evolution.","PeriodicalId":37118,"journal":{"name":"Journal of Language Evolution","volume":"2 1","pages":"126-140"},"PeriodicalIF":2.6,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JOLE/LZX012","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45977185","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Usage context influences the evolution of overspecification in iterated learning","authors":"Peeter Tinits, Jonas Nölle, S. Hartmann","doi":"10.1093/JOLE/LZX011","DOIUrl":"https://doi.org/10.1093/JOLE/LZX011","url":null,"abstract":"This article investigates the influence of contextual pressures on the evolution of overspecification, i.e. the degree to which communicatively irrelevant meaning dimensions are specified, in an iterated learning setup. To this end, we combine two lines of research: In artificial language learning studies, it has been shown that (miniature) languages adapt to their contexts of use. In experimental pragmatics, it has been shown that referential overspecification in natural language is more likely to occur in contexts in which the communicatively relevant feature dimensions are harder to discern. We test whether similar functional pressures can promote the cumulative growth of referential overspecification in iterated artificial language learning. Participants were trained on an artificial language which they then used to refer to objects. The output of each participant was used as input for the next participant. The initial language was designed such that it did not show any overspecification, but it allowed for overspecification to emerge in 16 out of 32 usage contexts. Between conditions, we manipulated the referential context in which the target items appear, so that the relative visuospatial complexity of the scene would make the communicatively relevant feature dimensions more difficult to discern in one of them. The artificial languages became overspecified more quickly and to a significantly higher degree in this condition, indicating that the trend toward overspecification was stronger in these contexts, as suggested by experimental pragmatics research. These results add further support to the hypothesis that linguistic conventions can be partly determined by usage context and shows that experimental pragmatics can be fruitfully combined with artificial language learning to offer valuable insights into the mechanisms involved in the evolution of linguistic phenomena.","PeriodicalId":37118,"journal":{"name":"Journal of Language Evolution","volume":"2 1","pages":"148-159"},"PeriodicalIF":2.6,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JOLE/LZX011","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49467925","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The emergence of linguistic structure in an online iterated learning task","authors":"Clay Beckner, J. Pierrehumbert, J. Hay","doi":"10.1093/JOLE/LZX001","DOIUrl":"https://doi.org/10.1093/JOLE/LZX001","url":null,"abstract":"Previous research by Kirby, Cornish & Smith (2008) has found that strikingly compositional language systems can be developed in the laboratory via iterated learning of an artificial language. However, our reanalysis of the data indicates that while iterated learning prompts an increase in language compositionality, the increase is followed by an apparent decrease. This decrease in compositionality is inexplicable, and seems to arise from chance events in a small dataset (4 transmission chains). The current study thus investigates the iterated emergence of language structure on a larger scale using Amazon Mechanical Turk, encompassing 24 independent chains of learners over 10 generations. This richer dataset provides further evidence that iterated learning causes languages to become more compositional, although the trend levels off before the 10th generation. Moreover, analysis of the data (and reanalysis of Kirby, Cornish & Smith, 2008) reveals that systematic units arise along some meaning dimensions before others, giving insight into the biases of learners.","PeriodicalId":37118,"journal":{"name":"Journal of Language Evolution","volume":"2 1","pages":"160-176"},"PeriodicalIF":2.6,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JOLE/LZX001","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44659349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"In the beginning: <i>A review of Robert C. Berwick and Noam Chomsky's Why Only Us</i>.","authors":"Michael Studdert-Kennedy, Herbert Terrace","doi":"10.1093/jole/lzx005","DOIUrl":"10.1093/jole/lzx005","url":null,"abstract":"<p><p>We review Berwick and Chomsky's <i>Why Only Us, Language and Evolution,</i> a book premised on language as an instrument primarily of thought, only secondarily of communication. The authors conclude that a Universal Grammar can be reduced to three biologically isolated components, whose computational system for syntax was the result of a single mutation that occurred about 80,000 years ago. We question that argument because it ignores the origin of words, even though Berwick and Chomsky acknowledge that words evolved before grammar. It also fails to explain what evolutionary problem language uniquely solved (Wallace's question). To answer that question, we review recent discoveries about the ontogeny and phylogeny of words. Ontogenetically, two modes of nonverbal relation between infant and mother begin at or within 6 months of birth that are crucial antecedents of the infant's first words: intersubjectivity and joint attention. Intersubjectivity refers to rhythmic shared affect between infant and caretaker(s) that develop during the first 6 months. When the infant begins to crawl, they begin to attend jointly to environmental objects. Phylogenetically, Hrdy and Bickerton describe aspects of <i>Homo erectus</i>' ecology and cognition that facilitated the evolution of words. Hrdy shows how cooperative breeding established trust between infant and caretakers, laying the groundwork for a community of mutual trust among adults. Bickerton shows how 'confrontational scavenging' led to displaced reference, whereby an individual communicated the nature of a dead animal and its location to members of the group that could not see it. Thus, both phylogenetically and ontogenetically, the original function of language was primarily an instrument of communication. Rejecting Berwick and Chomsky's answer to Wallace's question that syntax afforded better planning and inference, we endorse Bickerton's view that language enabled speakers to refer to objects not immediately present. Thus arose context-free mental representations, unique to human language and thought.</p>","PeriodicalId":37118,"journal":{"name":"Journal of Language Evolution","volume":"2 2","pages":"114-125"},"PeriodicalIF":2.1,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6715309/pdf/nihms-1000540.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9649700","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Infinitely productive language can arise from chance under communicative pressure","authors":"S. Piantadosi, Evelina Fedorenko","doi":"10.1093/JOLE/LZW013","DOIUrl":"https://doi.org/10.1093/JOLE/LZW013","url":null,"abstract":"Human communication is unparalleled in the animal kingdom. The key distinctive feature of our language is productivity: we are able to express an infinite number of ideas using a limited set of words. Traditionally, it has been argued or assumed that productivity emerged as a consequence of very specific, innate grammatical systems. Here we formally develop an alternative hypothesis: productivity may have rather solely arisen as a consequence of increasing the number of signals (e.g. sentences) in a communication system, under the additional assumption that the processing mechanisms are algorithmically unconstrained. Using tools from algorithmic information theory, we examine the consequences of two intuitive constraints on the probability that a language will be infinitely productive. We prove that under maximum entropy assumptions, increasing the complexity of a language will not strongly pressure it to be finite or infinite. In contrast, increasing the number of signals in a language increases the probability of languages that have—in fact—infinite cardinality. Thus, across evolutionary time, the productivity of human language could have arisen solely from algorithmic randomness combined with a communicative pressure for a large number of signals.","PeriodicalId":37118,"journal":{"name":"Journal of Language Evolution","volume":"2 1","pages":"141-147"},"PeriodicalIF":2.6,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JOLE/LZW013","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43988158","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Language for $200: Success in the environment influences grammatical alignment","authors":"S. Lev-Ari, S. Peperkamp","doi":"10.1093/JOLE/LZW012","DOIUrl":"https://doi.org/10.1093/JOLE/LZW012","url":null,"abstract":"Speakers constantly learn language from the environment by sampling their linguistic input and ad-justing their representations accordingly. Logically, people should attend more to the environment and adjust their behavior in accordance with it more the lower their success in the environment is. We test whether the learning of linguistic input follows this general principle in two studies: a corpus analysis of a TV game show, Jeopardy, and a laboratory task modeled after Go Fish. We show that lower (non-linguistic) success in the task modulates learning of and reliance on linguistic patterns in the environment. In Study 1, we find that poorer performance increases conformity with linguistic norms, as reflected by increased preference for frequent grammatical structures. In Study 2, which consists of a more interactive setting, poorer performance increases learning from the immediate social environment, as reflected by greater repetition of others’ grammatical structures. We propose that these results have implications for models of language production and language learning and for the propagation of language change. In particular, they suggest that linguistic changes might spread more quickly in times of crisis, or when the gap between more and less successful people is larger. The results might also suggest that innovations stem from successful individuals while their propagation would depend on relatively less successful individuals. We provide a few historical examples that are in line with the first suggested implication, namely, that the spread of linguistic changes is accelerated during difficult times, such as war time and an economic downturn.","PeriodicalId":37118,"journal":{"name":"Journal of Language Evolution","volume":"2 1","pages":"177-187"},"PeriodicalIF":2.6,"publicationDate":"2017-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JOLE/LZW012","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47512896","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}