M. Elsner, Andrea D. Sims, Alexander Erdmann, A. Hernandez, Evan Jaffe, Lifeng Jin, Martha Booker Johnson, Shuan O. Karim, David L. King, Luana Lamberti Nunes, Byung-Doh Oh, Nathan Rasmussen, Cory Shain, Stephanie Antetomaso, Kendra V. Dickinson, N. Diewald, Michelle Mckenzie, S. Stevens-Guille
{"title":"Modeling morphological learning, typology, and change: What can the neural sequence-to-sequence framework contribute?","authors":"M. Elsner, Andrea D. Sims, Alexander Erdmann, A. Hernandez, Evan Jaffe, Lifeng Jin, Martha Booker Johnson, Shuan O. Karim, David L. King, Luana Lamberti Nunes, Byung-Doh Oh, Nathan Rasmussen, Cory Shain, Stephanie Antetomaso, Kendra V. Dickinson, N. Diewald, Michelle Mckenzie, S. Stevens-Guille","doi":"10.15398/jlm.v7i1.244","DOIUrl":null,"url":null,"abstract":"We survey research using neural sequence-to-sequence models as compu-tational models of morphological learning and learnability. We discusstheir use in determining the predictability of inflectional exponents, inmaking predictions about language acquisition and in modeling languagechange. Finally, we make some proposals for future work in these areas.","PeriodicalId":53310,"journal":{"name":"Journal of Language Modelling","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Language Modelling","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15398/jlm.v7i1.244","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 11
Abstract
We survey research using neural sequence-to-sequence models as compu-tational models of morphological learning and learnability. We discusstheir use in determining the predictability of inflectional exponents, inmaking predictions about language acquisition and in modeling languagechange. Finally, we make some proposals for future work in these areas.