{"title":"由生成式人工智能聊天机器人生成的名字中的偏见评估","authors":"Jaime E. Mirowsky*, ","doi":"10.1021/acs.jchemed.4c0084410.1021/acs.jchemed.4c00844","DOIUrl":null,"url":null,"abstract":"<p >Generative artificial intelligence (GenAI) is becoming more prevalent in higher education, and with that comes opportunities and challenges. One opportunity is using this technology to help create educational material, but one challenge is that the output of these tools might produce biased content. Thus, for this work, three text-based GenAI tools (ChatGPT-4o, Microsoft Copilot, and Google Gemini) were used to develop an activity for an analytical chemistry laboratory course. In each response, the student names provided by the chatbots were quantified with respect to gender and broadly assessed for cultural representation. All three chatbots generated an equal percentage of female (“she/her”) and male (“he/him”) student names, but none of the chatbots used “they/them” pronouns, signaling a lack of inclusivity for nonbinary, gender-neutral, or gender-nonconforming individuals. The names provided by the chatbots were dominated by those popular in English-speaking countries, highlighting a lack of cultural diversity in the output provided. Both these biases could be mitigated by asking that the chatbots provide gender-inclusive names and names that represent diverse cultural backgrounds. As educators begin to utilize GenAI tools to create classroom materials or have students use this technology in their assignments, it is important to think about the potential biases that might emerge, share this limitation with those using these tools, and work to not perpetuate them.</p>","PeriodicalId":43,"journal":{"name":"Journal of Chemical Education","volume":"101 12","pages":"5142–5146 5142–5146"},"PeriodicalIF":2.5000,"publicationDate":"2024-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Assessing Biases in the Names Generated by Generative Artificial Intelligence Chatbots\",\"authors\":\"Jaime E. Mirowsky*, \",\"doi\":\"10.1021/acs.jchemed.4c0084410.1021/acs.jchemed.4c00844\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p >Generative artificial intelligence (GenAI) is becoming more prevalent in higher education, and with that comes opportunities and challenges. One opportunity is using this technology to help create educational material, but one challenge is that the output of these tools might produce biased content. Thus, for this work, three text-based GenAI tools (ChatGPT-4o, Microsoft Copilot, and Google Gemini) were used to develop an activity for an analytical chemistry laboratory course. In each response, the student names provided by the chatbots were quantified with respect to gender and broadly assessed for cultural representation. All three chatbots generated an equal percentage of female (“she/her”) and male (“he/him”) student names, but none of the chatbots used “they/them” pronouns, signaling a lack of inclusivity for nonbinary, gender-neutral, or gender-nonconforming individuals. The names provided by the chatbots were dominated by those popular in English-speaking countries, highlighting a lack of cultural diversity in the output provided. Both these biases could be mitigated by asking that the chatbots provide gender-inclusive names and names that represent diverse cultural backgrounds. As educators begin to utilize GenAI tools to create classroom materials or have students use this technology in their assignments, it is important to think about the potential biases that might emerge, share this limitation with those using these tools, and work to not perpetuate them.</p>\",\"PeriodicalId\":43,\"journal\":{\"name\":\"Journal of Chemical Education\",\"volume\":\"101 12\",\"pages\":\"5142–5146 5142–5146\"},\"PeriodicalIF\":2.5000,\"publicationDate\":\"2024-11-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Chemical Education\",\"FirstCategoryId\":\"92\",\"ListUrlMain\":\"https://pubs.acs.org/doi/10.1021/acs.jchemed.4c00844\",\"RegionNum\":3,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"CHEMISTRY, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Chemical Education","FirstCategoryId":"92","ListUrlMain":"https://pubs.acs.org/doi/10.1021/acs.jchemed.4c00844","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
Assessing Biases in the Names Generated by Generative Artificial Intelligence Chatbots
Generative artificial intelligence (GenAI) is becoming more prevalent in higher education, and with that comes opportunities and challenges. One opportunity is using this technology to help create educational material, but one challenge is that the output of these tools might produce biased content. Thus, for this work, three text-based GenAI tools (ChatGPT-4o, Microsoft Copilot, and Google Gemini) were used to develop an activity for an analytical chemistry laboratory course. In each response, the student names provided by the chatbots were quantified with respect to gender and broadly assessed for cultural representation. All three chatbots generated an equal percentage of female (“she/her”) and male (“he/him”) student names, but none of the chatbots used “they/them” pronouns, signaling a lack of inclusivity for nonbinary, gender-neutral, or gender-nonconforming individuals. The names provided by the chatbots were dominated by those popular in English-speaking countries, highlighting a lack of cultural diversity in the output provided. Both these biases could be mitigated by asking that the chatbots provide gender-inclusive names and names that represent diverse cultural backgrounds. As educators begin to utilize GenAI tools to create classroom materials or have students use this technology in their assignments, it is important to think about the potential biases that might emerge, share this limitation with those using these tools, and work to not perpetuate them.
期刊介绍:
The Journal of Chemical Education is the official journal of the Division of Chemical Education of the American Chemical Society, co-published with the American Chemical Society Publications Division. Launched in 1924, the Journal of Chemical Education is the world’s premier chemical education journal. The Journal publishes peer-reviewed articles and related information as a resource to those in the field of chemical education and to those institutions that serve them. JCE typically addresses chemical content, activities, laboratory experiments, instructional methods, and pedagogies. The Journal serves as a means of communication among people across the world who are interested in the teaching and learning of chemistry. This includes instructors of chemistry from middle school through graduate school, professional staff who support these teaching activities, as well as some scientists in commerce, industry, and government.