Raysa Benatti, Fabiana Severi, Sandra Avila, Esther Luna Colombini
{"title":"法院判决中的性别偏见检测:巴西案例研究","authors":"Raysa Benatti, Fabiana Severi, Sandra Avila, Esther Luna Colombini","doi":"arxiv-2406.00393","DOIUrl":null,"url":null,"abstract":"Data derived from the realm of the social sciences is often produced in\ndigital text form, which motivates its use as a source for natural language\nprocessing methods. Researchers and practitioners have developed and relied on\nartificial intelligence techniques to collect, process, and analyze documents\nin the legal field, especially for tasks such as text summarization and\nclassification. While increasing procedural efficiency is often the primary\nmotivation behind natural language processing in the field, several works have\nproposed solutions for human rights-related issues, such as assessment of\npublic policy and institutional social settings. One such issue is the presence\nof gender biases in court decisions, which has been largely studied in social\nsciences fields; biased institutional responses to gender-based violence are a\nviolation of international human rights dispositions since they prevent gender\nminorities from accessing rights and hamper their dignity. Natural language\nprocessing-based approaches can help detect these biases on a larger scale.\nStill, the development and use of such tools require researchers and\npractitioners to be mindful of legal and ethical aspects concerning data\nsharing and use, reproducibility, domain expertise, and value-charged choices.\nIn this work, we (a) present an experimental framework developed to\nautomatically detect gender biases in court decisions issued in Brazilian\nPortuguese and (b) describe and elaborate on features we identify to be\ncritical in such a technology, given its proposed use as a support tool for\nresearch and assessment of court~activity.","PeriodicalId":501112,"journal":{"name":"arXiv - CS - Computers and Society","volume":"16 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Gender Bias Detection in Court Decisions: A Brazilian Case Study\",\"authors\":\"Raysa Benatti, Fabiana Severi, Sandra Avila, Esther Luna Colombini\",\"doi\":\"arxiv-2406.00393\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Data derived from the realm of the social sciences is often produced in\\ndigital text form, which motivates its use as a source for natural language\\nprocessing methods. Researchers and practitioners have developed and relied on\\nartificial intelligence techniques to collect, process, and analyze documents\\nin the legal field, especially for tasks such as text summarization and\\nclassification. While increasing procedural efficiency is often the primary\\nmotivation behind natural language processing in the field, several works have\\nproposed solutions for human rights-related issues, such as assessment of\\npublic policy and institutional social settings. One such issue is the presence\\nof gender biases in court decisions, which has been largely studied in social\\nsciences fields; biased institutional responses to gender-based violence are a\\nviolation of international human rights dispositions since they prevent gender\\nminorities from accessing rights and hamper their dignity. Natural language\\nprocessing-based approaches can help detect these biases on a larger scale.\\nStill, the development and use of such tools require researchers and\\npractitioners to be mindful of legal and ethical aspects concerning data\\nsharing and use, reproducibility, domain expertise, and value-charged choices.\\nIn this work, we (a) present an experimental framework developed to\\nautomatically detect gender biases in court decisions issued in Brazilian\\nPortuguese and (b) describe and elaborate on features we identify to be\\ncritical in such a technology, given its proposed use as a support tool for\\nresearch and assessment of court~activity.\",\"PeriodicalId\":501112,\"journal\":{\"name\":\"arXiv - CS - Computers and Society\",\"volume\":\"16 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Computers and Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2406.00393\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computers and Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2406.00393","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Gender Bias Detection in Court Decisions: A Brazilian Case Study
Data derived from the realm of the social sciences is often produced in
digital text form, which motivates its use as a source for natural language
processing methods. Researchers and practitioners have developed and relied on
artificial intelligence techniques to collect, process, and analyze documents
in the legal field, especially for tasks such as text summarization and
classification. While increasing procedural efficiency is often the primary
motivation behind natural language processing in the field, several works have
proposed solutions for human rights-related issues, such as assessment of
public policy and institutional social settings. One such issue is the presence
of gender biases in court decisions, which has been largely studied in social
sciences fields; biased institutional responses to gender-based violence are a
violation of international human rights dispositions since they prevent gender
minorities from accessing rights and hamper their dignity. Natural language
processing-based approaches can help detect these biases on a larger scale.
Still, the development and use of such tools require researchers and
practitioners to be mindful of legal and ethical aspects concerning data
sharing and use, reproducibility, domain expertise, and value-charged choices.
In this work, we (a) present an experimental framework developed to
automatically detect gender biases in court decisions issued in Brazilian
Portuguese and (b) describe and elaborate on features we identify to be
critical in such a technology, given its proposed use as a support tool for
research and assessment of court~activity.