{"title":"Google Search Results: Buried If Not Forgotten","authors":"Allyson Haynes Stuart","doi":"10.2139/SSRN.2343398","DOIUrl":null,"url":null,"abstract":"The right to be forgotten or to require that online information be deleted squarely confronts the First Amendment right to free speech. But the underlying problem giving rise to this right is only increasing: harmful information posted online has the real potential to destroy a person’s reputation or livelihood. In addition, the way Internet users get their information – search engines, primarily Google – emphasizes harmful information if it is “popular” under Google’s algorithm. Google’s response to requests for removal is that it cannot control the underlying websites, so removing information from its results is pointless. But in fact, the search results themselves are of crucial importance. And those results are already being altered. If Internet users’ primary access to the vast amount of online information is filtered – and hand-edited – by a search engine, why shouldn’t that editing take into consideration the harmful nature of some information? This Article proposes that Google consider “demoting” references to information in its search results that falls within one of several sensitive categories, and the party requesting removal has unsuccessfully exhausted her remedies with respect to the website publisher of the information. This amounts not to censorship, but to factoring in the nature of the information itself in determining its relevance in response to search requests.","PeriodicalId":90661,"journal":{"name":"North Carolina journal of law & technology","volume":"15 1","pages":"463"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"North Carolina journal of law & technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/SSRN.2343398","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
The right to be forgotten or to require that online information be deleted squarely confronts the First Amendment right to free speech. But the underlying problem giving rise to this right is only increasing: harmful information posted online has the real potential to destroy a person’s reputation or livelihood. In addition, the way Internet users get their information – search engines, primarily Google – emphasizes harmful information if it is “popular” under Google’s algorithm. Google’s response to requests for removal is that it cannot control the underlying websites, so removing information from its results is pointless. But in fact, the search results themselves are of crucial importance. And those results are already being altered. If Internet users’ primary access to the vast amount of online information is filtered – and hand-edited – by a search engine, why shouldn’t that editing take into consideration the harmful nature of some information? This Article proposes that Google consider “demoting” references to information in its search results that falls within one of several sensitive categories, and the party requesting removal has unsuccessfully exhausted her remedies with respect to the website publisher of the information. This amounts not to censorship, but to factoring in the nature of the information itself in determining its relevance in response to search requests.