Accountability in ordinary action

P. Tolmie, Andy Crabtree
{"title":"Accountability in ordinary action","authors":"P. Tolmie, Andy Crabtree","doi":"10.1049/pbse014e_ch3","DOIUrl":null,"url":null,"abstract":"ly, various concepts of privacy posit axioms that essentially revolve around the disclosure of personal information or ‘data’. Often cited definitions thus inform us that privacy is the ability to control the disclosure of personal information [27], to create and manage interpersonal boundaries [28] and to employ contextual norms to regulate the ad hoc flow of personal information between people [29]. Concretely, the studies make it visible that privacy ordinarily and accountably revolves around managing a polymorphous array of mundane activities in which the digital is embedded in the ongoing conduct of manifold human relationships. Within this lively context, the disclosure or sharing of personal data is accountably organised in terms of managing members’ access to devices, applications and content through situated practices, procedures or methods that exploit the local ecology, device visibility and recipient design. The studies make it perspicuous that members’ concern with privacy is a concern to manage their accountability in the digital world and that manifest in data sharing practices is an evolving calculus of accountability employed to manage the potential attack surface the digital creates in everyday life. That the digital poses a threat to privacy and therefore undermines societal trust in the digital is broadly acknowledged. Much less wellknown and understood is how and why this happens and what kinds of steps beyond demonstrable compliance with the law might need to be taken to remedy the situation. 4 Privacy by design for the internet of things 3.2 The naturally accountable organisation of digital privacy in the home Managing digital privacy is intimately bound up with the observability and reportability of one’s digital activities and how other people might be able to see them in the first place. Developers and security analysts alike recommend passwords as the first line of defense to protect oneself from prying eyes. Yet there is a more fundamental naturally accountable organisation to privacy in the real world, as one of our participants, Paul (not his real name), tells us: Paul: I’m not particularly fussed about setting up passwords and things. I mean there’s no threat of network hijacking here. We live in the middle of the countryside, miles away from another house, it’s just not an issue. So as Paul makes accountable, one of the simplest and most ubiquitous ways to constrain access and protect privacy is by controlling access to the environments in which our digital activities occur, i.e., controlling access to the places in which digital devices are kept. Paul’s was not the only example of password suspension for devices considered inaccessible that we encountered in our studies. It was commonplace for devices that always stayed in the home, such as desktop PCs, tablets and media servers. The reasoning generally bound up with this is that the people who have rights of access to the network and the devices on it are the people who have rights of access to the environment in which those devices are located. This point was underscored by Christine, a 63yearold reflexologist who runs her practice from home: Christine: I’m big on making sure that the whole house isn’t open to their view. I’m forever closing doors and shutting things off. I make sure everything is clear that I consider sensitive. When I used to be in that smaller room particularly, I did not want my laptop there open while I was about to start a treatment. Fieldworker: So that window into your digital world you prefer to have closed to clients as well? Christine: Closed off. Absolutely. Absolutely. And obviously I have to make sure that I don’t leave information about previous clients lying around for clients to see. Fieldworker: So it’s both professional and personal reasons kind of wrapped in together? Christine: Well absolutely because you know, I could be sued if people felt it was a breach of data protection. So I have to be aware of that. I don’t have anything lying around that in any way gives any window into anything. I don’t keep my purse or my phone in the room with me either. Fieldworker: OK, and with regard to the network, if you had clients backing up, where one was waiting for another one to go, would you give them access to the network? Accountability in ordinary action 5 Christine: No. Absolutely not. No, no. I wouldn’t even give them access to the televisionunless Brian’s [her husband] in the room watching it! This vignette elaborates the fairly blanket operation of a situated privacy practice designed to minimise the risk of the incidental sharing of personal information of any kind – analogue or digital – with visitors to the house where the relationship is confined to business. Despite the account being directed towards breaches of data protection, Christine is also closing doors and generally constraining the movements of her clients. It is not just data relating to other clients but also of her own personal life she is keeping from view. What this makes particularly clear, however, is that physical barriers such as doors (and curtains, and hedges, etc.) are the principal gateway controlling access to personal information. Privacy is also managed through device visibility, which is not only controlled through physical barriers but by members proximity to screens. This is observably and reportably the case for participants who wanted to share very specific content where that content was a feature of activities and repositories described as private. Here Evelyn, a 21yearold student is talking about how she might occasionally share content with others around her on Tumblr: Evelyn: I don’t like anybody seeing what’s going on on Tumblr. Occasionally I’ll show Susan [her sister] a picture or a gif of a cat. But I don’t. Fieldworker: You show it to her? Evelyn: Yeah. Fieldworker: Rather than you let her watch you doing the posting? Evelyn: Yeah. Being on a phone it’s a smaller screen so there is less oversight. I tend to keep it quite close to me anyway. So I’m fine with browsing Tumblr when, like, I’m sitting on a chair downstairs and there are people walking about, or when we’re about to watch something. Evelyn can and does manage to keep what she is doing private by relying on the size of the screen. However, the way she positions the screen in relation to herself and others is not incidental. Indeed, insofar as screens are positioned not to disclose content then there is a mutual orientation to that content that it is not for general consumption. This of course is common practice, found in all kinds of settings. Dourish et al. [30] found, for example, that computer screens are often positioned such that visitors to an office cannot see them, and Klasnja et al. [31] found that people try to preserve their privacy when using devices in public places by ‘finding a seat against the wall’ or ‘tilting or dimming the screen’. Members have a mutually understood right to keep things from view that they routinely trade upon, such that, what might at first sight be viewed as guarded behaviour is nothing of the sort, but rather an everyday method for making manifest the status of particular kinds of personal data as ‘not to be shared’. So, as a matter of method, privacy can be and is managed by controlling how one physically enables others in the environment to see one’s devices. There are a number of permutations 6 Privacy by design for the internet of things whereby this can be brought about. It might be that the recipient is in an adjacent position and it is just a matter of locating the thing to be shared and then reangling the device so that they can see it. It may also be the case that the device is not mobile, in which case sharing may involve calling the recipient over to the device. Locating the data may also be something that is done prior to sharing it or may be done once the recipient is actually copresent. But, however it occurs, privacy and data disclosure are routinely managed through the social positioning of devices and content in interaction. Now none of this is to say that passwords are not used. It is simply to recognise what we all take for granted: that privacy is principally managed through the methodical use of physical barriers to control members’ access to the places where devices are located; and by controlling the access to devices themselves through their social positioning and making content selectively available to others by allowing them in various ways to see screens. That said, our participants did have ‘occasion’ to use passwords, as Mike and Alice elaborate: Mike: The PC upstairs occasionally has a password. It usually doesn’t. It’s in the back room. The last time I put a code on was when we had a decorator in to do that room. I’ve done it once or twice when we’ve had guests staying. Alice: Yeah, when my nephew comes, ‘cause he just logs into everything. Fieldworker: It kind of depends on the guest? Mike: Yeah. Fieldworker: ‘cause you see it as a potential risk? Mike: Yeah. Fieldworker: What would be a potential risk? Mike: Basically, er, adult material on there. So potential embarrassment I guess. Mike: With the decorator guy, it was more the general principle. There’s personal information on there. Alice: Bank details and stuff, so we don’t want them. Mike: Yeah, whereas if it was like family staying there, it’s more like the scenario where they just use the PC for something and stumble across a folder I’d rather they don’t stumble across. It would be easy to render what Mike is saying here as being that he and Alice use passwords to ensure data privacy. They do, of course, want to protect their ‘bank details and stuff’, but there is more to it than that. Insofar as they do use passwords to ensure data privacy, then they do so on a selective basis, rather than as a blanket policy. This will no doubt cause security advisors to throw their hands in the air. However, it is accountably","PeriodicalId":179291,"journal":{"name":"Privacy by Design for the Internet of Things: Building accountability and security","volume":"561-565 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Privacy by Design for the Internet of Things: Building accountability and security","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1049/pbse014e_ch3","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

ly, various concepts of privacy posit axioms that essentially revolve around the disclosure of personal information or ‘data’. Often cited definitions thus inform us that privacy is the ability to control the disclosure of personal information [27], to create and manage interpersonal boundaries [28] and to employ contextual norms to regulate the ad hoc flow of personal information between people [29]. Concretely, the studies make it visible that privacy ordinarily and accountably revolves around managing a polymorphous array of mundane activities in which the digital is embedded in the ongoing conduct of manifold human relationships. Within this lively context, the disclosure or sharing of personal data is accountably organised in terms of managing members’ access to devices, applications and content through situated practices, procedures or methods that exploit the local ecology, device visibility and recipient design. The studies make it perspicuous that members’ concern with privacy is a concern to manage their accountability in the digital world and that manifest in data sharing practices is an evolving calculus of accountability employed to manage the potential attack surface the digital creates in everyday life. That the digital poses a threat to privacy and therefore undermines societal trust in the digital is broadly acknowledged. Much less wellknown and understood is how and why this happens and what kinds of steps beyond demonstrable compliance with the law might need to be taken to remedy the situation. 4 Privacy by design for the internet of things 3.2 The naturally accountable organisation of digital privacy in the home Managing digital privacy is intimately bound up with the observability and reportability of one’s digital activities and how other people might be able to see them in the first place. Developers and security analysts alike recommend passwords as the first line of defense to protect oneself from prying eyes. Yet there is a more fundamental naturally accountable organisation to privacy in the real world, as one of our participants, Paul (not his real name), tells us: Paul: I’m not particularly fussed about setting up passwords and things. I mean there’s no threat of network hijacking here. We live in the middle of the countryside, miles away from another house, it’s just not an issue. So as Paul makes accountable, one of the simplest and most ubiquitous ways to constrain access and protect privacy is by controlling access to the environments in which our digital activities occur, i.e., controlling access to the places in which digital devices are kept. Paul’s was not the only example of password suspension for devices considered inaccessible that we encountered in our studies. It was commonplace for devices that always stayed in the home, such as desktop PCs, tablets and media servers. The reasoning generally bound up with this is that the people who have rights of access to the network and the devices on it are the people who have rights of access to the environment in which those devices are located. This point was underscored by Christine, a 63yearold reflexologist who runs her practice from home: Christine: I’m big on making sure that the whole house isn’t open to their view. I’m forever closing doors and shutting things off. I make sure everything is clear that I consider sensitive. When I used to be in that smaller room particularly, I did not want my laptop there open while I was about to start a treatment. Fieldworker: So that window into your digital world you prefer to have closed to clients as well? Christine: Closed off. Absolutely. Absolutely. And obviously I have to make sure that I don’t leave information about previous clients lying around for clients to see. Fieldworker: So it’s both professional and personal reasons kind of wrapped in together? Christine: Well absolutely because you know, I could be sued if people felt it was a breach of data protection. So I have to be aware of that. I don’t have anything lying around that in any way gives any window into anything. I don’t keep my purse or my phone in the room with me either. Fieldworker: OK, and with regard to the network, if you had clients backing up, where one was waiting for another one to go, would you give them access to the network? Accountability in ordinary action 5 Christine: No. Absolutely not. No, no. I wouldn’t even give them access to the televisionunless Brian’s [her husband] in the room watching it! This vignette elaborates the fairly blanket operation of a situated privacy practice designed to minimise the risk of the incidental sharing of personal information of any kind – analogue or digital – with visitors to the house where the relationship is confined to business. Despite the account being directed towards breaches of data protection, Christine is also closing doors and generally constraining the movements of her clients. It is not just data relating to other clients but also of her own personal life she is keeping from view. What this makes particularly clear, however, is that physical barriers such as doors (and curtains, and hedges, etc.) are the principal gateway controlling access to personal information. Privacy is also managed through device visibility, which is not only controlled through physical barriers but by members proximity to screens. This is observably and reportably the case for participants who wanted to share very specific content where that content was a feature of activities and repositories described as private. Here Evelyn, a 21yearold student is talking about how she might occasionally share content with others around her on Tumblr: Evelyn: I don’t like anybody seeing what’s going on on Tumblr. Occasionally I’ll show Susan [her sister] a picture or a gif of a cat. But I don’t. Fieldworker: You show it to her? Evelyn: Yeah. Fieldworker: Rather than you let her watch you doing the posting? Evelyn: Yeah. Being on a phone it’s a smaller screen so there is less oversight. I tend to keep it quite close to me anyway. So I’m fine with browsing Tumblr when, like, I’m sitting on a chair downstairs and there are people walking about, or when we’re about to watch something. Evelyn can and does manage to keep what she is doing private by relying on the size of the screen. However, the way she positions the screen in relation to herself and others is not incidental. Indeed, insofar as screens are positioned not to disclose content then there is a mutual orientation to that content that it is not for general consumption. This of course is common practice, found in all kinds of settings. Dourish et al. [30] found, for example, that computer screens are often positioned such that visitors to an office cannot see them, and Klasnja et al. [31] found that people try to preserve their privacy when using devices in public places by ‘finding a seat against the wall’ or ‘tilting or dimming the screen’. Members have a mutually understood right to keep things from view that they routinely trade upon, such that, what might at first sight be viewed as guarded behaviour is nothing of the sort, but rather an everyday method for making manifest the status of particular kinds of personal data as ‘not to be shared’. So, as a matter of method, privacy can be and is managed by controlling how one physically enables others in the environment to see one’s devices. There are a number of permutations 6 Privacy by design for the internet of things whereby this can be brought about. It might be that the recipient is in an adjacent position and it is just a matter of locating the thing to be shared and then reangling the device so that they can see it. It may also be the case that the device is not mobile, in which case sharing may involve calling the recipient over to the device. Locating the data may also be something that is done prior to sharing it or may be done once the recipient is actually copresent. But, however it occurs, privacy and data disclosure are routinely managed through the social positioning of devices and content in interaction. Now none of this is to say that passwords are not used. It is simply to recognise what we all take for granted: that privacy is principally managed through the methodical use of physical barriers to control members’ access to the places where devices are located; and by controlling the access to devices themselves through their social positioning and making content selectively available to others by allowing them in various ways to see screens. That said, our participants did have ‘occasion’ to use passwords, as Mike and Alice elaborate: Mike: The PC upstairs occasionally has a password. It usually doesn’t. It’s in the back room. The last time I put a code on was when we had a decorator in to do that room. I’ve done it once or twice when we’ve had guests staying. Alice: Yeah, when my nephew comes, ‘cause he just logs into everything. Fieldworker: It kind of depends on the guest? Mike: Yeah. Fieldworker: ‘cause you see it as a potential risk? Mike: Yeah. Fieldworker: What would be a potential risk? Mike: Basically, er, adult material on there. So potential embarrassment I guess. Mike: With the decorator guy, it was more the general principle. There’s personal information on there. Alice: Bank details and stuff, so we don’t want them. Mike: Yeah, whereas if it was like family staying there, it’s more like the scenario where they just use the PC for something and stumble across a folder I’d rather they don’t stumble across. It would be easy to render what Mike is saying here as being that he and Alice use passwords to ensure data privacy. They do, of course, want to protect their ‘bank details and stuff’, but there is more to it than that. Insofar as they do use passwords to ensure data privacy, then they do so on a selective basis, rather than as a blanket policy. This will no doubt cause security advisors to throw their hands in the air. However, it is accountably
日常行动中的问责
实际上,各种隐私概念都假定公理,本质上是围绕个人信息或“数据”的披露。因此,经常被引用的定义告诉我们,隐私是控制个人信息披露的能力[27],创建和管理人际界限的能力[28],以及使用情境规范来规范人与人之间的个人信息流动的能力[29]。具体地说,这些研究表明,隐私通常和负责地围绕着管理各种各样的世俗活动,其中数字嵌入到多种人类关系的持续行为中。在这种生动的背景下,个人数据的披露或共享是通过利用当地生态、设备可见性和收件人设计的定位实践、程序或方法,在管理成员对设备、应用程序和内容的访问方面进行负责任的组织。这些研究清楚地表明,成员对隐私的关注是对他们在数字世界中管理责任的关注,而在数据共享实践中表现出来的是一种不断发展的责任演算,用于管理数字在日常生活中创造的潜在攻击面。数字技术对隐私构成威胁,从而破坏了社会对数字技术的信任,这一点已得到广泛承认。人们不太了解和理解的是,这种情况是如何发生的,为什么会发生,以及除了明显遵守法律之外,可能需要采取哪些措施来纠正这种情况。管理数字隐私与一个人的数字活动的可观察性和可报告性以及其他人如何能够首先看到它们密切相关。开发人员和安全分析师都建议将密码作为保护自己免受窥探的第一道防线。然而,在现实世界中,有一种更基本的自然负责的隐私组织,正如我们的一位参与者保罗(化名)告诉我们的那样:保罗:我对设置密码和其他东西并不特别在意。我的意思是这里没有网络劫持的威胁。我们住在乡村中心,离另一所房子几英里远,这不是问题。因此,正如保罗所说,限制访问和保护隐私的最简单、最普遍的方法之一是控制对我们进行数字活动的环境的访问,也就是说,控制对保存数字设备的地方的访问。Paul的并不是我们在研究中遇到的唯一一个被认为无法访问的设备密码暂停的例子。对于台式机、平板电脑和媒体服务器等一直呆在家里的设备来说,这是司空见惯的事情。与此相关的一般推理是有权访问网络和网络上的设备的人是有权访问这些设备所在环境的人。这一点得到了63岁的反射治疗师克里斯汀(Christine)的强调,她在家里经营自己的诊所。克里斯汀:我非常希望确保整个房子都不会对他们的视线敞开。我总是关上门,把一切都关起来。我确保所有我认为敏感的东西都很清楚。尤其是当我在那个小房间的时候,我不希望我的笔记本电脑在我即将开始治疗的时候打开。现场工作者:所以你更愿意对客户关闭进入数字世界的窗口?克里斯汀:关闭。绝对的。绝对的。很明显,我必须确保我没有把以前客户的信息放在那里让客户看到。实地工作者:所以这是职业和个人原因交织在一起的吗?克里斯汀:当然,因为你知道,如果人们认为我违反了数据保护,我可能会被起诉。所以我必须意识到这一点。我身边没有任何东西能让我了解任何事情。我也不会把钱包和手机放在房间里。现场工作者:好的,关于网络,如果你有客户端备份,一个客户端在等待另一个客户端,你会给他们访问网络的权限吗?克莉丝汀:不。绝对不是。不,不。我甚至不会让他们看电视,除非布莱恩(她的丈夫)在房间里看!这个小插图详细阐述了一个相当全面的隐私实践,旨在最大限度地降低偶然分享任何类型的个人信息的风险,无论是模拟的还是数字的,与来访者的关系仅限于业务。尽管该账户被指向违反数据保护,但克里斯汀也关闭了大门,并普遍限制了客户的行动。 她隐瞒的不仅是与其他客户有关的数据,还有她自己的个人生活。然而,这特别清楚地表明,诸如门(以及窗帘和树篱等)之类的物理屏障是控制个人信息访问的主要门户。隐私也通过设备可见性来管理,这不仅通过物理屏障来控制,还通过成员靠近屏幕来控制。对于想要共享非常具体的内容的参与者来说,这种情况是显而易见的,并且这些内容是被描述为私有的活动和存储库的一个特征。21岁的学生Evelyn正在谈论她如何偶尔在Tumblr上与周围的人分享内容:Evelyn:我不喜欢任何人看到Tumblr上发生的事情。偶尔我会给苏珊(她妹妹)看一张猫的照片或动图。但我没有。田野工作者:你给她看了?伊芙琳:是啊。田野工作者:而不是你让她看着你发帖?伊芙琳:是啊。手机屏幕更小,所以监管更少。反正我总是把它放在身边。所以当我坐在楼下的椅子上有人走来走去的时候,或者当我们要看什么东西的时候,我可以浏览Tumblr。伊芙琳可以并且确实通过依赖屏幕的大小来保持她正在做的私事。然而,她放置屏幕的方式与自己和他人的关系并不是偶然的。事实上,只要屏幕的位置不显示内容,那么就存在一个共同的方向,即该内容不是供一般消费的。这当然是常见的做法,在各种情况下都可以找到。例如,Dourish等人[30]发现,电脑屏幕通常被放置在办公室访客看不到的位置,Klasnja等人[31]发现,人们在公共场所使用设备时,会通过“找一个靠墙的座位”或“倾斜或调暗屏幕”来保护自己的隐私。会员有一种相互理解的权利,让他们日常交易的东西不被看到,这样,乍一看可能被视为谨慎的行为并不是那种行为,而是一种日常方法,表明特定类型的个人数据的状态是“不共享”。因此,作为一种方法,隐私可以通过控制一个人如何让环境中的其他人看到自己的设备来管理。物联网的隐私设计有多种排列方式,可以实现这一点。这可能是因为接收人在一个相邻的位置,这只是一个问题,找到要分享的东西,然后调整设备,以便他们可以看到它。也可能设备不是移动设备,在这种情况下,共享可能需要将接收方呼叫到设备。定位数据也可能是在共享数据之前完成的工作,也可能是在接收方实际在场时完成的工作。但是,无论发生什么,隐私和数据泄露通常都是通过设备和内容在交互中的社会定位来管理的。这并不是说密码不被使用。只是要认识到我们都认为理所当然的事情:隐私主要是通过有条不紊地使用物理屏障来管理的,以控制成员进入设备所在的地方;通过控制对设备本身的访问,通过它们的社会定位,并通过允许他们以各种方式看到屏幕,有选择地向其他人提供内容。也就是说,我们的参与者确实有使用密码的“机会”,正如Mike和Alice详细阐述的那样:Mike:楼上的电脑偶尔会有密码。通常不会。在后面的房间里。上次我给房间装上暗号是我们请了一个装修师来装修房间的时候。有客人住的时候我就这么做过一两次。爱丽丝:是的,当我侄子来的时候,因为他会登录所有的东西。实地工作者:这取决于客人?迈克:是的。实地工作者:因为你认为这是一个潜在的风险?迈克:是的。现场工作者:潜在的风险是什么?迈克:基本上,嗯,上面有成人内容。所以我想这是潜在的尴尬。迈克:对那个装饰师来说,更多的是一般原则。上面有个人信息。银行信息之类的,我们不想要。迈克:对,不过如果是家人住在那里,那就更像是他们用电脑做事情,然后偶然发现了一个文件夹,我希望他们不要偶然发现。很容易将Mike在这里所说的解释为他和Alice使用密码来确保数据隐私。当然,他们确实希望保护自己的“银行详细信息和资料”,但事情远不止于此。就它们使用密码来确保数据隐私而言,它们是在选择性的基础上这样做的,而不是作为一种全面的策略。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信