{"title":"Accountability in ordinary action","authors":"P. Tolmie, Andy Crabtree","doi":"10.1049/pbse014e_ch3","DOIUrl":null,"url":null,"abstract":"ly, various concepts of privacy posit axioms that essentially revolve around the disclosure of personal information or ‘data’. Often cited definitions thus inform us that privacy is the ability to control the disclosure of personal information [27], to create and manage interpersonal boundaries [28] and to employ contextual norms to regulate the ad hoc flow of personal information between people [29]. Concretely, the studies make it visible that privacy ordinarily and accountably revolves around managing a polymorphous array of mundane activities in which the digital is embedded in the ongoing conduct of manifold human relationships. Within this lively context, the disclosure or sharing of personal data is accountably organised in terms of managing members’ access to devices, applications and content through situated practices, procedures or methods that exploit the local ecology, device visibility and recipient design. The studies make it perspicuous that members’ concern with privacy is a concern to manage their accountability in the digital world and that manifest in data sharing practices is an evolving calculus of accountability employed to manage the potential attack surface the digital creates in everyday life. That the digital poses a threat to privacy and therefore undermines societal trust in the digital is broadly acknowledged. Much less wellknown and understood is how and why this happens and what kinds of steps beyond demonstrable compliance with the law might need to be taken to remedy the situation. 4 Privacy by design for the internet of things 3.2 The naturally accountable organisation of digital privacy in the home Managing digital privacy is intimately bound up with the observability and reportability of one’s digital activities and how other people might be able to see them in the first place. Developers and security analysts alike recommend passwords as the first line of defense to protect oneself from prying eyes. Yet there is a more fundamental naturally accountable organisation to privacy in the real world, as one of our participants, Paul (not his real name), tells us: Paul: I’m not particularly fussed about setting up passwords and things. I mean there’s no threat of network hijacking here. We live in the middle of the countryside, miles away from another house, it’s just not an issue. So as Paul makes accountable, one of the simplest and most ubiquitous ways to constrain access and protect privacy is by controlling access to the environments in which our digital activities occur, i.e., controlling access to the places in which digital devices are kept. Paul’s was not the only example of password suspension for devices considered inaccessible that we encountered in our studies. It was commonplace for devices that always stayed in the home, such as desktop PCs, tablets and media servers. The reasoning generally bound up with this is that the people who have rights of access to the network and the devices on it are the people who have rights of access to the environment in which those devices are located. This point was underscored by Christine, a 63yearold reflexologist who runs her practice from home: Christine: I’m big on making sure that the whole house isn’t open to their view. I’m forever closing doors and shutting things off. I make sure everything is clear that I consider sensitive. When I used to be in that smaller room particularly, I did not want my laptop there open while I was about to start a treatment. Fieldworker: So that window into your digital world you prefer to have closed to clients as well? Christine: Closed off. Absolutely. Absolutely. And obviously I have to make sure that I don’t leave information about previous clients lying around for clients to see. Fieldworker: So it’s both professional and personal reasons kind of wrapped in together? Christine: Well absolutely because you know, I could be sued if people felt it was a breach of data protection. So I have to be aware of that. I don’t have anything lying around that in any way gives any window into anything. I don’t keep my purse or my phone in the room with me either. Fieldworker: OK, and with regard to the network, if you had clients backing up, where one was waiting for another one to go, would you give them access to the network? Accountability in ordinary action 5 Christine: No. Absolutely not. No, no. I wouldn’t even give them access to the televisionunless Brian’s [her husband] in the room watching it! This vignette elaborates the fairly blanket operation of a situated privacy practice designed to minimise the risk of the incidental sharing of personal information of any kind – analogue or digital – with visitors to the house where the relationship is confined to business. Despite the account being directed towards breaches of data protection, Christine is also closing doors and generally constraining the movements of her clients. It is not just data relating to other clients but also of her own personal life she is keeping from view. What this makes particularly clear, however, is that physical barriers such as doors (and curtains, and hedges, etc.) are the principal gateway controlling access to personal information. Privacy is also managed through device visibility, which is not only controlled through physical barriers but by members proximity to screens. This is observably and reportably the case for participants who wanted to share very specific content where that content was a feature of activities and repositories described as private. Here Evelyn, a 21yearold student is talking about how she might occasionally share content with others around her on Tumblr: Evelyn: I don’t like anybody seeing what’s going on on Tumblr. Occasionally I’ll show Susan [her sister] a picture or a gif of a cat. But I don’t. Fieldworker: You show it to her? Evelyn: Yeah. Fieldworker: Rather than you let her watch you doing the posting? Evelyn: Yeah. Being on a phone it’s a smaller screen so there is less oversight. I tend to keep it quite close to me anyway. So I’m fine with browsing Tumblr when, like, I’m sitting on a chair downstairs and there are people walking about, or when we’re about to watch something. Evelyn can and does manage to keep what she is doing private by relying on the size of the screen. However, the way she positions the screen in relation to herself and others is not incidental. Indeed, insofar as screens are positioned not to disclose content then there is a mutual orientation to that content that it is not for general consumption. This of course is common practice, found in all kinds of settings. Dourish et al. [30] found, for example, that computer screens are often positioned such that visitors to an office cannot see them, and Klasnja et al. [31] found that people try to preserve their privacy when using devices in public places by ‘finding a seat against the wall’ or ‘tilting or dimming the screen’. Members have a mutually understood right to keep things from view that they routinely trade upon, such that, what might at first sight be viewed as guarded behaviour is nothing of the sort, but rather an everyday method for making manifest the status of particular kinds of personal data as ‘not to be shared’. So, as a matter of method, privacy can be and is managed by controlling how one physically enables others in the environment to see one’s devices. There are a number of permutations 6 Privacy by design for the internet of things whereby this can be brought about. It might be that the recipient is in an adjacent position and it is just a matter of locating the thing to be shared and then reangling the device so that they can see it. It may also be the case that the device is not mobile, in which case sharing may involve calling the recipient over to the device. Locating the data may also be something that is done prior to sharing it or may be done once the recipient is actually copresent. But, however it occurs, privacy and data disclosure are routinely managed through the social positioning of devices and content in interaction. Now none of this is to say that passwords are not used. It is simply to recognise what we all take for granted: that privacy is principally managed through the methodical use of physical barriers to control members’ access to the places where devices are located; and by controlling the access to devices themselves through their social positioning and making content selectively available to others by allowing them in various ways to see screens. That said, our participants did have ‘occasion’ to use passwords, as Mike and Alice elaborate: Mike: The PC upstairs occasionally has a password. It usually doesn’t. It’s in the back room. The last time I put a code on was when we had a decorator in to do that room. I’ve done it once or twice when we’ve had guests staying. Alice: Yeah, when my nephew comes, ‘cause he just logs into everything. Fieldworker: It kind of depends on the guest? Mike: Yeah. Fieldworker: ‘cause you see it as a potential risk? Mike: Yeah. Fieldworker: What would be a potential risk? Mike: Basically, er, adult material on there. So potential embarrassment I guess. Mike: With the decorator guy, it was more the general principle. There’s personal information on there. Alice: Bank details and stuff, so we don’t want them. Mike: Yeah, whereas if it was like family staying there, it’s more like the scenario where they just use the PC for something and stumble across a folder I’d rather they don’t stumble across. It would be easy to render what Mike is saying here as being that he and Alice use passwords to ensure data privacy. They do, of course, want to protect their ‘bank details and stuff’, but there is more to it than that. Insofar as they do use passwords to ensure data privacy, then they do so on a selective basis, rather than as a blanket policy. This will no doubt cause security advisors to throw their hands in the air. However, it is accountably","PeriodicalId":179291,"journal":{"name":"Privacy by Design for the Internet of Things: Building accountability and security","volume":"561-565 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Privacy by Design for the Internet of Things: Building accountability and security","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1049/pbse014e_ch3","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
ly, various concepts of privacy posit axioms that essentially revolve around the disclosure of personal information or ‘data’. Often cited definitions thus inform us that privacy is the ability to control the disclosure of personal information [27], to create and manage interpersonal boundaries [28] and to employ contextual norms to regulate the ad hoc flow of personal information between people [29]. Concretely, the studies make it visible that privacy ordinarily and accountably revolves around managing a polymorphous array of mundane activities in which the digital is embedded in the ongoing conduct of manifold human relationships. Within this lively context, the disclosure or sharing of personal data is accountably organised in terms of managing members’ access to devices, applications and content through situated practices, procedures or methods that exploit the local ecology, device visibility and recipient design. The studies make it perspicuous that members’ concern with privacy is a concern to manage their accountability in the digital world and that manifest in data sharing practices is an evolving calculus of accountability employed to manage the potential attack surface the digital creates in everyday life. That the digital poses a threat to privacy and therefore undermines societal trust in the digital is broadly acknowledged. Much less wellknown and understood is how and why this happens and what kinds of steps beyond demonstrable compliance with the law might need to be taken to remedy the situation. 4 Privacy by design for the internet of things 3.2 The naturally accountable organisation of digital privacy in the home Managing digital privacy is intimately bound up with the observability and reportability of one’s digital activities and how other people might be able to see them in the first place. Developers and security analysts alike recommend passwords as the first line of defense to protect oneself from prying eyes. Yet there is a more fundamental naturally accountable organisation to privacy in the real world, as one of our participants, Paul (not his real name), tells us: Paul: I’m not particularly fussed about setting up passwords and things. I mean there’s no threat of network hijacking here. We live in the middle of the countryside, miles away from another house, it’s just not an issue. So as Paul makes accountable, one of the simplest and most ubiquitous ways to constrain access and protect privacy is by controlling access to the environments in which our digital activities occur, i.e., controlling access to the places in which digital devices are kept. Paul’s was not the only example of password suspension for devices considered inaccessible that we encountered in our studies. It was commonplace for devices that always stayed in the home, such as desktop PCs, tablets and media servers. The reasoning generally bound up with this is that the people who have rights of access to the network and the devices on it are the people who have rights of access to the environment in which those devices are located. This point was underscored by Christine, a 63yearold reflexologist who runs her practice from home: Christine: I’m big on making sure that the whole house isn’t open to their view. I’m forever closing doors and shutting things off. I make sure everything is clear that I consider sensitive. When I used to be in that smaller room particularly, I did not want my laptop there open while I was about to start a treatment. Fieldworker: So that window into your digital world you prefer to have closed to clients as well? Christine: Closed off. Absolutely. Absolutely. And obviously I have to make sure that I don’t leave information about previous clients lying around for clients to see. Fieldworker: So it’s both professional and personal reasons kind of wrapped in together? Christine: Well absolutely because you know, I could be sued if people felt it was a breach of data protection. So I have to be aware of that. I don’t have anything lying around that in any way gives any window into anything. I don’t keep my purse or my phone in the room with me either. Fieldworker: OK, and with regard to the network, if you had clients backing up, where one was waiting for another one to go, would you give them access to the network? Accountability in ordinary action 5 Christine: No. Absolutely not. No, no. I wouldn’t even give them access to the televisionunless Brian’s [her husband] in the room watching it! This vignette elaborates the fairly blanket operation of a situated privacy practice designed to minimise the risk of the incidental sharing of personal information of any kind – analogue or digital – with visitors to the house where the relationship is confined to business. Despite the account being directed towards breaches of data protection, Christine is also closing doors and generally constraining the movements of her clients. It is not just data relating to other clients but also of her own personal life she is keeping from view. What this makes particularly clear, however, is that physical barriers such as doors (and curtains, and hedges, etc.) are the principal gateway controlling access to personal information. Privacy is also managed through device visibility, which is not only controlled through physical barriers but by members proximity to screens. This is observably and reportably the case for participants who wanted to share very specific content where that content was a feature of activities and repositories described as private. Here Evelyn, a 21yearold student is talking about how she might occasionally share content with others around her on Tumblr: Evelyn: I don’t like anybody seeing what’s going on on Tumblr. Occasionally I’ll show Susan [her sister] a picture or a gif of a cat. But I don’t. Fieldworker: You show it to her? Evelyn: Yeah. Fieldworker: Rather than you let her watch you doing the posting? Evelyn: Yeah. Being on a phone it’s a smaller screen so there is less oversight. I tend to keep it quite close to me anyway. So I’m fine with browsing Tumblr when, like, I’m sitting on a chair downstairs and there are people walking about, or when we’re about to watch something. Evelyn can and does manage to keep what she is doing private by relying on the size of the screen. However, the way she positions the screen in relation to herself and others is not incidental. Indeed, insofar as screens are positioned not to disclose content then there is a mutual orientation to that content that it is not for general consumption. This of course is common practice, found in all kinds of settings. Dourish et al. [30] found, for example, that computer screens are often positioned such that visitors to an office cannot see them, and Klasnja et al. [31] found that people try to preserve their privacy when using devices in public places by ‘finding a seat against the wall’ or ‘tilting or dimming the screen’. Members have a mutually understood right to keep things from view that they routinely trade upon, such that, what might at first sight be viewed as guarded behaviour is nothing of the sort, but rather an everyday method for making manifest the status of particular kinds of personal data as ‘not to be shared’. So, as a matter of method, privacy can be and is managed by controlling how one physically enables others in the environment to see one’s devices. There are a number of permutations 6 Privacy by design for the internet of things whereby this can be brought about. It might be that the recipient is in an adjacent position and it is just a matter of locating the thing to be shared and then reangling the device so that they can see it. It may also be the case that the device is not mobile, in which case sharing may involve calling the recipient over to the device. Locating the data may also be something that is done prior to sharing it or may be done once the recipient is actually copresent. But, however it occurs, privacy and data disclosure are routinely managed through the social positioning of devices and content in interaction. Now none of this is to say that passwords are not used. It is simply to recognise what we all take for granted: that privacy is principally managed through the methodical use of physical barriers to control members’ access to the places where devices are located; and by controlling the access to devices themselves through their social positioning and making content selectively available to others by allowing them in various ways to see screens. That said, our participants did have ‘occasion’ to use passwords, as Mike and Alice elaborate: Mike: The PC upstairs occasionally has a password. It usually doesn’t. It’s in the back room. The last time I put a code on was when we had a decorator in to do that room. I’ve done it once or twice when we’ve had guests staying. Alice: Yeah, when my nephew comes, ‘cause he just logs into everything. Fieldworker: It kind of depends on the guest? Mike: Yeah. Fieldworker: ‘cause you see it as a potential risk? Mike: Yeah. Fieldworker: What would be a potential risk? Mike: Basically, er, adult material on there. So potential embarrassment I guess. Mike: With the decorator guy, it was more the general principle. There’s personal information on there. Alice: Bank details and stuff, so we don’t want them. Mike: Yeah, whereas if it was like family staying there, it’s more like the scenario where they just use the PC for something and stumble across a folder I’d rather they don’t stumble across. It would be easy to render what Mike is saying here as being that he and Alice use passwords to ensure data privacy. They do, of course, want to protect their ‘bank details and stuff’, but there is more to it than that. Insofar as they do use passwords to ensure data privacy, then they do so on a selective basis, rather than as a blanket policy. This will no doubt cause security advisors to throw their hands in the air. However, it is accountably