将社会信任融入安全系统的设计实践

P. Cofta, H. Lacohée, Paul Hodgson
{"title":"将社会信任融入安全系统的设计实践","authors":"P. Cofta, H. Lacohée, Paul Hodgson","doi":"10.4018/jdtis.2010100101","DOIUrl":null,"url":null,"abstract":"Companies are increasingly dependent on modern information and communication technology (ICT), yet the successful adoption of ICT systems stubbornly hovers at only around 50%, adding disappointment to business losses. Trust (both inter-personal and technology-related) has significant explanatory power when it comes to technology adoption, but only as part of a systematic methodology. Therefore, understanding more fully the interaction between human process and technology by adding the richness of socio-technical considerations to the design process of ICT systems should significantly improve adoption rates. At the same time, trust-based design has to demonstrate the (often neglected) business value of trust. ‘Designing for trust’, discussed in this chapter, is a design framework that consolidates trust governance and security management. Trust governance is a complete proposition that makes trust relevant to business practices, including the design and deployment of ICT systems. Trust governance incorporates the business justification of trust with an analytical framework, and a set of relevant tools and methods, as well as a maturity model. This chapter discusses how ‘designing for trust’ leverages trust governance into the design practices of ICT systems by complementing security-based methodologies, demonstrating the value of this approach. DOI: 10.4018/978-1-61520-837-1.ch010 2 International Journal of Dependable and Trustworthy Information Systems, 1(4), 1-24, October-December 2010 Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited. 2006) adding frustration and damaged reputation to lost investment and missed revenues. While analysing the reasons for such a lack of success, it is apparent that failures can often be attributed to the lack of social adoption of such new systems. This lack of adoption often originates in inappropriately designed and applied security measures (Cranor and Garfinkel, 2005), that are either too lax (so that they expose vulnerabilities) or too stringent (so that they inspire creative rejection), or finally, they may be appropriate in strength but entirely ignore established practices. Note that quite often such security measures are designed in full accordance with requirements or specifications, yet they miss the importance of the social context of practical application (Lippert & Davis, 2006). A system that fails to achieve adoption represents a business loss, but a system that is not fully or willingly adopted represents a significant security vulnerability, specifically if users of such a system are set to circumvent security controls by means of creative social practices. For example, even the most sophisticated access control does not provide security if users choose to use their access cards according to their perception of social relationships (and value systems) rather than according to security policies (Collins, 2007), or if a PIN code for a credit card is shared (Lacohee, Cofta, Phippen, and Furnell, 2008). This phenomenon of ‘unintended consequences’ can be best described in terms of affordance, coined by Gibson (1986) and popularised in the field of HCI and design by Norman (1988) who applied the concept to everyday artefacts. Norman defined affordance as “the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used.” Affordance, therefore, determines what an ICT system can be used for, following the intentions of its users, while specification and design concentrates on how the system is intended to be used by its original designers. The disparity between both intentions creates tension that eventually undermines system adoption. While the ‘answer’ to such ‘user challenges’ of ‘unintended consequences’ may lie partly in better education, improved usability or more stringent supervision, the underlying truth is that the deployment of an ICT system is a cause and enabler of a planned change (Lippert & Davis, 2006) that should be designed with its immediate social environment in mind. Successful technologies owe a large part of their success to the fact that they fulfil or enhance an existing human need, or fit well into an already well established social context. In common with other types of change, any unsubstantiated demand that requires a radical change of social practices will be met with rejection and creative re-use or even abuse. Therefore, a successful socio-technical approach to design should take into account social relationships and practices that surround a given system, leading to improvements in acceptance rates. Considering established software development practices, best design practice (Anderson, 2001) stresses that features such as security should be designed into the system as early as possible. This is corroborated by general observations that changes in the design stage can be made easier, faster and cheaper than changes undertaken later in the system’s lifetime (Boehm, 1981). Such approaches are well accepted and there are existing tools and methodologies to support them (e.g. Mouratidis & Giorgini, 2007). However, addressing security alone does not allow for the proper modelling of relationships between social agents and relationships between such agents and technology, where assurance can be resolved by means other than control. Adding the richness of social considerations (specifically the notion of affordance) to the design process of ICT systems should therefore significantly improve design considerations, thus decreasing potential security vulnerabilities as well as improving system adoption. It has been identified that trust (both inter-personal and technology-related) has significant explanatory power when it comes to technology adoption (Lippert & Davis, 2006). 22 more pages are available in the full version of this document, which may be purchased using the \"Add to Cart\" button on the publisher's webpage: www.igi-global.com/article/incorporating-social-trust-into-","PeriodicalId":298071,"journal":{"name":"Int. J. Dependable Trust. Inf. Syst.","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Incorporating Social Trust into Design Practices for Secure Systems\",\"authors\":\"P. Cofta, H. Lacohée, Paul Hodgson\",\"doi\":\"10.4018/jdtis.2010100101\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Companies are increasingly dependent on modern information and communication technology (ICT), yet the successful adoption of ICT systems stubbornly hovers at only around 50%, adding disappointment to business losses. Trust (both inter-personal and technology-related) has significant explanatory power when it comes to technology adoption, but only as part of a systematic methodology. Therefore, understanding more fully the interaction between human process and technology by adding the richness of socio-technical considerations to the design process of ICT systems should significantly improve adoption rates. At the same time, trust-based design has to demonstrate the (often neglected) business value of trust. ‘Designing for trust’, discussed in this chapter, is a design framework that consolidates trust governance and security management. Trust governance is a complete proposition that makes trust relevant to business practices, including the design and deployment of ICT systems. Trust governance incorporates the business justification of trust with an analytical framework, and a set of relevant tools and methods, as well as a maturity model. This chapter discusses how ‘designing for trust’ leverages trust governance into the design practices of ICT systems by complementing security-based methodologies, demonstrating the value of this approach. DOI: 10.4018/978-1-61520-837-1.ch010 2 International Journal of Dependable and Trustworthy Information Systems, 1(4), 1-24, October-December 2010 Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited. 2006) adding frustration and damaged reputation to lost investment and missed revenues. While analysing the reasons for such a lack of success, it is apparent that failures can often be attributed to the lack of social adoption of such new systems. This lack of adoption often originates in inappropriately designed and applied security measures (Cranor and Garfinkel, 2005), that are either too lax (so that they expose vulnerabilities) or too stringent (so that they inspire creative rejection), or finally, they may be appropriate in strength but entirely ignore established practices. Note that quite often such security measures are designed in full accordance with requirements or specifications, yet they miss the importance of the social context of practical application (Lippert & Davis, 2006). A system that fails to achieve adoption represents a business loss, but a system that is not fully or willingly adopted represents a significant security vulnerability, specifically if users of such a system are set to circumvent security controls by means of creative social practices. For example, even the most sophisticated access control does not provide security if users choose to use their access cards according to their perception of social relationships (and value systems) rather than according to security policies (Collins, 2007), or if a PIN code for a credit card is shared (Lacohee, Cofta, Phippen, and Furnell, 2008). This phenomenon of ‘unintended consequences’ can be best described in terms of affordance, coined by Gibson (1986) and popularised in the field of HCI and design by Norman (1988) who applied the concept to everyday artefacts. Norman defined affordance as “the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used.” Affordance, therefore, determines what an ICT system can be used for, following the intentions of its users, while specification and design concentrates on how the system is intended to be used by its original designers. The disparity between both intentions creates tension that eventually undermines system adoption. While the ‘answer’ to such ‘user challenges’ of ‘unintended consequences’ may lie partly in better education, improved usability or more stringent supervision, the underlying truth is that the deployment of an ICT system is a cause and enabler of a planned change (Lippert & Davis, 2006) that should be designed with its immediate social environment in mind. Successful technologies owe a large part of their success to the fact that they fulfil or enhance an existing human need, or fit well into an already well established social context. In common with other types of change, any unsubstantiated demand that requires a radical change of social practices will be met with rejection and creative re-use or even abuse. Therefore, a successful socio-technical approach to design should take into account social relationships and practices that surround a given system, leading to improvements in acceptance rates. Considering established software development practices, best design practice (Anderson, 2001) stresses that features such as security should be designed into the system as early as possible. This is corroborated by general observations that changes in the design stage can be made easier, faster and cheaper than changes undertaken later in the system’s lifetime (Boehm, 1981). Such approaches are well accepted and there are existing tools and methodologies to support them (e.g. Mouratidis & Giorgini, 2007). However, addressing security alone does not allow for the proper modelling of relationships between social agents and relationships between such agents and technology, where assurance can be resolved by means other than control. Adding the richness of social considerations (specifically the notion of affordance) to the design process of ICT systems should therefore significantly improve design considerations, thus decreasing potential security vulnerabilities as well as improving system adoption. It has been identified that trust (both inter-personal and technology-related) has significant explanatory power when it comes to technology adoption (Lippert & Davis, 2006). 22 more pages are available in the full version of this document, which may be purchased using the \\\"Add to Cart\\\" button on the publisher's webpage: www.igi-global.com/article/incorporating-social-trust-into-\",\"PeriodicalId\":298071,\"journal\":{\"name\":\"Int. J. Dependable Trust. Inf. Syst.\",\"volume\":\"5 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2010-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Int. J. Dependable Trust. Inf. Syst.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.4018/jdtis.2010100101\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. J. Dependable Trust. Inf. Syst.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/jdtis.2010100101","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

摘要

一般的观察证实了这一点,即在设计阶段进行的更改比在系统生命周期后期进行的更改更容易、更快、更便宜(Boehm, 1981)。这些方法被广泛接受,并且有现有的工具和方法来支持它们(例如Mouratidis & Giorgini, 2007)。然而,单独处理安全问题并不允许对社会代理之间的关系以及这些代理与技术之间的关系进行适当的建模,在这些关系中,保证可以通过控制以外的方法解决。因此,在ICT系统的设计过程中加入丰富的社会考虑因素(特别是可提供性的概念)应该会显著改善设计考虑因素,从而减少潜在的安全漏洞,并改善系统的采用。已经确定,当涉及到技术采用时,信任(人际关系和技术相关)具有显著的解释力(Lippert & Davis, 2006)。本文档的完整版还有22页,可通过出版商网页www.igi-global.com/article/incorporating-social-trust-into-上的“添加到购物车”按钮购买
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Incorporating Social Trust into Design Practices for Secure Systems
Companies are increasingly dependent on modern information and communication technology (ICT), yet the successful adoption of ICT systems stubbornly hovers at only around 50%, adding disappointment to business losses. Trust (both inter-personal and technology-related) has significant explanatory power when it comes to technology adoption, but only as part of a systematic methodology. Therefore, understanding more fully the interaction between human process and technology by adding the richness of socio-technical considerations to the design process of ICT systems should significantly improve adoption rates. At the same time, trust-based design has to demonstrate the (often neglected) business value of trust. ‘Designing for trust’, discussed in this chapter, is a design framework that consolidates trust governance and security management. Trust governance is a complete proposition that makes trust relevant to business practices, including the design and deployment of ICT systems. Trust governance incorporates the business justification of trust with an analytical framework, and a set of relevant tools and methods, as well as a maturity model. This chapter discusses how ‘designing for trust’ leverages trust governance into the design practices of ICT systems by complementing security-based methodologies, demonstrating the value of this approach. DOI: 10.4018/978-1-61520-837-1.ch010 2 International Journal of Dependable and Trustworthy Information Systems, 1(4), 1-24, October-December 2010 Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited. 2006) adding frustration and damaged reputation to lost investment and missed revenues. While analysing the reasons for such a lack of success, it is apparent that failures can often be attributed to the lack of social adoption of such new systems. This lack of adoption often originates in inappropriately designed and applied security measures (Cranor and Garfinkel, 2005), that are either too lax (so that they expose vulnerabilities) or too stringent (so that they inspire creative rejection), or finally, they may be appropriate in strength but entirely ignore established practices. Note that quite often such security measures are designed in full accordance with requirements or specifications, yet they miss the importance of the social context of practical application (Lippert & Davis, 2006). A system that fails to achieve adoption represents a business loss, but a system that is not fully or willingly adopted represents a significant security vulnerability, specifically if users of such a system are set to circumvent security controls by means of creative social practices. For example, even the most sophisticated access control does not provide security if users choose to use their access cards according to their perception of social relationships (and value systems) rather than according to security policies (Collins, 2007), or if a PIN code for a credit card is shared (Lacohee, Cofta, Phippen, and Furnell, 2008). This phenomenon of ‘unintended consequences’ can be best described in terms of affordance, coined by Gibson (1986) and popularised in the field of HCI and design by Norman (1988) who applied the concept to everyday artefacts. Norman defined affordance as “the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used.” Affordance, therefore, determines what an ICT system can be used for, following the intentions of its users, while specification and design concentrates on how the system is intended to be used by its original designers. The disparity between both intentions creates tension that eventually undermines system adoption. While the ‘answer’ to such ‘user challenges’ of ‘unintended consequences’ may lie partly in better education, improved usability or more stringent supervision, the underlying truth is that the deployment of an ICT system is a cause and enabler of a planned change (Lippert & Davis, 2006) that should be designed with its immediate social environment in mind. Successful technologies owe a large part of their success to the fact that they fulfil or enhance an existing human need, or fit well into an already well established social context. In common with other types of change, any unsubstantiated demand that requires a radical change of social practices will be met with rejection and creative re-use or even abuse. Therefore, a successful socio-technical approach to design should take into account social relationships and practices that surround a given system, leading to improvements in acceptance rates. Considering established software development practices, best design practice (Anderson, 2001) stresses that features such as security should be designed into the system as early as possible. This is corroborated by general observations that changes in the design stage can be made easier, faster and cheaper than changes undertaken later in the system’s lifetime (Boehm, 1981). Such approaches are well accepted and there are existing tools and methodologies to support them (e.g. Mouratidis & Giorgini, 2007). However, addressing security alone does not allow for the proper modelling of relationships between social agents and relationships between such agents and technology, where assurance can be resolved by means other than control. Adding the richness of social considerations (specifically the notion of affordance) to the design process of ICT systems should therefore significantly improve design considerations, thus decreasing potential security vulnerabilities as well as improving system adoption. It has been identified that trust (both inter-personal and technology-related) has significant explanatory power when it comes to technology adoption (Lippert & Davis, 2006). 22 more pages are available in the full version of this document, which may be purchased using the "Add to Cart" button on the publisher's webpage: www.igi-global.com/article/incorporating-social-trust-into-
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信