Many employees find information security secondary to their normal day-to-day work, often leaving their organisation vulnerable to cyber attacks, particularly if they are stressed or tired. When users perform tasks that comply with their own mental models (i.e. the way that they view the world and how they expect it to work), the activities present less of a cognitive challenge than those that work against these models. If people can apply their previous knowledge and experience to a problem, less energy is required to solve it in a secure manner, and they are less mentally depleted by the end of the day. For example, a piece of research on disk sanitisation highlighted the importance of secure file removal from the hard disk.[1] It is not clear to users that emptying the ‘Recycle Bin’ is insufficient and that files can easily be recovered. However, there are software products available that exploit users’ mental models. They employ a ‘shredding’ analogy to indicate that files are being removed securely, which echoes an activity they would perform at work. Such an interface design might help lighten the burden on users. Security professionals should pay attention to the usability of security mechanisms, aligning them with users’ existing mental models. In The Laws of Simplicity,[2] John Maeda supports the importance of making design more user-friendly by relating it to an existing experience. He refers to an example of the desktop metaphor introduced by Xerox researchers in the 1980s. People were able to relate to the graphical computer interface as opposed to the command line. They could manipulate objects similarly to the way they did with a physical desk: storing and categorising files in folders, as well as moving or renaming them, or deleting them by placing them in the recycle bin.
Using mental models
Building on existing mental models makes it easier for people to adopt new technologies and ways of working. However, such mappings must take cultural background into consideration. The metaphor might not work if it is not part of the existing mental model. For instance, Apple Macintosh’s original trash icon was impossible to recognise in Japan, where users were not accustomed to metallic bins of this kind. Good interface design not only lightens the burden on users but can also complement security. Traditionally, it has been assumed that security and usability always contradict each other – that security makes things more complicated, while usability aims to improve the user experience. In reality, they can support each other by defining constructive and destructive activities. Effective design should make constructive activities simple to perform while hindering destructive ones. This can be achieved by incorporating security activities into the natural workflow of productive tasks, which requires the involvement of security professionals early in the design process. Security and usability shouldn’t be extra features introduced as an afterthought once the system has been developed but an integral part of the design from the beginning. Security professionals can provide input into the design process via several methods such as iterative or participatory design.[3] The iterative method consists of each development cycle being followed by testing and evaluation and the participatory method ensures that key stakeholders, including security professionals, have an opportunity to be involved.
About the Author: Leron Zinatullin (@le_rond) is an experienced risk consultant, specialising in cyber security strategy, management and delivery. He has led large scale, global, high value security transformation projects with a view to improving cost performance and supporting business strategy. He has extensive knowledge and practical experience in solving information security, privacy and architectural issues across multiple industry sectors. Visit Leron’s blog here: https://zinatullin.com/ To find out more about the psychology behind information security, read Leron’s book, The Psychology of Information Security. Editor’s Note: The opinions expressed in this guest author article are solely those of the contributor, and do not necessarily reflect those of Tripwire, Inc.
[1] S. L. Garfinkel and A. Shelat, “Remembrance of Data Passed: A Study of Disk Sanitization Practices”, IEEE Security & Privacy, 1, 2003, 17–27. [2] John Maeda, The Laws of Simplicity, MIT Press, 2006. [3] For iterative design see J. Nielsen, “Iterative User Interface Design”, IEEE Computer, 26(11) (1993), 32–41; for participatory design see D. Schuler and A. Namioka, Participatory Design: Principles and Practices, CRC Press, 1993.