Ruhr-Uni-Bochum
Cyber Security in the Age of Large-Scale Adversaries

Copyright: stock.adobe.com: VideoFlow

Icon Usability

Key achievements

  1. improve the usability of security and privacy mechanisms
  2. develop novel methods that improve the usability of security and privacy mechanisms for end-users

Research Hub D: Usability

The Usability Hub is driven by the following Principal Investigators (PIs): Asia J. Biega, Markus Dürmuth, Sascha Fahl, Karola Marky, Veelasha Moonsamy, Alena Naiakshina, Nikol Rummel, M. Angela Sasse and Yixin Zou.

An important goal of CASA is to integrate human behavior and technical security research. We will focus both on end-user behavior and the systems engineering process, ranging from cryptographers to developers and engineers. Two research challenges will be pursued:

RC 10: Engineers and Usability

In this Research Challenge, we will focus on methods that will enable us to improve the usability of security and privacy mechanisms for engineers such as cryptographers, software developers, but also system administrators. 

RC 11: End-Users and Usability

We plan to develop novel methods that enable us to improve the usability of security and privacy mechanisms for end-users. Our goal also is to enhance the usability of computer systems and environments with high security and privacy requirements.

Research Projects in HUB D

One of the most intractable security problems is usability, the art of designing security systems and policies so that it is easier to follow them correctly than to create new vulnerabilities by bypassing them. Working on solutions is the research focus for Hub D.

Users Are Not the Enemy

Hub D leader Angela Sasse launched the field of usable security in 1999 with Users Are Not the Enemy, written with Anne Adams. The paper highlighted the problem of poorly implemented password requirements in a large company and made recommendations for designing systems and policies to make it easier for users to comply.

"In the 20 years since we've made some progress in terms of getting more usable forms of authentication," Sasse says. "It's not deployed everywhere yet, but it is happening."

What has become clear, however, is how many other vulnerabilities are a result of difficulties caused by poor design - for example, by tools and documentation that make it hard for developers to get security right in the first place.

Poor use of encryption

Encryption is an important example and the focus of one project. Even though it's a crucial technology for securing data, encryption has never been adopted as widely as it should be. The first paper highlighting this problem is also 20 years old: Why Johnny Can't Encrypt (PDF), by Alma Whitten and J. D. Tygar.

"When you look at encryption, I don't think there has been a lot of progress. In terms of it being deployed and people using it, it's not improved - and ultimately, security mechanisms that aren't used don't offer any protection," Sasse says. Most email is still not encrypted, even where companies clearly would benefit - for example, because the resulting authentication would block many phishing attacks.

Sasse is leading a project to understand why encryption is not more widely adopted. As a beginning, the project is developing a map of the cryptography ecosystem, including the many stakeholders and researchers and what they do.

"We've tried to really look down the chain at who is involved in deciding to use it, then deploying it, and so on, to identify blockers and enablers to adoption."

One of the early results surrounds deciding when to use secure communications. "If you have to use something special, that already means it's doomed to fail. It will always be an extra-special and difficult effort, and even if people are willing to do it they are more likely to make mistakes. It has to be a routine action, a habit."

In the first strand of this research, Sasse's group is interviewing veteran researchers about their experiences and where they think the barriers are. The more provocative parts help try to untangle these conflicts in further interviews. In addition, the group is asking both companies that build and sell encryption software and reluctant customers who really would benefit to identify the arguments, prejudices, and misunderstandings cited when companies decline or hesitate over adoption.

How media and research create false mental models

Key insights are already emerging. First, the implementation guidelines the authorities set for regulations such as Germany's Bundesdatenschutzgesetz (BDSG) data protection law are so complex and their restrictions so overwhelming that companies shy away unless they are legally required. Second, a meta-review of the literature on usability of encryption reveals that even among developers - let alone ordinary users - mental models of how encryption works are sparse. Worse, the one piece of misinformation that dominates is the belief that encryption is easily broken. Partly, this is due to the greater attention media and the research community give to reports of successful cracking, which overwhelm the vast landscape of solidly working cryptography.

The belief that encryption is easily broken is, Sasse says, poisonous because, "It fosters the futility argument that actually it's not worth bothering. Also, awareness of end-to-end encryption is particularly low, so they think that their email provider or anyone writing the software can read it anyway." The project intends to build a glossary and some basic models for communication to help change these mistaken beliefs.

Lack of clarity about decision-making processes among developers

Another project, a collaboration between Sasse and Sascha Fahl, investigates the process by which developers make decisions in implementing security policies and protections. This is foundational research; perhaps surprisingly, there is no scientific or academic knowledge about this area.

"We know how security systems should be designed so end users can work with them better than they can with most systems implemented today," Fahl says. "But we don't know as a research community how developers make usable security decisions for their end users." On an ordinary login form, for example, who decides what the constraints on password formation should be, and how? What is the process for writing the warning messages in anti-virus software? Who  - designers, developers, product managers, user experience experts - makes the decisions? "The question basically is that for some reason bad security decisions end up in all sorts of products, but we don't know why."

The first part of the project consists of about 40 interviews to find out whether there are structured processes in software development life cycles and who makes the decisions. Based on the findings from the interviews, the second part will be to design and research new interventions.

In a third project Fahl is working with Lucas Davi from Hub C. They work on software security, exploring the results of adopting the ten-year-old RUST systems programming language. RUST was intended to replace the C and C++ languages with one designed to produce fewer vulnerabilities while offering similar run-time performance. This project has two aspects: first, whether the tools and documentation actually are better from a developer perspective, and second, whether the code the system produces is better, using steady code analysis and other techniques.

All these projects show Hub D's special role within basic research in information security. Human factors are an essential part of security, but remain patchily researched. CASA hopes to create new milestones in this field

Other projects of CASA

Hub D's work on usable security is one of four  projects that make up the cluster of excellence  CASA - Cyber Security in the Age of Large-Scale Adversaries. The other three are "Future Cryptography" led by Eike Kiltz (Hub A); "Embedded Security", led by Christof Paar (Hub B) and "Secure Systems", led by Thorsten Holz (Hub C).

Author: Wendy M. Grossman

General note: In case of using gender-assigning attributes we include all those who consider themselves in this gender regardless of their own biological sex.