logo

54 pages 1 hour read

Donald Norman

The Design of Everyday Things

Nonfiction | Book | Adult | Published in 1988

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Chapter 5Chapter Summaries & Analyses

Chapter 5 Summary and Analysis: “Human Error? No, Bad Design”

Chapter 5 argues that bad design causes accidents and mishaps, not human error. Seeking the root causes of failures and redesigning systems to address them is the only way to avoid repeating disasters. Norman promotes HCD as the path forward because it takes human behavior, including inattention, into account.

Understanding Why There is Error

Norman argues that machines require people to behave in unnatural ways, resulting in errors. Staying alert for hours while multitasking, for example, can result in mistakes that lead to financial loss, injury, and even death. Investigators typically seek the root cause of failures, assign blame, and punish the guilty with fines, more training, job loss, or jail time. As Norman observes, however, accidents result from a chain of events and can only be prevented by disrupting the chain.

Norman objects to solely blaming people for errors, urging investigators to look more closely for root causes. To this end, he advocates a technique called the “Five Whys,” which stresses the need to ask as many questions as necessary to determine the root cause of a problem. This technique prompts investigators to look beyond human error and discover the underlying issues behind failures. Humans are creative, exploratory beings who are prone to error. Designing systems and products that take human behavior into account can minimize accidents.

Deliberate Violations

Norman distinguishes between intentional and unintentional deviations. Some failures result from the deliberate violation of rules and procedures, such as running a red light. Norman brackets out deliberate violations, deeming them beyond the scope of his book.

Two Types of Errors: Slips and Mistakes

Norman divides errors into two categories: slips and mistakes. Slips are action-based, while mistakes are rule- or knowledge-based. Slips occur when actions are not performed correctly, or when a person intends to do one action but ends up doing another. By contrast, mistakes happen when the goal or plan is incorrect. Both types of errors can involve memory lapses, most of which are caused by interruptions. Memory lapses that are cognitive are mistakes, while those that are subconscious are slips. Slips and mistakes can be understood in relation to the seven stages of action described in Chapter 2. Mistakes are errors in the establishment of goals, while slips occur unconsciously during the execution of a plan.

The Classification of Slips

This section explains the difference between memory-lapse slips and action slips. Unbuckling one’s wristwatch instead of the seatbelt is an example of a memory-lapse slip, while forgetting one’s briefcase on the way to work is an action slip. Drawing on the work of psychologists, Norman describes slips as “the study of the psychology of everyday errors” (173). This might involve being told something but hearing something different. Slips generally result from a lack of attention. As such, they are more common among experts who perform actions automatically than among novices, who must pay close attention to their tasks.

Three types of slips are relevant to design: Capture slips involve performing a recent activity instead of the desired activity; description-similarity slips involve erroneously acting upon an item similar to the target; and mode errors occur when the same controls have different meanings, depending on a device’s mode or state.

Norman offers techniques to minimize slips. Avoiding procedures that have identical opening steps is key to minimizing capture slips, while distinguishing between controls with different functions can minimize description-similarity slips. Although financial and space constraints prompt designers to create single controls for multiple purposes, Norman recommends eschewing this approach to avoid user confusion. If modes are indispensable, designers should make it clear to users which mode is being invoked. Designers can also combat memory-lapse errors by minimizing the number of steps and providing reminders of the steps needed to be completed.

The Classification of Mistakes

This section distinguishes between rule-based, knowledge-based, and memory-lapse mistakes. Rules are learned either through experience or formal guides, such as users’ manuals. Some rule-based mistakes involve misinterpreting a situation, which results in following the wrong rule. Others involve following the correct rule, but making an error because the rule itself is faulty. Rule-based mistakes are hard to detect and avoid. To minimize rule-based mistakes, designers must provide users with clear and complete guidance, ideally graphically.

In contrast to rule-based mistakes, knowledge-based mistakes occur when a situation is so novel that it requires a new procedure. Knowledge-based behavior involves problem-solving in unknown situations without relying on available rules or skills. Minimizing knowledge-based mistakes requires devising an appropriate conceptual model based on a good understanding of the situation. Cooperative problem-solving and the development of intelligent computer systems can also minimize knowledge-based mistakes.

Memory-lapse mistakes occur when people forget the goal or action plan. Interruption is the most common cause of memory lapses. Norman’s solutions for this type of mistake include ensuring that the relevant information is continuously available to users and providing guidance to users who must resume their activities after interruptions.

Social and Institutional Pressures

This section posits a relationship between social and institutional pressures and commercial accidents. Airline companies, for instance, are under immense pressure to keep their systems running, sometimes with disastrous consequences. Norman cites two aviation accidents to make this point. In 1977, a KLM Boeing 747 collided with a taxiing Pan American plane because the pilot took off without clearance in hopes of minimizing delays. Economic pressures, combined with poor weather conditions, contributed to the accident. Similarly, in 1982, an Air Florida flight from Washington, DC crashed into the Fourteenth Street Bridge during takeoff because rushing crews did not de-ice the plane correctly. This accident occurred despite warnings from the first officer, who expressed concerns to the captain during takeoff. Norman hypothesizes that the captain ignored his first officer’s warnings because of the pressure to stay on schedule, though he lacks evidence to support this claim.

Norman argues that placing safety above economic pressures is key to minimizing commercial accidents. Although he offers design strategies to increase safety, he does not address the pressure to maximize profits in capitalistic systems. Studies show that checklists reduce accidents, especially when performed collaboratively, as in commercial aviation (191). An effective checklist must be iterative, refined, and reflect human-centered design principles. Electronic checklists are beneficial because they keep track of skipped items and cannot be marked as complete until all items are addressed. Despite the research attesting to their effectiveness, many industries continue to resist checklists.

The Reporting Error

Errors are often hard to detect. Additionally, social pressures make it difficult for people to admit their errors or report those of others. Individuals who make errors can be fined, mocked, or imprisoned, while corporations can be sued. Norman argues that the only way to reduce errors is to admit that they happen and to take action to reduce their occurrence. People should be encouraged to report errors, not stigmatized or punished for committing them.

Norman presents NASA’s Aviation Safety Reporting System as a model for other industries. This program requires pilots to submit reports of their errors to NASA personnel, which are then used to enact safety changes. The names of pilots are removed during this process, preventing airline companies and the Federal Aviation Administration (FAA) from punishing individuals. When patterns emerge in the error reports, NASA makes recommendations to the airlines and the FAA to address the problems. Many fields would benefit from oversight, but lack neutral bodies like NASA.

Detecting Error

Detecting errors quickly is key to minimizing harm. Action slips are generally easier to detect than memory-lapse slips and mistakes, both of which lack error signals. As Norman notes, the absence of something that should have been done is harder to detect than the presence of something that should not have been done (195). Memory-lapse slips entail forgetting a component of the plan, while memory-lapse mistakes entail forgetting the plan in its entirety. People are often slow to detect errors because they explain away clues. However, the clues are generally clear in hindsight. Accident analysis requires analysts to put themselves in the shoes of those who made the error, to consider similar events and training, and to make recommendations to avoid similar errors in the future.

Designing for Error

This section argues that good design both reduces opportunities for error and offers users opportunities to correct errors. Most products and systems are designed to be used correctly and cease to function when users take inappropriate action. Many designs compound the problem by making it easy to err but hard to discover the problem and recover. Norman urges designers to understand the causes of errors and to create products and systems that minimize these causes.

Designers must strive to prevent errors and allow users to detect and correct errors when they occur. Adding constraints can block some errors. For example, automobile manufacturers separate filling points for different liquids, such as gas and oil, making it unlikely to confuse the two. Allowing users to undo wrong actions is also important, as is providing confirmation and error messages. Indeed, many computer systems require confirmation before executing certain commands, such as deleting files. Designers can improve confirmations by making the item being acted upon more prominent and by making the actions reversible.

In addition to “undo” features, Norman recommends creating electronic systems with sensibility or common-sense checks, such as banking systems that detect obvious currency errors. Minimizing mode errors requires eliminating modes or making modes as distinct from one another as possible. Similarly, designers can minimize slips by ensuring that actions and controls are either dissimilar or placed far apart. According to Norman, the most effective way of mitigating slips is to provide rapid and perceptible feedback describing the error and ways of rectifying the problem. Since interruptions are key sources of error, designers must create products and systems that allow users to resume the action cycle easily after an interruption.

When Good Design Isn’t Enough

People, and not designs, are sometimes responsible for errors. Alcohol, drugs, and sleep deprivation can cause people to misuse even the best designs. Drunk driving, for example, remains a major cause of car accidents, even in vehicles with advanced safety features (211). Such errors fall into the category of deliberate violations and lie outside the scope of Norman’s study.

Resilience Engineering

Norman presents resilience engineering as the paradigm of HCD because it takes human error as a given and plans accordingly. Industrial infrastructure, such as electrical plants and oil refineries, must be resilient enough to withstand extreme weather events and external attacks. Resilience engineers design systems, procedures, and training to protect this sensitive infrastructure. Assessment, testing, and updating are key aspects of resilience engineering. For example, some computer providers deliberately cause system errors to test response. These small tests are relatively safe and allow companies to improve their security.

The Paradox of Automation

According to Norman, automation can reduce, but not eliminate, errors. Automation has become more capable over time. However, automated tasks are becoming increasingly complex and thus more likely to fail. When automated systems fail, they tend to do so without warning, giving people few chances to respond in a timely manner.

Design Principles for Dealing with Error

This section advocates bridging the gap between people and machines. People are creative, versatile, and flexible, while machines are inflexible, precise, and rigid. Norman urges designers to create machines with people in mind, rather than forcing people to adapt to machines. He characterizes human error as nothing more than inappropriate human action for the needs of machines.

Norman recommends eliminating the concept of error and acknowledging that people need help translating their goals into the appropriate form for machines. Errors are inevitable. However, it is the designer’s job to minimize opportunities for error by putting the necessary knowledge about their products in the world, harnessing the power of natural and artificial constraints, and bridging the gulfs of execution and evaluation through perceivability. In sum, designers must embrace error, understand its causes, and take steps to minimize it.

blurred text
blurred text
blurred text
blurred text