The architecture of human error
Briefly

The architecture of human error
"Take a look at the picture above. Countless dials, each presumably conveying critical information about the health of a nuclear reactor. Is this well designed? From the "functionality" perspective - yes, it works. It does what it is intended to do. But from a perspective of human error it couldn't be worse. The design makes it almost impossible to detect changes, to identify critical components, and most importantly, to make decisions based on the information."
"The result will be an inevitable catastrophe. There will be finger pointing. "Human Error" will be the cause. But to credit human error as the cause is to miss the point entirely. Human Error is a response to design. Not all mistakes are created equal. The type of mistake that occurs depends fundamentally on how the individual is thinking about things when when things go wrong."
Different stakeholders prioritize different aspects of good design: engineers emphasize functionality while UX designers emphasize customer experience. Design must also ensure consistent operation, reduce human error, and include safeguards for recovery. Complex, information-dense interfaces, such as a nuclear power plant control wall with countless dials, can function yet be highly error-prone because they hinder detection, component identification, and decision making. Consequences can include catastrophic failures with misplaced blame labeled as "human error." Human error often arises from poor design. The kinds of mistakes that occur depend on how individuals are thinking when systems fail.
Read at Medium
Unable to calculate read time
[
|
]