Humans in the Loop and Education Don't Really Mix
Briefly

Humans in the Loop and Education Don't Really Mix
"The need for humans in the loop when automated systems are doing the bulk of the work is obvious. When the automation breaks, we need human judgment to set things right. The challenge for the humans in the loop is to make sure you understand the loop and to maintain sufficient attention over the automated loop to detect when intervention is necessary."
"Autopilot on planes is an obvious example of a human-in-the-loop system that seems to work. In this particular case, the human pilots are literally trained to maintain vigilance over these systems, and the systems are designed to require active input before changing something like heading or altitude."
"There are other human-in-the-loop systems where the human is not trained to practice vigilance and where the use of automation over time lulls the human into inattention because the automation appears to work so well-until it suddenly doesn't."
Human-in-the-loop systems place humans as monitors of automated processes, requiring intervention when automation fails. The challenge lies in two critical areas: operators must understand how the system works, and they must maintain sufficient attention to detect when intervention is needed. Well-designed systems like airplane autopilot succeed because pilots receive rigorous training in vigilance and systems require active human input for critical changes. However, many automated systems lack these safeguards, allowing operators to become complacent as automation appears reliable, creating dangerous situations when failures occur. This complacency represents a significant risk in AI and education applications where human oversight is assumed but not properly supported.
[
|
]