The Valley of Dangerous Complacency is when a system works often enough that you let down your guard around it, but in fact the system is still dangerous enough that full vigilance is required.
- If a robotic car made the correct decision 99% of the time, you'd need to grab the steering wheel on a daily basis, you'd stay alert and your robot-car-overriding skills would stay sharp.
- If a robotic car made the correct decision 100% of the time, you'd relax and let your guard down, but there wouldn't be anything wrong with that.
- If the robotic car made the correct decision 99.99% of the time, so that you need to grab the steering wheel or else crash in 1 of 100 days, the task of monitoring the car would feel very unrewarding and the car would seem pretty safe. You'd let your guard down and your driving skills would get rusty. After a couple of months, the car would crash.
Compare "Uncanny Valley" where a machine system is partially humanlike - humanlike enough that humans try to hold it to a human standard - but not humanlike enough to actually seem satisfactory when held to a human standard. This means that in terms of user experience, there's a valley as the degree of humanlikeness of the system increases where the user experience actually gets worse before it gets better. Similarly, if users become complacent, a 99.99% reliable system can be worse than a 99% reliable one, even though, with enough reliability, the degree of safety starts climbing back out of the valley.
Comments
Eric Rogstad
I'm reminded of this article: http://www.macroresilience.com/2011/12/29/people-make-poor-monitors-for-computers/, which provides some interesting examples.