Tolerance For Ambiguity

Tolerance For Ambiguity is both a people property and potentially a software property, at least when software becomes more able to handle the "real world" than it currently does.

Ambiguity and even contradiction are common elements in real life. We seem to deal with them by compartmentalizing our cognitive domains, to the point sometimes when we are not even aware of the contradictions. This is a marked survival property. I think we need to teach machines how to do it too. Some people do it better than others. I think that it is people who are comfortable with uncertainty that are able to handle ambiguity better than those who need to have a single right answer.

Here's an idea for what sort of software property "tolerance" might be:

"Tolerance" is a good name for the degree to which different implementations a software interface or protocol can diverge from what the concrete applications in a system already do before there are compatibility problems. It is related to generality and fragility. Ideally you wouldn't want this to happen at all, but in practice there can be constraints that are more complex than simple adherence to all the interface signatures. Contracts get at this notion.

Late binding is about not having to be sure about what object is going to answer your messages, after all. That has a ring of ambiguity. An abstraction with tight tolerances would make more demands on those who want to implement it than one with looser tolerances.

Tolerance could be considered a metric for the degree of specialization of a class. It would be related to the number of operations and transitively to the number of operations on all associated classes. As the specialization of a class or interface increases, its tolerance decreases.

What we need is a way of making software do something sensible even when it is used wrongly. People do that all the time, but software doesn't. A minimum tolerance for ambiguity as now practiced is something along the lines of saying: I don't understand and sending info on how to use the software correctly. But to be robust, the implementation has to be deep.

It's bad practice for a system to guess what was meant when a bad value is received. It's good practice to validate (inputs), process the ones that pass the edits, and requeue the rest after user intervention.

In AboutFace AlanCooper suggests that in many or most cases, user interfaces which do not allow entries like "13/13/13" in a date field are just being paranoid, and that the application should not complain until such time as the value is actually needed and found wanting, and only then if it causes a problem that the software can't resolve internally without bothering the user..

His claim is that often this value will never be used for anything internally, and a human will be able to figure out the correct value, and in the meantime the interface won't be giving the user annoying feedback. This is similar to YouArentGonnaNeedIt implementing it would certainly require ToleranceForAmbiguity in the supporting code.

And a tolerance for getting an incorrect check at some future time, or a system crash at 3 AM. Sometimes I think Cooper is on drugs.

Some well meaning validations can be damaging. An example: An inventory control system sees that a stock issue transaction would drive the balance on hand negative. Knowing that negative inventory is not feature of the real world, it rejects the transaction. But the problem with the stock count is probably one that occurred days ago, when some other issue or receipt was posted incorrectly. Refusing to process the current transaction just ensures that at least two real world transactions are not reflected in the current inventory balance.

An inventory system that rejects transactions like that probably has other problems. It's making the assumption that it's fully synchronized with any other systems that affect inventory - no network problems or delays, no messages pending in a queue, etc.; it's assuming that its number is the correct one matching the real-world count; it's making assumptions about shrink, preorders, backorders, etc. In the real world of business, there can be a surprising amount of ambiguity in something as simple as a counter.

{Often the customer doesn't know what they want until they see the negative impact of a feature missing or a problematic existing feature in their domain. A lot things can be anticipated up front, but the customer will often get overwhelmed or agitated if too many questions and what-if failure scenarios are brought to their attention by an analyst. Sometimes the customer just has to learn the hard way. The "negotiation" of these can be sticky. I try to cover my ass by documenting a potential future issue or gap, but otherwise try not to become a "pest" about them. I once overheard a customer saying, "After dealing with nerds all day, I need a f___ing drink."}

Contributors to this page included:


View edit of December 8, 2012 or FindPage with title or text search