Fog Creek Software
Discussion Board




Joel's latest post about stict clients

What was his point except to show there is an exception to every piece of common wisdom?

Here's the counter example:

http://diveintomark.org/archives/2004/01/14/thought_experiment

Gregg Tavares
Monday, July 26, 2004

I disagree with that "thought experiment" counter example.  If the XML/HTML world actually became strict when it said strict (currently it just wags a finger and makes tsk tsk noises) of course those sorts of problems would come out, but in time they would be ironed out. Currently, many systems screen input data for invalid sequences and screen them out; avoiding SQL injection is a good example. This wouldn't be any different.

HTML on the web is quite a mess due to the liberal acceptance of the browsers. With a set of closed systems in a corporate environment, I'm sure the gain from Postel's Law outweighs the bad. But with open standards, eventually the standards are replaced with something rather viscous, similar in texture to hot toffee, but without the sweet taste.

The fundamental point is that Postel's Law essentially dilutes standards and creates conflicts between different supported (undocumented) sets of "liberal" standards later on.

Still, it's not something that I think can be realistically avoided. It only takes one liberal client to start the whole thing off. Once one product accepts bad data, everyone has to join in otherwise, say, you wouldn't be able to "surf the web fully" like the competitor product can. It's not possible to put the toothpaste back into the tube, especially that stripey one.

Joel Goodwin
Monday, July 26, 2004

I think you need to split it into phases:

1) Development: everything should be set zero tolerance.  At this stage you want to spot every fault at the earliest time possible, becaues the earlier you catch a bug, the cheaper it is to fix.

2) Production: everything should be set to maximum tolerance.  Everything should be able to limp along whenever possible.  Because once you've gone live the longer it takes to get things up and running again, the more it costs you in lost revenue.

I think the best analogy is training with weights.  When training for a race you can improve the effectiveness by strapping weights on.  Then your going to have to work much harder.

On the day of the race, you do not want to be wearing those weights.

Ged Byrne
Monday, July 26, 2004

My interpretation on "being liberal with what you accept" has a slight twist to it.  Your program should be able to catch whatever crap is thrown at it.  Now that doesn't mean it should cheerfully and silently digest it, and try to interpret what shoudl be done with the dreck.

Rather, a program should not be able to be killed by any input.  Take it, look at it, and if it is crap, dont process it, but advise the sender that the input was invalid, and exactly what was wrong about it.

Ken Ray
Monday, July 26, 2004

At the point when this was first discussed, all those years ago, it was considered sane to take a standard which says...

The first byte after the type marker MUST be either 00 (meaning use type A behaviour) or 01 (meaning type B) other bytes MUST NOT be sent.

...and interpret that as an OK to use the byte after the type marker as an offset into your 2 entry function pointer array. Because the standard says MUST and anyone who disobeys the standard deserves the system to fall apart around their ears, right?

Indeed it appears that even Cutler's NT4 kernel team felt this way, and it took a lot of service packs, and eventually W2K to fix all the barge-sized holes in the system call API and net stack that resulted.

But today we know (or we should know) not to do this. The principle of robustness augments the idea of defensive programming. When driving, you always need to be ready to stop, even if the traffic is moving smoothly; when opening a file, you always need to be ready to handle a "file not found" or even a "too many symlinks", even if you just created that same file; when parsing an XML element the closing angle bracket just might never come.

Handling all possibilities and blundering on regardless are not the same thing. Buying a monster truck so that you'll just go right over smaller cars is not defensive driving, and accepting </p< h1> as meaning </p><h1> is not robustness. Sometimes if you can't open a file, the right thing to do is say "I couldn't open $thatfile". Rarely is it better to just pick any recently used file and hope that will do.

The reason Mosaic accepted broken HTML is not because of the principle of robustness. A brief examination of the code, or simple experiment (sit a ten year old kid down with an early web era Mosaic and let them bash way...) will quickly confirm that robustness was not a high priority. Mosaic accepted, and in some cases even encouraged broken HTML because it was easier, simple programmer laziness.

Tialaramex
Monday, July 26, 2004

*  Recent Topics

*  Fog Creek Home