The problem with that is twofold:
- Browsers can afford to be tolerant because it usually doesn’t matter if a web page looks a little wonky. When a computer misunderstands a program because of a misplaced comma and it doesn’t say so right away, space rockets fall from the sky. I say “when”, not “if”, because it’s a thing that actually happened in history. More than once.
- Even in browsers, it could be super-frustrating when a web page just didn’t look right and there was no way to tell what you did wrong because the oh-so-tolerant browser wouldn’t tell you. (See also: the infamous
edtext editor, that just displays a ? when something is wrong.) Sure enough, modern browsers have consoles that can be used to check errors.
You can’t learn from mistakes without feedback. Even cars have a “check engine” light.
True, but if you read the entire post you’ll find that Terence Eden isn’t just writing about why XHTML was a bad idea. He’s saying that people should be more tolerant than compilers and that we shouldn’t tolerate the intolerant.
Well, sure, machines are machines and people are people. We shouldn’t try to turn one into the other (and it’s exactly what technologists have been trying for decades). I often say we should be forgiving precisely because the laws of physics aren’t.
The specific examples however awake the cynic in me that I keep trying to bury. Yes, humans are all too happy to use / abuse the creations of those they torture to death. Aaron Swartz also springs to mind. But keep in mind ours is a culture that deems autistic people to be socially handicapped for (checks notes) having a strong sense of justice and refusing to be unfair or suffer unfairness. Most people want to be sloppy or worse without suffering consequences, and sometimes that’s simply not an option.
There’s a time for muddling through things all happy-go-lucky and a time for self-discipline. And we should know when to demand the latter.