Doh! Of course it was easier to implement. IETF wants a working open source implementation before standardising.

Have you ever tried to implement an ITU standard from just reading the specs? It's hard. Firstly you have to spend a lot of money just to buy the specs. Then you find the spec is written by somebody who has a proprietary product, and is tiptoeing along a line that reveals enough information to keep the standards body happy (ie, has enough info to make it worthwhile to purchase the specification), and not revealing the secret sauce in their implementation.

I've done it, and it's an absolute nightmare. The IETF RFCs are a breath of fresh air in comparison. Not only can you read the source, there are example implementations!

And if you think that didn't lead to a better outcome, you're kidding yourself. The ITU process naturally leads to a small number of large engineering orgs publishing just enough information so they can interoperate, while keeping enough hidden so the investment discourages the rise of smaller competitors. The result is, even now I can (and do) run my own email server. If the overly complicated bureaucratic ITU standards had won the day, I'm sure email would have been run by a small number of CompuServe like rent seeking parasites for decades.

Given that general public uses social network services for electronic messaging today, and those don't even pretend they want to be interoperable, we've got parasites of a totally different class on top of the Internet infrastructure.

Remember jabber/xmpp? At least they tried to interoperate. Google Talk at the beginning had interoperability as its main feature, but Google quickly scrapped that.

UPDATE: some say that's because XMPP was too encompassing of a standard (if a format allows to do too much it loses usefulness, like saying that binary files format can store anything). IMO that's not the reason, they could just support they own subset. They scrapped interoperability for competition only IMO.

> IETF wants a working open source implementation before standardising.

I don't think that's IETF policy. Individual IETF working groups decide whether to request publication of an RFC, and the availability of open source implementations is a strong argument in favour of publication, but not a hard requirement.

If the IETF standards are sometimes useful, it's more a matter of culture than of policy.

A great example of this was PKIX, whose policy was "we'll publish it as a standard and someone else will have to figure out how to make it work". There are 20-year-old standards-track PKIX documents that have no known implementations.

I have been told that ITU specifications are deliberately confusing so that they can sell consulting services.

However, I think DER is good (and is better than BER, PER, etc in my opinion). (I did make up a variant with a few additional types, though.)

OID is also a good idea, although I had thought they should add another arc for being based on various kind of other identifiers (telephone numbers, domain names, etc) together with a date for which that identifier is valid (to avoid issues with reassigned identifiers) as well as possibility of automatic delegation for some types (so that e.g. if you register an account on another system then you can get a free OID from it too; there is a bit of difficulty in some cases but it might be possible). (I have written a file about how to do this, although I did not publish it yet.)