"Any serious usage" starts at "it just works".

JSON just works. Every language worth giving a damn about has a half-decent parser, and the syntax is simple enough that you can write valid JSON by hand. You wouldn't hit the edgy edge cases or the need to use things like schemas until down the line, by which point you're already rolling with JSON.

XML doesn't "just work". There are like 4 decent libraries total, all extremely heavy, that have bindings in common languages, and the syntax is heavy and verbose. And by the time you could possibly get to "advanced features that make XML worth using", you've already bounced off the upfront cost of having to put up with XML.

Frontloading complexity ain't great for adoption - who would have thought.

> JSON just works.

Until it doesn't: underspecified numeric types and string types; parses poorly if there's a missing bracket; no built-in comments.

For many applications it's fine. I personally think it's a worse basis for a DSL, though.

That's my point. By the time you hit "until it doesn't", you're already doing JSON, and were for a while.

Also, is "parse well if there's a missing bracket" even a desirable property? If you get files with mangled syntax, something has already gone horribly wrong. And, chances are, there is no way to parse them that would be correct.

By "parses well" in that case I mean "can identify where the error is, and maybe even infer the missing closing tag if desirable;" i.e. error reporting and recovery.

If you've ever debugged a JSON parse error where the location of the error was the very end of a large document, and you're not sure where the missing bracket was, you'll know what I mean. (S-exprs have similar problems, BTW; LISPers rely on their editors so as not to come to grief, and things still sometimes go pear-shaped.)