Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The fact that lots of people want to comment their package.json file and can’t easily due to Crockford’s decision does not mean that Crockford’s argument is nonsense. Crockford’s argument is not that there shouldn’t be comments in JSON because no one would want to use them. And again, I would also like to have comments in JSON.


My point is that I don't buy Crockford's argument, at all, against having comments in JSON. "I removed comments from JSON because I saw people were using them to hold parsing directives, a practice which would have destroyed interoperability".

So instead you have people coming up with a million different bad hacks to support comments in non standard ways (e.g. "fake comment" properties, duplicate keys, etc.) Coupled with the fact that since it's not in the standard, you never really know if the file you're creating may be read by something that won't strip comments. It's the worst of all possible worlds in my opinion, with 0 benefit.


I’ve personally never encountered any combination of JSON-encoded data and JSON parser that didn’t work perfectly together. I don’t know whether the right trade-offs were made between interoperability and other features, but I’d say it’s very clear that if interoperability was a chief design goal, it was executed extremely well.


I don't understand why you keep making the same circular arguments. If the lowest-level standard for JSON-encoded data also included comments and trailing commas, than I have no doubt that every combination of JSON-encoded data and JSON parser would be able to parse that. Instead, it wasn't put in the lowest level spec, so instead what you have are lots of individual parsers with non-compatible custom flags. If Crockford's argument was "I got rid of comments because people were using them for preprocessor directives", then he just made the problem a million times worse because now there are a million different, incompatible "preprocessor directives" that are just implemented as basically command line args now if a user wants to have a very basic feature, which is comments.


> If Crockford's argument was "I got rid of comments because people were using them for preprocessor directives", then he just made the problem a million times worse because now there are a million different, incompatible "preprocessor directives"

No, because his goal was to make JSON a highly-compatible interchange format, which it is! There are roughly zero incompatible preprocessor directives, and roughly zero problems using JSON to exchange data between parties. Seriously, when have you received JSON that you couldn’t immediately parse with your standard JSON parser of preference? The fact that some people might devise their own tools to store incompatible JSON on their end with things like comments, and strip those things out when transmitting them in order to make them compatible JSON, does not qualify as an incompatible preprocessor directive. On the contrary, thats precisely the intent of the choice to not have comments in JSON.


> Seriously, when have you received JSON that you couldn’t immediately parse with your standard JSON parser of preference?

Every single time for any non-mediocre definition of parsing. JSON has no way to transmit the meaning of values, no datetimes, no units, no sets, nor any other semantic attribute. That means your "parsed" result is always useless and wrong on its own, literally a lesser-dimensional projection of the original information, until you interpret it again in an entirely ad-hoc custom manner. Wouldn't it be nice if you could store instructions inline for how to do that.


JSON schema exist exactly for that reason, hacks like adding type is just nonsense. If somebody need comments, types it - XML has all that. Also how hard is to write {"type": "datetime", "value": "2021-02-23 12:46:37.07"}?


> JSON schema exist exactly for that reason, hacks like adding type is just nonsense

Creating schemas for declaring types is literally a hack for adding type! I think its very bizarre to say "You dont need to do X because you can just do X."

> Also how hard is to write {"type": "datetime", "value": "2021-02-23 12:46:37.07"}

"datetime" isn't a universal format specification. 2021-01-10 could be October 1 or January 10. What time zone is this in? Local to the sender, UTC, local to the recipient, somewhere else? Is this a 24 hour clock or something that gets a modifier for the other half of the day? You're making some classic mistakes that people make when they don't think about the complexity of the problem domain of communicating information unambiguously.


> Creating schemas for declaring types is literally a hack for adding type!

It is a "communication protocol" - base of any messaging system. Message has header with protocol name and version: HTTP, TCP, DB connection, SOAP, any message queue - everything working that way.

> "datetime" isn't a universal format specification. than use another (see the comment above)

self-contained message is anti-pattern


> I’ve personally never encountered any combination of JSON-encoded data and JSON parser that didn’t work perfectly together.

As recently as 2017, the Google Translate API used to return completely invalid JSON with empty array indices instead of nulls ([,,,] instead of [null,null,null,null]). This broke several JSON parsers until they patched around Google's cavalier abuse of the language. Your experience of everything always working perfectly doesn't mean that everything always works perfectly.


That sounds more like a bug that, because the group responsible for the bug didn’t fix the bug and was extremely important to the community, developers of JSON parsers had to put in bug fixes on their end instead. I’m not familiar with the details of that event, but it doesn’t sound like anyone was actually accepting this as a new valid variant JSON, and AFAIK it didn’t lead to the propagation of different variants of JSON that are both in significant use but are incompatible. Or to put it another way, I don’t think people need to worry about whether they’re using “Google Translate JSON” or “Standard JSON.”


> I’m not familiar with the details of that event, but it doesn’t sound like anyone was actually accepting this as a new valid variant JSON

When they accept it, it becomes defacto valid. That's how acceptance works.

See also http://seriot.ch/parsing_json.php#41 for a big table of "JSON-encoded data and JSON parser that didn’t work perfectly together". Read the whole page though. It's quite enlightening.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: