Under Community Review

Built-in JSON parser restriction - Key duplicates

Hi, 

currently the built-in JSON parser in Passolo has a restriction: when a string key is used more than once in the same object, only the last string is parsed and displayed as a translatable string:

{
"lattitude": {
"INVALID_LAT": "Lattitude - Version 1",
"INVALID_LAT": "Lattitude - Version 2",
"INVALID_LAT": "Lattitude - Version 3"
},
"longitude": {
"INVALID_LAT": "longitude - Version 1",
"INVALID_LAT": "longitude - Version 2",
"INVALID_LAT": "longitude - Version 3",
}
}

In that case, only two strings will be parsed: "Lattitude - Version 3" and "longitude - Version 3". This is happening because Passolo is parsing only the last occurence of each key. Also, even if my strings under "longitude" use the same string key as above, Passolo still parses one, since the key is in a different structure in the file.

The issue is that when we have large JSON files, developers can make a mistake and use the same key. This results in that string not being parsed at all.

It would be great to have Passolo display some kind of error message when a key duplicate is found in the same object structure in a JSON, so that we are sure all strings are translated. Or maye also have an option in the parser to allow duplicate keys to be parsed anyway.

Thank you,

  • Thank you Achim.

    Just one quick comment, sometimes it could be hard to request developers to make changes in keys on the spot: rather than throwing an error in Passolo (and thus prevent the Passolo user to translate the file) could this be only a warning message, like the one we get if the source JSON is ANSI encoded?

    That would allow  to process the file and not lose time.

    Thank you for the validator link, I already use another one (codebeautify.org/jsonviewer). It has nice features like export as XML, minify code, etc.

    Jerome Selinger

  • As an interim solution I would suggest that the developers could use tools like www.jsonlint.com to test the localization files for duplicate keys. In this case you would be able to find the duplicate keys upfront.

    It is also maybe a good idea to insert such a test into the JSON Parser Add-In and then stop parsing with an error message if duplicate keys are detected.