Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. I'd like to ask a question about JSON to get a feel for priorities of coders here

I'd like to ask a question about JSON to get a feel for priorities of coders here

Scheduled Pinned Locked Moved The Lounge
questionjsonhelp
54 Posts 24 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • H honey the codewitch

    Let's say you wanted to write a fast JSON parser. You could do a pull parser that does well-formedness checking Or you could do one that's significantly faster but skips well formedness checking during search/skip operations, which can lead to later error reporting or missed errors You can't make an option to choose one or the other, but you can avoid using the skip/search functions that do this in the latter case. Which do you do? Are you a stomp-the-pedal type or a defensive driver? (Seriously, this is more about getting a read of the room than anything - I want a feel for priorities)

    Real programmers use butterflies

    B Offline
    B Offline
    BabyYoda
    wrote on last edited by
    #2

    honey the codewitch wrote:

    Which do you do?

    Use NewtonSoft's. ;P

    P 1 Reply Last reply
    0
    • H honey the codewitch

      Let's say you wanted to write a fast JSON parser. You could do a pull parser that does well-formedness checking Or you could do one that's significantly faster but skips well formedness checking during search/skip operations, which can lead to later error reporting or missed errors You can't make an option to choose one or the other, but you can avoid using the skip/search functions that do this in the latter case. Which do you do? Are you a stomp-the-pedal type or a defensive driver? (Seriously, this is more about getting a read of the room than anything - I want a feel for priorities)

      Real programmers use butterflies

      P Offline
      P Offline
      PIEBALDconsult
      wrote on last edited by
      #3

      I already told you a bit about mine. It's probably a bit permissive. I think your assumption of "search/skip operations" is not one which most others will even consider. I assume that most would not implement either of those, but instead want to have the whole entire document, because why else would you be parsing the thing anyway? As to well-formedness checking -- "You Ain't Gonna Need It" (the same as with XML). In my case, I had to "stomp-the-pedal" because I was given a short deadline to have a working solution for reading JSON files (75GB worth) and loading the data into SQL Server.

      H 1 Reply Last reply
      0
      • B BabyYoda

        honey the codewitch wrote:

        Which do you do?

        Use NewtonSoft's. ;P

        P Offline
        P Offline
        PIEBALDconsult
        wrote on last edited by
        #4

        Fortunately I'm not allowed to use third-party add-ins. I am awaiting access to the JSON support built into .net 4.7 and newer to see whether or not it can do what I require.

        M K R 3 Replies Last reply
        0
        • H honey the codewitch

          Let's say you wanted to write a fast JSON parser. You could do a pull parser that does well-formedness checking Or you could do one that's significantly faster but skips well formedness checking during search/skip operations, which can lead to later error reporting or missed errors You can't make an option to choose one or the other, but you can avoid using the skip/search functions that do this in the latter case. Which do you do? Are you a stomp-the-pedal type or a defensive driver? (Seriously, this is more about getting a read of the room than anything - I want a feel for priorities)

          Real programmers use butterflies

          J Offline
          J Offline
          Jorgen Andersson
          wrote on last edited by
          #5

          Whichever lets me stream the file into a database, without clogging any system resources.

          Wrong is evil and must be defeated. - Jeff Ello Never stop dreaming - Freddie Kruger

          1 Reply Last reply
          0
          • P PIEBALDconsult

            Fortunately I'm not allowed to use third-party add-ins. I am awaiting access to the JSON support built into .net 4.7 and newer to see whether or not it can do what I require.

            M Offline
            M Offline
            Matthew Dennis
            wrote on last edited by
            #6

            JSON.NET, along with a large number of OSS projects, has been given to the [.NET Foundation Projects](https://dotnetfoundation.org/projects). This might reduce your company's reluctance to use it, plus the fact that until recently, JSON.NET was the package that Microsoft was using in their OSS projects (.NET Core, ASP.NET Core, ...). Also, System.Text.Json, the new MS JSON support, is a NuGet package, not part of the framework, and can be used as far back as .NET 4.6.1.

            "Time flies like an arrow. Fruit flies like a banana."

            P 1 Reply Last reply
            0
            • H honey the codewitch

              Let's say you wanted to write a fast JSON parser. You could do a pull parser that does well-formedness checking Or you could do one that's significantly faster but skips well formedness checking during search/skip operations, which can lead to later error reporting or missed errors You can't make an option to choose one or the other, but you can avoid using the skip/search functions that do this in the latter case. Which do you do? Are you a stomp-the-pedal type or a defensive driver? (Seriously, this is more about getting a read of the room than anything - I want a feel for priorities)

              Real programmers use butterflies

              M Offline
              M Offline
              Marc Clifton
              wrote on last edited by
              #7

              I would want the "text" of the JSON to be well-formed (proper braces, quotes, commas, colons, brackets, etc.) but as the to the contents, whether they map or not to the backing entity doesn't much matter, though obviously things would break of a collection is expected and it's not a collection, or vice-versa. Same with automatic data type conversion. So, yeah, basically I would want the "defensive driver" approach.

              Latest Articles:
              Thread Safe Quantized Temporal Frame Ring Buffer

              H 1 Reply Last reply
              0
              • H honey the codewitch

                Let's say you wanted to write a fast JSON parser. You could do a pull parser that does well-formedness checking Or you could do one that's significantly faster but skips well formedness checking during search/skip operations, which can lead to later error reporting or missed errors You can't make an option to choose one or the other, but you can avoid using the skip/search functions that do this in the latter case. Which do you do? Are you a stomp-the-pedal type or a defensive driver? (Seriously, this is more about getting a read of the room than anything - I want a feel for priorities)

                Real programmers use butterflies

                C Offline
                C Offline
                CPallini
                wrote on last edited by
                #8

                It depends on context, of course.

                "In testa che avete, Signor di Ceprano?" -- Rigoletto

                D 1 Reply Last reply
                0
                • H honey the codewitch

                  Let's say you wanted to write a fast JSON parser. You could do a pull parser that does well-formedness checking Or you could do one that's significantly faster but skips well formedness checking during search/skip operations, which can lead to later error reporting or missed errors You can't make an option to choose one or the other, but you can avoid using the skip/search functions that do this in the latter case. Which do you do? Are you a stomp-the-pedal type or a defensive driver? (Seriously, this is more about getting a read of the room than anything - I want a feel for priorities)

                  Real programmers use butterflies

                  S Offline
                  S Offline
                  Slacker007
                  wrote on last edited by
                  #9

                  why are you not using Newtonsoft? Not sure why you are re-inventing the wheel here. :confused: NuGet Gallery| Newtonsoft.Json 12.0.3[^]

                  H J 2 Replies Last reply
                  0
                  • S Slacker007

                    why are you not using Newtonsoft? Not sure why you are re-inventing the wheel here. :confused: NuGet Gallery| Newtonsoft.Json 12.0.3[^]

                    H Offline
                    H Offline
                    honey the codewitch
                    wrote on last edited by
                    #10

                    First of all this is a hypothetical. Second, hosting the .NET CLI in C++ just to use a .NET package from C++ to parse a little JSON seems heavy handed and horribly inefficient. Plus C# won't run on arduinos.

                    Real programmers use butterflies

                    S S 2 Replies Last reply
                    0
                    • M Marc Clifton

                      I would want the "text" of the JSON to be well-formed (proper braces, quotes, commas, colons, brackets, etc.) but as the to the contents, whether they map or not to the backing entity doesn't much matter, though obviously things would break of a collection is expected and it's not a collection, or vice-versa. Same with automatic data type conversion. So, yeah, basically I would want the "defensive driver" approach.

                      Latest Articles:
                      Thread Safe Quantized Temporal Frame Ring Buffer

                      H Offline
                      H Offline
                      honey the codewitch
                      wrote on last edited by
                      #11

                      So if it wasn't, you'd like to error as soon as you catch it, even if it meant a slower parse is what I'm hearing.

                      Real programmers use butterflies

                      M 1 Reply Last reply
                      0
                      • H honey the codewitch

                        First of all this is a hypothetical. Second, hosting the .NET CLI in C++ just to use a .NET package from C++ to parse a little JSON seems heavy handed and horribly inefficient. Plus C# won't run on arduinos.

                        Real programmers use butterflies

                        S Offline
                        S Offline
                        Slacker007
                        wrote on last edited by
                        #12

                        fair enough.

                        H 1 Reply Last reply
                        0
                        • P PIEBALDconsult

                          I already told you a bit about mine. It's probably a bit permissive. I think your assumption of "search/skip operations" is not one which most others will even consider. I assume that most would not implement either of those, but instead want to have the whole entire document, because why else would you be parsing the thing anyway? As to well-formedness checking -- "You Ain't Gonna Need It" (the same as with XML). In my case, I had to "stomp-the-pedal" because I was given a short deadline to have a working solution for reading JSON files (75GB worth) and loading the data into SQL Server.

                          H Offline
                          H Offline
                          honey the codewitch
                          wrote on last edited by
                          #13

                          PIEBALDconsult wrote:

                          I assume that most would not implement either of those, but instead want to have the whole entire document, because why else would you be parsing the thing anyway?

                          In my JSON on Fire[^] article I present several cases where you only need a little data from a much larger dataset. Consider querying any mongoDB repository online. You don't need to parse everything you get back because the data they return is very large grained/chunky. You don't get fine grained query results with it. You get kilobytes of data at least, and on an IoT device you may just not have the room. The show information for Burn Notice from tmdb.com is almost 200kB. I know that because I'm using it as a test data set.

                          Real programmers use butterflies

                          P 1 Reply Last reply
                          0
                          • M Matthew Dennis

                            JSON.NET, along with a large number of OSS projects, has been given to the [.NET Foundation Projects](https://dotnetfoundation.org/projects). This might reduce your company's reluctance to use it, plus the fact that until recently, JSON.NET was the package that Microsoft was using in their OSS projects (.NET Core, ASP.NET Core, ...). Also, System.Text.Json, the new MS JSON support, is a NuGet package, not part of the framework, and can be used as far back as .NET 4.6.1.

                            "Time flies like an arrow. Fruit flies like a banana."

                            P Offline
                            P Offline
                            PIEBALDconsult
                            wrote on last edited by
                            #14

                            Yeah, no, we can't deploy any third-party stuff to the servers, it has to be either build-in .net or stuff we implement.

                            H 1 Reply Last reply
                            0
                            • S Slacker007

                              fair enough.

                              H Offline
                              H Offline
                              honey the codewitch
                              wrote on last edited by
                              #15

                              I should add that I originally wrote it in C# and then ported it to C++ Why did I write it in C#? Because I didn't know about NewtonSoft's JSON on the day I wrote it and then when i found out about it it turns out NewtonSoft's pull parser sucks and is slow. I'm glad I did. People are religious about never reinventing the wheel, but it's not always such a bad thing - it depends on the wheel.

                              Real programmers use butterflies

                              S P N 3 Replies Last reply
                              0
                              • H honey the codewitch

                                I should add that I originally wrote it in C# and then ported it to C++ Why did I write it in C#? Because I didn't know about NewtonSoft's JSON on the day I wrote it and then when i found out about it it turns out NewtonSoft's pull parser sucks and is slow. I'm glad I did. People are religious about never reinventing the wheel, but it's not always such a bad thing - it depends on the wheel.

                                Real programmers use butterflies

                                S Offline
                                S Offline
                                Slacker007
                                wrote on last edited by
                                #16

                                we use Newtonsoft with all of our Web APIs, etc. never had any noticeable issues with performance. I guess if you are parsing big json files then, perhaps that is an issue, but we don't do that. so....

                                H 1 Reply Last reply
                                0
                                • H honey the codewitch

                                  PIEBALDconsult wrote:

                                  I assume that most would not implement either of those, but instead want to have the whole entire document, because why else would you be parsing the thing anyway?

                                  In my JSON on Fire[^] article I present several cases where you only need a little data from a much larger dataset. Consider querying any mongoDB repository online. You don't need to parse everything you get back because the data they return is very large grained/chunky. You don't get fine grained query results with it. You get kilobytes of data at least, and on an IoT device you may just not have the room. The show information for Burn Notice from tmdb.com is almost 200kB. I know that because I'm using it as a test data set.

                                  Real programmers use butterflies

                                  P Offline
                                  P Offline
                                  PIEBALDconsult
                                  wrote on last edited by
                                  #17

                                  My needs are simple -- some other team sends us some number of JSON files and I need to load the data into SQL Server. In most cases, each JSON file contains one "table" of data so loading it into a table is simple. At most I may want to filter out large binary values which are of no use to us. And we trust the sender to have provided well-formed JSON -- if it isn't, we find out real fast and throw it back to them to fix. Well-formedness is one of those things you shouldn't be concerned about once you get your application to PROD. At this time, I'm consuming two sets of files from third-party products which those products also have to be able to read -- they're the configuration files for those products. The only untrustworthy set of data I consume is one which is generated by a utility I wrote, so if it's broken it's my fault and I can fix it.

                                  H 1 Reply Last reply
                                  0
                                  • S Slacker007

                                    we use Newtonsoft with all of our Web APIs, etc. never had any noticeable issues with performance. I guess if you are parsing big json files then, perhaps that is an issue, but we don't do that. so....

                                    H Offline
                                    H Offline
                                    honey the codewitch
                                    wrote on last edited by
                                    #18

                                    If you ever find yourself bulk loading JSON dumps into a database, you can do better. Hell, you could use my tiny JSON C# lib which is around here at CP somewhere.

                                    Real programmers use butterflies

                                    J 1 Reply Last reply
                                    0
                                    • P PIEBALDconsult

                                      My needs are simple -- some other team sends us some number of JSON files and I need to load the data into SQL Server. In most cases, each JSON file contains one "table" of data so loading it into a table is simple. At most I may want to filter out large binary values which are of no use to us. And we trust the sender to have provided well-formed JSON -- if it isn't, we find out real fast and throw it back to them to fix. Well-formedness is one of those things you shouldn't be concerned about once you get your application to PROD. At this time, I'm consuming two sets of files from third-party products which those products also have to be able to read -- they're the configuration files for those products. The only untrustworthy set of data I consume is one which is generated by a utility I wrote, so if it's broken it's my fault and I can fix it.

                                      H Offline
                                      H Offline
                                      honey the codewitch
                                      wrote on last edited by
                                      #19

                                      This lib i wrote was originally in C# and I ported it. I originally designed it (the C# version) to do bulk loads of data - basically exactly what you're doing but perhaps a lot more of it.

                                      Real programmers use butterflies

                                      1 Reply Last reply
                                      0
                                      • H honey the codewitch

                                        I should add that I originally wrote it in C# and then ported it to C++ Why did I write it in C#? Because I didn't know about NewtonSoft's JSON on the day I wrote it and then when i found out about it it turns out NewtonSoft's pull parser sucks and is slow. I'm glad I did. People are religious about never reinventing the wheel, but it's not always such a bad thing - it depends on the wheel.

                                        Real programmers use butterflies

                                        P Offline
                                        P Offline
                                        PIEBALDconsult
                                        wrote on last edited by
                                        #20

                                        If people didn't constantly reinvent the wheel, we'd still be using wooden wheels several feet in diameter. :laugh: Use the right wheel for the right job. Don't try to adapt to an existing wheel if it just doesn't do the job.

                                        H 1 Reply Last reply
                                        0
                                        • P PIEBALDconsult

                                          If people didn't constantly reinvent the wheel, we'd still be using wooden wheels several feet in diameter. :laugh: Use the right wheel for the right job. Don't try to adapt to an existing wheel if it just doesn't do the job.

                                          H Offline
                                          H Offline
                                          honey the codewitch
                                          wrote on last edited by
                                          #21

                                          agreed!

                                          Real programmers use butterflies

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups