Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Exasperated

Exasperated

Scheduled Pinned Locked Moved The Lounge
helpdesignsysadminhostingcloud
8 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • K Offline
    K Offline
    kmoorevs
    wrote on last edited by
    #1

    Several months back, we got a new customer using a cloud-based POS system that we hadn't worked with before. They are actually the first vendor we have encountered that had a real API for which they gave us a key and basically just said 'have fun!'. I'm used to figuring things out myself so no problem, after all, it's just a web request that gets a response and does something with it. I dug in and using their Swagger UI found 90% of what I was looking for and started pulling data by the month. There are 2 different requests/pulls, 1 returns around 120K records/month and the other around 240K records/month. These ran fine until last weekend when they both started timing out. I decided to break up the requests into smaller parts to maybe help things along so I added an outside loop to request one date at a time. Nope, still timing out. Increase the read timeout in the header. Still times out so double it...still times out so double it... Now I'm up to a read timeout of 3,000,000 (50 minutes!!!) and it's finally running, albeit very slowly and still with some timeouts that I am currently manually re-running. It won't do any good to increase that value anymore as the timeouts are coming @ 2 minutes likely due to a server setting beyond my control. It kind of worked up to 38K recs, then flat out refused to go any further. I just wrapped the request in a conditional loop so it can't get out until it actually finishes, or I manually stop it. 7 hours later and it's been hitting their server all night and no more progress...oh well, just keep on trying then. I wonder if they every check their logs?

    "Go forth into the source" - Neal Morse "Hope is contagious"

    R abmvA L 3 Replies Last reply
    0
    • K kmoorevs

      Several months back, we got a new customer using a cloud-based POS system that we hadn't worked with before. They are actually the first vendor we have encountered that had a real API for which they gave us a key and basically just said 'have fun!'. I'm used to figuring things out myself so no problem, after all, it's just a web request that gets a response and does something with it. I dug in and using their Swagger UI found 90% of what I was looking for and started pulling data by the month. There are 2 different requests/pulls, 1 returns around 120K records/month and the other around 240K records/month. These ran fine until last weekend when they both started timing out. I decided to break up the requests into smaller parts to maybe help things along so I added an outside loop to request one date at a time. Nope, still timing out. Increase the read timeout in the header. Still times out so double it...still times out so double it... Now I'm up to a read timeout of 3,000,000 (50 minutes!!!) and it's finally running, albeit very slowly and still with some timeouts that I am currently manually re-running. It won't do any good to increase that value anymore as the timeouts are coming @ 2 minutes likely due to a server setting beyond my control. It kind of worked up to 38K recs, then flat out refused to go any further. I just wrapped the request in a conditional loop so it can't get out until it actually finishes, or I manually stop it. 7 hours later and it's been hitting their server all night and no more progress...oh well, just keep on trying then. I wonder if they every check their logs?

      "Go forth into the source" - Neal Morse "Hope is contagious"

      R Offline
      R Offline
      Rage
      wrote on last edited by
      #2

      I wonder when you'll get the DOS attack attempt notification. :-D /edit: forgot to mention : I feel your pain !

      Do not escape reality : improve reality !

      1 Reply Last reply
      0
      • K kmoorevs

        Several months back, we got a new customer using a cloud-based POS system that we hadn't worked with before. They are actually the first vendor we have encountered that had a real API for which they gave us a key and basically just said 'have fun!'. I'm used to figuring things out myself so no problem, after all, it's just a web request that gets a response and does something with it. I dug in and using their Swagger UI found 90% of what I was looking for and started pulling data by the month. There are 2 different requests/pulls, 1 returns around 120K records/month and the other around 240K records/month. These ran fine until last weekend when they both started timing out. I decided to break up the requests into smaller parts to maybe help things along so I added an outside loop to request one date at a time. Nope, still timing out. Increase the read timeout in the header. Still times out so double it...still times out so double it... Now I'm up to a read timeout of 3,000,000 (50 minutes!!!) and it's finally running, albeit very slowly and still with some timeouts that I am currently manually re-running. It won't do any good to increase that value anymore as the timeouts are coming @ 2 minutes likely due to a server setting beyond my control. It kind of worked up to 38K recs, then flat out refused to go any further. I just wrapped the request in a conditional loop so it can't get out until it actually finishes, or I manually stop it. 7 hours later and it's been hitting their server all night and no more progress...oh well, just keep on trying then. I wonder if they every check their logs?

        "Go forth into the source" - Neal Morse "Hope is contagious"

        abmvA Offline
        abmvA Offline
        abmv
        wrote on last edited by
        #3

        whats the back end db? ............

        Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long

        We are in the beginning of a mass extinction. - Greta Thunberg

        R K 2 Replies Last reply
        0
        • abmvA abmv

          whats the back end db? ............

          Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long

          R Offline
          R Offline
          Rage
          wrote on last edited by
          #4

          Probably this[^].

          Do not escape reality : improve reality !

          1 Reply Last reply
          0
          • K kmoorevs

            Several months back, we got a new customer using a cloud-based POS system that we hadn't worked with before. They are actually the first vendor we have encountered that had a real API for which they gave us a key and basically just said 'have fun!'. I'm used to figuring things out myself so no problem, after all, it's just a web request that gets a response and does something with it. I dug in and using their Swagger UI found 90% of what I was looking for and started pulling data by the month. There are 2 different requests/pulls, 1 returns around 120K records/month and the other around 240K records/month. These ran fine until last weekend when they both started timing out. I decided to break up the requests into smaller parts to maybe help things along so I added an outside loop to request one date at a time. Nope, still timing out. Increase the read timeout in the header. Still times out so double it...still times out so double it... Now I'm up to a read timeout of 3,000,000 (50 minutes!!!) and it's finally running, albeit very slowly and still with some timeouts that I am currently manually re-running. It won't do any good to increase that value anymore as the timeouts are coming @ 2 minutes likely due to a server setting beyond my control. It kind of worked up to 38K recs, then flat out refused to go any further. I just wrapped the request in a conditional loop so it can't get out until it actually finishes, or I manually stop it. 7 hours later and it's been hitting their server all night and no more progress...oh well, just keep on trying then. I wonder if they every check their logs?

            "Go forth into the source" - Neal Morse "Hope is contagious"

            L Offline
            L Offline
            Lost User
            wrote on last edited by
            #5

            I guess it begs the question: what does a client do with 120-240K records that a query on the server can't handle? And how many fields and bytes are we talking about?

            "Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I

            K 1 Reply Last reply
            0
            • abmvA abmv

              whats the back end db? ............

              Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long

              K Offline
              K Offline
              kmoorevs
              wrote on last edited by
              #6

              abmv wrote:

              whats the back end db? ............

              I have absolutely no idea. The tables are exposed as objects in the Swagger UI, requests are made using OData (v.1 apparently as aggregate extensions are not supported) and responses come back as JSON. The query/request I'm trying to run is not complicated. (7 columns from 2 tables with joined on a PK/FK field.) Maybe they need to wind up the rubber bands? :laugh:

              "Go forth into the source" - Neal Morse "Hope is contagious"

              1 Reply Last reply
              0
              • L Lost User

                I guess it begs the question: what does a client do with 120-240K records that a query on the server can't handle? And how many fields and bytes are we talking about?

                "Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I

                K Offline
                K Offline
                kmoorevs
                wrote on last edited by
                #7

                Gerry Schmitz wrote:

                what does a client do with 120-240K records that a query on the server can't handle?

                I didn't really elaborate on the fact that the records come in chunks of 2K. I've tried aggregates but the server doesn't recognize the $apply directive so I'm guessing they are using an old version of OData on the server.

                Gerry Schmitz wrote:

                how many fields and bytes are we talking about?

                7 fields from 2 joined tables (highly normalized db btw)

                "Go forth into the source" - Neal Morse "Hope is contagious"

                L 1 Reply Last reply
                0
                • K kmoorevs

                  Gerry Schmitz wrote:

                  what does a client do with 120-240K records that a query on the server can't handle?

                  I didn't really elaborate on the fact that the records come in chunks of 2K. I've tried aggregates but the server doesn't recognize the $apply directive so I'm guessing they are using an old version of OData on the server.

                  Gerry Schmitz wrote:

                  how many fields and bytes are we talking about?

                  7 fields from 2 joined tables (highly normalized db btw)

                  "Go forth into the source" - Neal Morse "Hope is contagious"

                  L Offline
                  L Offline
                  Lost User
                  wrote on last edited by
                  #8

                  Maybe it's the wrong door. I would try FTP'ing the whole thing.

                  "Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I

                  1 Reply Last reply
                  0
                  Reply
                  • Reply as topic
                  Log in to reply
                  • Oldest to Newest
                  • Newest to Oldest
                  • Most Votes


                  • Login

                  • Don't have an account? Register

                  • Login or register to search.
                  • First post
                    Last post
                  0
                  • Categories
                  • Recent
                  • Tags
                  • Popular
                  • World
                  • Users
                  • Groups