Exasperated
-
Several months back, we got a new customer using a cloud-based POS system that we hadn't worked with before. They are actually the first vendor we have encountered that had a real API for which they gave us a key and basically just said 'have fun!'. I'm used to figuring things out myself so no problem, after all, it's just a web request that gets a response and does something with it. I dug in and using their Swagger UI found 90% of what I was looking for and started pulling data by the month. There are 2 different requests/pulls, 1 returns around 120K records/month and the other around 240K records/month. These ran fine until last weekend when they both started timing out. I decided to break up the requests into smaller parts to maybe help things along so I added an outside loop to request one date at a time. Nope, still timing out. Increase the read timeout in the header. Still times out so double it...still times out so double it... Now I'm up to a read timeout of 3,000,000 (50 minutes!!!) and it's finally running, albeit very slowly and still with some timeouts that I am currently manually re-running. It won't do any good to increase that value anymore as the timeouts are coming @ 2 minutes likely due to a server setting beyond my control. It kind of worked up to 38K recs, then flat out refused to go any further. I just wrapped the request in a conditional loop so it can't get out until it actually finishes, or I manually stop it. 7 hours later and it's been hitting their server all night and no more progress...oh well, just keep on trying then. I wonder if they every check their logs?
"Go forth into the source" - Neal Morse "Hope is contagious"
-
Several months back, we got a new customer using a cloud-based POS system that we hadn't worked with before. They are actually the first vendor we have encountered that had a real API for which they gave us a key and basically just said 'have fun!'. I'm used to figuring things out myself so no problem, after all, it's just a web request that gets a response and does something with it. I dug in and using their Swagger UI found 90% of what I was looking for and started pulling data by the month. There are 2 different requests/pulls, 1 returns around 120K records/month and the other around 240K records/month. These ran fine until last weekend when they both started timing out. I decided to break up the requests into smaller parts to maybe help things along so I added an outside loop to request one date at a time. Nope, still timing out. Increase the read timeout in the header. Still times out so double it...still times out so double it... Now I'm up to a read timeout of 3,000,000 (50 minutes!!!) and it's finally running, albeit very slowly and still with some timeouts that I am currently manually re-running. It won't do any good to increase that value anymore as the timeouts are coming @ 2 minutes likely due to a server setting beyond my control. It kind of worked up to 38K recs, then flat out refused to go any further. I just wrapped the request in a conditional loop so it can't get out until it actually finishes, or I manually stop it. 7 hours later and it's been hitting their server all night and no more progress...oh well, just keep on trying then. I wonder if they every check their logs?
"Go forth into the source" - Neal Morse "Hope is contagious"
I wonder when you'll get the DOS attack attempt notification. :-D /edit: forgot to mention : I feel your pain !
-
Several months back, we got a new customer using a cloud-based POS system that we hadn't worked with before. They are actually the first vendor we have encountered that had a real API for which they gave us a key and basically just said 'have fun!'. I'm used to figuring things out myself so no problem, after all, it's just a web request that gets a response and does something with it. I dug in and using their Swagger UI found 90% of what I was looking for and started pulling data by the month. There are 2 different requests/pulls, 1 returns around 120K records/month and the other around 240K records/month. These ran fine until last weekend when they both started timing out. I decided to break up the requests into smaller parts to maybe help things along so I added an outside loop to request one date at a time. Nope, still timing out. Increase the read timeout in the header. Still times out so double it...still times out so double it... Now I'm up to a read timeout of 3,000,000 (50 minutes!!!) and it's finally running, albeit very slowly and still with some timeouts that I am currently manually re-running. It won't do any good to increase that value anymore as the timeouts are coming @ 2 minutes likely due to a server setting beyond my control. It kind of worked up to 38K recs, then flat out refused to go any further. I just wrapped the request in a conditional loop so it can't get out until it actually finishes, or I manually stop it. 7 hours later and it's been hitting their server all night and no more progress...oh well, just keep on trying then. I wonder if they every check their logs?
"Go forth into the source" - Neal Morse "Hope is contagious"
-
whats the back end db? ............
Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
-
Several months back, we got a new customer using a cloud-based POS system that we hadn't worked with before. They are actually the first vendor we have encountered that had a real API for which they gave us a key and basically just said 'have fun!'. I'm used to figuring things out myself so no problem, after all, it's just a web request that gets a response and does something with it. I dug in and using their Swagger UI found 90% of what I was looking for and started pulling data by the month. There are 2 different requests/pulls, 1 returns around 120K records/month and the other around 240K records/month. These ran fine until last weekend when they both started timing out. I decided to break up the requests into smaller parts to maybe help things along so I added an outside loop to request one date at a time. Nope, still timing out. Increase the read timeout in the header. Still times out so double it...still times out so double it... Now I'm up to a read timeout of 3,000,000 (50 minutes!!!) and it's finally running, albeit very slowly and still with some timeouts that I am currently manually re-running. It won't do any good to increase that value anymore as the timeouts are coming @ 2 minutes likely due to a server setting beyond my control. It kind of worked up to 38K recs, then flat out refused to go any further. I just wrapped the request in a conditional loop so it can't get out until it actually finishes, or I manually stop it. 7 hours later and it's been hitting their server all night and no more progress...oh well, just keep on trying then. I wonder if they every check their logs?
"Go forth into the source" - Neal Morse "Hope is contagious"
I guess it begs the question: what does a client do with 120-240K records that a query on the server can't handle? And how many fields and bytes are we talking about?
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
-
whats the back end db? ............
Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
abmv wrote:
whats the back end db? ............
I have absolutely no idea. The tables are exposed as objects in the Swagger UI, requests are made using OData (v.1 apparently as aggregate extensions are not supported) and responses come back as JSON. The query/request I'm trying to run is not complicated. (7 columns from 2 tables with joined on a PK/FK field.) Maybe they need to wind up the rubber bands? :laugh:
"Go forth into the source" - Neal Morse "Hope is contagious"
-
I guess it begs the question: what does a client do with 120-240K records that a query on the server can't handle? And how many fields and bytes are we talking about?
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
Gerry Schmitz wrote:
what does a client do with 120-240K records that a query on the server can't handle?
I didn't really elaborate on the fact that the records come in chunks of 2K. I've tried aggregates but the server doesn't recognize the $apply directive so I'm guessing they are using an old version of OData on the server.
Gerry Schmitz wrote:
how many fields and bytes are we talking about?
7 fields from 2 joined tables (highly normalized db btw)
"Go forth into the source" - Neal Morse "Hope is contagious"
-
Gerry Schmitz wrote:
what does a client do with 120-240K records that a query on the server can't handle?
I didn't really elaborate on the fact that the records come in chunks of 2K. I've tried aggregates but the server doesn't recognize the $apply directive so I'm guessing they are using an old version of OData on the server.
Gerry Schmitz wrote:
how many fields and bytes are we talking about?
7 fields from 2 joined tables (highly normalized db btw)
"Go forth into the source" - Neal Morse "Hope is contagious"
Maybe it's the wrong door. I would try FTP'ing the whole thing.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I