Sybase Access Violaion Error
-
I am using Qlik to writeback to the database (Sybase IQ), however, the insert query fails after inserting 3500 rows stating "access violation." but the 3500 rows get inserted into the target table. I have verified that the data beyond 3500 rows also gets inserted if I run it for 3500 to 7000th rows, so it cannot be a data issue. I have looked around for this error, and tried setting the BufferCacheSize, PrefetchRows, EnableBulkInsert etc properties, but to no help. Also, if I execute the query with 1000 rows for a single insert, it fails with same error. If, however, I try to insert the same data 100 or 500 rows at a time, the load succeeds. Is there a limitation with how much data the Sybase IQ ODBC Driver can handle for an open connection? Any ideas/solutions would be much appreciated. Thanks.
-
I am using Qlik to writeback to the database (Sybase IQ), however, the insert query fails after inserting 3500 rows stating "access violation." but the 3500 rows get inserted into the target table. I have verified that the data beyond 3500 rows also gets inserted if I run it for 3500 to 7000th rows, so it cannot be a data issue. I have looked around for this error, and tried setting the BufferCacheSize, PrefetchRows, EnableBulkInsert etc properties, but to no help. Also, if I execute the query with 1000 rows for a single insert, it fails with same error. If, however, I try to insert the same data 100 or 500 rows at a time, the load succeeds. Is there a limitation with how much data the Sybase IQ ODBC Driver can handle for an open connection? Any ideas/solutions would be much appreciated. Thanks.
Ankit RS wrote:
Any ideas/solutions would be much appreciated.
It sure does sound like you are hitting some sort of memory threshold. I'd suggest batching your records in smaller amounts.
There are only 10 types of people in the world, those who understand binary and those who don't.
-
I am using Qlik to writeback to the database (Sybase IQ), however, the insert query fails after inserting 3500 rows stating "access violation." but the 3500 rows get inserted into the target table. I have verified that the data beyond 3500 rows also gets inserted if I run it for 3500 to 7000th rows, so it cannot be a data issue. I have looked around for this error, and tried setting the BufferCacheSize, PrefetchRows, EnableBulkInsert etc properties, but to no help. Also, if I execute the query with 1000 rows for a single insert, it fails with same error. If, however, I try to insert the same data 100 or 500 rows at a time, the load succeeds. Is there a limitation with how much data the Sybase IQ ODBC Driver can handle for an open connection? Any ideas/solutions would be much appreciated. Thanks.
I second RyanDev's suggestion. I've seen a benchmark somewhere, comparing the insert speed depending on the batch size and it had diminishing returns starting around 50. I think it was for SQL Server but I'd imagine it's similar for other DBMS'. Which would mean you won't gain much anyway with a batch size of 1000.
If the brain were so simple we could understand it, we would be so simple we couldn't. — Lyall Watson
-
I second RyanDev's suggestion. I've seen a benchmark somewhere, comparing the insert speed depending on the batch size and it had diminishing returns starting around 50. I think it was for SQL Server but I'd imagine it's similar for other DBMS'. Which would mean you won't gain much anyway with a batch size of 1000.
If the brain were so simple we could understand it, we would be so simple we couldn't. — Lyall Watson
Thanks Ryan and Sascha. I think I figured out the issue: it most certainly looks to be a threshold issue. On executing an insert inside a loop for 1 to 2000 in the query editor, Sybase throws up a pop-up stating "Maximum number of entries for the editor is 1000 and you should remove successful logs from the log file." Paraphrased. There are options to either remove or keep the successful entries from the log. See attached snapshot. Surely, this is what's causing the insert to fail, when executed through Qlik, as there's no handler for the pop-up. (Funny I can't locate the attach option here for the snapshot) Edit: Turns out the solution is: 1) keep the number of rows in the insert less than 1000 for each execution (loop or otherwise), and 2) include the begin and commit the transaction statements as a part of the query itself instead of using the connection.BeginTrans and connection.CommitTrans in the VBScript code (which is what I was doing initially.)
-
Thanks Ryan and Sascha. I think I figured out the issue: it most certainly looks to be a threshold issue. On executing an insert inside a loop for 1 to 2000 in the query editor, Sybase throws up a pop-up stating "Maximum number of entries for the editor is 1000 and you should remove successful logs from the log file." Paraphrased. There are options to either remove or keep the successful entries from the log. See attached snapshot. Surely, this is what's causing the insert to fail, when executed through Qlik, as there's no handler for the pop-up. (Funny I can't locate the attach option here for the snapshot) Edit: Turns out the solution is: 1) keep the number of rows in the insert less than 1000 for each execution (loop or otherwise), and 2) include the begin and commit the transaction statements as a part of the query itself instead of using the connection.BeginTrans and connection.CommitTrans in the VBScript code (which is what I was doing initially.)
Glad you solved it! Cheers :)
If the brain were so simple we could understand it, we would be so simple we couldn't. — Lyall Watson