Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Sybase Access Violaion Error

Sybase Access Violaion Error

Scheduled Pinned Locked Moved Database
helpdatabasequestion
5 Posts 3 Posters 2 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A Offline
    A Offline
    Ankit RS
    wrote on last edited by
    #1

    I am using Qlik to writeback to the database (Sybase IQ), however, the insert query fails after inserting 3500 rows stating "access violation." but the 3500 rows get inserted into the target table. I have verified that the data beyond 3500 rows also gets inserted if I run it for 3500 to 7000th rows, so it cannot be a data issue. I have looked around for this error, and tried setting the BufferCacheSize, PrefetchRows, EnableBulkInsert etc properties, but to no help. Also, if I execute the query with 1000 rows for a single insert, it fails with same error. If, however, I try to insert the same data 100 or 500 rows at a time, the load succeeds. Is there a limitation with how much data the Sybase IQ ODBC Driver can handle for an open connection? Any ideas/solutions would be much appreciated. Thanks.

    Z S 2 Replies Last reply
    0
    • A Ankit RS

      I am using Qlik to writeback to the database (Sybase IQ), however, the insert query fails after inserting 3500 rows stating "access violation." but the 3500 rows get inserted into the target table. I have verified that the data beyond 3500 rows also gets inserted if I run it for 3500 to 7000th rows, so it cannot be a data issue. I have looked around for this error, and tried setting the BufferCacheSize, PrefetchRows, EnableBulkInsert etc properties, but to no help. Also, if I execute the query with 1000 rows for a single insert, it fails with same error. If, however, I try to insert the same data 100 or 500 rows at a time, the load succeeds. Is there a limitation with how much data the Sybase IQ ODBC Driver can handle for an open connection? Any ideas/solutions would be much appreciated. Thanks.

      Z Offline
      Z Offline
      ZurdoDev
      wrote on last edited by
      #2

      Ankit RS wrote:

      Any ideas/solutions would be much appreciated.

      It sure does sound like you are hitting some sort of memory threshold. I'd suggest batching your records in smaller amounts.

      There are only 10 types of people in the world, those who understand binary and those who don't.

      1 Reply Last reply
      0
      • A Ankit RS

        I am using Qlik to writeback to the database (Sybase IQ), however, the insert query fails after inserting 3500 rows stating "access violation." but the 3500 rows get inserted into the target table. I have verified that the data beyond 3500 rows also gets inserted if I run it for 3500 to 7000th rows, so it cannot be a data issue. I have looked around for this error, and tried setting the BufferCacheSize, PrefetchRows, EnableBulkInsert etc properties, but to no help. Also, if I execute the query with 1000 rows for a single insert, it fails with same error. If, however, I try to insert the same data 100 or 500 rows at a time, the load succeeds. Is there a limitation with how much data the Sybase IQ ODBC Driver can handle for an open connection? Any ideas/solutions would be much appreciated. Thanks.

        S Offline
        S Offline
        Sascha Lefevre
        wrote on last edited by
        #3

        I second RyanDev's suggestion. I've seen a benchmark somewhere, comparing the insert speed depending on the batch size and it had diminishing returns starting around 50. I think it was for SQL Server but I'd imagine it's similar for other DBMS'. Which would mean you won't gain much anyway with a batch size of 1000.

        If the brain were so simple we could understand it, we would be so simple we couldn't. — Lyall Watson

        A 1 Reply Last reply
        0
        • S Sascha Lefevre

          I second RyanDev's suggestion. I've seen a benchmark somewhere, comparing the insert speed depending on the batch size and it had diminishing returns starting around 50. I think it was for SQL Server but I'd imagine it's similar for other DBMS'. Which would mean you won't gain much anyway with a batch size of 1000.

          If the brain were so simple we could understand it, we would be so simple we couldn't. — Lyall Watson

          A Offline
          A Offline
          Ankit RS
          wrote on last edited by
          #4

          Thanks Ryan and Sascha. I think I figured out the issue: it most certainly looks to be a threshold issue. On executing an insert inside a loop for 1 to 2000 in the query editor, Sybase throws up a pop-up stating "Maximum number of entries for the editor is 1000 and you should remove successful logs from the log file." Paraphrased. There are options to either remove or keep the successful entries from the log. See attached snapshot. Surely, this is what's causing the insert to fail, when executed through Qlik, as there's no handler for the pop-up. (Funny I can't locate the attach option here for the snapshot) Edit: Turns out the solution is: 1) keep the number of rows in the insert less than 1000 for each execution (loop or otherwise), and 2) include the begin and commit the transaction statements as a part of the query itself instead of using the connection.BeginTrans and connection.CommitTrans in the VBScript code (which is what I was doing initially.)

          S 1 Reply Last reply
          0
          • A Ankit RS

            Thanks Ryan and Sascha. I think I figured out the issue: it most certainly looks to be a threshold issue. On executing an insert inside a loop for 1 to 2000 in the query editor, Sybase throws up a pop-up stating "Maximum number of entries for the editor is 1000 and you should remove successful logs from the log file." Paraphrased. There are options to either remove or keep the successful entries from the log. See attached snapshot. Surely, this is what's causing the insert to fail, when executed through Qlik, as there's no handler for the pop-up. (Funny I can't locate the attach option here for the snapshot) Edit: Turns out the solution is: 1) keep the number of rows in the insert less than 1000 for each execution (loop or otherwise), and 2) include the begin and commit the transaction statements as a part of the query itself instead of using the connection.BeginTrans and connection.CommitTrans in the VBScript code (which is what I was doing initially.)

            S Offline
            S Offline
            Sascha Lefevre
            wrote on last edited by
            #5

            Glad you solved it! Cheers :)

            If the brain were so simple we could understand it, we would be so simple we couldn't. — Lyall Watson

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups