Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Making a row-based Iteration of a Bulk-Insertion

Making a row-based Iteration of a Bulk-Insertion

Scheduled Pinned Locked Moved Database
databaseperformancehelpquestionannouncement
4 Posts 3 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S Offline
    S Offline
    SkyRunner
    wrote on last edited by
    #1

    We are looking a change on the code to make it watchable and improve it's performance. Want to change this piece of SQL query into a loop version which Could `Select-Insert` records by 1000 items per iteration. On each iteration want to get a print to see what was the last inserted item and if possible get the elapsed time on the iteration. The Code : INSERT INTO [tData2] ( [Key], Info1, Info2 ) SELECT [Key], Info1, Info2 FROM [tData] -- Conditions were removed as weren't related to this question your help is really appreciated.

    L M 2 Replies Last reply
    0
    • S SkyRunner

      We are looking a change on the code to make it watchable and improve it's performance. Want to change this piece of SQL query into a loop version which Could `Select-Insert` records by 1000 items per iteration. On each iteration want to get a print to see what was the last inserted item and if possible get the elapsed time on the iteration. The Code : INSERT INTO [tData2] ( [Key], Info1, Info2 ) SELECT [Key], Info1, Info2 FROM [tData] -- Conditions were removed as weren't related to this question your help is really appreciated.

      L Offline
      L Offline
      Lost User
      wrote on last edited by
      #2

      Performance will suffer if you're going to print each item individually. Insert 50 (or so) records, see if more than a second passed, and write to the terminal that you've done the next batch.

      Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

      1 Reply Last reply
      0
      • S SkyRunner

        We are looking a change on the code to make it watchable and improve it's performance. Want to change this piece of SQL query into a loop version which Could `Select-Insert` records by 1000 items per iteration. On each iteration want to get a print to see what was the last inserted item and if possible get the elapsed time on the iteration. The Code : INSERT INTO [tData2] ( [Key], Info1, Info2 ) SELECT [Key], Info1, Info2 FROM [tData] -- Conditions were removed as weren't related to this question your help is really appreciated.

        M Offline
        M Offline
        Michael Potter
        wrote on last edited by
        #3

        Limiting the loop will slow your performance. There really isn't a nice way of doing it without resorting to pre-import queries or using a cursor. Neither choice will result in faster performance. Proper indexes on the conditions will improve performance given the query you listed. If you are moving lots of data into an empty table and you can fully control its integrity, then remove all constraints (except identities) and indexes on the new table - update the table - and put them back on. This can greatly speed up a large insert. You can also take a look at BULK INSERT if your SQL version will allow.

        S 1 Reply Last reply
        0
        • M Michael Potter

          Limiting the loop will slow your performance. There really isn't a nice way of doing it without resorting to pre-import queries or using a cursor. Neither choice will result in faster performance. Proper indexes on the conditions will improve performance given the query you listed. If you are moving lots of data into an empty table and you can fully control its integrity, then remove all constraints (except identities) and indexes on the new table - update the table - and put them back on. This can greatly speed up a large insert. You can also take a look at BULK INSERT if your SQL version will allow.

          S Offline
          S Offline
          SkyRunner
          wrote on last edited by
          #4

          @Michael thanks for the info, I recently made a solution for this used some of your previous advice, also will consider your recent advice as it's needed.

          1 Reply Last reply
          0
          Reply
          • Reply as topic
          Log in to reply
          • Oldest to Newest
          • Newest to Oldest
          • Most Votes


          • Login

          • Don't have an account? Register

          • Login or register to search.
          • First post
            Last post
          0
          • Categories
          • Recent
          • Tags
          • Popular
          • World
          • Users
          • Groups