Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Data Imports and Exports - My Demon

Data Imports and Exports - My Demon

Scheduled Pinned Locked Moved The Lounge
careerdatabasetestingbeta-testinghelp
7 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • B Offline
    B Offline
    Brady Kelly
    wrote on last edited by
    #1

    I've spend a large part of my programming career doing these, and either I make them complicated (generic, with mapping files and a designer), or something else does. Today, after testing on two DB's at home, and conquering some model validation using EF, I deploy to the remote test site, and lo, I get a duplicate key error on AccountNum for employee xxxx, while importing from CSV. There is only one AccountNum like xxxx's in the entire CSV file, and no AccountNum like xxxx's in the already in the DB. About 100 other duplicate AccountNum issues are caught and reported by my code before going to the DB. One entry is inserted/updated per transaction, so where the frakking hell is this duplicate being detected. I quoted an hour for this job because I'm using the wonderful FileHelpers library, and everyone was happy, all was working - all I had to to was revert to CSV from fixed len. :mad: :mad: :mad:

    C M H 3 Replies Last reply
    0
    • B Brady Kelly

      I've spend a large part of my programming career doing these, and either I make them complicated (generic, with mapping files and a designer), or something else does. Today, after testing on two DB's at home, and conquering some model validation using EF, I deploy to the remote test site, and lo, I get a duplicate key error on AccountNum for employee xxxx, while importing from CSV. There is only one AccountNum like xxxx's in the entire CSV file, and no AccountNum like xxxx's in the already in the DB. About 100 other duplicate AccountNum issues are caught and reported by my code before going to the DB. One entry is inserted/updated per transaction, so where the frakking hell is this duplicate being detected. I quoted an hour for this job because I'm using the wonderful FileHelpers library, and everyone was happy, all was working - all I had to to was revert to CSV from fixed len. :mad: :mad: :mad:

      C Offline
      C Offline
      Corporal Agarn
      wrote on last edited by
      #2

      It is most likely truncating to one character thus the matches :laugh: Importing to SQL 2008 is currently driving me deeper into insanity as what worked in SQL 2000 does not in 2008. Thus I feel your pain. Good Luck

      B 1 Reply Last reply
      0
      • C Corporal Agarn

        It is most likely truncating to one character thus the matches :laugh: Importing to SQL 2008 is currently driving me deeper into insanity as what worked in SQL 2000 does not in 2008. Thus I feel your pain. Good Luck

        B Offline
        B Offline
        Brady Kelly
        wrote on last edited by
        #3

        I've just taken a 2km walk to buy supper, and get some air, and within the first 500m it struck me. I'm using EF, and if I add a new entity to an entity set, then detect it has a duplicate field or whatever, and don't remove it from the entity set, the the next call to SaveChanges includes two updates, not just the one the error is reported on, but the previous one that actually causes the error.

        1 Reply Last reply
        0
        • B Brady Kelly

          I've spend a large part of my programming career doing these, and either I make them complicated (generic, with mapping files and a designer), or something else does. Today, after testing on two DB's at home, and conquering some model validation using EF, I deploy to the remote test site, and lo, I get a duplicate key error on AccountNum for employee xxxx, while importing from CSV. There is only one AccountNum like xxxx's in the entire CSV file, and no AccountNum like xxxx's in the already in the DB. About 100 other duplicate AccountNum issues are caught and reported by my code before going to the DB. One entry is inserted/updated per transaction, so where the frakking hell is this duplicate being detected. I quoted an hour for this job because I'm using the wonderful FileHelpers library, and everyone was happy, all was working - all I had to to was revert to CSV from fixed len. :mad: :mad: :mad:

          M Offline
          M Offline
          Mycroft Holmes
          wrote on last edited by
          #4

          Ah stop yer bitching, I have spent the last 3 weeks getting 400gb of text files into a semi reasonable data structure by trial and error. I have 1 file with 79 million rows that I have loaded twice :sigh: Then the transaction log blows out to 39gb and crashes 12 hours into the processing AAahhh. 6 hours to apply and index, reduced a process from 22hrs to 4, SQL Server is just not up to this volume of data. We keep blowing the drive or log files.

          Never underestimate the power of human stupidity RAH

          1 Reply Last reply
          0
          • B Brady Kelly

            I've spend a large part of my programming career doing these, and either I make them complicated (generic, with mapping files and a designer), or something else does. Today, after testing on two DB's at home, and conquering some model validation using EF, I deploy to the remote test site, and lo, I get a duplicate key error on AccountNum for employee xxxx, while importing from CSV. There is only one AccountNum like xxxx's in the entire CSV file, and no AccountNum like xxxx's in the already in the DB. About 100 other duplicate AccountNum issues are caught and reported by my code before going to the DB. One entry is inserted/updated per transaction, so where the frakking hell is this duplicate being detected. I quoted an hour for this job because I'm using the wonderful FileHelpers library, and everyone was happy, all was working - all I had to to was revert to CSV from fixed len. :mad: :mad: :mad:

            H Offline
            H Offline
            Harvey Saayman
            wrote on last edited by
            #5

            Hey dude, did you perhaps present a session at Devs4Devs a few weeks back?

            Harvey Saayman - South Africa Software Developer .Net, C#, SQL you.suck = (you.Occupation == jobTitles.Programmer && you.Passion != Programming) 1000100 1101111 1100101 1110011 100000 1110100 1101000 1101001 1110011 100000 1101101 1100101 1100001 1101110 100000 1101001 1101101 100000 1100001 100000 1100111 1100101 1100101 1101011 111111

            B 1 Reply Last reply
            0
            • H Harvey Saayman

              Hey dude, did you perhaps present a session at Devs4Devs a few weeks back?

              Harvey Saayman - South Africa Software Developer .Net, C#, SQL you.suck = (you.Occupation == jobTitles.Programmer && you.Passion != Programming) 1000100 1101111 1100101 1110011 100000 1110100 1101000 1101001 1110011 100000 1101101 1100101 1100001 1101110 100000 1101001 1101101 100000 1100001 100000 1100111 1100101 1100101 1101011 111111

              B Offline
              B Offline
              Brady Kelly
              wrote on last edited by
              #6

              I was supposed to but pranged my scooter that morning and didn't make it.

              H 1 Reply Last reply
              0
              • B Brady Kelly

                I was supposed to but pranged my scooter that morning and didn't make it.

                H Offline
                H Offline
                Harvey Saayman
                wrote on last edited by
                #7

                Ah that sux! I was looking out for you to say hi. I was there :)

                Harvey Saayman - South Africa Software Developer .Net, C#, SQL you.suck = (you.Occupation == jobTitles.Programmer && you.Passion != Programming) 1000100 1101111 1100101 1110011 100000 1110100 1101000 1101001 1110011 100000 1101101 1100101 1100001 1101110 100000 1101001 1101101 100000 1100001 100000 1100111 1100101 1100101 1101011 111111

                1 Reply Last reply
                0
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups