Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Bug in SQLBulkCopy? Truncates decimal digits after 26 records

Bug in SQLBulkCopy? Truncates decimal digits after 26 records

Scheduled Pinned Locked Moved Database
helpquestion
8 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A Offline
    A Offline
    Allan Watkins
    wrote on last edited by
    #1

    I believe I've found a very unexpected bug in SQLBulkCopy that took me a while to track down. My test program reads in records from a CSV file with exactly 1 column which is defined as a "numeric(19,5)" for statistical uploads. Condition 1: If the input file has 25 rows with the only value being the digit zero, followed by a number with a decimal (like '1.23456'), the data is imported perfectly. Condition 2: If the input file has 26 rows with the only value being the digit zero (or any non-decimal number), followed by a number with a decimal (like '1.23456'), the data is imported and the last (and any subsequent) rows have the field imported with truncated decimal digits (like '1.00000'). Condition 3: If the input file has 26 rows and the first row is '0.0', all of the rows are imported perfectly. I've checked the table/field definition for Condition 2 and it's the same as 1 and 3 so there's no manipulation of the field type (as far as I can tell). Is this a bug or am I missing something?

    J A 2 Replies Last reply
    0
    • A Allan Watkins

      I believe I've found a very unexpected bug in SQLBulkCopy that took me a while to track down. My test program reads in records from a CSV file with exactly 1 column which is defined as a "numeric(19,5)" for statistical uploads. Condition 1: If the input file has 25 rows with the only value being the digit zero, followed by a number with a decimal (like '1.23456'), the data is imported perfectly. Condition 2: If the input file has 26 rows with the only value being the digit zero (or any non-decimal number), followed by a number with a decimal (like '1.23456'), the data is imported and the last (and any subsequent) rows have the field imported with truncated decimal digits (like '1.00000'). Condition 3: If the input file has 26 rows and the first row is '0.0', all of the rows are imported perfectly. I've checked the table/field definition for Condition 2 and it's the same as 1 and 3 so there's no manipulation of the field type (as far as I can tell). Is this a bug or am I missing something?

      J Offline
      J Offline
      jschell
      wrote on last edited by
      #2

      You say the test file has "1 column". Then you say that each row has "the digit zero, followed by a number with a decimal (like '1.23456')," Does that mean that it has two columns or that the single number is '01.23456'? Also you should post the following 1. What database version you are using. 2. Example data 3. The relevant code. 4. The database schema you using.

      A 1 Reply Last reply
      0
      • J jschell

        You say the test file has "1 column". Then you say that each row has "the digit zero, followed by a number with a decimal (like '1.23456')," Does that mean that it has two columns or that the single number is '01.23456'? Also you should post the following 1. What database version you are using. 2. Example data 3. The relevant code. 4. The database schema you using.

        A Offline
        A Offline
        Allan Watkins
        wrote on last edited by
        #3

        Re: the numbers The test only had a single column although I first noticed the behavior with 100 columns. Each row only has 1 value (zero) until the 27th row which is the decimal number, i.e. 0 0 0 ... 0 1.23456 Re: other questions 1) The destination was SQL Server 2008 and the SQLBulkCopy came from the .Net 4.5 library. This brings up the question "Is the SQLBulkCopy the culprit or the destination SQL Server instance?" 2) Example data: see above 3) Relevant code

        using (SqlConnection connection = new SqlConnection(strConn))
        {
        connection.Open();

        using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
        {
            bulkCopy.DestinationTableName = strSQL\_table\_name;
            bulkCopy.BatchSize = GetBulkCopySize();
            bulkCopy.BulkCopyTimeout = 2000;
            bulkCopy.NotifyAfter = GetNotifyAfter();
            bulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsCopied);
        
            try
            {
                bulkCopy.WriteToServer(results);
            }
            catch (Exception ex)
            {
                string error = ex.Message;
                MessageBox.Show("Error(CSV\_To\_SQL-a): " + ex.Message);
                return;
            }
        
        1. single column in a table, name="BigNumeric", data type="numeric(18,5)
        J M 2 Replies Last reply
        0
        • A Allan Watkins

          Re: the numbers The test only had a single column although I first noticed the behavior with 100 columns. Each row only has 1 value (zero) until the 27th row which is the decimal number, i.e. 0 0 0 ... 0 1.23456 Re: other questions 1) The destination was SQL Server 2008 and the SQLBulkCopy came from the .Net 4.5 library. This brings up the question "Is the SQLBulkCopy the culprit or the destination SQL Server instance?" 2) Example data: see above 3) Relevant code

          using (SqlConnection connection = new SqlConnection(strConn))
          {
          connection.Open();

          using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
          {
              bulkCopy.DestinationTableName = strSQL\_table\_name;
              bulkCopy.BatchSize = GetBulkCopySize();
              bulkCopy.BulkCopyTimeout = 2000;
              bulkCopy.NotifyAfter = GetNotifyAfter();
              bulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsCopied);
          
              try
              {
                  bulkCopy.WriteToServer(results);
              }
              catch (Exception ex)
              {
                  string error = ex.Message;
                  MessageBox.Show("Error(CSV\_To\_SQL-a): " + ex.Message);
                  return;
              }
          
          1. single column in a table, name="BigNumeric", data type="numeric(18,5)
          J Offline
          J Offline
          jschell
          wrote on last edited by
          #4

          Where is the reader ('results') close?

          1 Reply Last reply
          0
          • A Allan Watkins

            Re: the numbers The test only had a single column although I first noticed the behavior with 100 columns. Each row only has 1 value (zero) until the 27th row which is the decimal number, i.e. 0 0 0 ... 0 1.23456 Re: other questions 1) The destination was SQL Server 2008 and the SQLBulkCopy came from the .Net 4.5 library. This brings up the question "Is the SQLBulkCopy the culprit or the destination SQL Server instance?" 2) Example data: see above 3) Relevant code

            using (SqlConnection connection = new SqlConnection(strConn))
            {
            connection.Open();

            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
            {
                bulkCopy.DestinationTableName = strSQL\_table\_name;
                bulkCopy.BatchSize = GetBulkCopySize();
                bulkCopy.BulkCopyTimeout = 2000;
                bulkCopy.NotifyAfter = GetNotifyAfter();
                bulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsCopied);
            
                try
                {
                    bulkCopy.WriteToServer(results);
                }
                catch (Exception ex)
                {
                    string error = ex.Message;
                    MessageBox.Show("Error(CSV\_To\_SQL-a): " + ex.Message);
                    return;
                }
            
            1. single column in a table, name="BigNumeric", data type="numeric(18,5)
            M Offline
            M Offline
            Mycroft Holmes
            wrote on last edited by
            #5

            I presume results is a datatable, have you inspected the content of the table before BC?

            Never underestimate the power of human stupidity RAH

            A 1 Reply Last reply
            0
            • M Mycroft Holmes

              I presume results is a datatable, have you inspected the content of the table before BC?

              Never underestimate the power of human stupidity RAH

              A Offline
              A Offline
              Allan Watkins
              wrote on last edited by
              #6

              No but that was the answer - I was doing an adapter.Fill(datatable) and didn't realize how inept the DataType was being established. I :^) assumed :^) that the Fill() was doing more work than it was. I used the .Clone() method (found at http://stackoverflow.com/questions/9028029/how-to-change-datatype-of-a-datacolumn-in-a-datatable[^]) which explained how to overcome poorly initiated DataTypes. I've certainly learned a good lesson though - thanks.

              A 1 Reply Last reply
              0
              • A Allan Watkins

                No but that was the answer - I was doing an adapter.Fill(datatable) and didn't realize how inept the DataType was being established. I :^) assumed :^) that the Fill() was doing more work than it was. I used the .Clone() method (found at http://stackoverflow.com/questions/9028029/how-to-change-datatype-of-a-datacolumn-in-a-datatable[^]) which explained how to overcome poorly initiated DataTypes. I've certainly learned a good lesson though - thanks.

                A Offline
                A Offline
                Allan Watkins
                wrote on last edited by
                #7

                As it turns out, the process still isn't working. The .Clone() didn't work so I tried the .FillSchema() method and it's failing too.

                adapter.FillSchema(table, SchemaType.Source);
                table.Columns[2].DataType = typeof (Int32);
                adapter.Fill(table);

                The column still gets cast as an Int32 from FillSchema(), then I try to change it to typeof(Decimal) followed by Fill() and the decimal points are still truncated in the resulting DataTable. I even tried to set the DataType to String but that didn't work either. Next I've discovered another practically hidden Microsoft "feature" called TypeGuessRows (and IMEX=1) which supposedly can be modified in the registry or the OleDbConnection string. I haven't gotten it to work yet but at least I know that others have seen this behavior too.

                1 Reply Last reply
                0
                • A Allan Watkins

                  I believe I've found a very unexpected bug in SQLBulkCopy that took me a while to track down. My test program reads in records from a CSV file with exactly 1 column which is defined as a "numeric(19,5)" for statistical uploads. Condition 1: If the input file has 25 rows with the only value being the digit zero, followed by a number with a decimal (like '1.23456'), the data is imported perfectly. Condition 2: If the input file has 26 rows with the only value being the digit zero (or any non-decimal number), followed by a number with a decimal (like '1.23456'), the data is imported and the last (and any subsequent) rows have the field imported with truncated decimal digits (like '1.00000'). Condition 3: If the input file has 26 rows and the first row is '0.0', all of the rows are imported perfectly. I've checked the table/field definition for Condition 2 and it's the same as 1 and 3 so there's no manipulation of the field type (as far as I can tell). Is this a bug or am I missing something?

                  A Offline
                  A Offline
                  Aadhar Joshi
                  wrote on last edited by
                  #8

                  Keep your column in an numeric format OR convert it manually in numeric format so it dont create destruction.

                  1 Reply Last reply
                  0
                  Reply
                  • Reply as topic
                  Log in to reply
                  • Oldest to Newest
                  • Newest to Oldest
                  • Most Votes


                  • Login

                  • Don't have an account? Register

                  • Login or register to search.
                  • First post
                    Last post
                  0
                  • Categories
                  • Recent
                  • Tags
                  • Popular
                  • World
                  • Users
                  • Groups