Bug in SQLBulkCopy? Truncates decimal digits after 26 records
-
I believe I've found a very unexpected bug in SQLBulkCopy that took me a while to track down. My test program reads in records from a CSV file with exactly 1 column which is defined as a "numeric(19,5)" for statistical uploads. Condition 1: If the input file has 25 rows with the only value being the digit zero, followed by a number with a decimal (like '1.23456'), the data is imported perfectly. Condition 2: If the input file has 26 rows with the only value being the digit zero (or any non-decimal number), followed by a number with a decimal (like '1.23456'), the data is imported and the last (and any subsequent) rows have the field imported with truncated decimal digits (like '1.00000'). Condition 3: If the input file has 26 rows and the first row is '0.0', all of the rows are imported perfectly. I've checked the table/field definition for Condition 2 and it's the same as 1 and 3 so there's no manipulation of the field type (as far as I can tell). Is this a bug or am I missing something?
-
I believe I've found a very unexpected bug in SQLBulkCopy that took me a while to track down. My test program reads in records from a CSV file with exactly 1 column which is defined as a "numeric(19,5)" for statistical uploads. Condition 1: If the input file has 25 rows with the only value being the digit zero, followed by a number with a decimal (like '1.23456'), the data is imported perfectly. Condition 2: If the input file has 26 rows with the only value being the digit zero (or any non-decimal number), followed by a number with a decimal (like '1.23456'), the data is imported and the last (and any subsequent) rows have the field imported with truncated decimal digits (like '1.00000'). Condition 3: If the input file has 26 rows and the first row is '0.0', all of the rows are imported perfectly. I've checked the table/field definition for Condition 2 and it's the same as 1 and 3 so there's no manipulation of the field type (as far as I can tell). Is this a bug or am I missing something?
You say the test file has "1 column". Then you say that each row has "the digit zero, followed by a number with a decimal (like '1.23456')," Does that mean that it has two columns or that the single number is '01.23456'? Also you should post the following 1. What database version you are using. 2. Example data 3. The relevant code. 4. The database schema you using.
-
You say the test file has "1 column". Then you say that each row has "the digit zero, followed by a number with a decimal (like '1.23456')," Does that mean that it has two columns or that the single number is '01.23456'? Also you should post the following 1. What database version you are using. 2. Example data 3. The relevant code. 4. The database schema you using.
Re: the numbers The test only had a single column although I first noticed the behavior with 100 columns. Each row only has 1 value (zero) until the 27th row which is the decimal number, i.e. 0 0 0 ... 0 1.23456 Re: other questions 1) The destination was SQL Server 2008 and the SQLBulkCopy came from the .Net 4.5 library. This brings up the question "Is the SQLBulkCopy the culprit or the destination SQL Server instance?" 2) Example data: see above 3) Relevant code
using (SqlConnection connection = new SqlConnection(strConn))
{
connection.Open();using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection)) { bulkCopy.DestinationTableName = strSQL\_table\_name; bulkCopy.BatchSize = GetBulkCopySize(); bulkCopy.BulkCopyTimeout = 2000; bulkCopy.NotifyAfter = GetNotifyAfter(); bulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsCopied); try { bulkCopy.WriteToServer(results); } catch (Exception ex) { string error = ex.Message; MessageBox.Show("Error(CSV\_To\_SQL-a): " + ex.Message); return; }
- single column in a table, name="BigNumeric", data type="numeric(18,5)
-
Re: the numbers The test only had a single column although I first noticed the behavior with 100 columns. Each row only has 1 value (zero) until the 27th row which is the decimal number, i.e. 0 0 0 ... 0 1.23456 Re: other questions 1) The destination was SQL Server 2008 and the SQLBulkCopy came from the .Net 4.5 library. This brings up the question "Is the SQLBulkCopy the culprit or the destination SQL Server instance?" 2) Example data: see above 3) Relevant code
using (SqlConnection connection = new SqlConnection(strConn))
{
connection.Open();using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection)) { bulkCopy.DestinationTableName = strSQL\_table\_name; bulkCopy.BatchSize = GetBulkCopySize(); bulkCopy.BulkCopyTimeout = 2000; bulkCopy.NotifyAfter = GetNotifyAfter(); bulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsCopied); try { bulkCopy.WriteToServer(results); } catch (Exception ex) { string error = ex.Message; MessageBox.Show("Error(CSV\_To\_SQL-a): " + ex.Message); return; }
- single column in a table, name="BigNumeric", data type="numeric(18,5)
-
Re: the numbers The test only had a single column although I first noticed the behavior with 100 columns. Each row only has 1 value (zero) until the 27th row which is the decimal number, i.e. 0 0 0 ... 0 1.23456 Re: other questions 1) The destination was SQL Server 2008 and the SQLBulkCopy came from the .Net 4.5 library. This brings up the question "Is the SQLBulkCopy the culprit or the destination SQL Server instance?" 2) Example data: see above 3) Relevant code
using (SqlConnection connection = new SqlConnection(strConn))
{
connection.Open();using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection)) { bulkCopy.DestinationTableName = strSQL\_table\_name; bulkCopy.BatchSize = GetBulkCopySize(); bulkCopy.BulkCopyTimeout = 2000; bulkCopy.NotifyAfter = GetNotifyAfter(); bulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsCopied); try { bulkCopy.WriteToServer(results); } catch (Exception ex) { string error = ex.Message; MessageBox.Show("Error(CSV\_To\_SQL-a): " + ex.Message); return; }
- single column in a table, name="BigNumeric", data type="numeric(18,5)
I presume results is a datatable, have you inspected the content of the table before BC?
Never underestimate the power of human stupidity RAH
-
I presume results is a datatable, have you inspected the content of the table before BC?
Never underestimate the power of human stupidity RAH
No but that was the answer - I was doing an adapter.Fill(datatable) and didn't realize how inept the DataType was being established. I :^) assumed :^) that the Fill() was doing more work than it was. I used the .Clone() method (found at http://stackoverflow.com/questions/9028029/how-to-change-datatype-of-a-datacolumn-in-a-datatable[^]) which explained how to overcome poorly initiated DataTypes. I've certainly learned a good lesson though - thanks.
-
No but that was the answer - I was doing an adapter.Fill(datatable) and didn't realize how inept the DataType was being established. I :^) assumed :^) that the Fill() was doing more work than it was. I used the .Clone() method (found at http://stackoverflow.com/questions/9028029/how-to-change-datatype-of-a-datacolumn-in-a-datatable[^]) which explained how to overcome poorly initiated DataTypes. I've certainly learned a good lesson though - thanks.
As it turns out, the process still isn't working. The .Clone() didn't work so I tried the .FillSchema() method and it's failing too.
adapter.FillSchema(table, SchemaType.Source);
table.Columns[2].DataType = typeof (Int32);
adapter.Fill(table);The column still gets cast as an Int32 from FillSchema(), then I try to change it to typeof(Decimal) followed by Fill() and the decimal points are still truncated in the resulting DataTable. I even tried to set the DataType to String but that didn't work either. Next I've discovered another practically hidden Microsoft "feature" called TypeGuessRows (and IMEX=1) which supposedly can be modified in the registry or the OleDbConnection string. I haven't gotten it to work yet but at least I know that others have seen this behavior too.
-
I believe I've found a very unexpected bug in SQLBulkCopy that took me a while to track down. My test program reads in records from a CSV file with exactly 1 column which is defined as a "numeric(19,5)" for statistical uploads. Condition 1: If the input file has 25 rows with the only value being the digit zero, followed by a number with a decimal (like '1.23456'), the data is imported perfectly. Condition 2: If the input file has 26 rows with the only value being the digit zero (or any non-decimal number), followed by a number with a decimal (like '1.23456'), the data is imported and the last (and any subsequent) rows have the field imported with truncated decimal digits (like '1.00000'). Condition 3: If the input file has 26 rows and the first row is '0.0', all of the rows are imported perfectly. I've checked the table/field definition for Condition 2 and it's the same as 1 and 3 so there's no manipulation of the field type (as far as I can tell). Is this a bug or am I missing something?
Keep your column in an numeric format OR convert it manually in numeric format so it dont create destruction.