Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
A

Allan Watkins

@Allan Watkins
About
Posts
5
Topics
2
Shares
0
Groups
0
Followers
0
Following
0

Posts

Recent Best Controversial

  • SQL BulkCopy using IDataReader - how can you cast the fields?
    A Allan Watkins

    I'm looking for a method/example/pseudo-code of importing a CSV file into a SQL Server table with several requirements: 1) I do not know the quantity of columns, column header nor data type of each column prior to the program running. Along with the header fields, I have another file which describes the type and size of each column (like integer, decimal or string) and I can already create the destination table based on this input. 2) Because I don't know the format of the input table before running the program (i.e. the input table changes every time), I can't create a class specifically designed to describe the contents of each record from the CSV file. 3) I already have a method working where a DataTable gets fully populated and the SQLBulkCopy variable uses the .WriteToServer() method to insert all of the records. HOWEVER, there's a problem when the source file gets too large (over 500mb or so) and the result is an OutOfMemory exception. 4) The AdHocGeek has a partial solution which works with strings only but doesn't address integers, decimals, dates, et.al. The problem is that I can't see a way to use an IDataReader into a complex (i.e. many field types) table on the fly. I've added the .ColumnMappings() method to determine the name of the columns with which to map the input columns into the destination SQL Server table. When the application is run however, it gets an error indicating "The given value of type String from the data source cannot be converted to type decimal of the specified target column". I can't determine why the DataTable works and the DataReader doesn't and can't find any examples to prove otherwise. Any ideas or examples would be appreciated.

    Database help database sql-server com sysadmin

  • Bug in SQLBulkCopy? Truncates decimal digits after 26 records
    A Allan Watkins

    As it turns out, the process still isn't working. The .Clone() didn't work so I tried the .FillSchema() method and it's failing too.

    adapter.FillSchema(table, SchemaType.Source);
    table.Columns[2].DataType = typeof (Int32);
    adapter.Fill(table);

    The column still gets cast as an Int32 from FillSchema(), then I try to change it to typeof(Decimal) followed by Fill() and the decimal points are still truncated in the resulting DataTable. I even tried to set the DataType to String but that didn't work either. Next I've discovered another practically hidden Microsoft "feature" called TypeGuessRows (and IMEX=1) which supposedly can be modified in the registry or the OleDbConnection string. I haven't gotten it to work yet but at least I know that others have seen this behavior too.

    Database help question

  • Bug in SQLBulkCopy? Truncates decimal digits after 26 records
    A Allan Watkins

    No but that was the answer - I was doing an adapter.Fill(datatable) and didn't realize how inept the DataType was being established. I :^) assumed :^) that the Fill() was doing more work than it was. I used the .Clone() method (found at http://stackoverflow.com/questions/9028029/how-to-change-datatype-of-a-datacolumn-in-a-datatable[^]) which explained how to overcome poorly initiated DataTypes. I've certainly learned a good lesson though - thanks.

    Database help question

  • Bug in SQLBulkCopy? Truncates decimal digits after 26 records
    A Allan Watkins

    Re: the numbers The test only had a single column although I first noticed the behavior with 100 columns. Each row only has 1 value (zero) until the 27th row which is the decimal number, i.e. 0 0 0 ... 0 1.23456 Re: other questions 1) The destination was SQL Server 2008 and the SQLBulkCopy came from the .Net 4.5 library. This brings up the question "Is the SQLBulkCopy the culprit or the destination SQL Server instance?" 2) Example data: see above 3) Relevant code

    using (SqlConnection connection = new SqlConnection(strConn))
    {
    connection.Open();

    using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
    {
        bulkCopy.DestinationTableName = strSQL\_table\_name;
        bulkCopy.BatchSize = GetBulkCopySize();
        bulkCopy.BulkCopyTimeout = 2000;
        bulkCopy.NotifyAfter = GetNotifyAfter();
        bulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsCopied);
    
        try
        {
            bulkCopy.WriteToServer(results);
        }
        catch (Exception ex)
        {
            string error = ex.Message;
            MessageBox.Show("Error(CSV\_To\_SQL-a): " + ex.Message);
            return;
        }
    
    1. single column in a table, name="BigNumeric", data type="numeric(18,5)
    Database help question

  • Bug in SQLBulkCopy? Truncates decimal digits after 26 records
    A Allan Watkins

    I believe I've found a very unexpected bug in SQLBulkCopy that took me a while to track down. My test program reads in records from a CSV file with exactly 1 column which is defined as a "numeric(19,5)" for statistical uploads. Condition 1: If the input file has 25 rows with the only value being the digit zero, followed by a number with a decimal (like '1.23456'), the data is imported perfectly. Condition 2: If the input file has 26 rows with the only value being the digit zero (or any non-decimal number), followed by a number with a decimal (like '1.23456'), the data is imported and the last (and any subsequent) rows have the field imported with truncated decimal digits (like '1.00000'). Condition 3: If the input file has 26 rows and the first row is '0.0', all of the rows are imported perfectly. I've checked the table/field definition for Condition 2 and it's the same as 1 and 3 so there's no manipulation of the field type (as far as I can tell). Is this a bug or am I missing something?

    Database help question
  • Login

  • Don't have an account? Register

  • Login or register to search.
  • First post
    Last post
0
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups