Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Opinions on loading large file into database?

Opinions on loading large file into database?

Scheduled Pinned Locked Moved Database
csharpdatabasesysadminannouncementdotnet
1 Posts 1 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • T Offline
    T Offline
    ThomasH1
    wrote on last edited by
    #1

    Hello everyone, I'd like your opinions! I'm rewriting a program that I did back in 1996; it takes a large "external-system" text file, and compares it against our database. If there's a difference between the file entry and database entry, we'll take the entry in the file and put it into the DB, and make note of it on a report. We don't run this program often, and we always take the file as the "correct" data. I'd like to load the file into an ADO.NET dataset, but I'm a little hesitant because of the average file size. The size of the flat file is usually between 2 and 4 MB large. Is this an okay size for a dataset in memory? I'd want to load up the entire file at one time into a dataset. Then I'd set up a DataAdapter and DataSet and etc for my database, and compare both datasets together. I could just call DataAdapter.Update on the database's dataset to make the changes. So not only would I have the 2-4 MB flatfile in memory, but I'd also potentially have the same size (2-4MB) from the database! The benefit is that I'd really be "using" ADO.NET... comparisions would be easier, and the update would be cleaner. My alternative is to open the file, and read it one record at a time. I'd then do an executescalar to check our database, and another executescalar if I had to update our database. This method seems memory-friendly, but database-intensive and network-intensive because of all the individual transactional updates... I'd have to update the database one-record-at-a-time! Do any of you load up huge 2-4 MB text files into a dataset? If so, do you have any problems with this approach? I really want to use datasets- but this program will be running on workstations, on an as-required basis- it won't be running on a server, so I don't want to write a program that takes forever to run. The workstations are semi-powerful; but this "might" have to run on an XP Pro box with only 256MB of memory. (Most of the XP Pros have 512 mb ram.) Oh, I'll be using C#.NET and ADO.NET under .NET Framework version 1.1. The file isn't in XML, and I think it would be pointless to convert it to XML first, and then process it. Correct me if I'm wrong, though! Thanks for any ideas! -Thomas

    1 Reply Last reply
    0
    Reply
    • Reply as topic
    Log in to reply
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes


    • Login

    • Don't have an account? Register

    • Login or register to search.
    • First post
      Last post
    0
    • Categories
    • Recent
    • Tags
    • Popular
    • World
    • Users
    • Groups