Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. SQL server/ ADO.NET performance

SQL server/ ADO.NET performance

Scheduled Pinned Locked Moved C#
csharpdatabaseperformancesql-serversysadmin
2 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • P Offline
    P Offline
    Paul Horstink
    wrote on last edited by
    #1

    I have an application (C#/ADO.NET) that collects alarms from equipment at around 100 sites, through concurrent TCP/IP listeners. Per day I receive about 1 million messages (~10/sec). I used to insert the directly into the destination table, but that would lock up the table so user access (read/reporting) was hardly possible. Now I insert the data in a intermediate table and upload it every 5 min. to the final table with SQl agent. First of all, this doesn't look like an elegant solution. Furthermore, performance is still quite bad. Even though I have just this one DB/table, SQL-server uses up all available memory (1.5Gb out of 2Gb) and all available CPU, 24x7; I'm just wainting for a melt-down... Anybody have any experience with this amount of transactions, and/or any suggestions? Thanks.

    S 1 Reply Last reply
    0
    • P Paul Horstink

      I have an application (C#/ADO.NET) that collects alarms from equipment at around 100 sites, through concurrent TCP/IP listeners. Per day I receive about 1 million messages (~10/sec). I used to insert the directly into the destination table, but that would lock up the table so user access (read/reporting) was hardly possible. Now I insert the data in a intermediate table and upload it every 5 min. to the final table with SQl agent. First of all, this doesn't look like an elegant solution. Furthermore, performance is still quite bad. Even though I have just this one DB/table, SQL-server uses up all available memory (1.5Gb out of 2Gb) and all available CPU, 24x7; I'm just wainting for a melt-down... Anybody have any experience with this amount of transactions, and/or any suggestions? Thanks.

      S Offline
      S Offline
      Steven Campbell
      wrote on last edited by
      #2

      CPU should not be high on the SQL box for data inserts. Check that your inserts are as efficient as possible: * avoid INSERT ... WHERE ... statements * avoid clustered index (good for selecting, often bad for inserting) * minimize the number of indexes on the tables you insert to Regarding memory usage, SQL tends to do that (grab as much memory as it can), it does not necessarily mean anything. However, it is worthwhile checking that you have appropriate indexes for the user-reporting and other reads are optimized.


      my blog

      1 Reply Last reply
      0
      Reply
      • Reply as topic
      Log in to reply
      • Oldest to Newest
      • Newest to Oldest
      • Most Votes


      • Login

      • Don't have an account? Register

      • Login or register to search.
      • First post
        Last post
      0
      • Categories
      • Recent
      • Tags
      • Popular
      • World
      • Users
      • Groups