Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. SQL server/ ADO.NET performance

SQL server/ ADO.NET performance

Scheduled Pinned Locked Moved Database
csharpdatabaseperformancesql-serversysadmin
2 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • P Offline
    P Offline
    Paul Horstink
    wrote on last edited by
    #1

    I have an application (C#/ADO.NET) that collects alarms from equipment at around 100 sites, through concurrent TCP/IP listeners. Per day I receive about 1 million messages (~10/sec). I used to insert them directly into the destination table, but that would lock up the table so user access (read/reporting) was hardly possible. Now I insert the data in a intermediate table and upload it every 5 min. to the final table with SQl agent. First of all, this doesn't look like an elegant solution. Furthermore, performance is still quite bad. Even though I have just this one DB/table, SQL-server uses up all available memory (1.5Gb out of 2Gb) and all available CPU, 24x7; I'm just wainting for a melt-down... Anybody have any experience with this amount of transactions, and/or any suggestions? Thanks.

    D 1 Reply Last reply
    0
    • P Paul Horstink

      I have an application (C#/ADO.NET) that collects alarms from equipment at around 100 sites, through concurrent TCP/IP listeners. Per day I receive about 1 million messages (~10/sec). I used to insert them directly into the destination table, but that would lock up the table so user access (read/reporting) was hardly possible. Now I insert the data in a intermediate table and upload it every 5 min. to the final table with SQl agent. First of all, this doesn't look like an elegant solution. Furthermore, performance is still quite bad. Even though I have just this one DB/table, SQL-server uses up all available memory (1.5Gb out of 2Gb) and all available CPU, 24x7; I'm just wainting for a melt-down... Anybody have any experience with this amount of transactions, and/or any suggestions? Thanks.

      D Offline
      D Offline
      David Salter
      wrote on last edited by
      #2

      Do you have the relevant clustered index on your table? Your table should have a clustered index so that new items are added to the end of the index and the index does not then need to be reordered. Affordable and reliable hosting? Click here!

      1 Reply Last reply
      0
      Reply
      • Reply as topic
      Log in to reply
      • Oldest to Newest
      • Newest to Oldest
      • Most Votes


      • Login

      • Don't have an account? Register

      • Login or register to search.
      • First post
        Last post
      0
      • Categories
      • Recent
      • Tags
      • Popular
      • World
      • Users
      • Groups