Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Maintaing DB

Maintaing DB

Scheduled Pinned Locked Moved Database
sharepointdatabasequestion
5 Posts 3 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • H Offline
    H Offline
    Hum Dum
    wrote on last edited by
    #1

    folder where files arrive in every two hrs. Sp i created read files in every 2 hrs from folder and moves files in separate folder. When reading files it inserts approx 11k - 12k lines in table. Now this has to be run everyday at every two hour. So, What measure should i take from slowing down queries result as days, months, year passes by? One i have in mind run a SP at every 10-15 day interval which will copy this table data into another table and removes rows? Your ideas ?

    W P 2 Replies Last reply
    0
    • H Hum Dum

      folder where files arrive in every two hrs. Sp i created read files in every 2 hrs from folder and moves files in separate folder. When reading files it inserts approx 11k - 12k lines in table. Now this has to be run everyday at every two hour. So, What measure should i take from slowing down queries result as days, months, year passes by? One i have in mind run a SP at every 10-15 day interval which will copy this table data into another table and removes rows? Your ideas ?

      W Offline
      W Offline
      Wendelius
      wrote on last edited by
      #2

      If I understood correctly you're concerned about the performance when the amounts of data grow. Based on the rough information you provided, few opinions: - correct indexing strategy must be implemented (depending on the needs) - I wouldn't use separate tables since it would affect program logic - think carefully about the table structure, optimize it for typical use-cases - consider using partitioning.

      The need to optimize rises from a bad design.My articles[^]

      H 1 Reply Last reply
      0
      • W Wendelius

        If I understood correctly you're concerned about the performance when the amounts of data grow. Based on the rough information you provided, few opinions: - correct indexing strategy must be implemented (depending on the needs) - I wouldn't use separate tables since it would affect program logic - think carefully about the table structure, optimize it for typical use-cases - consider using partitioning.

        The need to optimize rises from a bad design.My articles[^]

        H Offline
        H Offline
        Hum Dum
        wrote on last edited by
        #3

        You got it right. I think about moving data to separate table as query which has to run on table, will need only information of 1 previous day , today, 1 after day on any given day. Like if run my queries today i.e 09 Mar 11 17:04 PM then query will look for data of 8, 9, 10 march that's why i think about moving data after 10 days to separate table.

        Mika Wendelius wrote:

        - correct indexing strategy must be implemented (depending on the needs)
        - I wouldn't use separate tables since it would affect program logic
        - think carefully about the table structure, optimize it for typical use-cases

        If apply all these will it still work after 5 year (assuming 12000 * 12 * 365 * 5 rows) ?

        Mika Wendelius wrote:

        consider using partitioning.

        I don't have any clue about this? Will be helpful if you point to an article (targeting less then have DBA exp)

        W 1 Reply Last reply
        0
        • H Hum Dum

          You got it right. I think about moving data to separate table as query which has to run on table, will need only information of 1 previous day , today, 1 after day on any given day. Like if run my queries today i.e 09 Mar 11 17:04 PM then query will look for data of 8, 9, 10 march that's why i think about moving data after 10 days to separate table.

          Mika Wendelius wrote:

          - correct indexing strategy must be implemented (depending on the needs)
          - I wouldn't use separate tables since it would affect program logic
          - think carefully about the table structure, optimize it for typical use-cases

          If apply all these will it still work after 5 year (assuming 12000 * 12 * 365 * 5 rows) ?

          Mika Wendelius wrote:

          consider using partitioning.

          I don't have any clue about this? Will be helpful if you point to an article (targeting less then have DBA exp)

          W Offline
          W Offline
          Wendelius
          wrote on last edited by
          #4

          Hum Dum wrote:

          I think about moving data to separate table as query which has to run on table, will need only information of 1 previous day , today, 1 after day on any given day.

          This makes things even easier and goes back to good indexing. So if the new rows are always added to the end in your timeline, you could consider using clustered index. This would make queries selecting the near past very efficient, see: http://msdn.microsoft.com/en-us/library/ms190639.aspx[^]

          Hum Dum wrote:

          If apply all these will it still work after 5 year (assuming 12000 * 12 * 365 * 5 rows) ?

          Roughly 25 million rows, as long as the structure and indexing are well done you should be safe, of course depending on the requirements :) Basically the earlier rows are kinda a dead mass if you don't query them as long as you have a good access path to the few rows you need

          Hum Dum wrote:

          don't have any clue about this?
          Will be helpful if you point to an article (targeting less then have DBA exp)

          Here's one starting point: http://msdn.microsoft.com/en-us/library/ms178148.aspx[^] Hopefully these help, mika

          The need to optimize rises from a bad design.My articles[^]

          1 Reply Last reply
          0
          • H Hum Dum

            folder where files arrive in every two hrs. Sp i created read files in every 2 hrs from folder and moves files in separate folder. When reading files it inserts approx 11k - 12k lines in table. Now this has to be run everyday at every two hour. So, What measure should i take from slowing down queries result as days, months, year passes by? One i have in mind run a SP at every 10-15 day interval which will copy this table data into another table and removes rows? Your ideas ?

            P Offline
            P Offline
            PIEBALDconsult
            wrote on last edited by
            #5

            What I have done in the past is to periodically delete rows older than some threshold. If you like, you can use a trigger to also write them to some archive table.

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups