Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. What's the best way to temporarily persist results of a long running SP?

What's the best way to temporarily persist results of a long running SP?

Scheduled Pinned Locked Moved Database
databasecsharpcssasp-netsharepoint
24 Posts 7 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • J jschell

    Trekstuff wrote:

    and return a few million records, I need to display that data in an ASP.NET Grid

    Your requirements are badly flawed. If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.

    T Offline
    T Offline
    Trekstuff
    wrote on last edited by
    #10

    I apologize, I must've not made my intentions clear. User won't need to see all those few million records at once. He/she would need to make ad-hoc request to this dataset: filter/sort/search it on the fly and expect results fast. Thanks for the math lesson though :)

    J 1 Reply Last reply
    0
    • T Trekstuff

      I have a TSQL stored procedure that can run for a few minutes and return a few million records, I need to display that data in an ASP.NET Grid (Infragistics WebDataGrid to be precise). Obviously I don't want return all data at once and need to setup some kind of paging options - every time user selects another page - another portion of data is loaded from the DB. But I can't run the SP every time new page is requested - it would take too much time. What would be the best way to persist data from the SP, so when user selects a new page - new data portion would be loaded by a simple SELECT... WHERE from that temp data storage?

      R Offline
      R Offline
      R Giskard Reventlov
      wrote on last edited by
      #11

      This seems like a bad idea; no one is ever going to look at a million plus records; at best they'll look at a small subset. At the very least get them to filter before going to the database and only ever return as many records as they actually need to see, perhaps restricting to a managabale number.

      "If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair. nils illegitimus carborundum me, me, me

      T 1 Reply Last reply
      0
      • R R Giskard Reventlov

        This seems like a bad idea; no one is ever going to look at a million plus records; at best they'll look at a small subset. At the very least get them to filter before going to the database and only ever return as many records as they actually need to see, perhaps restricting to a managabale number.

        "If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair. nils illegitimus carborundum me, me, me

        T Offline
        T Offline
        Trekstuff
        wrote on last edited by
        #12

        That's the point, I don't know in advance what filters are going to be used; I can only restrict the data somewhat by initial call too the SP; but I cannot call SP for every filter request - this call is to expensive. For example: user needs to work with data for the past 3 month - this Date filter I can pass to SP and this call can returns several million of records. Once the data is obtained - user may want to: Get only rows which "Name" field start with "D" or which SSN ends with 74 or group by LastName and get the count etc. etc. - these requests will limit data to a much smaller subset, but are unpredictable and made to the original large resultset returned by the SP - instead of SP itself

        R 1 Reply Last reply
        0
        • T Trekstuff

          That's the point, I don't know in advance what filters are going to be used; I can only restrict the data somewhat by initial call too the SP; but I cannot call SP for every filter request - this call is to expensive. For example: user needs to work with data for the past 3 month - this Date filter I can pass to SP and this call can returns several million of records. Once the data is obtained - user may want to: Get only rows which "Name" field start with "D" or which SSN ends with 74 or group by LastName and get the count etc. etc. - these requests will limit data to a much smaller subset, but are unpredictable and made to the original large resultset returned by the SP - instead of SP itself

          R Offline
          R Offline
          R Giskard Reventlov
          wrote on last edited by
          #13

          Not getting your reasoning here: if they first filter on the last 3 months why can't they also filter on other criteria at the same time? If it doesn't give them what they want they start over. Again, only retrieve the records they really need - it's more sensible to build a filter screen and get what you need than not and retrieve far more than they'll ever look at.

          "If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair. nils illegitimus carborundum me, me, me

          T 1 Reply Last reply
          0
          • R R Giskard Reventlov

            Not getting your reasoning here: if they first filter on the last 3 months why can't they also filter on other criteria at the same time? If it doesn't give them what they want they start over. Again, only retrieve the records they really need - it's more sensible to build a filter screen and get what you need than not and retrieve far more than they'll ever look at.

            "If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair. nils illegitimus carborundum me, me, me

            T Offline
            T Offline
            Trekstuff
            wrote on last edited by
            #14

            Again, the only condition they know in advance is "last 3 months". It is passed to SP - which is expensive to run, it may take several minutes (and please don't ask me to optimize it, it's not an option at the moment). Once the data is there - it needs to present a flexible, dashboard view. User clicks the Name column to filters data, observes result, uses it elsewhere, removes the filter. User searches for SSN that ends in 74, and uses that data, removes the filter. User sorts data by date, jumps to the 1st page, jumps to the last page. Again, these requests cannot be passed to SP - it will take too long, they need to be done to subset of data, returned by SP (yes those several million of records) that presumably are stored in some local storage.

            1 Reply Last reply
            0
            • T Trekstuff

              Yup, I think this is what it's shaping out to be: A dedicated DB that'd hold temp permanent tables with cached data. Thanks!

              M Offline
              M Offline
              Mycroft Holmes
              wrote on last edited by
              #15

              I'll add my support to this one, we have used it successfully. We found that there were only a very limited number of users performing this task and so we would flag a datset with their ID and store the results in a single table. We then gave the user the option to either query the "cache" or refresh it with a new process of the proc. We then of course indexed the dammed table to death, which killed the insert/delete but made queries acceptably fast.

              Never underestimate the power of human stupidity RAH

              T 1 Reply Last reply
              0
              • T Trekstuff

                I have a TSQL stored procedure that can run for a few minutes and return a few million records, I need to display that data in an ASP.NET Grid (Infragistics WebDataGrid to be precise). Obviously I don't want return all data at once and need to setup some kind of paging options - every time user selects another page - another portion of data is loaded from the DB. But I can't run the SP every time new page is requested - it would take too much time. What would be the best way to persist data from the SP, so when user selects a new page - new data portion would be loaded by a simple SELECT... WHERE from that temp data storage?

                F Offline
                F Offline
                foxyland
                wrote on last edited by
                #16

                Trekstuff: We might have the same case as you are. You didn't describe what 'the expensive SP' is doing, but in our case it is a cross-tab complex query of about 5 million rows of data. In our case, the time periode is always one month, so we have SQL Agent services that would run the query at 23:00 on each last day of the month. The result then stored in permanent tables on different database. The user would then query their requirement to these tables. The result is satisfying. We just have to 'torture' the server for about 3 to 5 hours on that night. Of course the downside is there are always a specific requirement that the available data warehouse couldn't comply. But we have a policy that they have to request their 'custom' need first to the IT department, and wait for the result for at least one day. That way, we can still have our tea time :) hth, foxyland

                T 1 Reply Last reply
                0
                • T Trekstuff

                  I apologize, I must've not made my intentions clear. User won't need to see all those few million records at once. He/she would need to make ad-hoc request to this dataset: filter/sort/search it on the fly and expect results fast. Thanks for the math lesson though :)

                  J Offline
                  J Offline
                  jschell
                  wrote on last edited by
                  #17

                  Then the best way is to ask the user for the criteria and then build an appropriate query from that.

                  T 1 Reply Last reply
                  0
                  • M Mycroft Holmes

                    I'll add my support to this one, we have used it successfully. We found that there were only a very limited number of users performing this task and so we would flag a datset with their ID and store the results in a single table. We then gave the user the option to either query the "cache" or refresh it with a new process of the proc. We then of course indexed the dammed table to death, which killed the insert/delete but made queries acceptably fast.

                    Never underestimate the power of human stupidity RAH

                    T Offline
                    T Offline
                    Trekstuff
                    wrote on last edited by
                    #18

                    Yup, looks like this is the way we're going to take as well. Thanks

                    1 Reply Last reply
                    0
                    • J jschell

                      Then the best way is to ask the user for the criteria and then build an appropriate query from that.

                      T Offline
                      T Offline
                      Trekstuff
                      wrote on last edited by
                      #19

                      User may have dozes of different criteria he/she would want to run ad-hoc. Those criteria aren't know in advance and need to be passed to data on the fly.

                      J 1 Reply Last reply
                      0
                      • F foxyland

                        Trekstuff: We might have the same case as you are. You didn't describe what 'the expensive SP' is doing, but in our case it is a cross-tab complex query of about 5 million rows of data. In our case, the time periode is always one month, so we have SQL Agent services that would run the query at 23:00 on each last day of the month. The result then stored in permanent tables on different database. The user would then query their requirement to these tables. The result is satisfying. We just have to 'torture' the server for about 3 to 5 hours on that night. Of course the downside is there are always a specific requirement that the available data warehouse couldn't comply. But we have a policy that they have to request their 'custom' need first to the IT department, and wait for the result for at least one day. That way, we can still have our tea time :) hth, foxyland

                        T Offline
                        T Offline
                        Trekstuff
                        wrote on last edited by
                        #20

                        This helps indeed. More and more I am getting convinced that separate DB with permanent tables is the way to go. Thanks guys you're the best.

                        1 Reply Last reply
                        0
                        • J jschell

                          Trekstuff wrote:

                          and return a few million records, I need to display that data in an ASP.NET Grid

                          Your requirements are badly flawed. If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.

                          L Offline
                          L Offline
                          Lost User
                          wrote on last edited by
                          #21

                          jschell wrote:

                          If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.

                          ..assuming that they're looking at the records themselves, and not some kind of visualization of those records (like a chart) - that's assuming that they're looking at the data at all; for all I know they could be preparing an export to some Microsoft-Access database.

                          Bastard Programmer from Hell :suss:

                          1 Reply Last reply
                          0
                          • T Trekstuff

                            User may have dozes of different criteria he/she would want to run ad-hoc. Those criteria aren't know in advance and need to be passed to data on the fly.

                            J Offline
                            J Offline
                            jschell
                            wrote on last edited by
                            #22

                            Trekstuff wrote:

                            User may have dozes of different criteria he/she would want to run ad-hoc.

                            May and will are different. And standard business users will not have those requirements. For example anyone dealing with customers only needs a couple of criteria to find the the customer/order. And if you really meant "ad-hoc" then you have a power user and they should be proficient in SQL and have a SQL application, not a created app.

                            T 1 Reply Last reply
                            0
                            • J jschell

                              Trekstuff wrote:

                              User may have dozes of different criteria he/she would want to run ad-hoc.

                              May and will are different. And standard business users will not have those requirements. For example anyone dealing with customers only needs a couple of criteria to find the the customer/order. And if you really meant "ad-hoc" then you have a power user and they should be proficient in SQL and have a SQL application, not a created app.

                              T Offline
                              T Offline
                              Trekstuff
                              wrote on last edited by
                              #23

                              Yes I am convinced. I can see it now: A bunch of execs at a board meeting firing up SSMS and just querying happily away :)

                              J 1 Reply Last reply
                              0
                              • T Trekstuff

                                Yes I am convinced. I can see it now: A bunch of execs at a board meeting firing up SSMS and just querying happily away :)

                                J Offline
                                J Offline
                                jschell
                                wrote on last edited by
                                #24

                                Trekstuff wrote:

                                Yes I am convinced. I can see it now: A bunch of execs at a board meeting firing up SSMS and just querying happily away

                                If they want unlimited queries that is exactly what is needed and wrapping it in another app doesn't change that. And that is exactly the situation at a bank that I worked at. The VP in charge of the merchant credit services often (more than once a week) did custom queries directly into the production database.

                                1 Reply Last reply
                                0
                                Reply
                                • Reply as topic
                                Log in to reply
                                • Oldest to Newest
                                • Newest to Oldest
                                • Most Votes


                                • Login

                                • Don't have an account? Register

                                • Login or register to search.
                                • First post
                                  Last post
                                0
                                • Categories
                                • Recent
                                • Tags
                                • Popular
                                • World
                                • Users
                                • Groups