What's the best way to temporarily persist results of a long running SP?
-
Yup, I think this is what it's shaping out to be: A dedicated DB that'd hold temp permanent tables with cached data. Thanks!
I'll add my support to this one, we have used it successfully. We found that there were only a very limited number of users performing this task and so we would flag a datset with their ID and store the results in a single table. We then gave the user the option to either query the "cache" or refresh it with a new process of the proc. We then of course indexed the dammed table to death, which killed the insert/delete but made queries acceptably fast.
Never underestimate the power of human stupidity RAH
-
I have a TSQL stored procedure that can run for a few minutes and return a few million records, I need to display that data in an ASP.NET Grid (Infragistics WebDataGrid to be precise). Obviously I don't want return all data at once and need to setup some kind of paging options - every time user selects another page - another portion of data is loaded from the DB. But I can't run the SP every time new page is requested - it would take too much time. What would be the best way to persist data from the SP, so when user selects a new page - new data portion would be loaded by a simple SELECT... WHERE from that temp data storage?
Trekstuff: We might have the same case as you are. You didn't describe what 'the expensive SP' is doing, but in our case it is a cross-tab complex query of about 5 million rows of data. In our case, the time periode is always one month, so we have SQL Agent services that would run the query at 23:00 on each last day of the month. The result then stored in permanent tables on different database. The user would then query their requirement to these tables. The result is satisfying. We just have to 'torture' the server for about 3 to 5 hours on that night. Of course the downside is there are always a specific requirement that the available data warehouse couldn't comply. But we have a policy that they have to request their 'custom' need first to the IT department, and wait for the result for at least one day. That way, we can still have our tea time :) hth, foxyland
-
I apologize, I must've not made my intentions clear. User won't need to see all those few million records at once. He/she would need to make ad-hoc request to this dataset: filter/sort/search it on the fly and expect results fast. Thanks for the math lesson though :)
-
I'll add my support to this one, we have used it successfully. We found that there were only a very limited number of users performing this task and so we would flag a datset with their ID and store the results in a single table. We then gave the user the option to either query the "cache" or refresh it with a new process of the proc. We then of course indexed the dammed table to death, which killed the insert/delete but made queries acceptably fast.
Never underestimate the power of human stupidity RAH
-
Then the best way is to ask the user for the criteria and then build an appropriate query from that.
-
Trekstuff: We might have the same case as you are. You didn't describe what 'the expensive SP' is doing, but in our case it is a cross-tab complex query of about 5 million rows of data. In our case, the time periode is always one month, so we have SQL Agent services that would run the query at 23:00 on each last day of the month. The result then stored in permanent tables on different database. The user would then query their requirement to these tables. The result is satisfying. We just have to 'torture' the server for about 3 to 5 hours on that night. Of course the downside is there are always a specific requirement that the available data warehouse couldn't comply. But we have a policy that they have to request their 'custom' need first to the IT department, and wait for the result for at least one day. That way, we can still have our tea time :) hth, foxyland
-
Trekstuff wrote:
and return a few million records, I need to display that data in an ASP.NET Grid
Your requirements are badly flawed. If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.
jschell wrote:
If you have one million records and a user looks at each record for only 1 second it would take them 7 weeks (8 hour days at 5 days a week) to look at the list.
..assuming that they're looking at the records themselves, and not some kind of visualization of those records (like a chart) - that's assuming that they're looking at the data at all; for all I know they could be preparing an export to some Microsoft-Access database.
Bastard Programmer from Hell :suss:
-
User may have dozes of different criteria he/she would want to run ad-hoc. Those criteria aren't know in advance and need to be passed to data on the fly.
Trekstuff wrote:
User may have dozes of different criteria he/she would want to run ad-hoc.
May and will are different. And standard business users will not have those requirements. For example anyone dealing with customers only needs a couple of criteria to find the the customer/order. And if you really meant "ad-hoc" then you have a power user and they should be proficient in SQL and have a SQL application, not a created app.
-
Trekstuff wrote:
User may have dozes of different criteria he/she would want to run ad-hoc.
May and will are different. And standard business users will not have those requirements. For example anyone dealing with customers only needs a couple of criteria to find the the customer/order. And if you really meant "ad-hoc" then you have a power user and they should be proficient in SQL and have a SQL application, not a created app.
-
Yes I am convinced. I can see it now: A bunch of execs at a board meeting firing up SSMS and just querying happily away :)
Trekstuff wrote:
Yes I am convinced. I can see it now: A bunch of execs at a board meeting firing up SSMS and just querying happily away
If they want unlimited queries that is exactly what is needed and wrapping it in another app doesn't change that. And that is exactly the situation at a bank that I worked at. The VP in charge of the merchant credit services often (more than once a week) did custom queries directly into the production database.