SQL vs Code
-
I just made a tough decision by opting for a complex SQL (with recursive CTEs) in favor of complex logic in code. I hope it does not bite me later.:~
-
I just made a tough decision by opting for a complex SQL (with recursive CTEs) in favor of complex logic in code. I hope it does not bite me later.:~
You. No. The poor sap who has to enhance it 2 years down the line. Yes.
"WPF has many lovers. It's a veritable porn star!" - Josh Smith
As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.
-
I just made a tough decision by opting for a complex SQL (with recursive CTEs) in favor of complex logic in code. I hope it does not bite me later.:~
Rama Krishna Vavilala wrote:
with recursive CTEs
That's about the only wicked use of SQL I have ever seen (and used).
xacc.ide
IronScheme - 1.0 RC 1 - out now!
((λ (x) `(,x ',x)) '(λ (x) `(,x ',x))) The Scheme Programming Language – Fourth Edition -
Rama Krishna Vavilala wrote:
with recursive CTEs
That's about the only wicked use of SQL I have ever seen (and used).
xacc.ide
IronScheme - 1.0 RC 1 - out now!
((λ (x) `(,x ',x)) '(λ (x) `(,x ',x))) The Scheme Programming Language – Fourth EditionMove over to Oracle. The amount of stuff I've moved into the database because it fits in better there - it's wonderful, no longer having to retrieve items to perform complex processing only to put the output back into the database.
"WPF has many lovers. It's a veritable porn star!" - Josh Smith
As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.
-
You. No. The poor sap who has to enhance it 2 years down the line. Yes.
"WPF has many lovers. It's a veritable porn star!" - Josh Smith
As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.
May be or may be not. The code would have been around 400 lines, the SQL is about 50 significant lines . The main thing with SQL is performance (a set operation on the Database server vs an step by step operations on the middle-tier). The performance difference came to be around 10 times (30 mins vs 6 hrs approx).
-
Rama Krishna Vavilala wrote:
with recursive CTEs
That's about the only wicked use of SQL I have ever seen (and used).
xacc.ide
IronScheme - 1.0 RC 1 - out now!
((λ (x) `(,x ',x)) '(λ (x) `(,x ',x))) The Scheme Programming Language – Fourth Edition -
May be or may be not. The code would have been around 400 lines, the SQL is about 50 significant lines . The main thing with SQL is performance (a set operation on the Database server vs an step by step operations on the middle-tier). The performance difference came to be around 10 times (30 mins vs 6 hrs approx).
You don't have to tell me - I once managed to get an app that took 180 hours to run down to half an hour by recreating the processing in a stored procedure.
"WPF has many lovers. It's a veritable porn star!" - Josh Smith
As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.
-
May be or may be not. The code would have been around 400 lines, the SQL is about 50 significant lines . The main thing with SQL is performance (a set operation on the Database server vs an step by step operations on the middle-tier). The performance difference came to be around 10 times (30 mins vs 6 hrs approx).
The code could be wrong. 6 hours as well as 30 minutes both seem like a lot. How many records are we talking about here?
Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane
-
May be or may be not. The code would have been around 400 lines, the SQL is about 50 significant lines . The main thing with SQL is performance (a set operation on the Database server vs an step by step operations on the middle-tier). The performance difference came to be around 10 times (30 mins vs 6 hrs approx).
-
The code could be wrong. 6 hours as well as 30 minutes both seem like a lot. How many records are we talking about here?
Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane
Ennis Ray Lynch, Jr. wrote:
. How many records are we talking about here?
5 million! But the processing is super complicated (recursion/trees).
Ennis Ray Lynch, Jr. wrote:
The code could be wrong
No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.
-
Just make sure you've documented it (including the slower code) so the code monkey who comes in a few years later to patch it doesn't kill performance after testing both ways with only 20 records. :rolleyes:
3x12=36 2x12=24 1x12=12 0x12=18
Dan Neely wrote:
after testing both ways with only 20 records
LOL! They will not have that option luckily. But you are right any major decision like this has to be documented.
-
Ennis Ray Lynch, Jr. wrote:
. How many records are we talking about here?
5 million! But the processing is super complicated (recursion/trees).
Ennis Ray Lynch, Jr. wrote:
The code could be wrong
No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.
Rama Krishna Vavilala wrote:
No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.
I've made huge performance improvements by changing queries to just stream raw table data to an app instead of doing any aggregation in SQL server. As soon as the memory required for the aggregation on sql server overflowed the available server memory, paging dragged performance to a standstill with hundreds of other queries backed up behind it. Streaming several 5-12Gigs of data from SQL required much less memory and we could use more efficient structures in C# so it was thousands of times faster.
I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon
-
Dan Neely wrote:
after testing both ways with only 20 records
LOL! They will not have that option luckily. But you are right any major decision like this has to be documented.
Your dev/test environment has a dataset as massively huge as the production one? ON the one hand I'm impressed that you've gone so far towards realistic testing, OTOH having a dataset small enough to check results with in under a minute is very useful too.
3x12=36 2x12=24 1x12=12 0x12=18
-
Ennis Ray Lynch, Jr. wrote:
. How many records are we talking about here?
5 million! But the processing is super complicated (recursion/trees).
Ennis Ray Lynch, Jr. wrote:
The code could be wrong
No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.
I haven't seen the code nor do I know the problem domain, but the kind of numbers you posted (6 hours vs. 15 minutes) indicate that something else is going on other than comparing two optimum approaches. I used to think transferring data over the wire is a big deal, but it really isn't. Try the following exercise for fun one day:
DateTime startDate = DateTime.Now;
...
sqlCommand.CommandText = "SELECT * FROM myTable";
...
reader = sqlCommand.ExecuteReader();
while(reader.read()){
...//Don't really have to do anything for a test
}
TimeSpan result = DateTime.Now - startDate;6 million records shouldn't take more than a few minutes at worse.
Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane
-
Rama Krishna Vavilala wrote:
No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.
I've made huge performance improvements by changing queries to just stream raw table data to an app instead of doing any aggregation in SQL server. As soon as the memory required for the aggregation on sql server overflowed the available server memory, paging dragged performance to a standstill with hundreds of other queries backed up behind it. Streaming several 5-12Gigs of data from SQL required much less memory and we could use more efficient structures in C# so it was thousands of times faster.
I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon
Seconded.
Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane
-
Rama Krishna Vavilala wrote:
No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.
I've made huge performance improvements by changing queries to just stream raw table data to an app instead of doing any aggregation in SQL server. As soon as the memory required for the aggregation on sql server overflowed the available server memory, paging dragged performance to a standstill with hundreds of other queries backed up behind it. Streaming several 5-12Gigs of data from SQL required much less memory and we could use more efficient structures in C# so it was thousands of times faster.
I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon
Rather than remain silent and appear intelligent, I'll ask a question... Can't you use C# in SQL Server now? Wouldn't those C# optimizations work within the db itself? I haven't used any .NET in the db, so I'm interested to see what the answer is.
-
I haven't seen the code nor do I know the problem domain, but the kind of numbers you posted (6 hours vs. 15 minutes) indicate that something else is going on other than comparing two optimum approaches. I used to think transferring data over the wire is a big deal, but it really isn't. Try the following exercise for fun one day:
DateTime startDate = DateTime.Now;
...
sqlCommand.CommandText = "SELECT * FROM myTable";
...
reader = sqlCommand.ExecuteReader();
while(reader.read()){
...//Don't really have to do anything for a test
}
TimeSpan result = DateTime.Now - startDate;6 million records shouldn't take more than a few minutes at worse.
Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane
Well it is not just streaming but processing too. When you include processing it is streaming back and forth which effects the overall performance.
-
Your dev/test environment has a dataset as massively huge as the production one? ON the one hand I'm impressed that you've gone so far towards realistic testing, OTOH having a dataset small enough to check results with in under a minute is very useful too.
3x12=36 2x12=24 1x12=12 0x12=18
Dan Neely wrote:
with in under a minute is very useful too.
Yes. All I need to do us to enter different date ranges.
-
I haven't seen the code nor do I know the problem domain, but the kind of numbers you posted (6 hours vs. 15 minutes) indicate that something else is going on other than comparing two optimum approaches. I used to think transferring data over the wire is a big deal, but it really isn't. Try the following exercise for fun one day:
DateTime startDate = DateTime.Now;
...
sqlCommand.CommandText = "SELECT * FROM myTable";
...
reader = sqlCommand.ExecuteReader();
while(reader.read()){
...//Don't really have to do anything for a test
}
TimeSpan result = DateTime.Now - startDate;6 million records shouldn't take more than a few minutes at worse.
Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane
On a good connection perhaps...but what if they're across the state and have to go through several firewalls in the process. Keeping a connection open for each user for 'a few minutes' would be horrible. Plus, a few minutes is pretty hard for a user to accept. I mean, if your computer took a few minutes to bring up your homepage. Would you consider a new computer or at least a new homepage? :D
-
On a good connection perhaps...but what if they're across the state and have to go through several firewalls in the process. Keeping a connection open for each user for 'a few minutes' would be horrible. Plus, a few minutes is pretty hard for a user to accept. I mean, if your computer took a few minutes to bring up your homepage. Would you consider a new computer or at least a new homepage? :D
Whomever the user is they are already accustomed to a 30 minute wait. Heaven forbid I suggest a 2 minute wait is more usable.
Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane