Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. SQL vs Code

SQL vs Code

Scheduled Pinned Locked Moved The Lounge
databasevisual-studio
43 Posts 19 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • P Pete OHanlon

    You. No. The poor sap who has to enhance it 2 years down the line. Yes.

    "WPF has many lovers. It's a veritable porn star!" - Josh Smith

    As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.

    My blog | My articles | MoXAML PowerToys | Onyx

    R Offline
    R Offline
    Rama Krishna Vavilala
    wrote on last edited by
    #5

    May be or may be not. The code would have been around 400 lines, the SQL is about 50 significant lines . The main thing with SQL is performance (a set operation on the Database server vs an step by step operations on the middle-tier). The performance difference came to be around 10 times (30 mins vs 6 hrs approx).

    P E D J 4 Replies Last reply
    0
    • L leppie

      Rama Krishna Vavilala wrote:

      with recursive CTEs

      That's about the only wicked use of SQL I have ever seen (and used).

      xacc.ide
      IronScheme - 1.0 RC 1 - out now!
      ((λ (x) `(,x ',x)) '(λ (x) `(,x ',x))) The Scheme Programming Language – Fourth Edition

      D Offline
      D Offline
      Distind
      wrote on last edited by
      #6

      Check out CUBE. I had to replace one, while I'm sure it's wonderful in certain applications(or so I keep telling myself), it sure didn't belong where I found it.

      1 Reply Last reply
      0
      • R Rama Krishna Vavilala

        May be or may be not. The code would have been around 400 lines, the SQL is about 50 significant lines . The main thing with SQL is performance (a set operation on the Database server vs an step by step operations on the middle-tier). The performance difference came to be around 10 times (30 mins vs 6 hrs approx).

        P Offline
        P Offline
        Pete OHanlon
        wrote on last edited by
        #7

        You don't have to tell me - I once managed to get an app that took 180 hours to run down to half an hour by recreating the processing in a stored procedure.

        "WPF has many lovers. It's a veritable porn star!" - Josh Smith

        As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.

        My blog | My articles | MoXAML PowerToys | Onyx

        1 Reply Last reply
        0
        • R Rama Krishna Vavilala

          May be or may be not. The code would have been around 400 lines, the SQL is about 50 significant lines . The main thing with SQL is performance (a set operation on the Database server vs an step by step operations on the middle-tier). The performance difference came to be around 10 times (30 mins vs 6 hrs approx).

          E Offline
          E Offline
          Ennis Ray Lynch Jr
          wrote on last edited by
          #8

          The code could be wrong. 6 hours as well as 30 minutes both seem like a lot. How many records are we talking about here?

          Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane

          R 1 Reply Last reply
          0
          • R Rama Krishna Vavilala

            May be or may be not. The code would have been around 400 lines, the SQL is about 50 significant lines . The main thing with SQL is performance (a set operation on the Database server vs an step by step operations on the middle-tier). The performance difference came to be around 10 times (30 mins vs 6 hrs approx).

            D Offline
            D Offline
            Dan Neely
            wrote on last edited by
            #9

            Just make sure you've documented it (including the slower code) so the code monkey who comes in a few years later to patch it doesn't kill performance after testing both ways with only 20 records. :rolleyes:

            3x12=36 2x12=24 1x12=12 0x12=18

            R 1 Reply Last reply
            0
            • E Ennis Ray Lynch Jr

              The code could be wrong. 6 hours as well as 30 minutes both seem like a lot. How many records are we talking about here?

              Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane

              R Offline
              R Offline
              Rama Krishna Vavilala
              wrote on last edited by
              #10

              Ennis Ray Lynch, Jr. wrote:

              . How many records are we talking about here?

              5 million! But the processing is super complicated (recursion/trees).

              Ennis Ray Lynch, Jr. wrote:

              The code could be wrong

              No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.

              A E I 3 Replies Last reply
              0
              • D Dan Neely

                Just make sure you've documented it (including the slower code) so the code monkey who comes in a few years later to patch it doesn't kill performance after testing both ways with only 20 records. :rolleyes:

                3x12=36 2x12=24 1x12=12 0x12=18

                R Offline
                R Offline
                Rama Krishna Vavilala
                wrote on last edited by
                #11

                Dan Neely wrote:

                after testing both ways with only 20 records

                LOL! They will not have that option luckily. But you are right any major decision like this has to be documented.

                D 1 Reply Last reply
                0
                • R Rama Krishna Vavilala

                  Ennis Ray Lynch, Jr. wrote:

                  . How many records are we talking about here?

                  5 million! But the processing is super complicated (recursion/trees).

                  Ennis Ray Lynch, Jr. wrote:

                  The code could be wrong

                  No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.

                  A Offline
                  A Offline
                  Andy Brummer
                  wrote on last edited by
                  #12

                  Rama Krishna Vavilala wrote:

                  No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.

                  I've made huge performance improvements by changing queries to just stream raw table data to an app instead of doing any aggregation in SQL server. As soon as the memory required for the aggregation on sql server overflowed the available server memory, paging dragged performance to a standstill with hundreds of other queries backed up behind it. Streaming several 5-12Gigs of data from SQL required much less memory and we could use more efficient structures in C# so it was thousands of times faster.

                  I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon

                  E C D 3 Replies Last reply
                  0
                  • R Rama Krishna Vavilala

                    Dan Neely wrote:

                    after testing both ways with only 20 records

                    LOL! They will not have that option luckily. But you are right any major decision like this has to be documented.

                    D Offline
                    D Offline
                    Dan Neely
                    wrote on last edited by
                    #13

                    Your dev/test environment has a dataset as massively huge as the production one? ON the one hand I'm impressed that you've gone so far towards realistic testing, OTOH having a dataset small enough to check results with in under a minute is very useful too.

                    3x12=36 2x12=24 1x12=12 0x12=18

                    R 1 Reply Last reply
                    0
                    • R Rama Krishna Vavilala

                      Ennis Ray Lynch, Jr. wrote:

                      . How many records are we talking about here?

                      5 million! But the processing is super complicated (recursion/trees).

                      Ennis Ray Lynch, Jr. wrote:

                      The code could be wrong

                      No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.

                      E Offline
                      E Offline
                      Ennis Ray Lynch Jr
                      wrote on last edited by
                      #14

                      I haven't seen the code nor do I know the problem domain, but the kind of numbers you posted (6 hours vs. 15 minutes) indicate that something else is going on other than comparing two optimum approaches. I used to think transferring data over the wire is a big deal, but it really isn't. Try the following exercise for fun one day:

                      DateTime startDate = DateTime.Now;
                      ...
                      sqlCommand.CommandText = "SELECT * FROM myTable";
                      ...
                      reader = sqlCommand.ExecuteReader();
                      while(reader.read()){
                      ...//Don't really have to do anything for a test
                      }
                      TimeSpan result = DateTime.Now - startDate;

                      6 million records shouldn't take more than a few minutes at worse.

                      Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane

                      R A 2 Replies Last reply
                      0
                      • A Andy Brummer

                        Rama Krishna Vavilala wrote:

                        No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.

                        I've made huge performance improvements by changing queries to just stream raw table data to an app instead of doing any aggregation in SQL server. As soon as the memory required for the aggregation on sql server overflowed the available server memory, paging dragged performance to a standstill with hundreds of other queries backed up behind it. Streaming several 5-12Gigs of data from SQL required much less memory and we could use more efficient structures in C# so it was thousands of times faster.

                        I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon

                        E Offline
                        E Offline
                        Ennis Ray Lynch Jr
                        wrote on last edited by
                        #15

                        Seconded.

                        Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane

                        D 1 Reply Last reply
                        0
                        • A Andy Brummer

                          Rama Krishna Vavilala wrote:

                          No my experience with previous projects has been the same code is usually lot slower than SPs. Which makes sense as no data needs to be transfered over wire or via IPC.

                          I've made huge performance improvements by changing queries to just stream raw table data to an app instead of doing any aggregation in SQL server. As soon as the memory required for the aggregation on sql server overflowed the available server memory, paging dragged performance to a standstill with hundreds of other queries backed up behind it. Streaming several 5-12Gigs of data from SQL required much less memory and we could use more efficient structures in C# so it was thousands of times faster.

                          I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon

                          C Offline
                          C Offline
                          ChrisC ncmail
                          wrote on last edited by
                          #16

                          Rather than remain silent and appear intelligent, I'll ask a question... Can't you use C# in SQL Server now? Wouldn't those C# optimizations work within the db itself? I haven't used any .NET in the db, so I'm interested to see what the answer is.

                          M R A Y 4 Replies Last reply
                          0
                          • E Ennis Ray Lynch Jr

                            I haven't seen the code nor do I know the problem domain, but the kind of numbers you posted (6 hours vs. 15 minutes) indicate that something else is going on other than comparing two optimum approaches. I used to think transferring data over the wire is a big deal, but it really isn't. Try the following exercise for fun one day:

                            DateTime startDate = DateTime.Now;
                            ...
                            sqlCommand.CommandText = "SELECT * FROM myTable";
                            ...
                            reader = sqlCommand.ExecuteReader();
                            while(reader.read()){
                            ...//Don't really have to do anything for a test
                            }
                            TimeSpan result = DateTime.Now - startDate;

                            6 million records shouldn't take more than a few minutes at worse.

                            Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane

                            R Offline
                            R Offline
                            Rama Krishna Vavilala
                            wrote on last edited by
                            #17

                            Well it is not just streaming but processing too. When you include processing it is streaming back and forth which effects the overall performance.

                            1 Reply Last reply
                            0
                            • D Dan Neely

                              Your dev/test environment has a dataset as massively huge as the production one? ON the one hand I'm impressed that you've gone so far towards realistic testing, OTOH having a dataset small enough to check results with in under a minute is very useful too.

                              3x12=36 2x12=24 1x12=12 0x12=18

                              R Offline
                              R Offline
                              Rama Krishna Vavilala
                              wrote on last edited by
                              #18

                              Dan Neely wrote:

                              with in under a minute is very useful too.

                              Yes. All I need to do us to enter different date ranges.

                              D 1 Reply Last reply
                              0
                              • E Ennis Ray Lynch Jr

                                I haven't seen the code nor do I know the problem domain, but the kind of numbers you posted (6 hours vs. 15 minutes) indicate that something else is going on other than comparing two optimum approaches. I used to think transferring data over the wire is a big deal, but it really isn't. Try the following exercise for fun one day:

                                DateTime startDate = DateTime.Now;
                                ...
                                sqlCommand.CommandText = "SELECT * FROM myTable";
                                ...
                                reader = sqlCommand.ExecuteReader();
                                while(reader.read()){
                                ...//Don't really have to do anything for a test
                                }
                                TimeSpan result = DateTime.Now - startDate;

                                6 million records shouldn't take more than a few minutes at worse.

                                Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane

                                A Offline
                                A Offline
                                Andrew Rissing
                                wrote on last edited by
                                #19

                                On a good connection perhaps...but what if they're across the state and have to go through several firewalls in the process. Keeping a connection open for each user for 'a few minutes' would be horrible. Plus, a few minutes is pretty hard for a user to accept. I mean, if your computer took a few minutes to bring up your homepage. Would you consider a new computer or at least a new homepage? :D

                                E 1 Reply Last reply
                                0
                                • A Andrew Rissing

                                  On a good connection perhaps...but what if they're across the state and have to go through several firewalls in the process. Keeping a connection open for each user for 'a few minutes' would be horrible. Plus, a few minutes is pretty hard for a user to accept. I mean, if your computer took a few minutes to bring up your homepage. Would you consider a new computer or at least a new homepage? :D

                                  E Offline
                                  E Offline
                                  Ennis Ray Lynch Jr
                                  wrote on last edited by
                                  #20

                                  Whomever the user is they are already accustomed to a 30 minute wait. Heaven forbid I suggest a 2 minute wait is more usable.

                                  Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane

                                  A R 2 Replies Last reply
                                  0
                                  • E Ennis Ray Lynch Jr

                                    Seconded.

                                    Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane

                                    D Offline
                                    D Offline
                                    Dan Mos
                                    wrote on last edited by
                                    #21

                                    Seconded the second time:~ If and almost only if it's on a local network/ethernet. Or an very fast internet connection(8+ MBytes/Sec). On my ex work place I managed to make a Core2Duo at ~2 Ghz with 2Gigs RAM outperform a Server with 20+ processing power dedicated for the task not shared for other apps. But for that I did lots of optimizations on the DataLayer. No automated ORM tools In the worst case a BinarySerach and most of times O(1) with Dictionaries and the like, instead of complex joins and calculations on the server. But I knew that the lowest of PC would be a Core2 at aprox 2GHz with 2Gigs of RAM. :)

                                    I bug

                                    1 Reply Last reply
                                    0
                                    • E Ennis Ray Lynch Jr

                                      Whomever the user is they are already accustomed to a 30 minute wait. Heaven forbid I suggest a 2 minute wait is more usable.

                                      Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. A man said to the universe: "Sir I exist!" "However," replied the universe, "The fact has not created in me A sense of obligation." --Stephen Crane

                                      A Offline
                                      A Offline
                                      Andrew Rissing
                                      wrote on last edited by
                                      #22

                                      Correct, with respect to the situation at hand. I was responding to the general statement about the 2 minutes being fine. Sorry, I'm having flashbacks from my reporting days. ;)

                                      1 Reply Last reply
                                      0
                                      • C ChrisC ncmail

                                        Rather than remain silent and appear intelligent, I'll ask a question... Can't you use C# in SQL Server now? Wouldn't those C# optimizations work within the db itself? I haven't used any .NET in the db, so I'm interested to see what the answer is.

                                        M Offline
                                        M Offline
                                        Mycroft Holmes
                                        wrote on last edited by
                                        #23

                                        I have not heard of any experienced developer here who is actually using the CLR in SQL Server, I can't see adding another layer of abstraction over TSQL as being useful. INMHO anyone who uses a database to store data should learn the databases language (I hate LINQ as well).

                                        Never underestimate the power of human stupidity RAH

                                        1 Reply Last reply
                                        0
                                        • P Pete OHanlon

                                          Move over to Oracle. The amount of stuff I've moved into the database because it fits in better there - it's wonderful, no longer having to retrieve items to perform complex processing only to put the output back into the database.

                                          "WPF has many lovers. It's a veritable porn star!" - Josh Smith

                                          As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.

                                          My blog | My articles | MoXAML PowerToys | Onyx

                                          M Offline
                                          M Offline
                                          Mycroft Holmes
                                          wrote on last edited by
                                          #24

                                          Pete O'Hanlon wrote:

                                          retrieve items to perform complex processing

                                          But... why, surely TSQL (or the PL flavour) can do basically anything you can do in code. I have yet to run across any requirement that I have had to go out to c# to meet. And this week we have to implement cubic spline into our procs!

                                          Never underestimate the power of human stupidity RAH

                                          P 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups