Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Is the a max row count for SQLDataReader?

Is the a max row count for SQLDataReader?

Scheduled Pinned Locked Moved C#
databasecsharpsharepointsql-servervisual-studio
14 Posts 6 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • B Offline
    B Offline
    Ben Cocker
    wrote on last edited by
    #1

    Hi there, I am reading a (rather large) MS-SQL stored procedure into C# using the SqlDataReader. I expect this to return slightly over 8,000,000 records, but it consistently maxes out at 726,301. I cannot for the life in me see why - when running the stored procedure (which has no input parameters) in SQL Server Management Studio it returns the correct number of records. I am running VS 2008 Professional with SQL Server 2008 Developer Ed. The code is paraphrased as follows: // Set up a list of data type Record (my own) to hold the output of the sp List records = new List(); System.Data.SqlClient.SqlConnection Connection = new System.Data.SqlClient.SqlConnection(ConnectionString); System.Data.SqlClient.SqlCommand Command; try { Connection.Open(); { Command = new System.Data.SqlClient.SqlCommand("StoredProcedure", Connection); Command.CommandType = CommandType.StoredProcedure; System.Data.SqlClient.SqlDataReader Reader = Command.ExecuteReader(); while (Reader.Read()) { Record NewRecord = new Record(); NewRecord.PAR1 = Reader.GetInt32(Reader.GetOrdinal("PAR1")); NewRecord.PAR2 = Reader.GetInt32(Reader.GetOrdinal("PAR2"));... etc records.Add(NewRecord); } } } catch { } finally { Connection.Close(); } Thanks a million for any help! Ben

    L J P 3 Replies Last reply
    0
    • B Ben Cocker

      Hi there, I am reading a (rather large) MS-SQL stored procedure into C# using the SqlDataReader. I expect this to return slightly over 8,000,000 records, but it consistently maxes out at 726,301. I cannot for the life in me see why - when running the stored procedure (which has no input parameters) in SQL Server Management Studio it returns the correct number of records. I am running VS 2008 Professional with SQL Server 2008 Developer Ed. The code is paraphrased as follows: // Set up a list of data type Record (my own) to hold the output of the sp List records = new List(); System.Data.SqlClient.SqlConnection Connection = new System.Data.SqlClient.SqlConnection(ConnectionString); System.Data.SqlClient.SqlCommand Command; try { Connection.Open(); { Command = new System.Data.SqlClient.SqlCommand("StoredProcedure", Connection); Command.CommandType = CommandType.StoredProcedure; System.Data.SqlClient.SqlDataReader Reader = Command.ExecuteReader(); while (Reader.Read()) { Record NewRecord = new Record(); NewRecord.PAR1 = Reader.GetInt32(Reader.GetOrdinal("PAR1")); NewRecord.PAR2 = Reader.GetInt32(Reader.GetOrdinal("PAR2"));... etc records.Add(NewRecord); } } } catch { } finally { Connection.Close(); } Thanks a million for any help! Ben

      L Offline
      L Offline
      Luc Pattyn
      wrote on last edited by
      #2

      Hi, why don't you catch an Exception and look at all its information, using Exception.ToString(). Chances are the problem has been reported but you showed no interest. Could be a timeout, an out-of-memory condition, ... Since the number of rows returns is constant, I do expect an OOM. BTW: if 8 million is what you expect, how many records are there? And if you were to want all existing records, why use a SP in the first place? :)

      Luc Pattyn [Forum Guidelines] [My Articles]


      - before you ask a question here, search CodeProject, then Google - the quality and detail of your question reflects on the effectiveness of the help you are likely to get - use the code block button (PRE tags) to preserve formatting when showing multi-line code snippets


      modified on Sunday, June 12, 2011 9:07 AM

      B 1 Reply Last reply
      0
      • B Ben Cocker

        Hi there, I am reading a (rather large) MS-SQL stored procedure into C# using the SqlDataReader. I expect this to return slightly over 8,000,000 records, but it consistently maxes out at 726,301. I cannot for the life in me see why - when running the stored procedure (which has no input parameters) in SQL Server Management Studio it returns the correct number of records. I am running VS 2008 Professional with SQL Server 2008 Developer Ed. The code is paraphrased as follows: // Set up a list of data type Record (my own) to hold the output of the sp List records = new List(); System.Data.SqlClient.SqlConnection Connection = new System.Data.SqlClient.SqlConnection(ConnectionString); System.Data.SqlClient.SqlCommand Command; try { Connection.Open(); { Command = new System.Data.SqlClient.SqlCommand("StoredProcedure", Connection); Command.CommandType = CommandType.StoredProcedure; System.Data.SqlClient.SqlDataReader Reader = Command.ExecuteReader(); while (Reader.Read()) { Record NewRecord = new Record(); NewRecord.PAR1 = Reader.GetInt32(Reader.GetOrdinal("PAR1")); NewRecord.PAR2 = Reader.GetInt32(Reader.GetOrdinal("PAR2"));... etc records.Add(NewRecord); } } } catch { } finally { Connection.Close(); } Thanks a million for any help! Ben

        J Offline
        J Offline
        Jon Rista
        wrote on last edited by
        #3

        Well, my guess is that the sheer number of records your processing in-memory is just flat out insane. :P The problem is most likely not a row count limitation of SqlDataReader...but rather a memory limitation. You are trying to create a massive List of Record objects...the overhead for that is going to be fairly high compared to just displaying rows of text in, say, Sql Management Studio. Depending on exactly how many columns you are setting on your Record object, the data types of each, and the lengths of strings that may exist in your result set...you could be looking at GIGS of memory usage here. I am not exactly sure what your doing or what your needs are...but you should look into batching your work. Either chunk it up into significantly smaller data sets (say, 10000 records each) and process them one at a time...or distribute those chunks out to a server farm to process them in parallel. If you absolutely need to process all 8 million records at once...then you are probably doing something that calls for LAPACK or some other matrix or vector processor that can efficiently handle large data sets.

        B 1 Reply Last reply
        0
        • L Luc Pattyn

          Hi, why don't you catch an Exception and look at all its information, using Exception.ToString(). Chances are the problem has been reported but you showed no interest. Could be a timeout, an out-of-memory condition, ... Since the number of rows returns is constant, I do expect an OOM. BTW: if 8 million is what you expect, how many records are there? And if you were to want all existing records, why use a SP in the first place? :)

          Luc Pattyn [Forum Guidelines] [My Articles]


          - before you ask a question here, search CodeProject, then Google - the quality and detail of your question reflects on the effectiveness of the help you are likely to get - use the code block button (PRE tags) to preserve formatting when showing multi-line code snippets


          modified on Sunday, June 12, 2011 9:07 AM

          B Offline
          B Offline
          Ben Cocker
          wrote on last edited by
          #4

          Hi Luc, Thanks for your reply. In answer to your questions: I don't get an exception - it just terminates the while(Reader.Read()) loop at 7263001 records. I have actually slightly increased the field count of the query (which makes it also take much longer) - but this doesn't affect the number of records it tops out at, which made me think that timeout and oom might not be the reason. I used a sp in order to be able to test and maintain the query on the server rather than just having a long text string select statement in my C# code. Cheers, Ben

          L 2 Replies Last reply
          0
          • J Jon Rista

            Well, my guess is that the sheer number of records your processing in-memory is just flat out insane. :P The problem is most likely not a row count limitation of SqlDataReader...but rather a memory limitation. You are trying to create a massive List of Record objects...the overhead for that is going to be fairly high compared to just displaying rows of text in, say, Sql Management Studio. Depending on exactly how many columns you are setting on your Record object, the data types of each, and the lengths of strings that may exist in your result set...you could be looking at GIGS of memory usage here. I am not exactly sure what your doing or what your needs are...but you should look into batching your work. Either chunk it up into significantly smaller data sets (say, 10000 records each) and process them one at a time...or distribute those chunks out to a server farm to process them in parallel. If you absolutely need to process all 8 million records at once...then you are probably doing something that calls for LAPACK or some other matrix or vector processor that can efficiently handle large data sets.

            B Offline
            B Offline
            Ben Cocker
            wrote on last edited by
            #5

            Hi Jon, Thanks for your reply, and the suggestions. Could I point you to my reply to Luc for a discussion of this out-of-memory limitation? And, I have been very careful to limit the number of fields I use and don't have any string columns, so although I take your point that 8 million rows is a beast, in SQL management studio the size of comma delimited output file is less than 1.5gig - I've got a 32 bit machine with maximum memory so I am skating on thin ice, but still I expected it to work out memory wise. I'll look into LAPACK - thanks for the suggestion. Ben

            D D 2 Replies Last reply
            0
            • B Ben Cocker

              Hi Luc, Thanks for your reply. In answer to your questions: I don't get an exception - it just terminates the while(Reader.Read()) loop at 7263001 records. I have actually slightly increased the field count of the query (which makes it also take much longer) - but this doesn't affect the number of records it tops out at, which made me think that timeout and oom might not be the reason. I used a sp in order to be able to test and maintain the query on the server rather than just having a long text string select statement in my C# code. Cheers, Ben

              L Offline
              L Offline
              Luc Pattyn
              wrote on last edited by
              #6

              Hi Ben,

              Ben Cocker wrote:

              I don't get an exception

              you may have misunderstood me. What I meant is your code (catch{}) just swallows exceptions if and when they occur, so it may seem you don't get any. Did you try a

              ...
              catch(Exception exc) {
              log(exc.ToString());
              }
              ...

              where log is some logging method, a Console.WriteLine, a MessageBox.Show, whatever you fancy. Also, I'm not familiar with the Record class, is it standard .NET stuff or is it yours? any idea what the size of its objects is? I hope it isn't duplicating information, such as table metadata all over. BTW: "only 1.5GB" (from your other reply) is a phrase I never used so far :)

              Luc Pattyn [Forum Guidelines] [My Articles]


              - before you ask a question here, search CodeProject, then Google - the quality and detail of your question reflects on the effectiveness of the help you are likely to get - use the code block button (PRE tags) to preserve formatting when showing multi-line code snippets


              modified on Sunday, June 12, 2011 9:07 AM

              1 Reply Last reply
              0
              • B Ben Cocker

                Hi Jon, Thanks for your reply, and the suggestions. Could I point you to my reply to Luc for a discussion of this out-of-memory limitation? And, I have been very careful to limit the number of fields I use and don't have any string columns, so although I take your point that 8 million rows is a beast, in SQL management studio the size of comma delimited output file is less than 1.5gig - I've got a 32 bit machine with maximum memory so I am skating on thin ice, but still I expected it to work out memory wise. I'll look into LAPACK - thanks for the suggestion. Ben

                D Offline
                D Offline
                Dave Kreskowiak
                wrote on last edited by
                #7

                Ben Cocker wrote:

                SQL management studio the size of comma delimited output file is less than 1.5gig

                So what? That's a file on disk that doesn't have the overhead of a storage class managing the data. THere is no limit on the Reader, since it only reads a single record at a time and returns the fields in that record. The real question is where is this data going?? Is it going into an ever-expanding collection?? If so, then you're running the machine out of memory.

                A guide to posting questions on CodeProject[^]
                Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
                     2006, 2007, 2008

                1 Reply Last reply
                0
                • B Ben Cocker

                  Hi Luc, Thanks for your reply. In answer to your questions: I don't get an exception - it just terminates the while(Reader.Read()) loop at 7263001 records. I have actually slightly increased the field count of the query (which makes it also take much longer) - but this doesn't affect the number of records it tops out at, which made me think that timeout and oom might not be the reason. I used a sp in order to be able to test and maintain the query on the server rather than just having a long text string select statement in my C# code. Cheers, Ben

                  L Offline
                  L Offline
                  Luc Pattyn
                  wrote on last edited by
                  #8

                  Hi Ben, adding a field to the query does not change the memory situation if that field eventually gets stored in a value type inside Record, such as an int or Int32, since such members get allocated memory whether you fill them or not. If adding some NEW members to the Record class still keeps it failing at record 726,301 then I might be inclined to say something is abnormal with that record (say a NULL value, so your GetInt32 fails). My odds are still very much in favor of O-O-M though. :)

                  Luc Pattyn [Forum Guidelines] [My Articles]


                  - before you ask a question here, search CodeProject, then Google - the quality and detail of your question reflects on the effectiveness of the help you are likely to get - use the code block button (PRE tags) to preserve formatting when showing multi-line code snippets


                  modified on Thursday, March 5, 2009 2:34 PM

                  J 1 Reply Last reply
                  0
                  • L Luc Pattyn

                    Hi Ben, adding a field to the query does not change the memory situation if that field eventually gets stored in a value type inside Record, such as an int or Int32, since such members get allocated memory whether you fill them or not. If adding some NEW members to the Record class still keeps it failing at record 726,301 then I might be inclined to say something is abnormal with that record (say a NULL value, so your GetInt32 fails). My odds are still very much in favor of O-O-M though. :)

                    Luc Pattyn [Forum Guidelines] [My Articles]


                    - before you ask a question here, search CodeProject, then Google - the quality and detail of your question reflects on the effectiveness of the help you are likely to get - use the code block button (PRE tags) to preserve formatting when showing multi-line code snippets


                    modified on Thursday, March 5, 2009 2:34 PM

                    J Offline
                    J Offline
                    Jon Rista
                    wrote on last edited by
                    #9

                    I have to agree with Luc, I think your running into a problem on that particular record, its throwing an exception, and the exception is getting swallowed. Stick a breakpoint in your catch clause and see what is going on.

                    L 1 Reply Last reply
                    0
                    • J Jon Rista

                      I have to agree with Luc, I think your running into a problem on that particular record, its throwing an exception, and the exception is getting swallowed. Stick a breakpoint in your catch clause and see what is going on.

                      L Offline
                      L Offline
                      Luc Pattyn
                      wrote on last edited by
                      #10

                      ... and never ever swallow exceptions like that. :sigh:

                      Luc Pattyn [Forum Guidelines] [My Articles]


                      - before you ask a question here, search CodeProject, then Google - the quality and detail of your question reflects on the effectiveness of the help you are likely to get - use the code block button (PRE tags) to preserve formatting when showing multi-line code snippets


                      modified on Sunday, June 12, 2011 9:08 AM

                      J 1 Reply Last reply
                      0
                      • B Ben Cocker

                        Hi there, I am reading a (rather large) MS-SQL stored procedure into C# using the SqlDataReader. I expect this to return slightly over 8,000,000 records, but it consistently maxes out at 726,301. I cannot for the life in me see why - when running the stored procedure (which has no input parameters) in SQL Server Management Studio it returns the correct number of records. I am running VS 2008 Professional with SQL Server 2008 Developer Ed. The code is paraphrased as follows: // Set up a list of data type Record (my own) to hold the output of the sp List records = new List(); System.Data.SqlClient.SqlConnection Connection = new System.Data.SqlClient.SqlConnection(ConnectionString); System.Data.SqlClient.SqlCommand Command; try { Connection.Open(); { Command = new System.Data.SqlClient.SqlCommand("StoredProcedure", Connection); Command.CommandType = CommandType.StoredProcedure; System.Data.SqlClient.SqlDataReader Reader = Command.ExecuteReader(); while (Reader.Read()) { Record NewRecord = new Record(); NewRecord.PAR1 = Reader.GetInt32(Reader.GetOrdinal("PAR1")); NewRecord.PAR2 = Reader.GetInt32(Reader.GetOrdinal("PAR2"));... etc records.Add(NewRecord); } } } catch { } finally { Connection.Close(); } Thanks a million for any help! Ben

                        P Offline
                        P Offline
                        Pete OHanlon
                        wrote on last edited by
                        #11

                        Ben - what could you possibly do with this many records? You certainly couldn't present this to a user - are you meant to be passing this to some other process (say to store to a file). I suspect you've got two problems - one, the record at position 726,301 is invalid in some way which you aren't catching in your code and two, the number of records is too great. BTW - it's never a good idea to just consume exceptions. You should actually do something with it, even if it's only log the exception. I normally use a variation of this function to parse the items (I'm typing this from memory so it may need a bit of tidying up):

                        public static T ParseValue(this SqlDataReader reader, string fieldName)
                        {
                        int ordinal = reader.GetOrdinal(fieldName);
                        if (reader.IsDBNull(ordinal))
                        return default(T);
                        return (T)reader.GetValue(ordinal);
                        }

                        Then you use this as

                        NewRecord.PAR1 = Reader.ParseValue("PAR1");

                        "WPF has many lovers. It's a veritable porn star!" - Josh Smith

                        As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.

                        My blog | My articles | MoXAML PowerToys

                        1 Reply Last reply
                        0
                        • L Luc Pattyn

                          ... and never ever swallow exceptions like that. :sigh:

                          Luc Pattyn [Forum Guidelines] [My Articles]


                          - before you ask a question here, search CodeProject, then Google - the quality and detail of your question reflects on the effectiveness of the help you are likely to get - use the code block button (PRE tags) to preserve formatting when showing multi-line code snippets


                          modified on Sunday, June 12, 2011 9:08 AM

                          J Offline
                          J Offline
                          Jon Rista
                          wrote on last edited by
                          #12

                          ....never ever.....ever.....ever. :suss:

                          B 1 Reply Last reply
                          0
                          • B Ben Cocker

                            Hi Jon, Thanks for your reply, and the suggestions. Could I point you to my reply to Luc for a discussion of this out-of-memory limitation? And, I have been very careful to limit the number of fields I use and don't have any string columns, so although I take your point that 8 million rows is a beast, in SQL management studio the size of comma delimited output file is less than 1.5gig - I've got a 32 bit machine with maximum memory so I am skating on thin ice, but still I expected it to work out memory wise. I'll look into LAPACK - thanks for the suggestion. Ben

                            D Offline
                            D Offline
                            Dragonfly_Lee
                            wrote on last edited by
                            #13

                            Well, maybe you are confused about the datareader and dataset in ADO.net. DataReader object just go forward for ONE record each time. Generally, it will not take you too much memory for only one record, right?

                            :) I Love KongFu~

                            1 Reply Last reply
                            0
                            • J Jon Rista

                              ....never ever.....ever.....ever. :suss:

                              B Offline
                              B Offline
                              Ben Cocker
                              wrote on last edited by
                              #14

                              Guys, I've been astonished by the help I've received with this problem - thank you very much everyone. I'm afraid to say that the problem was as stupid as me consuming the exception on a particular record. Best practice point to self, and I'm really sorry for wasting peoples time on such a simple thing. I normally go out of my way to Google for problems, but since 726301 is not a magic number/ceiling (as it was just an exception) I was really scratching my head over this one without seeing the obvious. Anyway, thanks again for everyone's time, and for the suggestions. Ben

                              1 Reply Last reply
                              0
                              Reply
                              • Reply as topic
                              Log in to reply
                              • Oldest to Newest
                              • Newest to Oldest
                              • Most Votes


                              • Login

                              • Don't have an account? Register

                              • Login or register to search.
                              • First post
                                Last post
                              0
                              • Categories
                              • Recent
                              • Tags
                              • Popular
                              • World
                              • Users
                              • Groups