Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Dynamic Arrays

Dynamic Arrays

Scheduled Pinned Locked Moved C#
csharpdotnetdata-structuresquestion
22 Posts 6 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • L Lost User

    You could use a List<BigInteger> Or do what the List does, double the size every time you need more room. Or, since it seems that numbers_temp is an array, you are already done. Just change numbers to point to numbers_temp and allocate a new numbers_temp array next time. But you didn't give much information, so maybe I could not properly answer your question.

    B Offline
    B Offline
    Bassam Abdul Baki
    wrote on last edited by
    #7

    Thanks! I'll probably end up using the scaling. My list array increases by roughly a factor of three at each iteration.

    1 Reply Last reply
    0
    • L Lost User

      You could use a List<BigInteger> Or do what the List does, double the size every time you need more room. Or, since it seems that numbers_temp is an array, you are already done. Just change numbers to point to numbers_temp and allocate a new numbers_temp array next time. But you didn't give much information, so maybe I could not properly answer your question.

      B Offline
      B Offline
      Bassam Abdul Baki
      wrote on last edited by
      #8

      harold aptroot wrote:

      Or, since it seems that numbers_temp is an array, you are already done. Just change numbers to point to numbers_temp and allocate a new numbers_temp array next time.

      BTW, this is what I'm doing. I'm trying to get rid of this.

      L 1 Reply Last reply
      0
      • B Bassam Abdul Baki

        harold aptroot wrote:

        Or, since it seems that numbers_temp is an array, you are already done. Just change numbers to point to numbers_temp and allocate a new numbers_temp array next time.

        BTW, this is what I'm doing. I'm trying to get rid of this.

        L Offline
        L Offline
        Lost User
        wrote on last edited by
        #9

        If you were doing that, you wouldn't need any CopyTo.. and no Resize either

        B 1 Reply Last reply
        0
        • L Lost User

          If you were doing that, you wouldn't need any CopyTo.. and no Resize either

          B Offline
          B Offline
          Bassam Abdul Baki
          wrote on last edited by
          #10

          What? In my code? That's how the example I saw had it. Like I said, it's been a while since I've done .NET programming. But I think I'll switch to the list since it seems better suited than this.

          L 1 Reply Last reply
          0
          • B Bassam Abdul Baki

            What? In my code? That's how the example I saw had it. Like I said, it's been a while since I've done .NET programming. But I think I'll switch to the list since it seems better suited than this.

            L Offline
            L Offline
            Lost User
            wrote on last edited by
            #11

            Like this:

            numbers_temp = new BigInteger[some number you apparently know];
            // fill the array..
            numbers = numbers_temp;
            // repeat until finished

            B 1 Reply Last reply
            0
            • L Lost User

              Like this:

              numbers_temp = new BigInteger[some number you apparently know];
              // fill the array..
              numbers = numbers_temp;
              // repeat until finished

              B Offline
              B Offline
              Bassam Abdul Baki
              wrote on last edited by
              #12

              Yes, but I don't know what the size is in the loop and I'm using the Array.Resize to make it grow. That seems to be slow and inefficient. I would rather only work with one array since all I'm doing with the temp one is using it to recreate the actual array when I create a new copy with a bigger size. The generic list would take care of all that I hope.

              L 1 Reply Last reply
              0
              • B Bassam Abdul Baki

                Yes, but I don't know what the size is in the loop and I'm using the Array.Resize to make it grow. That seems to be slow and inefficient. I would rather only work with one array since all I'm doing with the temp one is using it to recreate the actual array when I create a new copy with a bigger size. The generic list would take care of all that I hope.

                L Offline
                L Offline
                Lost User
                wrote on last edited by
                #13

                Is there a maximum number of elements that could be produced in the loop known in advance?

                B 1 Reply Last reply
                0
                • L Lost User

                  Is there a maximum number of elements that could be produced in the loop known in advance?

                  B Offline
                  B Offline
                  Bassam Abdul Baki
                  wrote on last edited by
                  #14

                  Nope, but my estimate says it's bigger than a long. :)

                  L 1 Reply Last reply
                  0
                  • B Bassam Abdul Baki

                    Nope, but my estimate says it's bigger than a long. :)

                    L Offline
                    L Offline
                    Lost User
                    wrote on last edited by
                    #15

                    Then I'd probably use a List, using an array and resizing is manually is hardly worth the effort

                    1 Reply Last reply
                    0
                    • B Bassam Abdul Baki

                      I haven't done .NET programming in a while so I'm not sure what the best approach is for this. I'm using the .NET Framework 4.0 and the BigInteger class. Before the end of my loop, I have:

                      Array.Resize(ref numbers, numbers_temp.Length);
                       numbers_temp.CopyTo(numbers, 0);

                      Is there a better approach than this? I have an array of BigInteger that I add and delete numbers from at various times in the loop. Towards the end of my loop, this array grows quite a bit. It seems such a waste to create another array to copy all members from one to the other and then destroy the older one at every loop iteration. Thanks!

                      J Offline
                      J Offline
                      JasonPSage
                      wrote on last edited by
                      #16

                      In all the languages I've coded, there seems to be a general rule of thumb when looking at linked lists versus arrays and dynamic arrays. Double Linked Lists are better at adding and removing items a lot - but overall searches and navigation is slower than navigating an array; where an array is much faster for navigating hands down making searching, sorting etc just faster. Where the dynamic arrays get slow is when adding items exceeds the "chunk" and the array needs to make a full copy of itself in a new memory location with a new empty chunk. By Chunk I mean the number of array elements that are allocated but not yet used.. and as you fill the array.. you eventually hit this limit and adding one more item will cause this "thunking" I'm talking about. (Look up on Wiki what thunking is... the definition kind of applies here). How .Net implements their dynamic arrays under the hood I am not certain but I would imagine it's not different by much as these are pretty standard data configurations in any system I've used thus far. The fact that .Net has both lists and arrays indicates this is probably spot on. It seems like everything in programming is a choise of the right tool for the job. Deciding can be tricky - especially when both ways apply. Good Luck!

                      Know way too many languages... master of none!

                      1 Reply Last reply
                      0
                      • B Bassam Abdul Baki

                        I haven't done .NET programming in a while so I'm not sure what the best approach is for this. I'm using the .NET Framework 4.0 and the BigInteger class. Before the end of my loop, I have:

                        Array.Resize(ref numbers, numbers_temp.Length);
                         numbers_temp.CopyTo(numbers, 0);

                        Is there a better approach than this? I have an array of BigInteger that I add and delete numbers from at various times in the loop. Towards the end of my loop, this array grows quite a bit. It seems such a waste to create another array to copy all members from one to the other and then destroy the older one at every loop iteration. Thanks!

                        A Offline
                        A Offline
                        AspDotNetDev
                        wrote on last edited by
                        #17

                        I created a data structure, SlimList, and wrote an article about it. Read that. A SlimList is like a .Net List, except it avoids copying any elements on the resize. So, a .Net List will double the size of the underlying array then copy the items over from the old array to the new array that is twice the size. A SlimList will just create an additional array that is the same size as the existing arrays combined, but will not copy any elements over. It does this by creating a "major" array to hold all the references to the "minor" arrays. All operations on on par with List with respect to big-O notation (e.g., an index operation takes O(1) time). However, as it is currently implemented, SlimList is a lot slower than List (about 600x slower in some cases). However, with a little assembly code and some other modifications (that I've outlined in the article), it could be made to be about 2x slower than List. Probably not worth it, which is why I didn't implement the optimizations myself, but it's an interesting data structure that does overcome the array copy issue that exists with current implementations of List. So I'd say look at SlimList to see what route you could take, but then use List because that's going to be the most efficient route if speed is important. Also, if you are going to create a very large list (as you indicated in one of your replies), then I recommend creating lists of lists, rather than a single list. The reason is that memory fragmentation and other limits may prevent you from creating a single contiguous chunk of memory above a certain size (say, 256MB, a limit I've run into before due to fragmentation). That also helps to remedy the array copy problem. The sublists can always be created with a known capacity(e.g., 10,000,000 elements). Then it would only be the "major" list that would grow, and it would not grow very often. You would then only require two index operations to access a BigInteger within your list of lists and you wouldn't hit the memory issues.

                        [Forum Guidelines]

                        B 3 Replies Last reply
                        0
                        • A AspDotNetDev

                          I created a data structure, SlimList, and wrote an article about it. Read that. A SlimList is like a .Net List, except it avoids copying any elements on the resize. So, a .Net List will double the size of the underlying array then copy the items over from the old array to the new array that is twice the size. A SlimList will just create an additional array that is the same size as the existing arrays combined, but will not copy any elements over. It does this by creating a "major" array to hold all the references to the "minor" arrays. All operations on on par with List with respect to big-O notation (e.g., an index operation takes O(1) time). However, as it is currently implemented, SlimList is a lot slower than List (about 600x slower in some cases). However, with a little assembly code and some other modifications (that I've outlined in the article), it could be made to be about 2x slower than List. Probably not worth it, which is why I didn't implement the optimizations myself, but it's an interesting data structure that does overcome the array copy issue that exists with current implementations of List. So I'd say look at SlimList to see what route you could take, but then use List because that's going to be the most efficient route if speed is important. Also, if you are going to create a very large list (as you indicated in one of your replies), then I recommend creating lists of lists, rather than a single list. The reason is that memory fragmentation and other limits may prevent you from creating a single contiguous chunk of memory above a certain size (say, 256MB, a limit I've run into before due to fragmentation). That also helps to remedy the array copy problem. The sublists can always be created with a known capacity(e.g., 10,000,000 elements). Then it would only be the "major" list that would grow, and it would not grow very often. You would then only require two index operations to access a BigInteger within your list of lists and you wouldn't hit the memory issues.

                          [Forum Guidelines]

                          B Offline
                          B Offline
                          Bassam Abdul Baki
                          wrote on last edited by
                          #18

                          All I care about is speed. I've already implemented the list and it is several orders of magnitude faster.

                          1 Reply Last reply
                          0
                          • A AspDotNetDev

                            I created a data structure, SlimList, and wrote an article about it. Read that. A SlimList is like a .Net List, except it avoids copying any elements on the resize. So, a .Net List will double the size of the underlying array then copy the items over from the old array to the new array that is twice the size. A SlimList will just create an additional array that is the same size as the existing arrays combined, but will not copy any elements over. It does this by creating a "major" array to hold all the references to the "minor" arrays. All operations on on par with List with respect to big-O notation (e.g., an index operation takes O(1) time). However, as it is currently implemented, SlimList is a lot slower than List (about 600x slower in some cases). However, with a little assembly code and some other modifications (that I've outlined in the article), it could be made to be about 2x slower than List. Probably not worth it, which is why I didn't implement the optimizations myself, but it's an interesting data structure that does overcome the array copy issue that exists with current implementations of List. So I'd say look at SlimList to see what route you could take, but then use List because that's going to be the most efficient route if speed is important. Also, if you are going to create a very large list (as you indicated in one of your replies), then I recommend creating lists of lists, rather than a single list. The reason is that memory fragmentation and other limits may prevent you from creating a single contiguous chunk of memory above a certain size (say, 256MB, a limit I've run into before due to fragmentation). That also helps to remedy the array copy problem. The sublists can always be created with a known capacity(e.g., 10,000,000 elements). Then it would only be the "major" list that would grow, and it would not grow very often. You would then only require two index operations to access a BigInteger within your list of lists and you wouldn't hit the memory issues.

                            [Forum Guidelines]

                            B Offline
                            B Offline
                            Bassam Abdul Baki
                            wrote on last edited by
                            #19

                            aspdotnetdev wrote:

                            Also, if you are going to create a very large list (as you indicated in one of your replies), then I recommend creating lists of lists, rather than a single list. The reason is that memory fragmentation and other limits may prevent you from creating a single contiguous chunk of memory above a certain size (say, 256MB, a limit I've run into before due to fragmentation). That also helps to remedy the array copy problem. The sublists can always be created with a known capacity(e.g., 10,000,000 elements). Then it would only be the "major" list that would grow, and it would not grow very often. You would then only require two index operations to access a BigInteger within your list of lists and you wouldn't hit the memory issues.

                            I'm already over 4 million items in my list and it increases by an order of three at each loop. I'll have to think about this one since I expect it to grow much, much higher than this. Thanks!

                            1 Reply Last reply
                            0
                            • A AspDotNetDev

                              I created a data structure, SlimList, and wrote an article about it. Read that. A SlimList is like a .Net List, except it avoids copying any elements on the resize. So, a .Net List will double the size of the underlying array then copy the items over from the old array to the new array that is twice the size. A SlimList will just create an additional array that is the same size as the existing arrays combined, but will not copy any elements over. It does this by creating a "major" array to hold all the references to the "minor" arrays. All operations on on par with List with respect to big-O notation (e.g., an index operation takes O(1) time). However, as it is currently implemented, SlimList is a lot slower than List (about 600x slower in some cases). However, with a little assembly code and some other modifications (that I've outlined in the article), it could be made to be about 2x slower than List. Probably not worth it, which is why I didn't implement the optimizations myself, but it's an interesting data structure that does overcome the array copy issue that exists with current implementations of List. So I'd say look at SlimList to see what route you could take, but then use List because that's going to be the most efficient route if speed is important. Also, if you are going to create a very large list (as you indicated in one of your replies), then I recommend creating lists of lists, rather than a single list. The reason is that memory fragmentation and other limits may prevent you from creating a single contiguous chunk of memory above a certain size (say, 256MB, a limit I've run into before due to fragmentation). That also helps to remedy the array copy problem. The sublists can always be created with a known capacity(e.g., 10,000,000 elements). Then it would only be the "major" list that would grow, and it would not grow very often. You would then only require two index operations to access a BigInteger within your list of lists and you wouldn't hit the memory issues.

                              [Forum Guidelines]

                              B Offline
                              B Offline
                              Bassam Abdul Baki
                              wrote on last edited by
                              #20

                              aspdotnetdev wrote:

                              The reason is that memory fragmentation and other limits may prevent you from creating a single contiguous chunk of memory above a certain size (say, 256MB, a limit I've run into before due to fragmentation).

                              What would happen if you don't have a chunk large enough? Does it fail or wait?

                              aspdotnetdev wrote:

                              The sublists can always be created with a known capacity(e.g., 10,000,000 elements). Then it would only be the "major" list that would grow, and it would not grow very often.

                              Sadly, I need something much, much larger than this size for my array at its largest point in the loop. I think I'm reaching my limit much earlier then that.

                              A 1 Reply Last reply
                              0
                              • B Bassam Abdul Baki

                                aspdotnetdev wrote:

                                The reason is that memory fragmentation and other limits may prevent you from creating a single contiguous chunk of memory above a certain size (say, 256MB, a limit I've run into before due to fragmentation).

                                What would happen if you don't have a chunk large enough? Does it fail or wait?

                                aspdotnetdev wrote:

                                The sublists can always be created with a known capacity(e.g., 10,000,000 elements). Then it would only be the "major" list that would grow, and it would not grow very often.

                                Sadly, I need something much, much larger than this size for my array at its largest point in the loop. I think I'm reaching my limit much earlier then that.

                                A Offline
                                A Offline
                                AspDotNetDev
                                wrote on last edited by
                                #21

                                Bassam Abdul-Baki wrote:

                                What would happen if you don't have a chunk large enough? Does it fail or wait?

                                You get an OutOfMemoryException.

                                Bassam Abdul-Baki wrote:

                                I need something much, much larger than this size

                                Then I think you misunderstood. If you have a "major" list that holds 10,000,000 lists that EACH hold 10,000,000 BigIntegers, that would be a total of 100,000,000,000,000 BigIntegers. If each BigInteger is at least 10 bytes, that's about 1 petabyte (read: 1,000,000 gigabytes) of data. The largest commercial hard drives sold today are about 1,000th of that size. Even some versions of 64-bit operating systems don't go that high in virtual memory. Also, are you understanding that you would treat this list of lists of BigIntegers as a single collection of BigIntegers? The advantage is that it is physically spread into small chunks across the RAM. Also, if you need more data (for some strange reason), you could create a third level of lists. So, you'd have a list of 10,000,000 lists of 10,000,000 lists of BigIntegers. There is probably not enough storage on the planet to store all that data. In case you still aren't getting it, here is basically how the petabyte list would look:

                                List<List<BigInteger>> ints = new List<List<BigInteger>>();
                                for(int i = 0; i < 10000000; i++)
                                {
                                List<BigInteger> minorInts = new List<BigInteger>(10000000);
                                ints.Add(minorInts);
                                }

                                [Forum Guidelines]

                                B 1 Reply Last reply
                                0
                                • A AspDotNetDev

                                  Bassam Abdul-Baki wrote:

                                  What would happen if you don't have a chunk large enough? Does it fail or wait?

                                  You get an OutOfMemoryException.

                                  Bassam Abdul-Baki wrote:

                                  I need something much, much larger than this size

                                  Then I think you misunderstood. If you have a "major" list that holds 10,000,000 lists that EACH hold 10,000,000 BigIntegers, that would be a total of 100,000,000,000,000 BigIntegers. If each BigInteger is at least 10 bytes, that's about 1 petabyte (read: 1,000,000 gigabytes) of data. The largest commercial hard drives sold today are about 1,000th of that size. Even some versions of 64-bit operating systems don't go that high in virtual memory. Also, are you understanding that you would treat this list of lists of BigIntegers as a single collection of BigIntegers? The advantage is that it is physically spread into small chunks across the RAM. Also, if you need more data (for some strange reason), you could create a third level of lists. So, you'd have a list of 10,000,000 lists of 10,000,000 lists of BigIntegers. There is probably not enough storage on the planet to store all that data. In case you still aren't getting it, here is basically how the petabyte list would look:

                                  List<List<BigInteger>> ints = new List<List<BigInteger>>();
                                  for(int i = 0; i < 10000000; i++)
                                  {
                                  List<BigInteger> minorInts = new List<BigInteger>(10000000);
                                  ints.Add(minorInts);
                                  }

                                  [Forum Guidelines]

                                  B Offline
                                  B Offline
                                  Bassam Abdul Baki
                                  wrote on last edited by
                                  #22

                                  Thanks! I'm not getting an out of memory exception yet, so I must be good. I might be able to modify my program to only use a BigInteger array of integers or strings (of a maximum length) if I make a few changes. I'm hoping that the program shrinks enough after a number of iterations where the rate of removing numbers exceeds the rate of adding them and it shrinks to zero eventually. We'll see. Even if it works, I may change it for the learning experience.

                                  1 Reply Last reply
                                  0
                                  Reply
                                  • Reply as topic
                                  Log in to reply
                                  • Oldest to Newest
                                  • Newest to Oldest
                                  • Most Votes


                                  • Login

                                  • Don't have an account? Register

                                  • Login or register to search.
                                  • First post
                                    Last post
                                  0
                                  • Categories
                                  • Recent
                                  • Tags
                                  • Popular
                                  • World
                                  • Users
                                  • Groups