Dynamic Arrays
-
aspdotnetdev wrote:
The reason is that memory fragmentation and other limits may prevent you from creating a single contiguous chunk of memory above a certain size (say, 256MB, a limit I've run into before due to fragmentation).
What would happen if you don't have a chunk large enough? Does it fail or wait?
aspdotnetdev wrote:
The sublists can always be created with a known capacity(e.g., 10,000,000 elements). Then it would only be the "major" list that would grow, and it would not grow very often.
Sadly, I need something much, much larger than this size for my array at its largest point in the loop. I think I'm reaching my limit much earlier then that.
Bassam Abdul-Baki wrote:
What would happen if you don't have a chunk large enough? Does it fail or wait?
You get an OutOfMemoryException.
Bassam Abdul-Baki wrote:
I need something much, much larger than this size
Then I think you misunderstood. If you have a "major" list that holds 10,000,000 lists that EACH hold 10,000,000 BigIntegers, that would be a total of 100,000,000,000,000 BigIntegers. If each BigInteger is at least 10 bytes, that's about 1 petabyte (read: 1,000,000 gigabytes) of data. The largest commercial hard drives sold today are about 1,000th of that size. Even some versions of 64-bit operating systems don't go that high in virtual memory. Also, are you understanding that you would treat this list of lists of BigIntegers as a single collection of BigIntegers? The advantage is that it is physically spread into small chunks across the RAM. Also, if you need more data (for some strange reason), you could create a third level of lists. So, you'd have a list of 10,000,000 lists of 10,000,000 lists of BigIntegers. There is probably not enough storage on the planet to store all that data. In case you still aren't getting it, here is basically how the petabyte list would look:
List<List<BigInteger>> ints = new List<List<BigInteger>>();
for(int i = 0; i < 10000000; i++)
{
List<BigInteger> minorInts = new List<BigInteger>(10000000);
ints.Add(minorInts);
} -
Bassam Abdul-Baki wrote:
What would happen if you don't have a chunk large enough? Does it fail or wait?
You get an OutOfMemoryException.
Bassam Abdul-Baki wrote:
I need something much, much larger than this size
Then I think you misunderstood. If you have a "major" list that holds 10,000,000 lists that EACH hold 10,000,000 BigIntegers, that would be a total of 100,000,000,000,000 BigIntegers. If each BigInteger is at least 10 bytes, that's about 1 petabyte (read: 1,000,000 gigabytes) of data. The largest commercial hard drives sold today are about 1,000th of that size. Even some versions of 64-bit operating systems don't go that high in virtual memory. Also, are you understanding that you would treat this list of lists of BigIntegers as a single collection of BigIntegers? The advantage is that it is physically spread into small chunks across the RAM. Also, if you need more data (for some strange reason), you could create a third level of lists. So, you'd have a list of 10,000,000 lists of 10,000,000 lists of BigIntegers. There is probably not enough storage on the planet to store all that data. In case you still aren't getting it, here is basically how the petabyte list would look:
List<List<BigInteger>> ints = new List<List<BigInteger>>();
for(int i = 0; i < 10000000; i++)
{
List<BigInteger> minorInts = new List<BigInteger>(10000000);
ints.Add(minorInts);
}Thanks! I'm not getting an out of memory exception yet, so I must be good. I might be able to modify my program to only use a BigInteger array of integers or strings (of a maximum length) if I make a few changes. I'm hoping that the program shrinks enough after a number of iterations where the rate of removing numbers exceeds the rate of adding them and it shrinks to zero eventually. We'll see. Even if it works, I may change it for the learning experience.