Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Memory leak trouble with Linq to Sql and multiple threads

Memory leak trouble with Linq to Sql and multiple threads

Scheduled Pinned Locked Moved C#
databasehelpcsharplinqwindows-admin
37 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • L Luc Pattyn

    OK, the change to usings is no problem, once it works it should be fine. yes, if anything is still the matter, I will want access to the code of which you will tell us the results later on. this will be my last message for today. CU tomorrow. :)

    Luc Pattyn [My Articles] Nil Volentibus Arduum

    J Offline
    J Offline
    JD86
    wrote on last edited by
    #27

    Luc, Sorry for the delay. Yesterday I discovered another issue in the code. The Quartz library i'm using would let the job fire multiple times. So it didn't stop the timer while the job was executing and restart the timer after it was done. I discovered after that the memory leak still exists and i'm trying your suggestions now. HOWEVER the memory leak is MUCH slower now because the jobs are not occurring so often. From yesterday morning to now the memory has grown to 1.5GB

    1 Reply Last reply
    0
    • L Luc Pattyn

      OK, the change to usings is no problem, once it works it should be fine. yes, if anything is still the matter, I will want access to the code of which you will tell us the results later on. this will be my last message for today. CU tomorrow. :)

      Luc Pattyn [My Articles] Nil Volentibus Arduum

      J Offline
      J Offline
      JD86
      wrote on last edited by
      #28

      Ok code is updated and running now. I changes the triggers that were every 60 minutes to every 30 minutes to hopefully reproduce quicker. Updated code is here: KnowMoreIT/CloudPanel-Service · GitHub[^]

      1 Reply Last reply
      0
      • L Luc Pattyn

        OK, the change to usings is no problem, once it works it should be fine. yes, if anything is still the matter, I will want access to the code of which you will tell us the results later on. this will be my last message for today. CU tomorrow. :)

        Luc Pattyn [My Articles] Nil Volentibus Arduum

        J Offline
        J Offline
        JD86
        wrote on last edited by
        #29

        Ok it is running with that new code is currently at 677MB. Just a few minutes ago it went up to 1.5GB and back down... however it is slowly climbing

        L 1 Reply Last reply
        0
        • J JD86

          Ok it is running with that new code is currently at 677MB. Just a few minutes ago it went up to 1.5GB and back down... however it is slowly climbing

          L Offline
          L Offline
          Luc Pattyn
          wrote on last edited by
          #30

          Hi, 1. once again I must ask you to be more specific; "slowly climbing" doesn't offer much info. 2. changing the code and changing run-time parameters too much makes it extremely hard if not impossible to understand what sporadic measurement results actually mean. 3. I won't be in tomorrow. 4. Today I have been, and still am, researching the behavior of the Large Object Heap, and probably will write a CP article about it; if so, that will take a few weeks though. In summary, while there may be a way to make it perform as required (with a lot of conditions), I am still inclined to avoid large objects as much as possible. And hence: 5. I developed a little class that would keep your huge message lists in the regular heap, hence avoiding the LOH fragmentation you are currently bound to get. It does not offer an initial capacity, I don't think that would be very useful. It has been tested with foreach and with LINQ. Here it is:

          using System;
          using System.Collections;
          using System.Collections.Generic;
          using System.Text;

          namespace LargeObjectsHeapTester {
          //
          // This class collects a sequence of items of type T (value or reference type)
          // and keeps them in the order they get added.
          // A list-of-lists technique is used to keep everything in the regular heap,
          // i.e. this class should not challenge the large object heap unless
          // the number of items collected gets really big (above 100 million).
          // To control what is inside the collection:
          // - use Add to add an item at the end;
          // - use Clear to clear the collection.
          // To obtain information about its content:
          // - use the Count property;
          // - use the IEnumerable interface (or the old-fashioned IEnumerable if you must)
          // (both foreach and LINQ rely on the IEnumerable interface when present)
          //
          // Note: value types such as structs should be small (max 8B) otherwise
          // the level1 lists may not fit in the regular heap.
          public class ListOfLists :IEnumerable, IEnumerable {
          public const int MAX_ITEMS_IN_LEVEL1=10000;
          //public static ILog Logger;
          private List> level0;
          private List currentLevel1;

          	// constructor
          	public ListOfLists() {
          		Clear();
          	}
          
          	// logging utility
          	private void log(string msg) {
          		//if(Logger!=null) Logger.log(msg);
          	}
          
          	// empty the collection
          	public void Clear() {
          		level0=new List\>();
          		currentLevel1=new List();
          		level0.Add(currentLevel1);
          	}
          
          	// add an item at the end of the collection
          	publ
          
          J 1 Reply Last reply
          0
          • L Luc Pattyn

            Hi, 1. once again I must ask you to be more specific; "slowly climbing" doesn't offer much info. 2. changing the code and changing run-time parameters too much makes it extremely hard if not impossible to understand what sporadic measurement results actually mean. 3. I won't be in tomorrow. 4. Today I have been, and still am, researching the behavior of the Large Object Heap, and probably will write a CP article about it; if so, that will take a few weeks though. In summary, while there may be a way to make it perform as required (with a lot of conditions), I am still inclined to avoid large objects as much as possible. And hence: 5. I developed a little class that would keep your huge message lists in the regular heap, hence avoiding the LOH fragmentation you are currently bound to get. It does not offer an initial capacity, I don't think that would be very useful. It has been tested with foreach and with LINQ. Here it is:

            using System;
            using System.Collections;
            using System.Collections.Generic;
            using System.Text;

            namespace LargeObjectsHeapTester {
            //
            // This class collects a sequence of items of type T (value or reference type)
            // and keeps them in the order they get added.
            // A list-of-lists technique is used to keep everything in the regular heap,
            // i.e. this class should not challenge the large object heap unless
            // the number of items collected gets really big (above 100 million).
            // To control what is inside the collection:
            // - use Add to add an item at the end;
            // - use Clear to clear the collection.
            // To obtain information about its content:
            // - use the Count property;
            // - use the IEnumerable interface (or the old-fashioned IEnumerable if you must)
            // (both foreach and LINQ rely on the IEnumerable interface when present)
            //
            // Note: value types such as structs should be small (max 8B) otherwise
            // the level1 lists may not fit in the regular heap.
            public class ListOfLists :IEnumerable, IEnumerable {
            public const int MAX_ITEMS_IN_LEVEL1=10000;
            //public static ILog Logger;
            private List> level0;
            private List currentLevel1;

            	// constructor
            	public ListOfLists() {
            		Clear();
            	}
            
            	// logging utility
            	private void log(string msg) {
            		//if(Logger!=null) Logger.log(msg);
            	}
            
            	// empty the collection
            	public void Clear() {
            		level0=new List\>();
            		currentLevel1=new List();
            		level0.Add(currentLevel1);
            	}
            
            	// add an item at the end of the collection
            	publ
            
            J Offline
            J Offline
            JD86
            wrote on last edited by
            #31

            Hi Luc! I think after reading your questions/answers and researching that I have a much better understanding of what is going on. Basically when a List has more than 10,000 objects in it then it goes into LOH. Now from what I understand the garbage collector doesn't normally "compact" these as it expects to reuse this space? Now I read in 4.5.1 they introduced this: whi[^] which gives you the ability to tell it TO compact once and then it resets to default which doesn't compact LOH. Now what you posted creates a List of Lists (as named) and keeps each list under 10,000 so it never enters LOH. I'm testing your class right now and recompiling.. Edit: (Sorry forgot to answer your questions): Questions: a) how many users satisfy x.MailboxPlan > 0 Right now it is under 5000 but technically it could be well over 10,000 one day b) what would be a good upper limit for the Count of this allMailboxes? Same thing.. right now under 5000 but could be over 10,000 one day c) how many users would there typically be in the below a.Users (sent or received) This is how many messages a single user has sent. Most of the time it will be under 1000 but if someone spams it could be larger. The list of ALL users for sent messages is well over 10,000. I can put in some logs to get the exact data

            L 1 Reply Last reply
            0
            • J JD86

              Hi Luc! I think after reading your questions/answers and researching that I have a much better understanding of what is going on. Basically when a List has more than 10,000 objects in it then it goes into LOH. Now from what I understand the garbage collector doesn't normally "compact" these as it expects to reuse this space? Now I read in 4.5.1 they introduced this: whi[^] which gives you the ability to tell it TO compact once and then it resets to default which doesn't compact LOH. Now what you posted creates a List of Lists (as named) and keeps each list under 10,000 so it never enters LOH. I'm testing your class right now and recompiling.. Edit: (Sorry forgot to answer your questions): Questions: a) how many users satisfy x.MailboxPlan > 0 Right now it is under 5000 but technically it could be well over 10,000 one day b) what would be a good upper limit for the Count of this allMailboxes? Same thing.. right now under 5000 but could be over 10,000 one day c) how many users would there typically be in the below a.Users (sent or received) This is how many messages a single user has sent. Most of the time it will be under 1000 but if someone spams it could be larger. The list of ALL users for sent messages is well over 10,000. I can put in some logs to get the exact data

              L Offline
              L Offline
              Luc Pattyn
              wrote on last edited by
              #32

              1. Yes, your first paragraph is correct. 2. I have almost zero experience with the improved LOH GC in .NET 4.5/4.5.1/4.5.2/4.6 (yes it came incrementally!), I've read it all and I'm working on some experiments, however I do not fully trust it for the potential side effects. I'd rather avoid the fragmentation if there happens to be a reasonable way to do so. Keep in mind an LOH compaction is a potentially huge operation that is rumored to maybe take ten seconds or so, in which time your app probably doesn't respond to anything. 3. Once you got an OOMExc, you're stuck, unless you put try-catch AND retry logic everywhere! As an OOMExc could occur in many places, you would have to: (a) either include a lot of GCSettings.LargeObjectHeapCompactionMode=...Once statements, (b) or trust that by setting it once at the start of some/all intervals would suffice. But that assumption might be hard to proof correct. Anyway I'd be more interested in the newer forms of GC.Collect, see here[^] but that requires 4.6 You could also set LargeObjectHeapCompactionMode to once AND call GC.Collect(2) to force a LOH collect right away. That should work on 4.5.1 BTW: I've seen an article suggesting a timer that periodically causes an LOH compaction, but that sounds horrible, it would not synchronize to your app at all. 4. I'm inclined to recommend you eventually switch to a 64-bit app if that is at all possible. Yes pointers become twice as large (so lists become LOH candidates sooner), but the usable virtual address space theoretically grows from 2 GB (maybe 3 or 4) up to 8 TB (other limits may apply, the Windows memory system is pretty complex!). A 64-bit app needs a CPU that supports x64 and a Windows OS version that does the same (check under MyComputer/Properties, not sure how virtual machines handle it), and a .NET app that is built for "Any CPU", an option in Build/ConfigurationManager (no problem, unless you are referencing libraries/DLLs that are built explicitly for 32-bit). 5. Warning: every little step you take to solve a problem like this may make it less probable, so it becomes harder to detect if anything more needs to be done. It is essential you find and use a way to "stress test" your code, maybe by shortening intervals, entering duplicates of actual list elements, etc. 6. Finally, depending on ho

              J 1 Reply Last reply
              0
              • L Luc Pattyn

                1. Yes, your first paragraph is correct. 2. I have almost zero experience with the improved LOH GC in .NET 4.5/4.5.1/4.5.2/4.6 (yes it came incrementally!), I've read it all and I'm working on some experiments, however I do not fully trust it for the potential side effects. I'd rather avoid the fragmentation if there happens to be a reasonable way to do so. Keep in mind an LOH compaction is a potentially huge operation that is rumored to maybe take ten seconds or so, in which time your app probably doesn't respond to anything. 3. Once you got an OOMExc, you're stuck, unless you put try-catch AND retry logic everywhere! As an OOMExc could occur in many places, you would have to: (a) either include a lot of GCSettings.LargeObjectHeapCompactionMode=...Once statements, (b) or trust that by setting it once at the start of some/all intervals would suffice. But that assumption might be hard to proof correct. Anyway I'd be more interested in the newer forms of GC.Collect, see here[^] but that requires 4.6 You could also set LargeObjectHeapCompactionMode to once AND call GC.Collect(2) to force a LOH collect right away. That should work on 4.5.1 BTW: I've seen an article suggesting a timer that periodically causes an LOH compaction, but that sounds horrible, it would not synchronize to your app at all. 4. I'm inclined to recommend you eventually switch to a 64-bit app if that is at all possible. Yes pointers become twice as large (so lists become LOH candidates sooner), but the usable virtual address space theoretically grows from 2 GB (maybe 3 or 4) up to 8 TB (other limits may apply, the Windows memory system is pretty complex!). A 64-bit app needs a CPU that supports x64 and a Windows OS version that does the same (check under MyComputer/Properties, not sure how virtual machines handle it), and a .NET app that is built for "Any CPU", an option in Build/ConfigurationManager (no problem, unless you are referencing libraries/DLLs that are built explicitly for 32-bit). 5. Warning: every little step you take to solve a problem like this may make it less probable, so it becomes harder to detect if anything more needs to be done. It is essential you find and use a way to "stress test" your code, maybe by shortening intervals, entering duplicates of actual list elements, etc. 6. Finally, depending on ho

                J Offline
                J Offline
                JD86
                wrote on last edited by
                #33

                I have no issues moving it to x64 except it seems like it would use way more memory than it should even if I switched it right? I have shortened the intervals (just a xml file) to every 30 minutes... i'm querying all of this from Exchange so I don't want to run it too often. I may need to just create some fake data and seed it into a list and bypass Exchange for testing. This is a Windows Service... The restart would technically help but I don't like that idea lol I'm testing your ListOfLists class right now.. worse case I could combine my powershell command sql insert statements so i'm not passing around Lists. However the powershell command does return a ICollection of PSObjects so most likely I guess we would end up in the same situation?

                L 1 Reply Last reply
                0
                • J JD86

                  I have no issues moving it to x64 except it seems like it would use way more memory than it should even if I switched it right? I have shortened the intervals (just a xml file) to every 30 minutes... i'm querying all of this from Exchange so I don't want to run it too often. I may need to just create some fake data and seed it into a list and bypass Exchange for testing. This is a Windows Service... The restart would technically help but I don't like that idea lol I'm testing your ListOfLists class right now.. worse case I could combine my powershell command sql insert statements so i'm not passing around Lists. However the powershell command does return a ICollection of PSObjects so most likely I guess we would end up in the same situation?

                  L Offline
                  L Offline
                  Luc Pattyn
                  wrote on last edited by
                  #34

                  1. The C# code remains the same; after JIT compilation, your code is somewhat larger, this won't be relevant. IIRC the smallest object grows from 32 to 48B when switching to x64. And obviously every reference grows from 4 to 8B. So memory usage is bound to be less than twice the original, and typically much less than that, as value types, texts, etc. don't grow at all. 2. A piece of code that returns a collection and doesn't care about fragmentation issues probably is based on an array; so is the ToList() you're using in LINQ. An array (or an array-based class) is just the easiest to produce and consume. There are alternatives, such as streaming (produce while consuming, never have it all in memory; that still is likely to contain an array internally, a smaller one). Or just asking for less data at once (you could keep start and end datetimes closer to each other, possibly in a loop, inside ExchActions.Get_TotalSentMessages(). BTW: My ListOfLists class is an IEnumerable, i.e. to the consumer it only shows one element at a time. That is extreme streaming! Fortunately the compiler converts "yield return" statements in all the code required to keep track of where the consumer is currently fetching the data. 3. What the powershell interface gives you is an ICollection, which is slightly more than an IEnumerable (e.g. it has a Count property, and an Add method). From this one can only gamble how it is implemented. Or look at it using Reflector or some other tool. 4. I'm not sure you need ToList() in your LINQ statements. Dit you try without? Select() returns an IEnumerable, and I expect that is all you are needing. I don't know how Select is implemented, I would hope it works in some kind of streaming mode (IEnumerable in, IEnumerable out, and produce while being consumed). If this holds true, that is a number of potentially big objects you no longer need. You would have to declare differently, and count yourself (while at it, I also summed the bytes!):

                  IEnumerable<MessageTrackingLog> totalSentLogs=LINQ statement without .ToList();
                  int sentCount=0;
                  int sentBytes=0;
                  foreach(MessageTrackingLog item in totalSentLogs) {
                  sentCount++;
                  sentBytes+=item.TotalBytes;
                  }

                  which is much cheaper than having the CLR hand you a List first, and then use LINQ to process it. :)

                  Luc Pattyn [My Articles] Nil Volentibus Arduum

                  J 2 Replies Last reply
                  0
                  • L Luc Pattyn

                    1. The C# code remains the same; after JIT compilation, your code is somewhat larger, this won't be relevant. IIRC the smallest object grows from 32 to 48B when switching to x64. And obviously every reference grows from 4 to 8B. So memory usage is bound to be less than twice the original, and typically much less than that, as value types, texts, etc. don't grow at all. 2. A piece of code that returns a collection and doesn't care about fragmentation issues probably is based on an array; so is the ToList() you're using in LINQ. An array (or an array-based class) is just the easiest to produce and consume. There are alternatives, such as streaming (produce while consuming, never have it all in memory; that still is likely to contain an array internally, a smaller one). Or just asking for less data at once (you could keep start and end datetimes closer to each other, possibly in a loop, inside ExchActions.Get_TotalSentMessages(). BTW: My ListOfLists class is an IEnumerable, i.e. to the consumer it only shows one element at a time. That is extreme streaming! Fortunately the compiler converts "yield return" statements in all the code required to keep track of where the consumer is currently fetching the data. 3. What the powershell interface gives you is an ICollection, which is slightly more than an IEnumerable (e.g. it has a Count property, and an Add method). From this one can only gamble how it is implemented. Or look at it using Reflector or some other tool. 4. I'm not sure you need ToList() in your LINQ statements. Dit you try without? Select() returns an IEnumerable, and I expect that is all you are needing. I don't know how Select is implemented, I would hope it works in some kind of streaming mode (IEnumerable in, IEnumerable out, and produce while being consumed). If this holds true, that is a number of potentially big objects you no longer need. You would have to declare differently, and count yourself (while at it, I also summed the bytes!):

                    IEnumerable<MessageTrackingLog> totalSentLogs=LINQ statement without .ToList();
                    int sentCount=0;
                    int sentBytes=0;
                    foreach(MessageTrackingLog item in totalSentLogs) {
                    sentCount++;
                    sentBytes+=item.TotalBytes;
                    }

                    which is much cheaper than having the CLR hand you a List first, and then use LINQ to process it. :)

                    Luc Pattyn [My Articles] Nil Volentibus Arduum

                    J Offline
                    J Offline
                    JD86
                    wrote on last edited by
                    #35

                    Just wanted to update you that i've tried a couple different things... The first thing i've tried is completely disabling all Exchange actions (message tracking logs, mailbox sizes, etc). Now it basically just processes the Active Directory options which has a total of 4533 that can be put in a list. What I am finding is the memory usage is still up to 1GB now even with all the tasks disabled and growing. I've had this service working without memory issues in the past. I completely rewrote it changing from Entity Framework to Linq to SQL because I didn't want to worry about the "context" being different. My goal was to make it where the scheduler version could last multiple version of the primary application. I'm really starting to wonder if its Linq to SQL because nothing should be over 5000 in a list now after disabling those other options. I may try switching to using SqlConnection and SqlCommand for a test (BTW I updated my code if you want to check it out again at the current state)

                    1 Reply Last reply
                    0
                    • L Luc Pattyn

                      1. The C# code remains the same; after JIT compilation, your code is somewhat larger, this won't be relevant. IIRC the smallest object grows from 32 to 48B when switching to x64. And obviously every reference grows from 4 to 8B. So memory usage is bound to be less than twice the original, and typically much less than that, as value types, texts, etc. don't grow at all. 2. A piece of code that returns a collection and doesn't care about fragmentation issues probably is based on an array; so is the ToList() you're using in LINQ. An array (or an array-based class) is just the easiest to produce and consume. There are alternatives, such as streaming (produce while consuming, never have it all in memory; that still is likely to contain an array internally, a smaller one). Or just asking for less data at once (you could keep start and end datetimes closer to each other, possibly in a loop, inside ExchActions.Get_TotalSentMessages(). BTW: My ListOfLists class is an IEnumerable, i.e. to the consumer it only shows one element at a time. That is extreme streaming! Fortunately the compiler converts "yield return" statements in all the code required to keep track of where the consumer is currently fetching the data. 3. What the powershell interface gives you is an ICollection, which is slightly more than an IEnumerable (e.g. it has a Count property, and an Add method). From this one can only gamble how it is implemented. Or look at it using Reflector or some other tool. 4. I'm not sure you need ToList() in your LINQ statements. Dit you try without? Select() returns an IEnumerable, and I expect that is all you are needing. I don't know how Select is implemented, I would hope it works in some kind of streaming mode (IEnumerable in, IEnumerable out, and produce while being consumed). If this holds true, that is a number of potentially big objects you no longer need. You would have to declare differently, and count yourself (while at it, I also summed the bytes!):

                      IEnumerable<MessageTrackingLog> totalSentLogs=LINQ statement without .ToList();
                      int sentCount=0;
                      int sentBytes=0;
                      foreach(MessageTrackingLog item in totalSentLogs) {
                      sentCount++;
                      sentBytes+=item.TotalBytes;
                      }

                      which is much cheaper than having the CLR hand you a List first, and then use LINQ to process it. :)

                      Luc Pattyn [My Articles] Nil Volentibus Arduum

                      J Offline
                      J Offline
                      JD86
                      wrote on last edited by
                      #36

                      Luc, I've been running ANTS Memory Profiler 8.8 on the service... the Large Object Heap size is actually only about 40MB according to this profiler. It shows the "Private Bytes" and "Working Set - Private" as the one that has all the memory. When I took a snapshot this is what it is saying: -> Generation 1 0 bytes -> Generation 2 -> 1.105MB -> Large Object Heap -> 2.645MB -> Unused memory allocated to .NET -> 108.6MB -> Unmanaged -> 618.6MB It also shows this for class list (Live size (bytes)): -> ConditionalWeakTable+Entry<Object, PSMemberInfoInternalCollection> (8,073,936 bytes) -> Int32[] (2,726,312 bytes) -> string (337,040) -> AdsValueHelper (151,956) and it just goes down from there. It does show "string" has 4,782 live instances and "AdsValueHelper" has 4,221 live instances

                      L 1 Reply Last reply
                      0
                      • J JD86

                        Luc, I've been running ANTS Memory Profiler 8.8 on the service... the Large Object Heap size is actually only about 40MB according to this profiler. It shows the "Private Bytes" and "Working Set - Private" as the one that has all the memory. When I took a snapshot this is what it is saying: -> Generation 1 0 bytes -> Generation 2 -> 1.105MB -> Large Object Heap -> 2.645MB -> Unused memory allocated to .NET -> 108.6MB -> Unmanaged -> 618.6MB It also shows this for class list (Live size (bytes)): -> ConditionalWeakTable+Entry<Object, PSMemberInfoInternalCollection> (8,073,936 bytes) -> Int32[] (2,726,312 bytes) -> string (337,040) -> AdsValueHelper (151,956) and it just goes down from there. It does show "string" has 4,782 live instances and "AdsValueHelper" has 4,221 live instances

                        L Offline
                        L Offline
                        Luc Pattyn
                        wrote on last edited by
                        #37

                        1. I have no idea what all that means. 2. I'm not a PowerShell user, nor will I become one any time soon. I have been reading up on it a bit, and seem to have hit on two reasons for it to leak memory: one of the first results googling "C# powershell memory leak"[^] leaky PowerShell scripts[^] 3. If I were to expect lots of output from something like PowerShell, and having seen the number of questions and complaints on it after a 1 minute Google, I would opt for a file interface: launch it with Process.Start() and have it create a file, hence avoiding most potential trouble. 4. I recommend you reduce your program to a fraction of the intended functionality, make the memory consumption numbers very visible, and work on it till your "climbing slowly" is completely gone. Then iteratively add code and functionality, keeping a sharp eye on the memory situation at all times. :)

                        Luc Pattyn [My Articles] Nil Volentibus Arduum

                        1 Reply Last reply
                        0
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Don't have an account? Register

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • World
                        • Users
                        • Groups