Memory consumption problem
-
Oops, I guess I found the problem... I'm deviding files in chunks of 1024 bytes. The code does something with each chunk and then fires an event which tells the chunk is complete. The event handler then calls a method to start processing the next chunk. This creates a nasty piece of recursive code which consumes a lot of code. Solved this and memory is more stable. I will however run a profiler later to check (and find) additional issues. For now everything seems fine! Thanks for the help, appreciate that!!
.: I love it when a plan comes together :. http://www.zonderpunt.nl
Hello, I followed the discussion and have a wild guess now!
Eduard Keilholz wrote:
The code does something with each chunk and then fires an event which tells the chunk is complete. The event handler then calls a method to start processing the next chunk.
If you are referencing this chunk, from an object which also holds the event ... And if you now register (subscribe) this event for every chunk from your "managing" class and not unregister (unsubscribe) after complete ... Your "managing" class would hold all the references to the objects which hold the references to the chunks and therefor the GC would not be able to free your objects. Just a wild guess! Hope you find it. P.S.: I'm sure with this[^] memory profiler you will find it quickly!
All the best, Martin
modified on Thursday, January 21, 2010 5:46 AM
-
Well, in that case, check if you are holding reference to any of the huge objects that you would have created. Run a memory profiler and see the results. That should tell you what consumes that much of memory when seervice is idle. Also check if you have use static objects. AFAIK they will exist throughout the lifetime of your service.
50-50-90 rule: Anytime I have a 50-50 chance of getting something right, there's a 90% probability I'll get it wrong...!!
I would probably agree here. The objects you have are probably referenced within the main application either through events (the little devils) or main variables. Setting these variables to null rather than the reference to the object may allow the garbage collector to work its magic, or you can try calling it directly to see if its just being lazy. Event disposal How to: Subscribe to and Unsubscribe from Events Just noticed that Covean has posted about the GC that should be useful.
modified on Thursday, January 21, 2010 5:50 AM
-
Hello, I followed the discussion and have a wild guess now!
Eduard Keilholz wrote:
The code does something with each chunk and then fires an event which tells the chunk is complete. The event handler then calls a method to start processing the next chunk.
If you are referencing this chunk, from an object which also holds the event ... And if you now register (subscribe) this event for every chunk from your "managing" class and not unregister (unsubscribe) after complete ... Your "managing" class would hold all the references to the objects which hold the references to the chunks and therefor the GC would not be able to free your objects. Just a wild guess! Hope you find it. P.S.: I'm sure with this[^] memory profiler you will find it quickly!
All the best, Martin
modified on Thursday, January 21, 2010 5:46 AM
Hey Martin, Thanks for your reply.. This is what happends A file becomes 'active'. As soon as a byte becomes active, the 'managing' class registers to three events (Completed, ProcessChanged and ErrorOccured). As soon as ProcessChanged is called, the 'active' file should start handling the next file part. As soon as Completed or ErrorOccured is fired, the 'managing' unregisters for the active file's events and a new file becomes 'active'. This process starts over and over again untill all files are completely processed. Now the ProcessChanged imidiately called the 'next file part' method which caused a recursive bunch of code. This caused the extreme large amount of memory consumption since some files are fairly large. I changed this so the recursion doesn't take place anymore. My Windows Service now claims about 27 Mb tops, in stead of the previous 500 Mb. I'll be running the memory profiler soon. Thanks for the interest!
.: I love it when a plan comes together :. http://www.zonderpunt.nl
-
Hey guys, I'm developing an application (a windows service in fact) which occasionally creates a large amount of objects. It's reasonable the service then consumes a lot of memory (op to 500 Mb). However when the service is done processing all objects (which are being disposed after processing) the service still claims the full 500 Mb. I have spent an entire day finding a memory leak in code but this does not seem to be the problem. Can anybody explain me (or point to a good reliable document) which explains me when memory is being 'released' after a process claimed it? I don't want to mess with the garbage collector because GC should find out when to clear up stuff. Thanks!
.: I love it when a plan comes together :. http://www.zonderpunt.nl
Before you spend a lot of time, just perform a simple experiment: - run the app, don't let it exit - watch the working set on TaskManager - minimize the main form - restore the main form - watch the working set again on TaskManager if it got way down, then nothing is wrong; the memory IS available for use by other processes, and minimizing makes Windows reclaim memory from an app. The effect will be most noticeable on recent Windows versions (Vista, 7). if not, you are holding on on something, or have a real leak. :)
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles]
I only read code that is properly formatted, adding PRE tags is the easiest way to obtain that.
[The QA section does it automatically now, I hope we soon get it on regular forums as well]
-
Hey Martin, Thanks for your reply.. This is what happends A file becomes 'active'. As soon as a byte becomes active, the 'managing' class registers to three events (Completed, ProcessChanged and ErrorOccured). As soon as ProcessChanged is called, the 'active' file should start handling the next file part. As soon as Completed or ErrorOccured is fired, the 'managing' unregisters for the active file's events and a new file becomes 'active'. This process starts over and over again untill all files are completely processed. Now the ProcessChanged imidiately called the 'next file part' method which caused a recursive bunch of code. This caused the extreme large amount of memory consumption since some files are fairly large. I changed this so the recursion doesn't take place anymore. My Windows Service now claims about 27 Mb tops, in stead of the previous 500 Mb. I'll be running the memory profiler soon. Thanks for the interest!
.: I love it when a plan comes together :. http://www.zonderpunt.nl
Hello, If you really unregister all the events (and I'm sure you do) I think you do not have a memory leak at all. It's just, like you found out, that you hold a lot of your chunk objects because of the recurse calls. This forces the GC to move the objects in a next generation. If your action is done, and no more new objects have to be created, the GC will not see the need of cleaning up the generations. I would assume, that if you start that action again (in the first implementated version), the 500Mb will not be topped, unless you haven't forgot to unregister the events or holding the objects in an collection for example! Here[^] is an article, which explains how the GC passes the objects threw the generation levels!
All the best, Martin
-
Before you spend a lot of time, just perform a simple experiment: - run the app, don't let it exit - watch the working set on TaskManager - minimize the main form - restore the main form - watch the working set again on TaskManager if it got way down, then nothing is wrong; the memory IS available for use by other processes, and minimizing makes Windows reclaim memory from an app. The effect will be most noticeable on recent Windows versions (Vista, 7). if not, you are holding on on something, or have a real leak. :)
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles]
I only read code that is properly formatted, adding PRE tags is the easiest way to obtain that.
[The QA section does it automatically now, I hope we soon get it on regular forums as well]
-
Hey Martin, Thanks for your reply.. This is what happends A file becomes 'active'. As soon as a byte becomes active, the 'managing' class registers to three events (Completed, ProcessChanged and ErrorOccured). As soon as ProcessChanged is called, the 'active' file should start handling the next file part. As soon as Completed or ErrorOccured is fired, the 'managing' unregisters for the active file's events and a new file becomes 'active'. This process starts over and over again untill all files are completely processed. Now the ProcessChanged imidiately called the 'next file part' method which caused a recursive bunch of code. This caused the extreme large amount of memory consumption since some files are fairly large. I changed this so the recursion doesn't take place anymore. My Windows Service now claims about 27 Mb tops, in stead of the previous 500 Mb. I'll be running the memory profiler soon. Thanks for the interest!
.: I love it when a plan comes together :. http://www.zonderpunt.nl
Also, keep in mind that if you're looking at Task Manager to tell you how much memory your app is using, it's lying to you. You're seeing how much memory is RESERVED by the .NET CLR for your app, NOT how much your app is actually using. The .NET CLR keeps a managed memory pool that your objects are allocated from. If you free an object, the memory goes back into the pool for future use. It is NOT returned to Windows! The .NET CLR will return memory to Windows if Windows wants it back. Otherwise, it'll keep the memory in the managed pool.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007, 2008
But no longer in 2009... -
Wouldn't it be better if he uses some kind of memory profiler? They would, IMO, give specific results.
50-50-90 rule: Anytime I have a 50-50 chance of getting something right, there's a 90% probability I'll get it wrong...!!
I haven't used memory profilers yet, I include what I deem necessary in my own code. So I can't advice about them as I don't know how intelligible their output would be for a novice. And that is why I prefer a simple experiment over installing yet another tool. :)
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles]
I only read code that is properly formatted, adding PRE tags is the easiest way to obtain that.
[The QA section does it automatically now, I hope we soon get it on regular forums as well]
-
I haven't used memory profilers yet, I include what I deem necessary in my own code. So I can't advice about them as I don't know how intelligible their output would be for a novice. And that is why I prefer a simple experiment over installing yet another tool. :)
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles]
I only read code that is properly formatted, adding PRE tags is the easiest way to obtain that.
[The QA section does it automatically now, I hope we soon get it on regular forums as well]
I haven't used a profiler as well. This morning I downloaded one to find the cause of my problem. The profiler however gives me loads and loads of information which I don't know how to 'read'. I tried your tip and yes, the amount of used memory dramaticly decreases when minimized. This is however when I use my test application which is a windows forms app. The actual app. is a Windows Service and since that has no GUI I cannot minimize it ;) Thanks for the tip!
.: I love it when a plan comes together :. http://www.zonderpunt.nl
-
I haven't used a profiler as well. This morning I downloaded one to find the cause of my problem. The profiler however gives me loads and loads of information which I don't know how to 'read'. I tried your tip and yes, the amount of used memory dramaticly decreases when minimized. This is however when I use my test application which is a windows forms app. The actual app. is a Windows Service and since that has no GUI I cannot minimize it ;) Thanks for the tip!
.: I love it when a plan comes together :. http://www.zonderpunt.nl
yes, "loads of information a novice won't understand" often is the outcome of a tool, even a good one. your app (as a WinForms app) going down to acceptable and repeatable levels should tell you all is well. :)
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles]
I only read code that is properly formatted, adding PRE tags is the easiest way to obtain that.
[The QA section does it automatically now, I hope we soon get it on regular forums as well]
-
Hello, If you really unregister all the events (and I'm sure you do) I think you do not have a memory leak at all. It's just, like you found out, that you hold a lot of your chunk objects because of the recurse calls. This forces the GC to move the objects in a next generation. If your action is done, and no more new objects have to be created, the GC will not see the need of cleaning up the generations. I would assume, that if you start that action again (in the first implementated version), the 500Mb will not be topped, unless you haven't forgot to unregister the events or holding the objects in an collection for example! Here[^] is an article, which explains how the GC passes the objects threw the generation levels!
All the best, Martin
The windows service now starts claiming about 26 Mb of ram. If I make the service work (really work) it grows to about 28 Mb and then stabilized (dit I write that correct)? It seems that my problem WAS the recursion and not a memory leak. Pretty happy after a day of stress testing, thank for the help!:thumbsup:
.: I love it when a plan comes together :. http://www.zonderpunt.nl
-
Also, keep in mind that if you're looking at Task Manager to tell you how much memory your app is using, it's lying to you. You're seeing how much memory is RESERVED by the .NET CLR for your app, NOT how much your app is actually using. The .NET CLR keeps a managed memory pool that your objects are allocated from. If you free an object, the memory goes back into the pool for future use. It is NOT returned to Windows! The .NET CLR will return memory to Windows if Windows wants it back. Otherwise, it'll keep the memory in the managed pool.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
2006, 2007, 2008
But no longer in 2009...Erhm, actually... that WAS the tool I was using to monitor the memory consumption. I'll try some performance counters, maybe I can find a more accurate result there. Thanks for sharing your meaning...:thumbsup:
.: I love it when a plan comes together :. http://www.zonderpunt.nl