Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Looking for ideas on how to release memory right before a generation 3 garbage collection [modified]

Looking for ideas on how to release memory right before a generation 3 garbage collection [modified]

Scheduled Pinned Locked Moved C#
visual-studioquestioncsharpdotnetcom
21 Posts 9 Posters 1 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • G GWBas1c

    Like I said, I'm deploying on Mono + Linux, so I can't use COM or CLR-specific APIs.

    M Offline
    M Offline
    Martin 0
    wrote on last edited by
    #6

    GWBas1c wrote:

    Like I said, I'm deploying on Mono + Linux, so I can't use COM or CLR-specific APIs.

    ok, sorry!

    All the best, Martin

    1 Reply Last reply
    0
    • G GWBas1c

      This is a question for GC experts out there: [Please don't respond telling me to use Dispose, it's not what I'm looking for.] I'm currently looking for ideas on how to release memory right before a generation 3 garbage collection. Specifically, I have a bunch of objects that all hold the contents of corresponding files on disk. I'd like to hold as many objects as possible in RAM and only release them as more RAM is needed. The naive approach is to use a Dictionary of WeakReferences. This is what I'm currently doing, but the problem is that some objects never get out of generation 0, and thus are collected and then re-loaded a few seconds later. So are there any ideas with regard to getting a good estimate of when a garbage collection is coming so I can move unused objects into WeakReferences? Some constraints: 1 - My deployment scenario is Mono + Linux, so I can't use the Win32 API, hidden CLR functions, COM, ect, ect. 2 - I'm developing with Visual Studio 2005 Pro. I can't afford VS 2008 Pro, and 2005 has some vital threading features in the debugger that VS 2008 express lacks.

      modified on Friday, December 4, 2009 5:32 AM

      P Offline
      P Offline
      puri keemti
      wrote on last edited by
      #7

      Best thing is to use dispose() pattern to release memory by using below code dispose() { base.dispose(); GC.SuppressFinalize(this) }

      G 1 Reply Last reply
      0
      • P puri keemti

        Best thing is to use dispose() pattern to release memory by using below code dispose() { base.dispose(); GC.SuppressFinalize(this) }

        G Offline
        G Offline
        GWBas1c
        wrote on last edited by
        #8

        This has nothing to do with Dispose. I'm trying to release references right before a garbage collection occurs. It's for a cache that should grow to occupy as much RAM as possible.

        P 1 Reply Last reply
        0
        • G GWBas1c

          This has nothing to do with Dispose. I'm trying to release references right before a garbage collection occurs. It's for a cache that should grow to occupy as much RAM as possible.

          P Offline
          P Offline
          puri keemti
          wrote on last edited by
          #9

          Pls. refers once purpose of Dispose pattern and supressfinalize method of GC.

          G 1 Reply Last reply
          0
          • P puri keemti

            Pls. refers once purpose of Dispose pattern and supressfinalize method of GC.

            G Offline
            G Offline
            GWBas1c
            wrote on last edited by
            #10

            Dispose isn't what I'm looking for. I fully understand the pattern, and it's not appropriate in this situation. I already described why it won't work here: http://www.codeproject.com/Messages/3293314/Re-Looking-for-ideas-on-how-to-release-memory-righ.aspx[^]

            P 1 Reply Last reply
            0
            • G GWBas1c

              Dispose isn't what I'm looking for. I fully understand the pattern, and it's not appropriate in this situation. I already described why it won't work here: http://www.codeproject.com/Messages/3293314/Re-Looking-for-ideas-on-how-to-release-memory-righ.aspx[^]

              P Offline
              P Offline
              puri keemti
              wrote on last edited by
              #11

              Kindly change the question and state the correct problem there......

              G L 2 Replies Last reply
              0
              • P puri keemti

                Kindly change the question and state the correct problem there......

                G Offline
                G Offline
                GWBas1c
                wrote on last edited by
                #12

                Look, I appreciate that you're trying to be helpful. My initial question does state the correct problem: I'm trying to release memory right before a generation 3 garbage collection.

                1 Reply Last reply
                0
                • P puri keemti

                  Kindly change the question and state the correct problem there......

                  L Offline
                  L Offline
                  Lost User
                  wrote on last edited by
                  #13

                  The question is already correct and unambiguous. It seems you got it wrong.

                  1 Reply Last reply
                  0
                  • G GWBas1c

                    This is a question for GC experts out there: [Please don't respond telling me to use Dispose, it's not what I'm looking for.] I'm currently looking for ideas on how to release memory right before a generation 3 garbage collection. Specifically, I have a bunch of objects that all hold the contents of corresponding files on disk. I'd like to hold as many objects as possible in RAM and only release them as more RAM is needed. The naive approach is to use a Dictionary of WeakReferences. This is what I'm currently doing, but the problem is that some objects never get out of generation 0, and thus are collected and then re-loaded a few seconds later. So are there any ideas with regard to getting a good estimate of when a garbage collection is coming so I can move unused objects into WeakReferences? Some constraints: 1 - My deployment scenario is Mono + Linux, so I can't use the Win32 API, hidden CLR functions, COM, ect, ect. 2 - I'm developing with Visual Studio 2005 Pro. I can't afford VS 2008 Pro, and 2005 has some vital threading features in the debugger that VS 2008 express lacks.

                    modified on Friday, December 4, 2009 5:32 AM

                    D Offline
                    D Offline
                    Dave Kreskowiak
                    wrote on last edited by
                    #14

                    AFAICT, there is no way to tell when a GC is going to occur without diving into the CLR-specific API. Short of creating your own version of the CLR, I don't see how you can do it.

                    A guide to posting questions on CodeProject[^]
                    Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
                         2006, 2007, 2008
                    But no longer in 2009...

                    G 1 Reply Last reply
                    0
                    • D Dave Kreskowiak

                      AFAICT, there is no way to tell when a GC is going to occur without diving into the CLR-specific API. Short of creating your own version of the CLR, I don't see how you can do it.

                      A guide to posting questions on CodeProject[^]
                      Dave Kreskowiak Microsoft MVP Visual Developer - Visual Basic
                           2006, 2007, 2008
                      But no longer in 2009...

                      G Offline
                      G Offline
                      GWBas1c
                      wrote on last edited by
                      #15

                      Yeah, I agree. The techniques described for .Net 3.0 seem a bit sketchy. I ended up using an algorithm that checks GC.GetTotalMemory() on a periodic basis. As a consequence, the system administrator will have to tune the program to give it a target RAM amount to occupy.

                      1 Reply Last reply
                      0
                      • G GWBas1c

                        This is a question for GC experts out there: [Please don't respond telling me to use Dispose, it's not what I'm looking for.] I'm currently looking for ideas on how to release memory right before a generation 3 garbage collection. Specifically, I have a bunch of objects that all hold the contents of corresponding files on disk. I'd like to hold as many objects as possible in RAM and only release them as more RAM is needed. The naive approach is to use a Dictionary of WeakReferences. This is what I'm currently doing, but the problem is that some objects never get out of generation 0, and thus are collected and then re-loaded a few seconds later. So are there any ideas with regard to getting a good estimate of when a garbage collection is coming so I can move unused objects into WeakReferences? Some constraints: 1 - My deployment scenario is Mono + Linux, so I can't use the Win32 API, hidden CLR functions, COM, ect, ect. 2 - I'm developing with Visual Studio 2005 Pro. I can't afford VS 2008 Pro, and 2005 has some vital threading features in the debugger that VS 2008 express lacks.

                        modified on Friday, December 4, 2009 5:32 AM

                        A Offline
                        A Offline
                        Alan Balkany
                        wrote on last edited by
                        #16

                        Idea: Maintain a queue of strong references to the most recently used objects. This prevents them from being garbage collected. When an object is used, move it to the front of the queue. When you insert a new object at the front of the queue, delete the strong reference at the tail of the queue if the queue size is above a certain threshold. If recently-referenced objects are more likely to be referenced again, this cache will avoid re-reading them from disk. An enhancement: Instead of just a plain queue, keep the items in keyed storage (hash table, binary tree, etc), and maintain the queue as a doubly-linked list, implemented by Previous/Next fields in your items. This will let you quickly look up an object which may be in your cache. Adjust the Previous/Next fields to move it to the head of the queue. You'll also want to maintain external pointers to the head and tail of the queue. Hope this helps.

                        G 1 Reply Last reply
                        0
                        • G GWBas1c

                          This is a question for GC experts out there: [Please don't respond telling me to use Dispose, it's not what I'm looking for.] I'm currently looking for ideas on how to release memory right before a generation 3 garbage collection. Specifically, I have a bunch of objects that all hold the contents of corresponding files on disk. I'd like to hold as many objects as possible in RAM and only release them as more RAM is needed. The naive approach is to use a Dictionary of WeakReferences. This is what I'm currently doing, but the problem is that some objects never get out of generation 0, and thus are collected and then re-loaded a few seconds later. So are there any ideas with regard to getting a good estimate of when a garbage collection is coming so I can move unused objects into WeakReferences? Some constraints: 1 - My deployment scenario is Mono + Linux, so I can't use the Win32 API, hidden CLR functions, COM, ect, ect. 2 - I'm developing with Visual Studio 2005 Pro. I can't afford VS 2008 Pro, and 2005 has some vital threading features in the debugger that VS 2008 express lacks.

                          modified on Friday, December 4, 2009 5:32 AM

                          G Offline
                          G Offline
                          Gideon Engelberth
                          wrote on last edited by
                          #17

                          It's your app, so you know if this is possible, but I would try to resturcture the logic so I do not need to be accessing the same file repeatedly to the point where I'm keeping a cache of files. If you are only accessing a file once every 10 minutes, I would assume that the performance gain from caching the string will be small. If the Mono garbage collector works like the .NET one, the Large Object Heap may be what you need. (I do not claim to be a Large Object expert. I've only read the first article that comes up on Google about it. [1][2]) If the resulting string from the file is >85K in size, the string should get placed on the LOH and will thus not get collected except in a full collection. If the files are smaller than that and you still need to cache them, you may be able to hack something together with a large finalizable object, but I wouldn't recommend it. If that doesn't work, you probably need a more defined caching policy than "keep them as long as the process has memory."

                          G 1 Reply Last reply
                          0
                          • A Alan Balkany

                            Idea: Maintain a queue of strong references to the most recently used objects. This prevents them from being garbage collected. When an object is used, move it to the front of the queue. When you insert a new object at the front of the queue, delete the strong reference at the tail of the queue if the queue size is above a certain threshold. If recently-referenced objects are more likely to be referenced again, this cache will avoid re-reading them from disk. An enhancement: Instead of just a plain queue, keep the items in keyed storage (hash table, binary tree, etc), and maintain the queue as a doubly-linked list, implemented by Previous/Next fields in your items. This will let you quickly look up an object which may be in your cache. Adjust the Previous/Next fields to move it to the head of the queue. You'll also want to maintain external pointers to the head and tail of the queue. Hope this helps.

                            G Offline
                            G Offline
                            GWBas1c
                            wrote on last edited by
                            #18

                            That's pretty much what I decided to implement, although I'm experimenting with GC.GetTotalMemory() to determine how many items to pull off of the queue. I keep building the queue until GC.GetTotalMemory() reaches a target size, and if GC.GetTotalMemory() goes above a certain threshold, I remove two or three items. I'm not quite sure how the linked list plays in: I'm keeping references with a dictionary, and the queue is used to increment / decrement an access count. Anyway, thanks!

                            1 Reply Last reply
                            0
                            • G Gideon Engelberth

                              It's your app, so you know if this is possible, but I would try to resturcture the logic so I do not need to be accessing the same file repeatedly to the point where I'm keeping a cache of files. If you are only accessing a file once every 10 minutes, I would assume that the performance gain from caching the string will be small. If the Mono garbage collector works like the .NET one, the Large Object Heap may be what you need. (I do not claim to be a Large Object expert. I've only read the first article that comes up on Google about it. [1][2]) If the resulting string from the file is >85K in size, the string should get placed on the LOH and will thus not get collected except in a full collection. If the files are smaller than that and you still need to cache them, you may be able to hack something together with a large finalizable object, but I wouldn't recommend it. If that doesn't work, you probably need a more defined caching policy than "keep them as long as the process has memory."

                              G Offline
                              G Offline
                              GWBas1c
                              wrote on last edited by
                              #19

                              Thanks, but this is for a web server. (I'm also caching more then just text files, but I had to give a simplified example.) The program needs to dynamically adapt to whatever becomes popular without manual tuning from a system administrator. I ended up doing something similar to what was suggested here: http://www.codeproject.com/Messages/3293896/Re-Looking-for-ideas-on-how-to-release-memory-righ.aspx[^]

                              1 Reply Last reply
                              0
                              • G GWBas1c

                                This is a question for GC experts out there: [Please don't respond telling me to use Dispose, it's not what I'm looking for.] I'm currently looking for ideas on how to release memory right before a generation 3 garbage collection. Specifically, I have a bunch of objects that all hold the contents of corresponding files on disk. I'd like to hold as many objects as possible in RAM and only release them as more RAM is needed. The naive approach is to use a Dictionary of WeakReferences. This is what I'm currently doing, but the problem is that some objects never get out of generation 0, and thus are collected and then re-loaded a few seconds later. So are there any ideas with regard to getting a good estimate of when a garbage collection is coming so I can move unused objects into WeakReferences? Some constraints: 1 - My deployment scenario is Mono + Linux, so I can't use the Win32 API, hidden CLR functions, COM, ect, ect. 2 - I'm developing with Visual Studio 2005 Pro. I can't afford VS 2008 Pro, and 2005 has some vital threading features in the debugger that VS 2008 express lacks.

                                modified on Friday, December 4, 2009 5:32 AM

                                N Offline
                                N Offline
                                Natza Mitzi
                                wrote on last edited by
                                #20

                                Hi, Did you think about implementing a paging algorithm (LRU, clock page replacement) ? As for the dispose, I recommend that in your dispose you set the memebers to null where possible for faster memory reclaims.

                                Natza Mitzi Analysis Studio Statistical Analysis Software

                                G 1 Reply Last reply
                                0
                                • N Natza Mitzi

                                  Hi, Did you think about implementing a paging algorithm (LRU, clock page replacement) ? As for the dispose, I recommend that in your dispose you set the memebers to null where possible for faster memory reclaims.

                                  Natza Mitzi Analysis Studio Statistical Analysis Software

                                  G Offline
                                  G Offline
                                  GWBas1c
                                  wrote on last edited by
                                  #21

                                  I ended up using a combination of a queue and access counters. Whenever an object is accessed, I call GC.GetTotalMemory() to determine how many objects to de-reference. Anyway, this thread has nothing to do with Dispose.

                                  1 Reply Last reply
                                  0
                                  Reply
                                  • Reply as topic
                                  Log in to reply
                                  • Oldest to Newest
                                  • Newest to Oldest
                                  • Most Votes


                                  • Login

                                  • Don't have an account? Register

                                  • Login or register to search.
                                  • First post
                                    Last post
                                  0
                                  • Categories
                                  • Recent
                                  • Tags
                                  • Popular
                                  • World
                                  • Users
                                  • Groups