Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Virtual Memory

Virtual Memory

Scheduled Pinned Locked Moved The Lounge
questioncsharpasp-netvisual-studioarchitecture
31 Posts 23 Posters 1 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R Rob Philpott

    I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

    Regards, Rob Philpott.

    M Offline
    M Offline
    moon_stick
    wrote on last edited by
    #16

    I was running Resharper against a similarly sized solution recently but had to uninstall it as it was impacting on performance too heavily (45 seconds for a context menu in code editor, although my machine is less powerful than yours). One of the developers on my team still uses Resharper but has turned off automatic code scanning and this make performance more reasonable. Have you looked in task manager to see how much memory VS and Resharper are taking up? Could you change the drive for your VM to the SSD? I suspect that the problem is that Resharper is hogging the CPU rather than it being a memory problem but might be worth a try...

    Sarchasm : The gulf between the author of sarcastic wit and the person who doesn't get it.

    1 Reply Last reply
    0
    • B BobJanova

      Both Visual Studio in general and Resharper in particular are notorious memory hogs, so it's possible that they have eaten the whole thing. Alternatively, perhaps the slowness is not memory related? Resharper probably does lots of project scanning and it might just be CPU blocked. Are you sure that memory allocation is the problem? You can almost certainly make things much faster by (i) not using VS and (ii) not using Resharper, heh.

      _ Offline
      _ Offline
      _beauw_
      wrote on last edited by
      #17

      I noticed a big jump in memory utilization when I upgraded to VS2008. At the time, I had created some WPF projects in VS2005. It was necessary to install some extra things (CTP?) to do WPF in VS2005, but I installed them and memory usage was still reasonable. When I later converted these projects to VS2008, with its built-in WPF support, memory usage by the IDE grew by a factor of 2-4 times. This might be due to some decision to re-write parts of the IDE and toolsuite in .NET; I seem to recall that one of Microsoft's marketing claims at the time was that big parts of Visual Studio had begun using .NET code.

      1 Reply Last reply
      0
      • R Rob Philpott

        I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

        Regards, Rob Philpott.

        H Offline
        H Offline
        HalfHuman
        wrote on last edited by
        #18

        it's almost never a good ideea to disable the swap file. if you have plenty of ram you could set it to something lower like 4gig in your case (if you are space limited). look in resource monitor at the mem usage of visual studio. having 8gigs of ram should be a good start, the fast ssd definitely helps. if you have 4 cores then i do not see how visual studio kills the machine. usually resharper (i use 5.1) does its scanning at the begining, builds a cache (hogging the cpu for a while) and then it's quite idle. the cache is reused after ide/os restarts. careful not to delete the resharper cache as it will have to rebuild it. also do not go crazy on registry/os cleanners/optimizers etc. also if you have automaticly generated code that is large then you can also exclude that from the analysys. other than that resharper is priceless. take a look at the antivirus you are using as some are renouned to be hogging the cpu. if needed disable live scanning of the projects folder and/or ide process.

        1 Reply Last reply
        0
        • R Rob Philpott

          I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

          Regards, Rob Philpott.

          M Offline
          M Offline
          Member 3156407
          wrote on last edited by
          #19

          Hi Check Out the Resharper Forum , you are not alone. There has been a heated debate about performance (by me at least) since 6.0 was released , they released 6.1 to overcome some of these , a specific issue with writing and processing files. I have a i7 Quad / SSD / BUT Win 32 and 4Gb . There is nothing I can do about this as its an "Issue Machine" by my employers . BUT Why should I have to , its not a bad spec. I code for about and hour and eventually get a Out Of Memory error and VS 2010 crashes . I have had to uninstall R# . I have reverted to using Visual Assist X as a much lower demand add in . BUT a much lower functionality to go with it. I firmly believe that R# has simply got too big and now really demands a 64 Bit PC and Loads of RAM if its to function at all . This is sad since comapred to all the add ins out there its streets ahead in the way in blends into VS. The Love Hate relationship continues. What is really sad is that users like me sit with a "redundant" product . I can use it but it means restarting VS every hour or so to clear out memory, hardly ideal. Even on small projects I see 300 Mb DevEnv.exe usage in Task Manager climb to 700- 800 Mb within an hour. Even sadder is Jet Brains are remarkably quiet on the subject. Mike

          Mike

          1 Reply Last reply
          0
          • B BobJanova

            Windows only uses the page file if the whole of normal memory is allocated, so there is an alternative, but it's 'out of memory' error messages, which most people would say is worse.

            J Offline
            J Offline
            Jules H
            wrote on last edited by
            #20

            That's not true -- Windows will swap stuff out in order to make space for its disk cache, which it grows if you're doing a lot of IO. Some would say this is good, but I find the disk cache algorithm is actually less accurate at predicting future hits than "I loaded this program so I'm likely to use its memory again". In Linux, you can disable this behaviour with 'echo 0 > /proc/sys/vm/swappiness'; I don't know if there's a similar control for Windows.

            1 Reply Last reply
            0
            • R Rob Philpott

              I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

              Regards, Rob Philpott.

              S Offline
              S Offline
              Stefan_Lang
              wrote on last edited by
              #21

              As has been pointed out, VM != page file. You may be confused because when virtual memory was introduced it's main point was to increase the available amount of memory, and that in turn required a page file. Nowadays the main point of VM is to provide seemingly continuous blocks of memory in a system that may have fragmented memory to bits of unusable size. This is a true concern in a time where hundreds of services and processes may run on a single machine, and some of them for hours. Without VM, memory allocation functions would quickly deterioate to uselessnes as it gets increasingly harder to find blocks continuous of memory. You can't even 'defrag' it because without a VM, moving a memory block would pull out the data from under the programs that access it: every pointer currently pointing inside that memory block would become invalid, and the program would likely crash. A VM however you can move memory blocks around if you must: it can just copy the block to another place and then change the virtual mapping for that memory range. In 32 bit systems the VM also provides another advantage: the ability to use more than 4GB of RAM! While it's still true that a single thread cannot use more memory than it can address through a 4 byte pointer, i. e. 4 GB, different threads may use different physical memory blocks, and those are not necessarily limited by the same boundary. At least that's the theory - apparently the Win32 VM enforces much stricter limits. The ability to 'outsource' chunks of memory to a page file now is just a side-kick of the VM. Like you said, if you have a sufficient amount of RAM, you don't even need a page file, and in that case you can of course turn it off entirely. However, there is no point in doing that: the system will use the page file only if it must! So there's no slowdown for having it activated as long as it isn't actually used. But if you run out of physical RAM and there is no page file, then, in all likelyhood, the whole system will freeze or crash. With regard to your problem, there should be a compiler option for using multiple cores. I don't recall where to find it, but you should ensure that it is set - it may be disabled. You could also just take a look at the task manager and verify that all cores are active when you start a full compile. Note that AFAIK VS 2003 and older can't use multiple cores for all purposes, or even at all! At the very least, VS 2003 has no option for using multiple cores.

              1 Reply Last reply
              0
              • B BobJanova

                Both Visual Studio in general and Resharper in particular are notorious memory hogs, so it's possible that they have eaten the whole thing. Alternatively, perhaps the slowness is not memory related? Resharper probably does lots of project scanning and it might just be CPU blocked. Are you sure that memory allocation is the problem? You can almost certainly make things much faster by (i) not using VS and (ii) not using Resharper, heh.

                F Offline
                F Offline
                Fabio Franco
                wrote on last edited by
                #22

                BobJanova wrote:

                You can almost certainly make things much faster by (i) not using VS

                To Notepad++ and command line compiler. It will be much faster to type indeed.

                BobJanova wrote:

                and (ii) not using Resharper

                I actually found that Resharper makes things much slower and I don't think that it's worth it. I prefer vanilla VS.

                "To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson

                1 Reply Last reply
                0
                • R Roger Wright

                  This may be entirely unrelated, but it's an interesting coincidence. While trying to speed up a slow-moving system of a similar configuration, I bought a registry scanner/cleaner from Uniblue. It didn't solve the problem, but it seemed to help. What is coincidental, though, is that I woke this morning to find a message that an automated scan had found 86 registry errors. Instead of simply instructing it to fix the errors, I took the time to view them for a change. All of them were Invalid Path errors pointing to components of Visual Studio, and appeared to be references to parts of the .Net 4.0 library. I've used Visual Studio exactly once in the past month, and that only to open a small project I haven't had time to complete. As I said, this may be entirely irrelevant, but it also might be a clue to solving the problem, since it specifically involves Visual Studio.

                  Will Rogers never met me.

                  J Offline
                  J Offline
                  johannesnestler
                  wrote on last edited by
                  #23

                  reasons for my bad vote on your comment: 1. "... bought a registry cleaner" 2. "registry errors" - you have no clue what a "registry cleaner" does, have you? 3. yes - entirely irrelevant.

                  1 Reply Last reply
                  0
                  • R Rob Philpott

                    I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

                    Regards, Rob Philpott.

                    P Offline
                    P Offline
                    patbob
                    wrote on last edited by
                    #24

                    On XP, you could disable the page file and live perfectly fine. On the embedded windows OSs, you can still do that. Windows 7 may not let you though, but probably more because it likes to fill all available RAM with cached junk, just in case you might need it again someday. That feeture might require a page file backing, either to copy cached data to to free up RAM, or just for warm fuzzys. Here's what I did on Win 7 to make by (2GB RAM (!) dev box) box run faster: * I turned off the caching of boot files (I'm gonna use it once to boot and windows is going to keep it in RAM forever? how stupid). Task manager shows that I now always have some free RAM and VS performance doesn't start to slow down significantly until I've used most of that up. * I also periodically clean out C:/Windows/Prefetch periodically before a reboot. The task set I use on a regular basis morphs over time and I suspect Windows doesn't quite keep the prefetch contents appropriate... the system feels a bit faster afterwards, but its probably just placebo. * Lastly, VS or Win 7 must be leaking something.. periodic restarts/reboots are required to maintain performance. Sometimes multiple times in a day if I'm doing whatever it is that's exacerbating the problem. Hope that helps.

                    We can program with only 1's, but if all you've got are zeros, you've got nothing.

                    1 Reply Last reply
                    0
                    • R Rob Philpott

                      I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

                      Regards, Rob Philpott.

                      S Offline
                      S Offline
                      Sasha Laurel
                      wrote on last edited by
                      #25

                      Have you given ReadyBoost a try? I am using it with a 4GB SD card that I just keep plugged in. I'm not entirely sure how it works, apparently it acts as a cache for SuperFetch. Anyway, it may make a noticable difference or may not. As cheap as those USB drives and SD cards have become, I can't see how it would hurt to give it a try. Good luck.

                      1 Reply Last reply
                      0
                      • R Rob Philpott

                        I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

                        Regards, Rob Philpott.

                        J Offline
                        J Offline
                        jschell
                        wrote on last edited by
                        #26

                        Rob Philpott wrote:

                        Our solution does have about 130 projects in it.

                        If you can't create other solutions that contain only a subset of that, say 10 projects or less, then the design is flawed.

                        1 Reply Last reply
                        0
                        • R Rob Philpott

                          I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

                          Regards, Rob Philpott.

                          M Offline
                          M Offline
                          msvbdev
                          wrote on last edited by
                          #27

                          Here: http://superuser.com/questions/205114/is-there-anyway-to-stop-windows-xp-from-using-the-page-file/379669[^] and here: http://superuser.com/questions/14795/windows-swap-page-file-enable-or-disable[^] are relevant posts on SuperUser

                          1 Reply Last reply
                          0
                          • R Rob Philpott

                            I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

                            Regards, Rob Philpott.

                            L Offline
                            L Offline
                            Lost User
                            wrote on last edited by
                            #28

                            I have been running without page file since win7 and upgrade from 4 to 6 gb. Now have 12 gb at work and 16 gb at home. With 12 gb WS2010 running on both physical win7_64 and win7_32 vmware are happy and matlab (<-here os bit count must match with external tools with some toolboxes) and labview there somewhere from time to time. Not because I must, but because I can. And waiting for the tools to start up is not what I like. And booting a vm is much faster than booting the work pc (now over 9 minutes).

                            1 Reply Last reply
                            0
                            • R Rob Philpott

                              I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

                              Regards, Rob Philpott.

                              D Offline
                              D Offline
                              daaren
                              wrote on last edited by
                              #29

                              I personal faced the same problem some time ago, n the company I am working. We realized that visual studio has trouble as the number of the projects (not the number of source files, however) in the solution grows. We had a solution with more or less the same number of projects, and it took 10 minutes to load. We reduced the number of projects to 30 by grouping dlls, while keeping separate source folders, and this fixed the issue. Hope this helps PS: at the time we used VS 2008. Now we migrated to VS2010, and all still wok fine. Our average dev PC is similar to those of you (Pentium core 2, 8GB, Win7 64), and I regularly work with three or four instances of VS opened

                              1 Reply Last reply
                              0
                              • R Rob Philpott

                                I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

                                Regards, Rob Philpott.

                                K Offline
                                K Offline
                                KbrKnight
                                wrote on last edited by
                                #30

                                In most simple words, all I can say is as far as my knowledge goes, VM cannot be turned off in modern OS with a GUI. There is a minimum amount of VM you HAVE to set. You can increase your RAM as much as you want, but "turning off" VM makes no sense to me. Keep your registry optimized and periodically clean up junk and temp files from %temp%, temp and prefetch folders. Make sure your display driver is not a resource hog (this is a very common issue, and very easily negligible). Hope this helps

                                Getting information off the Internet is like taking a drink from a fire hydrant. In three words I can sum up everything I've learned about life: it goes on.

                                1 Reply Last reply
                                0
                                • R Rob Philpott

                                  I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?

                                  Regards, Rob Philpott.

                                  P Offline
                                  P Offline
                                  Paulo_JCG
                                  wrote on last edited by
                                  #31

                                  Hi, I think the OS has a limit of memory usage for each process you use (2GB if i'm not mistaken) check if it helps http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778%28v=vs.85%29.aspx[^]

                                  Paulo Gomes Over and Out :D

                                  1 Reply Last reply
                                  0
                                  Reply
                                  • Reply as topic
                                  Log in to reply
                                  • Oldest to Newest
                                  • Newest to Oldest
                                  • Most Votes


                                  • Login

                                  • Don't have an account? Register

                                  • Login or register to search.
                                  • First post
                                    Last post
                                  0
                                  • Categories
                                  • Recent
                                  • Tags
                                  • Popular
                                  • World
                                  • Users
                                  • Groups