Virtual Memory
-
I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?
Regards, Rob Philpott.
As has been pointed out, VM != page file. You may be confused because when virtual memory was introduced it's main point was to increase the available amount of memory, and that in turn required a page file. Nowadays the main point of VM is to provide seemingly continuous blocks of memory in a system that may have fragmented memory to bits of unusable size. This is a true concern in a time where hundreds of services and processes may run on a single machine, and some of them for hours. Without VM, memory allocation functions would quickly deterioate to uselessnes as it gets increasingly harder to find blocks continuous of memory. You can't even 'defrag' it because without a VM, moving a memory block would pull out the data from under the programs that access it: every pointer currently pointing inside that memory block would become invalid, and the program would likely crash. A VM however you can move memory blocks around if you must: it can just copy the block to another place and then change the virtual mapping for that memory range. In 32 bit systems the VM also provides another advantage: the ability to use more than 4GB of RAM! While it's still true that a single thread cannot use more memory than it can address through a 4 byte pointer, i. e. 4 GB, different threads may use different physical memory blocks, and those are not necessarily limited by the same boundary. At least that's the theory - apparently the Win32 VM enforces much stricter limits. The ability to 'outsource' chunks of memory to a page file now is just a side-kick of the VM. Like you said, if you have a sufficient amount of RAM, you don't even need a page file, and in that case you can of course turn it off entirely. However, there is no point in doing that: the system will use the page file only if it must! So there's no slowdown for having it activated as long as it isn't actually used. But if you run out of physical RAM and there is no page file, then, in all likelyhood, the whole system will freeze or crash. With regard to your problem, there should be a compiler option for using multiple cores. I don't recall where to find it, but you should ensure that it is set - it may be disabled. You could also just take a look at the task manager and verify that all cores are active when you start a full compile. Note that AFAIK VS 2003 and older can't use multiple cores for all purposes, or even at all! At the very least, VS 2003 has no option for using multiple cores.
-
Both Visual Studio in general and Resharper in particular are notorious memory hogs, so it's possible that they have eaten the whole thing. Alternatively, perhaps the slowness is not memory related? Resharper probably does lots of project scanning and it might just be CPU blocked. Are you sure that memory allocation is the problem? You can almost certainly make things much faster by (i) not using VS and (ii) not using Resharper, heh.
BobJanova wrote:
You can almost certainly make things much faster by (i) not using VS
To Notepad++ and command line compiler. It will be much faster to type indeed.
BobJanova wrote:
and (ii) not using Resharper
I actually found that Resharper makes things much slower and I don't think that it's worth it. I prefer vanilla VS.
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
-
This may be entirely unrelated, but it's an interesting coincidence. While trying to speed up a slow-moving system of a similar configuration, I bought a registry scanner/cleaner from Uniblue. It didn't solve the problem, but it seemed to help. What is coincidental, though, is that I woke this morning to find a message that an automated scan had found 86 registry errors. Instead of simply instructing it to fix the errors, I took the time to view them for a change. All of them were Invalid Path errors pointing to components of Visual Studio, and appeared to be references to parts of the .Net 4.0 library. I've used Visual Studio exactly once in the past month, and that only to open a small project I haven't had time to complete. As I said, this may be entirely irrelevant, but it also might be a clue to solving the problem, since it specifically involves Visual Studio.
Will Rogers never met me.
reasons for my bad vote on your comment: 1. "... bought a registry cleaner" 2. "registry errors" - you have no clue what a "registry cleaner" does, have you? 3. yes - entirely irrelevant.
-
I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?
Regards, Rob Philpott.
On XP, you could disable the page file and live perfectly fine. On the embedded windows OSs, you can still do that. Windows 7 may not let you though, but probably more because it likes to fill all available RAM with cached junk, just in case you might need it again someday. That feeture might require a page file backing, either to copy cached data to to free up RAM, or just for warm fuzzys. Here's what I did on Win 7 to make by (2GB RAM (!) dev box) box run faster: * I turned off the caching of boot files (I'm gonna use it once to boot and windows is going to keep it in RAM forever? how stupid). Task manager shows that I now always have some free RAM and VS performance doesn't start to slow down significantly until I've used most of that up. * I also periodically clean out C:/Windows/Prefetch periodically before a reboot. The task set I use on a regular basis morphs over time and I suspect Windows doesn't quite keep the prefetch contents appropriate... the system feels a bit faster afterwards, but its probably just placebo. * Lastly, VS or Win 7 must be leaking something.. periodic restarts/reboots are required to maintain performance. Sometimes multiple times in a day if I'm doing whatever it is that's exacerbating the problem. Hope that helps.
We can program with only 1's, but if all you've got are zeros, you've got nothing.
-
I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?
Regards, Rob Philpott.
Have you given ReadyBoost a try? I am using it with a 4GB SD card that I just keep plugged in. I'm not entirely sure how it works, apparently it acts as a cache for SuperFetch. Anyway, it may make a noticable difference or may not. As cheap as those USB drives and SD cards have become, I can't see how it would hurt to give it a try. Good luck.
-
I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?
Regards, Rob Philpott.
-
I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?
Regards, Rob Philpott.
Here: http://superuser.com/questions/205114/is-there-anyway-to-stop-windows-xp-from-using-the-page-file/379669[^] and here: http://superuser.com/questions/14795/windows-swap-page-file-enable-or-disable[^] are relevant posts on SuperUser
-
I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?
Regards, Rob Philpott.
I have been running without page file since win7 and upgrade from 4 to 6 gb. Now have 12 gb at work and 16 gb at home. With 12 gb WS2010 running on both physical win7_64 and win7_32 vmware are happy and matlab (<-here os bit count must match with external tools with some toolboxes) and labview there somewhere from time to time. Not because I must, but because I can. And waiting for the tools to start up is not what I like. And booting a vm is much faster than booting the work pc (now over 9 minutes).
-
I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?
Regards, Rob Philpott.
I personal faced the same problem some time ago, n the company I am working. We realized that visual studio has trouble as the number of the projects (not the number of source files, however) in the solution grows. We had a solution with more or less the same number of projects, and it took 10 minutes to load. We reduced the number of projects to 30 by grouping dlls, while keeping separate source folders, and this fixed the issue. Hope this helps PS: at the time we used VS 2008. Now we migrated to VS2010, and all still wok fine. Our average dev PC is similar to those of you (Pentium core 2, 8GB, Win7 64), and I regularly work with three or four instances of VS opened
-
I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?
Regards, Rob Philpott.
In most simple words, all I can say is as far as my knowledge goes, VM cannot be turned off in modern OS with a GUI. There is a minimum amount of VM you HAVE to set. You can increase your RAM as much as you want, but "turning off" VM makes no sense to me. Keep your registry optimized and periodically clean up junk and temp files from %temp%, temp and prefetch folders. Make sure your display driver is not a resource hog (this is a very common issue, and very easily negligible). Hope this helps
Getting information off the Internet is like taking a drink from a fire hydrant. In three words I can sum up everything I've learned about life: it goes on.
-
I started a new job recently and took with me what I thought was a reasonably high-spec machine running an AMD 4 core chip and 8GB of RAM (64bit Windows 7) and a very fast solid state disk. But Visual Studio (with resharper) seems to be killing it. Our solution does have about 130 projects in it. I thought with having 8GB and all I might try turning virtual memory off. When I do this, it asks for a reboot and I found its still on afterwards. So quite frankly I don't know how to do this. Anyway, I wonder why we still have VM in an age of super cheap real RAM. I think various things like memory mapped files use it behind the scenes, but otherwise can't see the purpose. Interested in opinions here or a definitive answer. Paging file is 8GB. What would happen if I went up to 16GB [of RAM]? How can I make this thing faster?
Regards, Rob Philpott.