How is Core2Duo speeding up compile-time?
-
My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof
-
My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof
Christof Schardt wrote:
I wonder, whether a new machine could significantly cut down my compile-times.
I guess it all depends on what you consider "significant". Somewhere in the history of this board, which I don't feel like searching for at the moment, was a nice long description I made of the compile->link process. CPU is one of several aspects that affect the outcome speed. Disk speed is one, memory the other. Make sure when you upgrade you make significant changes in all three, and you should have significant results. I don't have the latest versions of VS, so you will have to wait for the others to respond about any dual-core and multi-threaded compile capability, but if memory serves, VS still isn't multi-threaded compile, so you are still only using a single core to compile with and limited to the speed of a single core. However, even saying that, when the compile takes a full processor, all other system processes and other programs that are "stealiing" even minor time from your compile get pushed to the second core. So even then you get another benefit from a dual core in respect to multi-tasking based operating systems like Windows XP and Vista.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
Christof Schardt wrote:
I wonder, whether a new machine could significantly cut down my compile-times.
I guess it all depends on what you consider "significant". Somewhere in the history of this board, which I don't feel like searching for at the moment, was a nice long description I made of the compile->link process. CPU is one of several aspects that affect the outcome speed. Disk speed is one, memory the other. Make sure when you upgrade you make significant changes in all three, and you should have significant results. I don't have the latest versions of VS, so you will have to wait for the others to respond about any dual-core and multi-threaded compile capability, but if memory serves, VS still isn't multi-threaded compile, so you are still only using a single core to compile with and limited to the speed of a single core. However, even saying that, when the compile takes a full processor, all other system processes and other programs that are "stealiing" even minor time from your compile get pushed to the second core. So even then you get another benefit from a dual core in respect to multi-tasking based operating systems like Windows XP and Vista.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
Jeffry J. Brickley wrote:
VS still isn't multi-threaded compile, so you are still only using a single core to compile with and limited to the speed of a single core.
VS2k5 is. Although, it's not as good at it as make on Unix, where parallelized builds can really increase compilation times by a huge factor (depending on your hardware of course - but even on uniprocessor hardware, it's quicker due to the I/O intensity) I haven't timed VS2k5, but I'm sure using nmake could speed things up considerably. It does for VS2k3 anyway, and last time I checked VS2k5 is significantly slower than its predecessor.
-- From the network that brought you "The Simpsons"
-
Jeffry J. Brickley wrote:
VS still isn't multi-threaded compile, so you are still only using a single core to compile with and limited to the speed of a single core.
VS2k5 is. Although, it's not as good at it as make on Unix, where parallelized builds can really increase compilation times by a huge factor (depending on your hardware of course - but even on uniprocessor hardware, it's quicker due to the I/O intensity) I haven't timed VS2k5, but I'm sure using nmake could speed things up considerably. It does for VS2k3 anyway, and last time I checked VS2k5 is significantly slower than its predecessor.
-- From the network that brought you "The Simpsons"
Joergen Sigvardsson wrote:
and last time I checked VS2k5 is significantly slower than its predecessor.
well, something to look forward to I guess....
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof
I'm finding that the new processors open up more possibilities. For example, instead of buying a monster laptop, I buy a monster desktop for much cheaper, load VMWare Server on it, and run many operating systems at the same time. For example, I'll have a VS 2003 instance, VS 2005 instance, etc... So my instances are up 24-7, backed-up by my corporation, and I don't have a single thing loaded on my laptop besides office! When Vista comes out, I can adopt it twenty times quicker than anybody else. The possibilities are unlimited. You couldn't do this a few years ago.
-
My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof
Having managed C++ projects of that magnitude and more, I was appalled when I stepped into a project and discovered people had build times like that. The first thing I did was rearchitect the project so that there were discrete modules that were disconnected from each other. Yes, that took rearchitecting, but the payoff was tremendous. Consider a project whose development time is 2-3 years, and when you take a build process that takes 20-30 minutes down to 1 minute because you just are rebuilding the module you're working with, that was a major impact to productivity. So, the point being, don't look to better processors. Fix the root of the problem. Marc
People are just notoriously impossible. --DavidCrow
There's NO excuse for not commenting your code. -- John Simmons / outlaw programmer
People who say that they will refactor their code later to make it "good" don't understand refactoring, nor the art and craft of programming. -- Josh Smith -
Having managed C++ projects of that magnitude and more, I was appalled when I stepped into a project and discovered people had build times like that. The first thing I did was rearchitect the project so that there were discrete modules that were disconnected from each other. Yes, that took rearchitecting, but the payoff was tremendous. Consider a project whose development time is 2-3 years, and when you take a build process that takes 20-30 minutes down to 1 minute because you just are rebuilding the module you're working with, that was a major impact to productivity. So, the point being, don't look to better processors. Fix the root of the problem. Marc
People are just notoriously impossible. --DavidCrow
There's NO excuse for not commenting your code. -- John Simmons / outlaw programmer
People who say that they will refactor their code later to make it "good" don't understand refactoring, nor the art and craft of programming. -- Josh SmithMarc Clifton wrote:
So, the point being, don't look to better processors. Fix the root of the problem.
Now we know you're not an intel/amd sales rep. ;)
-- Broadcast simultaneously one year in the future
-
My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof
Not by much for the current Visual Studio. As I understand it, the build process is very constrained when it comes to using multiple threads, and without that you're not gonna see much of an improvement on multi-core processors. I watched a video of the Concurrency and Coordination Runtime (CCR) on the Channel9 MSDN website, and they said one of the guys on the MSBuild team used the CCR to achieve a 1:1 speedup per processor core; he roughly got 4x faster build times performance on a quad-core processor by utilizing the CCR in MSBuild. I'm hoping we'll see those improvements in VS Orcas.
Tech, life, family, faith: Give me a visit. I'm currently blogging about: God-as-Judge, God-as-Forgiver The apostle Paul, modernly speaking: Epistles of Paul Judah Himango
-
Not by much for the current Visual Studio. As I understand it, the build process is very constrained when it comes to using multiple threads, and without that you're not gonna see much of an improvement on multi-core processors. I watched a video of the Concurrency and Coordination Runtime (CCR) on the Channel9 MSDN website, and they said one of the guys on the MSBuild team used the CCR to achieve a 1:1 speedup per processor core; he roughly got 4x faster build times performance on a quad-core processor by utilizing the CCR in MSBuild. I'm hoping we'll see those improvements in VS Orcas.
Tech, life, family, faith: Give me a visit. I'm currently blogging about: God-as-Judge, God-as-Forgiver The apostle Paul, modernly speaking: Epistles of Paul Judah Himango
Judah Himango wrote:
I watched a video of the Concurrency and Coordination Runtime (CCR) on the Channel9 MSDN website, and they said one of the guys on the MSBuild team used the CCR to achieve a 1:1 speedup per processor core; he roughly got 4x faster build times performance on a quad-core processor by utilizing the CCR in MSBuild. I'm hoping we'll see those improvements in VS Orcas.
Yeah, but the IDE will be so much slower that you won't notice the difference... ;-P
-
I'm finding that the new processors open up more possibilities. For example, instead of buying a monster laptop, I buy a monster desktop for much cheaper, load VMWare Server on it, and run many operating systems at the same time. For example, I'll have a VS 2003 instance, VS 2005 instance, etc... So my instances are up 24-7, backed-up by my corporation, and I don't have a single thing loaded on my laptop besides office! When Vista comes out, I can adopt it twenty times quicker than anybody else. The possibilities are unlimited. You couldn't do this a few years ago.
I'm thinking about doing exactly this kind of thing (with VMware Workstation) but do you find much of a slowdown in speed? Any other issues? I've found sometimes that if I have to bounce my machine because of problems without shutting down VMware properly, VM's sometimes forget recent activity. While this may be just a consequence of turning off the VM without shutting down Windows first, I'm a bit suspicious and a bit reluctant to put everything into VM's because of it.
0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.
-
I'm thinking about doing exactly this kind of thing (with VMware Workstation) but do you find much of a slowdown in speed? Any other issues? I've found sometimes that if I have to bounce my machine because of problems without shutting down VMware properly, VM's sometimes forget recent activity. While this may be just a consequence of turning off the VM without shutting down Windows first, I'm a bit suspicious and a bit reluctant to put everything into VM's because of it.
0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.
Taka Muraoka wrote:
While this may be just a consequence of turning off the VM without shutting down Windows first
If windows doesn't survive this on a REAL machine what makes people think that it will somehow magically survive this on a VM? Always power down, or suspend your VM just as if you would shut down a real machine before unplugging it. Think of your host machine as the AC system, when you power down the master breaker, you better have your hardware turned off... similarly, when you powerdown your computer you should have already taken care of your VM.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
Judah Himango wrote:
I watched a video of the Concurrency and Coordination Runtime (CCR) on the Channel9 MSDN website, and they said one of the guys on the MSBuild team used the CCR to achieve a 1:1 speedup per processor core; he roughly got 4x faster build times performance on a quad-core processor by utilizing the CCR in MSBuild. I'm hoping we'll see those improvements in VS Orcas.
Yeah, but the IDE will be so much slower that you won't notice the difference... ;-P
:laugh: Seriously though, I think now that CPU clock speed has topped off, only software that utilizes parallel processing well (i.e. multiple threads in the right places) and does it scalable such that locks aren't killing the gains of parallel execution, only these pieces of software will be getting faster over time. It's nice to see that MS is investing a lot of research into this area (CCR, Software Transactional Memory, COmega, to name a few), especially given the current multithreading story is very hairy and difficult to do right.
Tech, life, family, faith: Give me a visit. I'm currently blogging about: God-as-Judge, God-as-Forgiver The apostle Paul, modernly speaking: Epistles of Paul Judah Himango
-
My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof
-
Taka Muraoka wrote:
While this may be just a consequence of turning off the VM without shutting down Windows first
If windows doesn't survive this on a REAL machine what makes people think that it will somehow magically survive this on a VM? Always power down, or suspend your VM just as if you would shut down a real machine before unplugging it. Think of your host machine as the AC system, when you power down the master breaker, you better have your hardware turned off... similarly, when you powerdown your computer you should have already taken care of your VM.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
Jeffry J. Brickley wrote:
If windows doesn't survive this on a REAL machine what makes people think that it will somehow magically survive this on a VM? Always power down, or suspend your VM just as if you would shut down a real machine before unplugging it.
Sometimes I can't. I run a virtual desktop manager and occasionally another program goes wonky and stops me from switching desktops which means I can't get at the VMware window to shut it down :doh: Because of this it's hard to be sure, but I get the nagging feeling that things that happened recently on the VM get forgotten, that wouldn't have if it was on a real PC. For example, changes to a file would get committed to disk almost immediately on a real PC but I've sure I've lost them by dropping a VM long after actually making the changes. I'm thinking maybe VMware is caching disk activity itself and not getting a chance to commit the changes. If I was hitting the big red button on the host machine, that would be understandable but a clean shutdown of the host machine, you'd think VMware would have the sense to flush its caches before exiting. It would be much nicer if VMware suspended your VM's rather than just powering them off :rolleyes:
0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.
-
Jeffry J. Brickley wrote:
If windows doesn't survive this on a REAL machine what makes people think that it will somehow magically survive this on a VM? Always power down, or suspend your VM just as if you would shut down a real machine before unplugging it.
Sometimes I can't. I run a virtual desktop manager and occasionally another program goes wonky and stops me from switching desktops which means I can't get at the VMware window to shut it down :doh: Because of this it's hard to be sure, but I get the nagging feeling that things that happened recently on the VM get forgotten, that wouldn't have if it was on a real PC. For example, changes to a file would get committed to disk almost immediately on a real PC but I've sure I've lost them by dropping a VM long after actually making the changes. I'm thinking maybe VMware is caching disk activity itself and not getting a chance to commit the changes. If I was hitting the big red button on the host machine, that would be understandable but a clean shutdown of the host machine, you'd think VMware would have the sense to flush its caches before exiting. It would be much nicer if VMware suspended your VM's rather than just powering them off :rolleyes:
0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.
Taka Muraoka wrote:
I'm thinking maybe VMware is caching disk activity itself and not getting a chance to commit the changes.
Parallels doesn't. ;P There is good with the bad... a virtual disk heavily cached in memory will run much faster than a virtual disk represented by a single image file. I just peeked at the Parallels options:
VM Shutdown behavior Default action to perform on application exit: _X_ Suspend VM ___ Power Off ___ Ask me what to do
obviously there will be cases where the host machine is so locked up that it is unable to be powered off gracefully. As with any hardware machine, the more in your cache, the more danger of corruption. You play Russian Roulette with every occurrence, one may or may not do it, but enough chances for corruption... well, we all know the results._________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
Taka Muraoka wrote:
I'm thinking maybe VMware is caching disk activity itself and not getting a chance to commit the changes.
Parallels doesn't. ;P There is good with the bad... a virtual disk heavily cached in memory will run much faster than a virtual disk represented by a single image file. I just peeked at the Parallels options:
VM Shutdown behavior Default action to perform on application exit: _X_ Suspend VM ___ Power Off ___ Ask me what to do
obviously there will be cases where the host machine is so locked up that it is unable to be powered off gracefully. As with any hardware machine, the more in your cache, the more danger of corruption. You play Russian Roulette with every occurrence, one may or may not do it, but enough chances for corruption... well, we all know the results._________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
Jeffry J. Brickley wrote:
a virtual disk heavily cached in memory will run much faster than a virtual disk represented by a single image file.
VMware has a option to use a real disk instead of a virtualized one, although they don't seem to have a lot of faith in this feature. One would assume it would be nearly full-speed. I was looking at a Mac laptop but they're way overpriced for what you get and not highly spec'ed enough for what I want. I wasn't really keen to spend all that money to get a Mac just to run Windows on it :-) Maybe I should just run Linux on the host ;P
0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.
-
Jeffry J. Brickley wrote:
a virtual disk heavily cached in memory will run much faster than a virtual disk represented by a single image file.
VMware has a option to use a real disk instead of a virtualized one, although they don't seem to have a lot of faith in this feature. One would assume it would be nearly full-speed. I was looking at a Mac laptop but they're way overpriced for what you get and not highly spec'ed enough for what I want. I wasn't really keen to spend all that money to get a Mac just to run Windows on it :-) Maybe I should just run Linux on the host ;P
0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.
Taka Muraoka wrote:
Maybe I should just run Linux on the host
go for it! My only virtual windows is the work edition of windows... talk about keeping work and home separate now... I have all my work on a VM that I take here and there. :)
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof
In effect your 2GB RAM is very good, but for compilation you could try to search if you RAM is really all available(or how much remain): verify it. Your CPU is hyper-threading this could help you for compile time. I've a other dual-core, model D830 and with example-app in C++, with several files(but not up 500, like your) and in effect respect previous Opteron 142, i've seen that speed is more fast. Find other variables like: 1) Memory type(DDR2 is more fast) and dual-channel configuration 2) CPU clock 3) Cache first and second level 4) Chipset If you've always this application type, with up 500 files, you can try to, not only dual-core, but dual-processor mb and 4GB RAM. Verify last processors and benchmark. Achille
-
My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof
With something like IncrediBuild (http://www.xoreax.com) you'll probably get a pretty decent speedup with your C++ compiles (assuming you have enough memory to support a couple of compiler processes on your machine). It mostly uses network distribution to speed up compiles but can also take advantage of multiple cores. With plain Visual Studio 2005 you'll also be getting some improvement. VS2005 can build two different configurations in parallel on a dual-core machine. The downside is that each project compiles only on one CPU, so if you have one big project that wouldn't be getting any speedup. - Dan
-
Having managed C++ projects of that magnitude and more, I was appalled when I stepped into a project and discovered people had build times like that. The first thing I did was rearchitect the project so that there were discrete modules that were disconnected from each other. Yes, that took rearchitecting, but the payoff was tremendous. Consider a project whose development time is 2-3 years, and when you take a build process that takes 20-30 minutes down to 1 minute because you just are rebuilding the module you're working with, that was a major impact to productivity. So, the point being, don't look to better processors. Fix the root of the problem. Marc
People are just notoriously impossible. --DavidCrow
There's NO excuse for not commenting your code. -- John Simmons / outlaw programmer
People who say that they will refactor their code later to make it "good" don't understand refactoring, nor the art and craft of programming. -- Josh Smith