Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. How is Core2Duo speeding up compile-time?

How is Core2Duo speeding up compile-time?

Scheduled Pinned Locked Moved The Lounge
c++questionworkspace
25 Posts 17 Posters 6 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Offline
    C Offline
    Christof Schardt
    wrote on last edited by
    #1

    My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof

    E S M J J 11 Replies Last reply
    0
    • C Christof Schardt

      My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof

      E Offline
      E Offline
      El Corazon
      wrote on last edited by
      #2

      Christof Schardt wrote:

      I wonder, whether a new machine could significantly cut down my compile-times.

      I guess it all depends on what you consider "significant". Somewhere in the history of this board, which I don't feel like searching for at the moment, was a nice long description I made of the compile->link process. CPU is one of several aspects that affect the outcome speed. Disk speed is one, memory the other. Make sure when you upgrade you make significant changes in all three, and you should have significant results. I don't have the latest versions of VS, so you will have to wait for the others to respond about any dual-core and multi-threaded compile capability, but if memory serves, VS still isn't multi-threaded compile, so you are still only using a single core to compile with and limited to the speed of a single core. However, even saying that, when the compile takes a full processor, all other system processes and other programs that are "stealiing" even minor time from your compile get pushed to the second core. So even then you get another benefit from a dual core in respect to multi-tasking based operating systems like Windows XP and Vista.

      _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

      J 1 Reply Last reply
      0
      • E El Corazon

        Christof Schardt wrote:

        I wonder, whether a new machine could significantly cut down my compile-times.

        I guess it all depends on what you consider "significant". Somewhere in the history of this board, which I don't feel like searching for at the moment, was a nice long description I made of the compile->link process. CPU is one of several aspects that affect the outcome speed. Disk speed is one, memory the other. Make sure when you upgrade you make significant changes in all three, and you should have significant results. I don't have the latest versions of VS, so you will have to wait for the others to respond about any dual-core and multi-threaded compile capability, but if memory serves, VS still isn't multi-threaded compile, so you are still only using a single core to compile with and limited to the speed of a single core. However, even saying that, when the compile takes a full processor, all other system processes and other programs that are "stealiing" even minor time from your compile get pushed to the second core. So even then you get another benefit from a dual core in respect to multi-tasking based operating systems like Windows XP and Vista.

        _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

        J Offline
        J Offline
        Jorgen Sigvardsson
        wrote on last edited by
        #3

        Jeffry J. Brickley wrote:

        VS still isn't multi-threaded compile, so you are still only using a single core to compile with and limited to the speed of a single core.

        VS2k5 is. Although, it's not as good at it as make on Unix, where parallelized builds can really increase compilation times by a huge factor (depending on your hardware of course - but even on uniprocessor hardware, it's quicker due to the I/O intensity) I haven't timed VS2k5, but I'm sure using nmake could speed things up considerably. It does for VS2k3 anyway, and last time I checked VS2k5 is significantly slower than its predecessor.

        -- From the network that brought you "The Simpsons"

        E 1 Reply Last reply
        0
        • J Jorgen Sigvardsson

          Jeffry J. Brickley wrote:

          VS still isn't multi-threaded compile, so you are still only using a single core to compile with and limited to the speed of a single core.

          VS2k5 is. Although, it's not as good at it as make on Unix, where parallelized builds can really increase compilation times by a huge factor (depending on your hardware of course - but even on uniprocessor hardware, it's quicker due to the I/O intensity) I haven't timed VS2k5, but I'm sure using nmake could speed things up considerably. It does for VS2k3 anyway, and last time I checked VS2k5 is significantly slower than its predecessor.

          -- From the network that brought you "The Simpsons"

          E Offline
          E Offline
          El Corazon
          wrote on last edited by
          #4

          Joergen Sigvardsson wrote:

          and last time I checked VS2k5 is significantly slower than its predecessor.

          well, something to look forward to I guess....

          _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

          1 Reply Last reply
          0
          • C Christof Schardt

            My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof

            S Offline
            S Offline
            SamStange
            wrote on last edited by
            #5

            I'm finding that the new processors open up more possibilities. For example, instead of buying a monster laptop, I buy a monster desktop for much cheaper, load VMWare Server on it, and run many operating systems at the same time. For example, I'll have a VS 2003 instance, VS 2005 instance, etc... So my instances are up 24-7, backed-up by my corporation, and I don't have a single thing loaded on my laptop besides office! When Vista comes out, I can adopt it twenty times quicker than anybody else. The possibilities are unlimited. You couldn't do this a few years ago.

            T 1 Reply Last reply
            0
            • C Christof Schardt

              My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof

              M Offline
              M Offline
              Marc Clifton
              wrote on last edited by
              #6

              Having managed C++ projects of that magnitude and more, I was appalled when I stepped into a project and discovered people had build times like that. The first thing I did was rearchitect the project so that there were discrete modules that were disconnected from each other. Yes, that took rearchitecting, but the payoff was tremendous. Consider a project whose development time is 2-3 years, and when you take a build process that takes 20-30 minutes down to 1 minute because you just are rebuilding the module you're working with, that was a major impact to productivity. So, the point being, don't look to better processors. Fix the root of the problem. Marc

              Thyme In The Country

              People are just notoriously impossible. --DavidCrow
              There's NO excuse for not commenting your code. -- John Simmons / outlaw programmer
              People who say that they will refactor their code later to make it "good" don't understand refactoring, nor the art and craft of programming. -- Josh Smith

              J P 2 Replies Last reply
              0
              • M Marc Clifton

                Having managed C++ projects of that magnitude and more, I was appalled when I stepped into a project and discovered people had build times like that. The first thing I did was rearchitect the project so that there were discrete modules that were disconnected from each other. Yes, that took rearchitecting, but the payoff was tremendous. Consider a project whose development time is 2-3 years, and when you take a build process that takes 20-30 minutes down to 1 minute because you just are rebuilding the module you're working with, that was a major impact to productivity. So, the point being, don't look to better processors. Fix the root of the problem. Marc

                Thyme In The Country

                People are just notoriously impossible. --DavidCrow
                There's NO excuse for not commenting your code. -- John Simmons / outlaw programmer
                People who say that they will refactor their code later to make it "good" don't understand refactoring, nor the art and craft of programming. -- Josh Smith

                J Offline
                J Offline
                Jorgen Sigvardsson
                wrote on last edited by
                #7

                Marc Clifton wrote:

                So, the point being, don't look to better processors. Fix the root of the problem.

                Now we know you're not an intel/amd sales rep. ;)

                -- Broadcast simultaneously one year in the future

                1 Reply Last reply
                0
                • C Christof Schardt

                  My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof

                  J Offline
                  J Offline
                  Judah Gabriel Himango
                  wrote on last edited by
                  #8

                  Not by much for the current Visual Studio. As I understand it, the build process is very constrained when it comes to using multiple threads, and without that you're not gonna see much of an improvement on multi-core processors. I watched a video of the Concurrency and Coordination Runtime (CCR) on the Channel9 MSDN website, and they said one of the guys on the MSBuild team used the CCR to achieve a 1:1 speedup per processor core; he roughly got 4x faster build times performance on a quad-core processor by utilizing the CCR in MSBuild. I'm hoping we'll see those improvements in VS Orcas.

                  Tech, life, family, faith: Give me a visit. I'm currently blogging about: God-as-Judge, God-as-Forgiver The apostle Paul, modernly speaking: Epistles of Paul Judah Himango

                  J 1 Reply Last reply
                  0
                  • J Judah Gabriel Himango

                    Not by much for the current Visual Studio. As I understand it, the build process is very constrained when it comes to using multiple threads, and without that you're not gonna see much of an improvement on multi-core processors. I watched a video of the Concurrency and Coordination Runtime (CCR) on the Channel9 MSDN website, and they said one of the guys on the MSBuild team used the CCR to achieve a 1:1 speedup per processor core; he roughly got 4x faster build times performance on a quad-core processor by utilizing the CCR in MSBuild. I'm hoping we'll see those improvements in VS Orcas.

                    Tech, life, family, faith: Give me a visit. I'm currently blogging about: God-as-Judge, God-as-Forgiver The apostle Paul, modernly speaking: Epistles of Paul Judah Himango

                    J Offline
                    J Offline
                    J Dunlap
                    wrote on last edited by
                    #9

                    Judah Himango wrote:

                    I watched a video of the Concurrency and Coordination Runtime (CCR) on the Channel9 MSDN website, and they said one of the guys on the MSBuild team used the CCR to achieve a 1:1 speedup per processor core; he roughly got 4x faster build times performance on a quad-core processor by utilizing the CCR in MSBuild. I'm hoping we'll see those improvements in VS Orcas.

                    Yeah, but the IDE will be so much slower that you won't notice the difference... ;-P

                    J 1 Reply Last reply
                    0
                    • S SamStange

                      I'm finding that the new processors open up more possibilities. For example, instead of buying a monster laptop, I buy a monster desktop for much cheaper, load VMWare Server on it, and run many operating systems at the same time. For example, I'll have a VS 2003 instance, VS 2005 instance, etc... So my instances are up 24-7, backed-up by my corporation, and I don't have a single thing loaded on my laptop besides office! When Vista comes out, I can adopt it twenty times quicker than anybody else. The possibilities are unlimited. You couldn't do this a few years ago.

                      T Offline
                      T Offline
                      Taka Muraoka
                      wrote on last edited by
                      #10

                      I'm thinking about doing exactly this kind of thing (with VMware Workstation) but do you find much of a slowdown in speed? Any other issues? I've found sometimes that if I have to bounce my machine because of problems without shutting down VMware properly, VM's sometimes forget recent activity. While this may be just a consequence of turning off the VM without shutting down Windows first, I'm a bit suspicious and a bit reluctant to put everything into VM's because of it.


                      0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.

                      E 1 Reply Last reply
                      0
                      • T Taka Muraoka

                        I'm thinking about doing exactly this kind of thing (with VMware Workstation) but do you find much of a slowdown in speed? Any other issues? I've found sometimes that if I have to bounce my machine because of problems without shutting down VMware properly, VM's sometimes forget recent activity. While this may be just a consequence of turning off the VM without shutting down Windows first, I'm a bit suspicious and a bit reluctant to put everything into VM's because of it.


                        0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.

                        E Offline
                        E Offline
                        El Corazon
                        wrote on last edited by
                        #11

                        Taka Muraoka wrote:

                        While this may be just a consequence of turning off the VM without shutting down Windows first

                        If windows doesn't survive this on a REAL machine what makes people think that it will somehow magically survive this on a VM? Always power down, or suspend your VM just as if you would shut down a real machine before unplugging it. Think of your host machine as the AC system, when you power down the master breaker, you better have your hardware turned off... similarly, when you powerdown your computer you should have already taken care of your VM.

                        _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                        T 1 Reply Last reply
                        0
                        • J J Dunlap

                          Judah Himango wrote:

                          I watched a video of the Concurrency and Coordination Runtime (CCR) on the Channel9 MSDN website, and they said one of the guys on the MSBuild team used the CCR to achieve a 1:1 speedup per processor core; he roughly got 4x faster build times performance on a quad-core processor by utilizing the CCR in MSBuild. I'm hoping we'll see those improvements in VS Orcas.

                          Yeah, but the IDE will be so much slower that you won't notice the difference... ;-P

                          J Offline
                          J Offline
                          Judah Gabriel Himango
                          wrote on last edited by
                          #12

                          :laugh: Seriously though, I think now that CPU clock speed has topped off, only software that utilizes parallel processing well (i.e. multiple threads in the right places) and does it scalable such that locks aren't killing the gains of parallel execution, only these pieces of software will be getting faster over time. It's nice to see that MS is investing a lot of research into this area (CCR, Software Transactional Memory, COmega, to name a few), especially given the current multithreading story is very hairy and difficult to do right.

                          Tech, life, family, faith: Give me a visit. I'm currently blogging about: God-as-Judge, God-as-Forgiver The apostle Paul, modernly speaking: Epistles of Paul Judah Himango

                          1 Reply Last reply
                          0
                          • C Christof Schardt

                            My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof

                            J Offline
                            J Offline
                            Jun Du
                            wrote on last edited by
                            #13

                            When I changed to dual CPUs at 3.21 GHz, I didn't find a siginificant cut in compile time. It's not supprising when you check Task Manager. The compiler uses one CPU only most of the time.

                            Best, Jun

                            1 Reply Last reply
                            0
                            • E El Corazon

                              Taka Muraoka wrote:

                              While this may be just a consequence of turning off the VM without shutting down Windows first

                              If windows doesn't survive this on a REAL machine what makes people think that it will somehow magically survive this on a VM? Always power down, or suspend your VM just as if you would shut down a real machine before unplugging it. Think of your host machine as the AC system, when you power down the master breaker, you better have your hardware turned off... similarly, when you powerdown your computer you should have already taken care of your VM.

                              _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                              T Offline
                              T Offline
                              Taka Muraoka
                              wrote on last edited by
                              #14

                              Jeffry J. Brickley wrote:

                              If windows doesn't survive this on a REAL machine what makes people think that it will somehow magically survive this on a VM? Always power down, or suspend your VM just as if you would shut down a real machine before unplugging it.

                              Sometimes I can't. I run a virtual desktop manager and occasionally another program goes wonky and stops me from switching desktops which means I can't get at the VMware window to shut it down :doh: Because of this it's hard to be sure, but I get the nagging feeling that things that happened recently on the VM get forgotten, that wouldn't have if it was on a real PC. For example, changes to a file would get committed to disk almost immediately on a real PC but I've sure I've lost them by dropping a VM long after actually making the changes. I'm thinking maybe VMware is caching disk activity itself and not getting a chance to commit the changes. If I was hitting the big red button on the host machine, that would be understandable but a clean shutdown of the host machine, you'd think VMware would have the sense to flush its caches before exiting. It would be much nicer if VMware suspended your VM's rather than just powering them off :rolleyes:


                              0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.

                              E 1 Reply Last reply
                              0
                              • T Taka Muraoka

                                Jeffry J. Brickley wrote:

                                If windows doesn't survive this on a REAL machine what makes people think that it will somehow magically survive this on a VM? Always power down, or suspend your VM just as if you would shut down a real machine before unplugging it.

                                Sometimes I can't. I run a virtual desktop manager and occasionally another program goes wonky and stops me from switching desktops which means I can't get at the VMware window to shut it down :doh: Because of this it's hard to be sure, but I get the nagging feeling that things that happened recently on the VM get forgotten, that wouldn't have if it was on a real PC. For example, changes to a file would get committed to disk almost immediately on a real PC but I've sure I've lost them by dropping a VM long after actually making the changes. I'm thinking maybe VMware is caching disk activity itself and not getting a chance to commit the changes. If I was hitting the big red button on the host machine, that would be understandable but a clean shutdown of the host machine, you'd think VMware would have the sense to flush its caches before exiting. It would be much nicer if VMware suspended your VM's rather than just powering them off :rolleyes:


                                0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.

                                E Offline
                                E Offline
                                El Corazon
                                wrote on last edited by
                                #15

                                Taka Muraoka wrote:

                                I'm thinking maybe VMware is caching disk activity itself and not getting a chance to commit the changes.

                                Parallels doesn't. ;P There is good with the bad... a virtual disk heavily cached in memory will run much faster than a virtual disk represented by a single image file. I just peeked at the Parallels options: VM Shutdown behavior Default action to perform on application exit: _X_ Suspend VM ___ Power Off ___ Ask me what to do obviously there will be cases where the host machine is so locked up that it is unable to be powered off gracefully. As with any hardware machine, the more in your cache, the more danger of corruption. You play Russian Roulette with every occurrence, one may or may not do it, but enough chances for corruption... well, we all know the results.

                                _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                                T 1 Reply Last reply
                                0
                                • E El Corazon

                                  Taka Muraoka wrote:

                                  I'm thinking maybe VMware is caching disk activity itself and not getting a chance to commit the changes.

                                  Parallels doesn't. ;P There is good with the bad... a virtual disk heavily cached in memory will run much faster than a virtual disk represented by a single image file. I just peeked at the Parallels options: VM Shutdown behavior Default action to perform on application exit: _X_ Suspend VM ___ Power Off ___ Ask me what to do obviously there will be cases where the host machine is so locked up that it is unable to be powered off gracefully. As with any hardware machine, the more in your cache, the more danger of corruption. You play Russian Roulette with every occurrence, one may or may not do it, but enough chances for corruption... well, we all know the results.

                                  _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                                  T Offline
                                  T Offline
                                  Taka Muraoka
                                  wrote on last edited by
                                  #16

                                  Jeffry J. Brickley wrote:

                                  a virtual disk heavily cached in memory will run much faster than a virtual disk represented by a single image file.

                                  VMware has a option to use a real disk instead of a virtualized one, although they don't seem to have a lot of faith in this feature. One would assume it would be nearly full-speed. I was looking at a Mac laptop but they're way overpriced for what you get and not highly spec'ed enough for what I want. I wasn't really keen to spend all that money to get a Mac just to run Windows on it :-) Maybe I should just run Linux on the host ;P


                                  0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.

                                  E 1 Reply Last reply
                                  0
                                  • T Taka Muraoka

                                    Jeffry J. Brickley wrote:

                                    a virtual disk heavily cached in memory will run much faster than a virtual disk represented by a single image file.

                                    VMware has a option to use a real disk instead of a virtualized one, although they don't seem to have a lot of faith in this feature. One would assume it would be nearly full-speed. I was looking at a Mac laptop but they're way overpriced for what you get and not highly spec'ed enough for what I want. I wasn't really keen to spend all that money to get a Mac just to run Windows on it :-) Maybe I should just run Linux on the host ;P


                                    0 bottles of beer on the wall, 0 bottles of beer, you take 1 down, pass it around, 4294967295 bottles of beer on the wall. Awasu 2.2.3 [^]: A free RSS/Atom feed reader with support for Code Project.

                                    E Offline
                                    E Offline
                                    El Corazon
                                    wrote on last edited by
                                    #17

                                    Taka Muraoka wrote:

                                    Maybe I should just run Linux on the host

                                    go for it! My only virtual windows is the work edition of windows... talk about keeping work and home separate now... I have all my work on a VM that I take here and there. :)

                                    _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                                    1 Reply Last reply
                                    0
                                    • C Christof Schardt

                                      My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof

                                      A Offline
                                      A Offline
                                      achillepaoloni
                                      wrote on last edited by
                                      #18

                                      In effect your 2GB RAM is very good, but for compilation you could try to search if you RAM is really all available(or how much remain): verify it. Your CPU is hyper-threading this could help you for compile time. I've a other dual-core, model D830 and with example-app in C++, with several files(but not up 500, like your) and in effect respect previous Opteron 142, i've seen that speed is more fast. Find other variables like: 1) Memory type(DDR2 is more fast) and dual-channel configuration 2) CPU clock 3) Cache first and second level 4) Chipset If you've always this application type, with up 500 files, you can try to, not only dual-core, but dual-processor mb and 4GB RAM. Verify last processors and benchmark. Achille

                                      1 Reply Last reply
                                      0
                                      • C Christof Schardt

                                        My C++-project has >500 cpp-Files and takes >20min. to build. Configuration: Visual-C++ (VS2005) on a Pentium 4 with 3GHZ and 2GB RAM. I wonder, whether a new machine could significantly cut down my compile-times. Core2Duo is advertised to perform by dimensions better than all previous processors. Can anyone give an estimation, which gain I could expect from changing to a up-to-date machine (especially with respect to C++-compiling?) Thanks CHristof

                                        D Offline
                                        D Offline
                                        Dan Berger
                                        wrote on last edited by
                                        #19

                                        With something like IncrediBuild (http://www.xoreax.com) you'll probably get a pretty decent speedup with your C++ compiles (assuming you have enough memory to support a couple of compiler processes on your machine). It mostly uses network distribution to speed up compiles but can also take advantage of multiple cores. With plain Visual Studio 2005 you'll also be getting some improvement. VS2005 can build two different configurations in parallel on a dual-core machine. The downside is that each project compiles only on one CPU, so if you have one big project that wouldn't be getting any speedup. - Dan

                                        B 1 Reply Last reply
                                        0
                                        • M Marc Clifton

                                          Having managed C++ projects of that magnitude and more, I was appalled when I stepped into a project and discovered people had build times like that. The first thing I did was rearchitect the project so that there were discrete modules that were disconnected from each other. Yes, that took rearchitecting, but the payoff was tremendous. Consider a project whose development time is 2-3 years, and when you take a build process that takes 20-30 minutes down to 1 minute because you just are rebuilding the module you're working with, that was a major impact to productivity. So, the point being, don't look to better processors. Fix the root of the problem. Marc

                                          Thyme In The Country

                                          People are just notoriously impossible. --DavidCrow
                                          There's NO excuse for not commenting your code. -- John Simmons / outlaw programmer
                                          People who say that they will refactor their code later to make it "good" don't understand refactoring, nor the art and craft of programming. -- Josh Smith

                                          P Offline
                                          P Offline
                                          pg az
                                          wrote on last edited by
                                          #20

                                          Several months back I decided to bother to create my own statically-linked libraries for commmon stuff - I got maybe a factor of two from that. A factor of two on 20 minutes is Good !

                                          pg--az

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups