Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Considering starting a software project - anything like this exist?

Considering starting a software project - anything like this exist?

Scheduled Pinned Locked Moved The Lounge
csharpvisual-studioc++comhelp
62 Posts 25 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • J Judah Gabriel Himango

    Jörgen Sigvardsson wrote:

    How much C# code do you have really?

    Last I checked, 270,000 LOC. It's probably closer to 300k now.

    Jörgen Sigvardsson wrote:

    I do know that the C# code compiles blazingly fast though.

    It does. The compiling isn't really the issue. You do a compile, it finishes near instantly, then the disk churns as it spits out assemblies and dependencies. Then dependent projects compile, spit out assemblies and depenedencies, etc. until all changed/depedent-on-changed projects are rebuilt. This takes significant time when done locally. One thing I've found is that if you have fewer projects but larger assemblies, build times decrease. However, often times you can't combine projects: separate assemblies for client and server, separate assemblies for shared data between client/server, utility libraries, add-in projects that require their own assemblies, independently-maintained libraries (e.g. a .NET project including the Lucene.NET open source search engine), and so on. What I'm saying is, you often cannot combine many projects. This causes build/run times to slow way down.

    Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

    J Offline
    J Offline
    Jorgen Sigvardsson
    wrote on last edited by
    #29

    I understand your predicament. Basically, it's the overhead of not compiling (shuffling stuff into place) that increases build time. I think you would have to partition the solution in such a way that the distribution nodes can work independently of each other. This may be hard if the projects are tightly coupled to each other. Perhaps you could find a subset of projects that you could build on a single node, and then parallelize the resulting forrest of projects over several nodes. Post some updates here in the Lounge if you come up with something :)

    1 Reply Last reply
    0
    • J Judah Gabriel Himango

      Jörgen Sigvardsson wrote:

      How much C# code do you have really?

      Last I checked, 270,000 LOC. It's probably closer to 300k now.

      Jörgen Sigvardsson wrote:

      I do know that the C# code compiles blazingly fast though.

      It does. The compiling isn't really the issue. You do a compile, it finishes near instantly, then the disk churns as it spits out assemblies and dependencies. Then dependent projects compile, spit out assemblies and depenedencies, etc. until all changed/depedent-on-changed projects are rebuilt. This takes significant time when done locally. One thing I've found is that if you have fewer projects but larger assemblies, build times decrease. However, often times you can't combine projects: separate assemblies for client and server, separate assemblies for shared data between client/server, utility libraries, add-in projects that require their own assemblies, independently-maintained libraries (e.g. a .NET project including the Lucene.NET open source search engine), and so on. What I'm saying is, you often cannot combine many projects. This causes build/run times to slow way down.

      Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

      R Offline
      R Offline
      Rob Graham
      wrote on last edited by
      #30

      If you really have a lot of assemblies, particularly ones that have dependencies on each other, and you are not compiling to a common directory and marking the references "Copy local = false", then you are wasting a lot of time duplicating references to each project build directory. File i/o is eating you alive, not to mention wasting a lot of disk real-estate. Changing so you build to a common output folder, and making sure references aren't copied, could save a lot of time. As a test, write a little utility to edit all the project files to fix up references and change output directory and then try a build. Surely your project files are in some source control system, so if it doesn't help enough, no harm done, just revert all the projects.

      J 1 Reply Last reply
      0
      • J Judah Gabriel Himango

        This project would do .NET compilations - C#, VB.NET, F#, etc.

        Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

        R Offline
        R Offline
        Rick York
        wrote on last edited by
        #31

        Yes, and it seems that the actual compiling is not your bottleneck and that is what the include file thing addresses.

        1 Reply Last reply
        0
        • R Rob Graham

          If you really have a lot of assemblies, particularly ones that have dependencies on each other, and you are not compiling to a common directory and marking the references "Copy local = false", then you are wasting a lot of time duplicating references to each project build directory. File i/o is eating you alive, not to mention wasting a lot of disk real-estate. Changing so you build to a common output folder, and making sure references aren't copied, could save a lot of time. As a test, write a little utility to edit all the project files to fix up references and change output directory and then try a build. Surely your project files are in some source control system, so if it doesn't help enough, no harm done, just revert all the projects.

          J Offline
          J Offline
          Judah Gabriel Himango
          wrote on last edited by
          #32

          Yep, we know. We figure that's a big source of the time taken in build times. We going to do a project reorganization next week that will combine many assemblies into fewer, larger assemblies. (Nonetheless, there are some projects that cannot be merged.) I'll additionally push for a common output directory as you suggest. Still worth pursuing a distributed build system for .NET projects?

          Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

          R 1 Reply Last reply
          0
          • J Judah Gabriel Himango

            Driven by really slow build times in our .NET project here at work, and slow build times on the projects I've done consulting for, I'm considering building a Visual Studio add-in + web service that would allow you to perform a distributed build for .NET projects. Performing a build would go like this: -You click Build->Distributed Build in Visual Studio. -The VS add-in would detect which files have changed since the last build. -It send only the changed bytes of those files compressed to the web service. -The web service, having previously analyzed your project dependence hierarchies, figures out which projects need to be rebuilt. -Spreads the builds across multiple machines, each building some independent project(s). Then: -If you're just trying to build the software, it sends back the build result, success or error messages. -If you're trying to actually run the software, it sends only the changed bytes of the assemblies, compressed, back to the VS add-in. I'm thinking I could get super fast build times doing this, much faster than doing a build locally. Obviously some proof-of-concept is in order. Two Questions for you CPians: -Is there any existing distributed build systems for .NET? I see lots of C/C++, but can't find any for .NET projects. -Is this a worthwhile project? .NET projects generally build fast...until you have a large solution with lots of projects.

            Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

            D Offline
            D Offline
            Daniel Grunwald
            wrote on last edited by
            #33

            Are you using WPF? If so, you're probably aware of the issue that cyclic references between XAML and C# cause the project to be compiled twice. Unfortunately such cyclic references are usually unavoidable. Recently I've been trying to figure out a way to do XAML compilation without the second C# compile run, instead injecting the BAML into the compiled assembly. I've been able to make place inside the Resources segment in an assembly (which means updating a lot of RVAs that point to stuff after the Resources segment); and indeed assemblies "enlarged" this way run on .NET; but for some reason PEVerify doesn't like them and I don't know why. Still, for local development it might be interesting to use this approach, falling back to the double compile run only for release builds. I'll try to continue work on this...

            J 1 Reply Last reply
            0
            • J Judah Gabriel Himango

              Yep, we know. We figure that's a big source of the time taken in build times. We going to do a project reorganization next week that will combine many assemblies into fewer, larger assemblies. (Nonetheless, there are some projects that cannot be merged.) I'll additionally push for a common output directory as you suggest. Still worth pursuing a distributed build system for .NET projects?

              Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

              R Offline
              R Offline
              Rob Graham
              wrote on last edited by
              #34

              Judah Himango wrote:

              Still worth pursuing a distributed build system for .NET projects?

              I don't see how you could efficiently resolve dependency issues - you might end up losing just as much time shuttling/copying dependent assemblies around. An SSD[^]for the output directory would probably help more, and possibly cost about the same as deploying a distributed build system.

              J 1 Reply Last reply
              0
              • D Daniel Grunwald

                Are you using WPF? If so, you're probably aware of the issue that cyclic references between XAML and C# cause the project to be compiled twice. Unfortunately such cyclic references are usually unavoidable. Recently I've been trying to figure out a way to do XAML compilation without the second C# compile run, instead injecting the BAML into the compiled assembly. I've been able to make place inside the Resources segment in an assembly (which means updating a lot of RVAs that point to stuff after the Resources segment); and indeed assemblies "enlarged" this way run on .NET; but for some reason PEVerify doesn't like them and I don't know why. Still, for local development it might be interesting to use this approach, falling back to the double compile run only for release builds. I'll try to continue work on this...

                J Offline
                J Offline
                Judah Gabriel Himango
                wrote on last edited by
                #35

                Daniel Grunwald wrote:

                Are you using WPF?

                Not much. We're mostly WinForms on the client, but we're integrating with some WPF components now.

                Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                1 Reply Last reply
                0
                • R Rob Graham

                  Judah Himango wrote:

                  Still worth pursuing a distributed build system for .NET projects?

                  I don't see how you could efficiently resolve dependency issues - you might end up losing just as much time shuttling/copying dependent assemblies around. An SSD[^]for the output directory would probably help more, and possibly cost about the same as deploying a distributed build system.

                  J Offline
                  J Offline
                  Judah Gabriel Himango
                  wrote on last edited by
                  #36

                  Rob Graham wrote:

                  I don't see how you could efficiently resolve dependency issues

                  It will be tough. I'll see what can be done. I do plan on having 10k RPM HDs or SSDs in a RAID on the build machines. So even if the projects don't have much or any independent bits of work to be done, we can still at least do faster builds than a typical dev machine.

                  Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                  R 1 Reply Last reply
                  0
                  • J Judah Gabriel Himango

                    Rob Graham wrote:

                    I don't see how you could efficiently resolve dependency issues

                    It will be tough. I'll see what can be done. I do plan on having 10k RPM HDs or SSDs in a RAID on the build machines. So even if the projects don't have much or any independent bits of work to be done, we can still at least do faster builds than a typical dev machine.

                    Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                    R Offline
                    R Offline
                    Rob Graham
                    wrote on last edited by
                    #37

                    How will you arbitrate the builds - that is, who gets to build and who doesn't.

                    J 1 Reply Last reply
                    0
                    • R Rob Graham

                      How will you arbitrate the builds - that is, who gets to build and who doesn't.

                      J Offline
                      J Offline
                      Judah Gabriel Himango
                      wrote on last edited by
                      #38

                      Which projects get to build, you're asking? On the client (VS add-in) side, I'd monitor changed source files. When the server (web service) gets these changes, it figures out which projects need to be built based on the project dependence hierarchy tree generated on the server during the one-time initialization of the project.

                      Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                      1 Reply Last reply
                      0
                      • J Judah Gabriel Himango

                        Rocky Moore wrote:

                        Sure must be some kind of application. If you often have to rebuild so many projects that it takes that long to build, perhaps more thought needs to go into the dependencies to make them not quite so tight. Independence can be a good thing

                        Agreed. And we are doing some project-combining next week to make fewer but larger assemblies. Nonetheless, we have unit test projects that need to be separate assemblies, we have client projects separate from server projects (these also cannot be combined), add-in framework assemblies which cannot be combined, shared libraries between client and server, and some server-side components that deliberately are separated from the server assembly so they can be run on a separate machine, and so on. What I'm trying to say is, on larger projects, solutions tend to have many projects, some of which cannot be combined, yet many could be built in parallel.

                        Rocky Moore wrote:

                        I do think the idea of a simple distributed build is a good idea though. It is amazing the functionality was not built in many years ago.

                        Cool, thanks, that's encouraging. Alright, I will pursue this idea further.

                        Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                        R Offline
                        R Offline
                        Rocky Moore
                        wrote on last edited by
                        #39

                        Judah Himango wrote:

                        What I'm trying to say is, on larger projects, solutions tend to have many projects, some of which cannot be combined, yet many could be built in parallel.

                        Yeah, I was not talking as much about combining them, but rather how them different assembles interact, that they can be more loosly coupled I understand that it depends on the application.

                        Rocky <>< Recent Blog Post: Chocolate Chip Cookies!

                        1 Reply Last reply
                        0
                        • J Judah Gabriel Himango

                          Yeah, I'm aware of that. I know .NET doesn't do things this way; the best parallelism we can get is having MSBuild build each project using multiple cores. There isn't file-level parallelism. However, in my anecdotal experience, many .NET solutions have several projects that can be built independently. Potential speed up there. And while MSBuild supports building projects in parallel to a degree, the local disk becomes a bottleneck. By spreading the work across machines, I theoretically could save significant build times.

                          Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                          modified on Tuesday, September 15, 2009 12:27 PM

                          A Offline
                          A Offline
                          Adriaan Davel
                          wrote on last edited by
                          #40

                          I have found that runing RAID 0 on the source code folders (and bin folders) have a very noticable impact on build times... HDD is a huge problem, especially when running top end CPU's and plenty RAM. I haven't played with SDD yet but RAID 0 works very well. Sounds like a very interesting project though, one worth doing as a 'lurning curve' if you have the time, even if it isn't hugely successful...

                          ____________________________________________________________ Be brave little warrior, be VERY brave

                          J 1 Reply Last reply
                          0
                          • B Brady Kelly

                            I have never worked on a project that took more than a minute to build. Where have I been all my life?

                            I have been trying for weeks to get this little site indexed. If you wonder what it is, or would like some informal accommodation for the 2010 World Cup, please click on this link for Rhino Cottages.

                            D Offline
                            D Offline
                            dot_sent
                            wrote on last edited by
                            #41

                            I'm now working on a project which requires ~15 minutes to build on my machine. Rather inconvenient, I would say.

                            1 Reply Last reply
                            0
                            • J Judah Gabriel Himango

                              Our current project here at work takes about 2:30 to build. Even 1:00 is insane, considering we do many builds per day. Make a change to some low-level component, then suddenly all these dependent projects need to build, and you go on a coffee break as VS thrashes your disk for a while. The idea is to make even large solutions (30-50 projects) build in seconds, rather than minutes. The idea is to eliminate coffee breaks. :)

                              Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                              F Offline
                              F Offline
                              Filip C
                              wrote on last edited by
                              #42

                              NEVER EVER eliminate coffee breaks! :omg:

                              1 Reply Last reply
                              0
                              • J Judah Gabriel Himango

                                Our current project here at work takes about 2:30 to build. Even 1:00 is insane, considering we do many builds per day. Make a change to some low-level component, then suddenly all these dependent projects need to build, and you go on a coffee break as VS thrashes your disk for a while. The idea is to make even large solutions (30-50 projects) build in seconds, rather than minutes. The idea is to eliminate coffee breaks. :)

                                Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                                D Offline
                                D Offline
                                dojohansen
                                wrote on last edited by
                                #43

                                Judah Himango wrote:

                                Even 1:00 is insane, considering we do many builds per day.

                                Or perhaps what is "insane" is the number of builds you do per day. What purpose would that serve? I don't know how VS does it's "code analysis" without building on the fly in the background, but anyway it does for the most part do a very good job of immediately indicating invalid parameters, members that don't exist, whether or not a type is recognized (missing the using directive?) and so on. So it seems to me the only time you really need to build is when you need to *run* the code. On the other hand, VS sometimes wants to build stuff on it's own initiative, and often stupidly. For example, if I rename a private method in a "normal" (non-partial) class in the business layer, it sets off to rebuild the web project... I find however that simply using search-and-replace takes care of those cases. The few times it is desireable to rename more accessible members are slow, but it's costing me a few minutes a month. I "lose" *far* more time to codeproject and coffee breaks than to the compiler. When debugging you may of course want to change something and go again. But for that we have edit-and-continue, which is nearly instantanious. Admittedly it doesn't work in web applications, but there are other ways to work around that. In ours I wrote a little console where you can log on to the system the same way you do when using the web ui, and then issue commands to load classes and call methods on them. When our business layer (in class libraries) is run in the context of the console app edit-and-continue does work, which simplifies life for the developer. And it can of course also be used as a crude framework for unit testing (by writing test classes and console scripts that load them and execute test methods). All of that having been said, it can never build too quickly, so I'm sure a lot of people would be interested in any tool that boosted build times 100x. In any event, a programmers job is, in my view, more about thinking that about typing the code. So there's little reason to stop working because you're building.

                                D J 2 Replies Last reply
                                0
                                • D dojohansen

                                  Judah Himango wrote:

                                  Even 1:00 is insane, considering we do many builds per day.

                                  Or perhaps what is "insane" is the number of builds you do per day. What purpose would that serve? I don't know how VS does it's "code analysis" without building on the fly in the background, but anyway it does for the most part do a very good job of immediately indicating invalid parameters, members that don't exist, whether or not a type is recognized (missing the using directive?) and so on. So it seems to me the only time you really need to build is when you need to *run* the code. On the other hand, VS sometimes wants to build stuff on it's own initiative, and often stupidly. For example, if I rename a private method in a "normal" (non-partial) class in the business layer, it sets off to rebuild the web project... I find however that simply using search-and-replace takes care of those cases. The few times it is desireable to rename more accessible members are slow, but it's costing me a few minutes a month. I "lose" *far* more time to codeproject and coffee breaks than to the compiler. When debugging you may of course want to change something and go again. But for that we have edit-and-continue, which is nearly instantanious. Admittedly it doesn't work in web applications, but there are other ways to work around that. In ours I wrote a little console where you can log on to the system the same way you do when using the web ui, and then issue commands to load classes and call methods on them. When our business layer (in class libraries) is run in the context of the console app edit-and-continue does work, which simplifies life for the developer. And it can of course also be used as a crude framework for unit testing (by writing test classes and console scripts that load them and execute test methods). All of that having been said, it can never build too quickly, so I'm sure a lot of people would be interested in any tool that boosted build times 100x. In any event, a programmers job is, in my view, more about thinking that about typing the code. So there's little reason to stop working because you're building.

                                  D Offline
                                  D Offline
                                  dojohansen
                                  wrote on last edited by
                                  #44

                                  By the way "VS" in the above refers to VS-2008.

                                  1 Reply Last reply
                                  0
                                  • J Judah Gabriel Himango

                                    Yeah, I'm aware of that. I know .NET doesn't do things this way; the best parallelism we can get is having MSBuild build each project using multiple cores. There isn't file-level parallelism. However, in my anecdotal experience, many .NET solutions have several projects that can be built independently. Potential speed up there. And while MSBuild supports building projects in parallel to a degree, the local disk becomes a bottleneck. By spreading the work across machines, I theoretically could save significant build times.

                                    Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                                    modified on Tuesday, September 15, 2009 12:27 PM

                                    S Offline
                                    S Offline
                                    Stephen Hewitt
                                    wrote on last edited by
                                    #45

                                    Yeah. I've no trouble accepting that some gain could be achieved.

                                    Steve

                                    1 Reply Last reply
                                    0
                                    • J Judah Gabriel Himango

                                      Our current project here at work takes about 2:30 to build. Even 1:00 is insane, considering we do many builds per day. Make a change to some low-level component, then suddenly all these dependent projects need to build, and you go on a coffee break as VS thrashes your disk for a while. The idea is to make even large solutions (30-50 projects) build in seconds, rather than minutes. The idea is to eliminate coffee breaks. :)

                                      Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                                      S Offline
                                      S Offline
                                      Stephen Hewitt
                                      wrote on last edited by
                                      #46

                                      Beware. Quick build time means less solitaire.

                                      Steve

                                      J 1 Reply Last reply
                                      0
                                      • B Brady Kelly

                                        I have never worked on a project that took more than a minute to build. Where have I been all my life?

                                        I have been trying for weeks to get this little site indexed. If you wonder what it is, or would like some informal accommodation for the 2010 World Cup, please click on this link for Rhino Cottages.

                                        G Offline
                                        G Offline
                                        Gary Wheeler
                                        wrote on last edited by
                                        #47

                                        Lucky you. I've been writing control software for commercial ink-jet printers for nearly 20 years now. Our initial hardware environment was MS-DOS 3.3 on a 16MHz 386-SX processor with 4MB of RAM and a 200MB hard drive. Our development tools were Watcom C 9.0 and Microsoft C 6.0. Build time for the product was approximately 40-50 minutes, from the time it started to the time the 3 archive floppies were written. Our current build machine is a 4 dual-core processor beast with 16G of RAM and 2TB of disk in a RAID 5, running Windows 2003 Server. Our development tools are Visual Studio .NET 2003 (legacy stuff) and Visual Studio 2008 (the Big New Thing™). Build time for the products ranges from 45 minutes to 75 minutes, from the time they start to the time the archive DVD is written. At least we're consistent... :rolleyes:

                                        Software Zen: delete this;

                                        1 Reply Last reply
                                        0
                                        • J Judah Gabriel Himango

                                          Driven by really slow build times in our .NET project here at work, and slow build times on the projects I've done consulting for, I'm considering building a Visual Studio add-in + web service that would allow you to perform a distributed build for .NET projects. Performing a build would go like this: -You click Build->Distributed Build in Visual Studio. -The VS add-in would detect which files have changed since the last build. -It send only the changed bytes of those files compressed to the web service. -The web service, having previously analyzed your project dependence hierarchies, figures out which projects need to be rebuilt. -Spreads the builds across multiple machines, each building some independent project(s). Then: -If you're just trying to build the software, it sends back the build result, success or error messages. -If you're trying to actually run the software, it sends only the changed bytes of the assemblies, compressed, back to the VS add-in. I'm thinking I could get super fast build times doing this, much faster than doing a build locally. Obviously some proof-of-concept is in order. Two Questions for you CPians: -Is there any existing distributed build systems for .NET? I see lots of C/C++, but can't find any for .NET projects. -Is this a worthwhile project? .NET projects generally build fast...until you have a large solution with lots of projects.

                                          Religiously blogging on the intarwebs since the early 21st century: Kineti L'Tziyon Judah Himango

                                          K Offline
                                          K Offline
                                          KramII
                                          wrote on last edited by
                                          #48

                                          The project sounds interesting, but... Considering the man-hour cost of writing your software, I honestly wonder if you wouldn't be better off putting your money into better (faster) hardware?

                                          KramII

                                          J 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups