Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. .NET (Core and Framework)
  4. Future Real-Time Development?

Future Real-Time Development?

Scheduled Pinned Locked Moved .NET (Core and Framework)
csharpc++questionhtmlvisual-studio
3 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • B Offline
    B Offline
    BrianEllis
    wrote on last edited by
    #1

    Thanks for any assistance ahead of time. My question is in general more of a .NET question, but since I'm a C#/C++ person and C# is the ideal .NET language, I'm gonna take a chance posting here. I work for a government contractor and a lot of the things we handle are real-time applications. Data collection, analysis, derived data, etc. and all of it is coming down the pipe in real-time. I am a HUGE .NET fan and advocate, I went to training for C# and use it whenever I can, but when I run into something real-time I always end up going back to VC++ (probably developed in the .NET IDE, but not managed code). Example: I had to rewrite a program that took real-time data from a program, calculated derived data from the input, then passed the original along with the derived back into the main program. The program was much more simple to write in C# with the use of the Code-DOM, but it hit us with 60-100% CPU usage when we hit it with real data. After the rewrite (In VC++ 6 & MFC), we have virtually no limit on the speed we can process samples and our CPU usage has gone back to 3% - 6%. I have heard the same stories and worries from people at NASA, GM and other groups. In other words, what happens to all of the real-time applications when Longhorn, Indigo, Yukon and the rest make managed code the game and real-time is not possible? I know that Microsoft has said they are moving to a total managed code scheme eventually. This puts a lot of organizations, including a huge part of the U.S. government out looking for new solutions.:((

    M 1 Reply Last reply
    0
    • B BrianEllis

      Thanks for any assistance ahead of time. My question is in general more of a .NET question, but since I'm a C#/C++ person and C# is the ideal .NET language, I'm gonna take a chance posting here. I work for a government contractor and a lot of the things we handle are real-time applications. Data collection, analysis, derived data, etc. and all of it is coming down the pipe in real-time. I am a HUGE .NET fan and advocate, I went to training for C# and use it whenever I can, but when I run into something real-time I always end up going back to VC++ (probably developed in the .NET IDE, but not managed code). Example: I had to rewrite a program that took real-time data from a program, calculated derived data from the input, then passed the original along with the derived back into the main program. The program was much more simple to write in C# with the use of the Code-DOM, but it hit us with 60-100% CPU usage when we hit it with real data. After the rewrite (In VC++ 6 & MFC), we have virtually no limit on the speed we can process samples and our CPU usage has gone back to 3% - 6%. I have heard the same stories and worries from people at NASA, GM and other groups. In other words, what happens to all of the real-time applications when Longhorn, Indigo, Yukon and the rest make managed code the game and real-time is not possible? I know that Microsoft has said they are moving to a total managed code scheme eventually. This puts a lot of organizations, including a huge part of the U.S. government out looking for new solutions.:((

      M Offline
      M Offline
      Mike Dimmick
      wrote on last edited by
      #2

      Hang on, what were you using CodeDOM for?? It's for generating code at run-time. I have to defer to Rico Mariani: Know what things cost[^]. It sounds a little like you may have used some 'cool' techniques that were too slow for what you needed. .NET isn't really suitable for truly real-time systems because you can never predict when a garbage collection is going to occur, nor how long it will take. You'll still be able to write and use unmanaged C++ programs on Longhorn, IIRC. The new Avalon GUI system is .NET-only, but your data collection and manipulation code is in a separate process already, right? You can keep this architecture, using some form of communication channel or shared memory to communicate with the presentation process if necessary. You can also have hybrid managed/unmanaged processes, with some threads managed and others unmanaged if you wish.

      B 1 Reply Last reply
      0
      • M Mike Dimmick

        Hang on, what were you using CodeDOM for?? It's for generating code at run-time. I have to defer to Rico Mariani: Know what things cost[^]. It sounds a little like you may have used some 'cool' techniques that were too slow for what you needed. .NET isn't really suitable for truly real-time systems because you can never predict when a garbage collection is going to occur, nor how long it will take. You'll still be able to write and use unmanaged C++ programs on Longhorn, IIRC. The new Avalon GUI system is .NET-only, but your data collection and manipulation code is in a separate process already, right? You can keep this architecture, using some form of communication channel or shared memory to communicate with the presentation process if necessary. You can also have hybrid managed/unmanaged processes, with some threads managed and others unmanaged if you wish.

        B Offline
        B Offline
        BrianEllis
        wrote on last edited by
        #3

        Sorry, I should have been a little more clear. The derived calculations on that particular program are loaded in by an engineer, who writes his derived calcs. in C#. However, that is only done once at the beginning of the program, then loaded in memory as a seperate assembly (all using the Code-DOM). The new version does the same thing using .dll's, etc. The real-time part of the program though is basically a message, data, timing pump that just pulls the data, fires processing if necessary and pumps it back on schedule. No 'cool' techniques really inside of either version. You said: You'll still be able to write and use unmanaged C++ programs on Longhorn, IIRC. The new Avalon GUI system is .NET-only, but your data collection and manipulation code is in a separate process already, right? Is that so? I really haven't heard that. I read that there would be an entirely new Windows API, (not Win32) and I thought that the new API was built on .NET. If that's not true and it will just be an updated API that we can interface with natively, that will be better than .NET managed under the hood. Granted, it's impossible to get true real-time out of Windows anyway since it has a 12 millisec. window it may or may not hit for your instructions, but with .NET under the hood, even our soft real-time stuff would probably have to be ported. Do you know of any references to this subject (Real-time references, new Longhorn API, etc.?) that might be helpful to me? Brian

        1 Reply Last reply
        0
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


        • Login

        • Don't have an account? Register

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • World
        • Users
        • Groups