Future of C++ and Visual C++ within MS
-
C++ is "not yet extinct" at Microsoft. Quite the contrary: "Central to the success of these customers, as well as Microsoft's own internal development, is Visual C++." Hard to believe ... :suss: http://blogs.msdn.com/sripod/archive/2007/06/26/future-of-c-and-visual-c-within-ms-and-elsewhere.aspx[^]
-
C++ is "not yet extinct" at Microsoft. Quite the contrary: "Central to the success of these customers, as well as Microsoft's own internal development, is Visual C++." Hard to believe ... :suss: http://blogs.msdn.com/sripod/archive/2007/06/26/future-of-c-and-visual-c-within-ms-and-elsewhere.aspx[^]
[Message Deleted]
-
[Message Deleted]
Duncan Edwards Jones wrote:
Hmm - it may not be extinct but there's a great big meteorite with "Web 2.0" and "Ruby" and things like that written all over it that you might want to check the trajectory of...just in case
But it does not cover ALL uses of C++. People act like web programming is the only programming on the planet. What ever happened to hardware drivers? direct hardware interfacing? real-time programming? Operating system level development? Network protocol development? video and audio signal processing, analysis, conversion, manipulation and augmentation? Game programming? Computational Physics/Math? Massively parallel operations? 3D graphics from the high level to the low-level. Yes, C# will replace C++ and has for certain types of development. But the world is a biiiiig place! Ruby replaces other sections, but still again, not all. C++ programming is the successor to C. The only thing that can offer a hope of fully replacing C++ is the replacing the wide range of application use, not just the single use. I am not saying anything bad about these languages, they are good, but C++ has a large usage. Just because you no longer need doesn't mean that anyone doing computational physics and weather modelling is going to suddenly switch to Web 2.0 to calculate the drafting effect of a canyon on a weather front moving in from the east. :) Although we're already talking about CUDA for some of that. Still, there is a lot left, physics is not all that we do and I barely touched the list above.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
Duncan Edwards Jones wrote:
Hmm - it may not be extinct but there's a great big meteorite with "Web 2.0" and "Ruby" and things like that written all over it that you might want to check the trajectory of...just in case
But it does not cover ALL uses of C++. People act like web programming is the only programming on the planet. What ever happened to hardware drivers? direct hardware interfacing? real-time programming? Operating system level development? Network protocol development? video and audio signal processing, analysis, conversion, manipulation and augmentation? Game programming? Computational Physics/Math? Massively parallel operations? 3D graphics from the high level to the low-level. Yes, C# will replace C++ and has for certain types of development. But the world is a biiiiig place! Ruby replaces other sections, but still again, not all. C++ programming is the successor to C. The only thing that can offer a hope of fully replacing C++ is the replacing the wide range of application use, not just the single use. I am not saying anything bad about these languages, they are good, but C++ has a large usage. Just because you no longer need doesn't mean that anyone doing computational physics and weather modelling is going to suddenly switch to Web 2.0 to calculate the drafting effect of a canyon on a weather front moving in from the east. :) Although we're already talking about CUDA for some of that. Still, there is a lot left, physics is not all that we do and I barely touched the list above.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
lol 32k... i think my program uses that as a buffer :)
-
Duncan Edwards Jones wrote:
Hmm - it may not be extinct but there's a great big meteorite with "Web 2.0" and "Ruby" and things like that written all over it that you might want to check the trajectory of...just in case
But it does not cover ALL uses of C++. People act like web programming is the only programming on the planet. What ever happened to hardware drivers? direct hardware interfacing? real-time programming? Operating system level development? Network protocol development? video and audio signal processing, analysis, conversion, manipulation and augmentation? Game programming? Computational Physics/Math? Massively parallel operations? 3D graphics from the high level to the low-level. Yes, C# will replace C++ and has for certain types of development. But the world is a biiiiig place! Ruby replaces other sections, but still again, not all. C++ programming is the successor to C. The only thing that can offer a hope of fully replacing C++ is the replacing the wide range of application use, not just the single use. I am not saying anything bad about these languages, they are good, but C++ has a large usage. Just because you no longer need doesn't mean that anyone doing computational physics and weather modelling is going to suddenly switch to Web 2.0 to calculate the drafting effect of a canyon on a weather front moving in from the east. :) Although we're already talking about CUDA for some of that. Still, there is a lot left, physics is not all that we do and I barely touched the list above.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
[Message Deleted]
-
[Message Deleted]
Duncan Edwards Jones wrote:
such as a domain specific modelling language
Perhaps, but that is a lot of domains. Perhaps game programming will develop their own language, but why? they are using C++ and they are C++ programmers. 3D graphics has expanded their language infrastructure into shading languages which do not replace C++ but augment functionality in parallel. From my perspective C++ has been growing even with augmented languages to support its operation. Of course from my perspective we're still killing Fortran. :laugh: But then a fellow programmer thinks we should all jump to Java.... :doh:
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
Ed.Poore wrote:
32K
I think I have a texture somewhere that small.... ;) but I might have to rumage and search for a while....
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
lol 32k... i think my program uses that as a buffer :)
StevenWalsh wrote:
i think my program uses that as a buffer
I think a module within my program uses that as a buffer. I have world textures and physics buffers that dwarf that.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
StevenWalsh wrote:
i think my program uses that as a buffer
I think a module within my program uses that as a buffer. I have world textures and physics buffers that dwarf that.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
Suprisingly lately we've been using quite a few of these in small systems because they tend to be quite efficient and not doing much. The main processor used in our embedded systems has about 138K of RAM & ROM combined, and operates at 20MHz. Most of the time though it's drawing <5mW of power, so compare that to your systems :-D, now who's green :rolleyes:. Reason I say normally is that when it's not in standby it usually draws around 100mW but it's switched off most of the time because most processing is now being done in FPGAs so much much much faster (think speed of light :cool: and you're not far off).
-
Ed.Poore wrote:
32K
I think I have a texture somewhere that small.... ;) but I might have to rumage and search for a while....
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
Suprisingly lately we've been using quite a few of these in small systems because they tend to be quite efficient and not doing much. The main processor used in our embedded systems has about 138K of RAM & ROM combined, and operates at 20MHz. Most of the time though it's drawing <5mW of power, so compare that to your systems :-D, now who's green :rolleyes:. Reason I say normally is that when it's not in standby it usually draws around 100mW but it's switched off most of the time because most processing is now being done in FPGAs so much much much faster (think speed of light :cool: and you're not far off).
Ed.Poore wrote:
The main processor used in our embedded systems has about 138K of RAM & ROM combined, and operates at 20MHz.
Yup, used them before. I didn't always do 3D graphics on err... green... machines. ;) I've done embedded work for various projects. When you are hanging a package from a 3 mile length of kevlar rope and aiming a missile at it (hoping you take the target hanging below, not the package above), you want it all to be cheap, light, low-power and ... well... inexpensive (in labor not just materials, plug-n-err... pray) just in case you have a .... near-miss that hits the package instead of the target (which has happened). Also, if you are putting a telecommunication package on a mini-helicopter, again all the samethings apply. :)
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
[Message Deleted]
-
Ed.Poore wrote:
The main processor used in our embedded systems has about 138K of RAM & ROM combined, and operates at 20MHz.
Yup, used them before. I didn't always do 3D graphics on err... green... machines. ;) I've done embedded work for various projects. When you are hanging a package from a 3 mile length of kevlar rope and aiming a missile at it (hoping you take the target hanging below, not the package above), you want it all to be cheap, light, low-power and ... well... inexpensive (in labor not just materials, plug-n-err... pray) just in case you have a .... near-miss that hits the package instead of the target (which has happened). Also, if you are putting a telecommunication package on a mini-helicopter, again all the samethings apply. :)
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
Ed.Poore wrote:
Isn't that standard for Windows?
true... but I was thinking even hardware interfacing. Hehe, though we did have an embedded Windows machine come crashing down from about 14 feet in height.... with camera attached with USB connections to hardware.... amazingly enough everything but the USB hub survived the fall, though only because we had just shut down the hardware at the time.... I am trying to imagine a disk seek/write on a 14foot pole that decided it wanted to play hammer-time....
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
are you sure that's not cache ram and it needs an external memory connection?
-- You have to explain to them [VB coders] what you mean by "typed". their first response is likely to be something like, "Of course my code is typed. Do you think i magically project it onto the screen with the power of my mind?" --- John Simmons / outlaw programmer
-
[Message Deleted]
-
C++ is "not yet extinct" at Microsoft. Quite the contrary: "Central to the success of these customers, as well as Microsoft's own internal development, is Visual C++." Hard to believe ... :suss: http://blogs.msdn.com/sripod/archive/2007/06/26/future-of-c-and-visual-c-within-ms-and-elsewhere.aspx[^]
Well, could be true of C++ rather than the VC++ dev environment experience. I gather a lot of MS devs don't use Visual Studio for C++ development.
Kevin
-
Ed.Poore wrote:
Isn't that standard for Windows?
true... but I was thinking even hardware interfacing. Hehe, though we did have an embedded Windows machine come crashing down from about 14 feet in height.... with camera attached with USB connections to hardware.... amazingly enough everything but the USB hub survived the fall, though only because we had just shut down the hardware at the time.... I am trying to imagine a disk seek/write on a 14foot pole that decided it wanted to play hammer-time....
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
are you sure that's not cache ram and it needs an external memory connection?
-- You have to explain to them [VB coders] what you mean by "typed". their first response is likely to be something like, "Of course my code is typed. Do you think i magically project it onto the screen with the power of my mind?" --- John Simmons / outlaw programmer
No that'd be program RAM & ROM, you could probably write a small boot loader which goes off and reads more program from an EEPROM but why not shell out a few more pence and get a bigger one. There are applications for such small micros, for example some simple switch devices etc.