University, C, C++, Java and .NET
-
We all know that the most "good" applications are still written in C or C++. Java is almost never used fo desktop applications. .NET rocks, but it's not very much used for "big" commercial applications. During my first year of university I learned ANSI C, including network programming on the Linux platform. During the second year I learned Java (not very well, actually) and the (most basic) fundamentals of C++, ADA (don't ask!) and Python. During the third (current) year, I have no more coding lessons. Since the first year I started learning .NET, and now I have even a job as web programmer and designer, on the ASP.NET platform. Of course I write many .NET desktop applications, the most for my own (tools and utilities). This year I have three university projects: - Software Enginnering (I have been authorized to use C# and .NET instead of Java :cool:) - Logical Networks (don't know how it's called in English the hardware design) - Web applications (I think I'll have to use JSP) The first one is ok for me. The second one consists in the development of a mp3 hardware encoder, using a Diopsis 740 board (ARM7 TDMI + mAgic DSP). The last one hasn't been defined yet. For the second one, we have to write the algorithm in C. We mostly don't remember nothing about C. I'm even no longer able to think non-OO. I wonder if you are in the same situation. If you use .NET (or, say, Java) for three years, are you still capable of a non-OO though? I'm not, and I'm sad for this. When I remember how more difficult it is to write (working) C code than .NET code, I feel even scared. .NET is a sort of vitiating platform. It's too comfortable, too easy and too robust to make you return into C. You're marked forever by .NET, you even unlearn the basics of good C programming. Maybe I'm wrong. ___________________________________ Tozzi is right: Gaia is getting rid of us. My Blog [ITA]
-
We all know that the most "good" applications are still written in C or C++. Java is almost never used fo desktop applications. .NET rocks, but it's not very much used for "big" commercial applications. During my first year of university I learned ANSI C, including network programming on the Linux platform. During the second year I learned Java (not very well, actually) and the (most basic) fundamentals of C++, ADA (don't ask!) and Python. During the third (current) year, I have no more coding lessons. Since the first year I started learning .NET, and now I have even a job as web programmer and designer, on the ASP.NET platform. Of course I write many .NET desktop applications, the most for my own (tools and utilities). This year I have three university projects: - Software Enginnering (I have been authorized to use C# and .NET instead of Java :cool:) - Logical Networks (don't know how it's called in English the hardware design) - Web applications (I think I'll have to use JSP) The first one is ok for me. The second one consists in the development of a mp3 hardware encoder, using a Diopsis 740 board (ARM7 TDMI + mAgic DSP). The last one hasn't been defined yet. For the second one, we have to write the algorithm in C. We mostly don't remember nothing about C. I'm even no longer able to think non-OO. I wonder if you are in the same situation. If you use .NET (or, say, Java) for three years, are you still capable of a non-OO though? I'm not, and I'm sad for this. When I remember how more difficult it is to write (working) C code than .NET code, I feel even scared. .NET is a sort of vitiating platform. It's too comfortable, too easy and too robust to make you return into C. You're marked forever by .NET, you even unlearn the basics of good C programming. Maybe I'm wrong. ___________________________________ Tozzi is right: Gaia is getting rid of us. My Blog [ITA]
I learned Java at university but I'm programming my own stuff in .NET . Recently I was doing some C-programming but I had no major problems with it, the only stuff I really had to think about was (function) pointers, that's kind of like programming in stone age once you got used to .NET and Java. At university I also had to program in VHDL (hardware CPU simulation and that stuff) but I got used to it pretty quickly. I think once you know the basics of programming you can adapt to other languages, but my language of choice is definately .NET (C#) regards
-
We all know that the most "good" applications are still written in C or C++. Java is almost never used fo desktop applications. .NET rocks, but it's not very much used for "big" commercial applications. During my first year of university I learned ANSI C, including network programming on the Linux platform. During the second year I learned Java (not very well, actually) and the (most basic) fundamentals of C++, ADA (don't ask!) and Python. During the third (current) year, I have no more coding lessons. Since the first year I started learning .NET, and now I have even a job as web programmer and designer, on the ASP.NET platform. Of course I write many .NET desktop applications, the most for my own (tools and utilities). This year I have three university projects: - Software Enginnering (I have been authorized to use C# and .NET instead of Java :cool:) - Logical Networks (don't know how it's called in English the hardware design) - Web applications (I think I'll have to use JSP) The first one is ok for me. The second one consists in the development of a mp3 hardware encoder, using a Diopsis 740 board (ARM7 TDMI + mAgic DSP). The last one hasn't been defined yet. For the second one, we have to write the algorithm in C. We mostly don't remember nothing about C. I'm even no longer able to think non-OO. I wonder if you are in the same situation. If you use .NET (or, say, Java) for three years, are you still capable of a non-OO though? I'm not, and I'm sad for this. When I remember how more difficult it is to write (working) C code than .NET code, I feel even scared. .NET is a sort of vitiating platform. It's too comfortable, too easy and too robust to make you return into C. You're marked forever by .NET, you even unlearn the basics of good C programming. Maybe I'm wrong. ___________________________________ Tozzi is right: Gaia is getting rid of us. My Blog [ITA]
-
We all know that the most "good" applications are still written in C or C++. Java is almost never used fo desktop applications. .NET rocks, but it's not very much used for "big" commercial applications. During my first year of university I learned ANSI C, including network programming on the Linux platform. During the second year I learned Java (not very well, actually) and the (most basic) fundamentals of C++, ADA (don't ask!) and Python. During the third (current) year, I have no more coding lessons. Since the first year I started learning .NET, and now I have even a job as web programmer and designer, on the ASP.NET platform. Of course I write many .NET desktop applications, the most for my own (tools and utilities). This year I have three university projects: - Software Enginnering (I have been authorized to use C# and .NET instead of Java :cool:) - Logical Networks (don't know how it's called in English the hardware design) - Web applications (I think I'll have to use JSP) The first one is ok for me. The second one consists in the development of a mp3 hardware encoder, using a Diopsis 740 board (ARM7 TDMI + mAgic DSP). The last one hasn't been defined yet. For the second one, we have to write the algorithm in C. We mostly don't remember nothing about C. I'm even no longer able to think non-OO. I wonder if you are in the same situation. If you use .NET (or, say, Java) for three years, are you still capable of a non-OO though? I'm not, and I'm sad for this. When I remember how more difficult it is to write (working) C code than .NET code, I feel even scared. .NET is a sort of vitiating platform. It's too comfortable, too easy and too robust to make you return into C. You're marked forever by .NET, you even unlearn the basics of good C programming. Maybe I'm wrong. ___________________________________ Tozzi is right: Gaia is getting rid of us. My Blog [ITA]
Dario Solera wrote:
If you use .NET (or, say, Java) for three years, are you still capable of a non-OO though? I'm not, and I'm sad for this.
I will have to say some do and some do not. After reading your bio however I question that you may be selling yourself short. You said "He develops C# .NET Windows Applications, WebServices and Web Applications." I ask what effort is made in deciding what Web Services to create? That should the analysis of the business process, which is not OO. If you are just coding the web services you have been told to create then your self assesment maybe correct. This should not be a hindrence to you but an opportunity for you to expand your education. What options do you have to study business processes? If none are readily available, then a trip to the library is always and option. I found a book called "Document Engineering" (ISBN: 0262072610) to be worth reading. "Every new day begins with possibilities. It's up to us to fill it with things that move us toward progress and peace.” (Ronald Reagan)
-
We all know that the most "good" applications are still written in C or C++. Java is almost never used fo desktop applications. .NET rocks, but it's not very much used for "big" commercial applications. During my first year of university I learned ANSI C, including network programming on the Linux platform. During the second year I learned Java (not very well, actually) and the (most basic) fundamentals of C++, ADA (don't ask!) and Python. During the third (current) year, I have no more coding lessons. Since the first year I started learning .NET, and now I have even a job as web programmer and designer, on the ASP.NET platform. Of course I write many .NET desktop applications, the most for my own (tools and utilities). This year I have three university projects: - Software Enginnering (I have been authorized to use C# and .NET instead of Java :cool:) - Logical Networks (don't know how it's called in English the hardware design) - Web applications (I think I'll have to use JSP) The first one is ok for me. The second one consists in the development of a mp3 hardware encoder, using a Diopsis 740 board (ARM7 TDMI + mAgic DSP). The last one hasn't been defined yet. For the second one, we have to write the algorithm in C. We mostly don't remember nothing about C. I'm even no longer able to think non-OO. I wonder if you are in the same situation. If you use .NET (or, say, Java) for three years, are you still capable of a non-OO though? I'm not, and I'm sad for this. When I remember how more difficult it is to write (working) C code than .NET code, I feel even scared. .NET is a sort of vitiating platform. It's too comfortable, too easy and too robust to make you return into C. You're marked forever by .NET, you even unlearn the basics of good C programming. Maybe I'm wrong. ___________________________________ Tozzi is right: Gaia is getting rid of us. My Blog [ITA]
Dario Solera wrote:
You're marked forever by .NET, you even unlearn the basics of good C programming.
Good programming skills are the same, no matter what the language. The basics of good C programming are what my career has been built on as I've progressed from C->C++->C#. You may have to look-up the odd bit of syntax here and there, but the structure and foundations are little different. Michael CP Blog [^] Development Blog [^]
-
We all know that the most "good" applications are still written in C or C++. Java is almost never used fo desktop applications. .NET rocks, but it's not very much used for "big" commercial applications. During my first year of university I learned ANSI C, including network programming on the Linux platform. During the second year I learned Java (not very well, actually) and the (most basic) fundamentals of C++, ADA (don't ask!) and Python. During the third (current) year, I have no more coding lessons. Since the first year I started learning .NET, and now I have even a job as web programmer and designer, on the ASP.NET platform. Of course I write many .NET desktop applications, the most for my own (tools and utilities). This year I have three university projects: - Software Enginnering (I have been authorized to use C# and .NET instead of Java :cool:) - Logical Networks (don't know how it's called in English the hardware design) - Web applications (I think I'll have to use JSP) The first one is ok for me. The second one consists in the development of a mp3 hardware encoder, using a Diopsis 740 board (ARM7 TDMI + mAgic DSP). The last one hasn't been defined yet. For the second one, we have to write the algorithm in C. We mostly don't remember nothing about C. I'm even no longer able to think non-OO. I wonder if you are in the same situation. If you use .NET (or, say, Java) for three years, are you still capable of a non-OO though? I'm not, and I'm sad for this. When I remember how more difficult it is to write (working) C code than .NET code, I feel even scared. .NET is a sort of vitiating platform. It's too comfortable, too easy and too robust to make you return into C. You're marked forever by .NET, you even unlearn the basics of good C programming. Maybe I'm wrong. ___________________________________ Tozzi is right: Gaia is getting rid of us. My Blog [ITA]
Dario Solera wrote:
.NET rocks, but it's not very much used for "big" commercial applications.
Thats assuming you've been told about them. The London Stock Exchange runs on .NET. The rewrite took place a couple of years ago I think, and they have made massive cost savings as well as huge scalability improvements, to the point where they have features they can make avaliable to customers that no other exchange can do for a few more years.
-
We all know that the most "good" applications are still written in C or C++. Java is almost never used fo desktop applications. .NET rocks, but it's not very much used for "big" commercial applications. During my first year of university I learned ANSI C, including network programming on the Linux platform. During the second year I learned Java (not very well, actually) and the (most basic) fundamentals of C++, ADA (don't ask!) and Python. During the third (current) year, I have no more coding lessons. Since the first year I started learning .NET, and now I have even a job as web programmer and designer, on the ASP.NET platform. Of course I write many .NET desktop applications, the most for my own (tools and utilities). This year I have three university projects: - Software Enginnering (I have been authorized to use C# and .NET instead of Java :cool:) - Logical Networks (don't know how it's called in English the hardware design) - Web applications (I think I'll have to use JSP) The first one is ok for me. The second one consists in the development of a mp3 hardware encoder, using a Diopsis 740 board (ARM7 TDMI + mAgic DSP). The last one hasn't been defined yet. For the second one, we have to write the algorithm in C. We mostly don't remember nothing about C. I'm even no longer able to think non-OO. I wonder if you are in the same situation. If you use .NET (or, say, Java) for three years, are you still capable of a non-OO though? I'm not, and I'm sad for this. When I remember how more difficult it is to write (working) C code than .NET code, I feel even scared. .NET is a sort of vitiating platform. It's too comfortable, too easy and too robust to make you return into C. You're marked forever by .NET, you even unlearn the basics of good C programming. Maybe I'm wrong. ___________________________________ Tozzi is right: Gaia is getting rid of us. My Blog [ITA]
Dario Solera wrote:
.NET rocks, but it's not very much used for "big" commercial applications.
Only becuase most big apps predate it and aren't seen as worth rewriting. Masses of cobal/fortran code are still in use for the same reason. At the same time, most new apps are being written in some form of managed code.