Python - no arguments, please
-
I've not had experience with large (and long) python projects yet, but two things really stick out for me when I think about your statement: 1. Comments seem to be very much a second (or third or fourth) consideration. Not even having formal syntax support for multi-line comments just strikes me as being of the mindset that support for comments is a necessary evil, not an integral part of documenting the niggly things that a dev, 3 years later, will need to refer to 2. The absolutely terrible variable names you see in so many samples. Having started my career being forced to maintain a massive legacy FORTRAN codebase in a research institution many, many years ago, I'm still scarred by the variables named
A
,AA
,AAA
,A2
,B2
and so on. And no, this isn't hyperbole: this was the common naming method. I don't see stuff as bad in Python, but naming isn't exactly deeply rooted in the Pythonic culture. It really does not help the cause of maintainability.cheers Chris Maunder
-
PIEBALDconsult wrote:
You don't have to build it before deploying -- just copy the code to the destination. Edit: Oh, I had read that as advantages, not disadvantages.
That still works as a disadvantage - with no compilation step, you've lost a basic sanity check on the code you're deploying. :)
"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer
I used to develop glue code (mostly monitoring and piping data between applications) for a major electric utility. The systems had to run in real time 24x7. I found scripted code to be an advantage because the on-call person always had easy access to the production code in the event of a failure anywhere in the system (often due to an external problem with a data source). This meant that emergency patches were easily applied. No special development environment was required. Just because the production code was scripted doesn't mean it wasn't thoroughly tested before going live.
-
Gerry Schmitz wrote:
Python is interpreted; easier to get started with (IMO).
That certainly was an essential argument in favor of interpreted languages ... Long time ago. In my student days as a junior, around 1980, our group project - no more than a couple thousand lines - required more than half an hour compilation time, on a VAX. So we made sure to make all known changes/fixes in the source code before starting a recompilation. I haven't timed compilers for a few years. Last time I did timing, on a complete recompilation of a system with a few hundred modules, on the average the compiler produced eight object code modules per second. If you use an IDE such as VS, which takes the responsibility for recompiling modified modules only, compilation is practically unnoticeable. No compilation delay once was an argument in favor of interpretation. It no longer is.
trønderen wrote:
In my student days as a junior, around 1980, our group project - no more than a couple thousand lines - required more than half an hour compilation time, on a VAX. So we made sure to make all known changes/fixes in the source code before starting a recompilation.
My first year of college we used punch cards -- had to carefully type the deck (make a mistake, throw out the card), bundle it up, and drop it off at the data center. At the beginning of the semester, come back an hour later to pick up the printout and deck. At the end of the semester? The turnaround was 12 hours. That type of hassle makes more careful programmers, as we needed to get things right on the first try, not F5 / fix a line / F5 / fix a line / F5 / repeat until it looks like it works. I hated it, but it was excellent training. Those that screwed around and waited to write & run their programs typically switched majors in their second semester.
-
I am NOT criticizing Python and I implore the reader please do not start nor engage in any “flame war” on Python. I have programmed in a number of different languages over my 40 years as a developer. My current languages that I use the most are C# and T-SQL. I see that Python is popular, or at least appears so. What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#? Is Python a better general purpose language, or is it better at some specific niche(s) in development? I would really like to get a clearer picture of why using Python along with, or in place of, C# would be of value. Thanks in advance for the non-flame responses.
Great questions. I have recently started using Python for machine vision prototyping as well as exploring the machine learning libraries. While I like the language, the lack of structured architecture is a bit unnerving. It reminds me of National Instruments LabVIEW which is also very easy to use and make a big mess in. I have had conversations with our software department and there are some very good libraries available in Python for complex math operations that are well documented and have community support. There are similar libraries available in C#; however, they are difficult to implement and lack proper documentation or support. I think this may have more to do with the underlying code that supports Python is C++ rather than C#, so there is some conversion/wrapping that needs to be done to make the library available to C#. I think it comes down to your use case, but I don't think there is anything available in Python that isn't available in C#. A lot of the available Python libraries are wrappers for other general purpose code blobs available in other languages. Case in point is the Kivy library for building UI interfaces. The code base underpinning Kivy is god-level genius. Some of the implemenation requirements are a bit clunky, but I have been impressed so far with the library.
-
I used to develop glue code (mostly monitoring and piping data between applications) for a major electric utility. The systems had to run in real time 24x7. I found scripted code to be an advantage because the on-call person always had easy access to the production code in the event of a failure anywhere in the system (often due to an external problem with a data source). This meant that emergency patches were easily applied. No special development environment was required. Just because the production code was scripted doesn't mean it wasn't thoroughly tested before going live.
Absolutely. As long as the change gets into source control. But it's also too easy for a properly-authorized bad-actor to make illegitimate changes to such a system as well.
-
Absolutely. As long as the change gets into source control. But it's also too easy for a properly-authorized bad-actor to make illegitimate changes to such a system as well.
-
I am NOT criticizing Python and I implore the reader please do not start nor engage in any “flame war” on Python. I have programmed in a number of different languages over my 40 years as a developer. My current languages that I use the most are C# and T-SQL. I see that Python is popular, or at least appears so. What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#? Is Python a better general purpose language, or is it better at some specific niche(s) in development? I would really like to get a clearer picture of why using Python along with, or in place of, C# would be of value. Thanks in advance for the non-flame responses.
MSBassSinger wrote:
What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#?
Not to be flippant, but if you have C#/T-SQL experience, go ahead and learn Python and you will discover the advantages and disadvantages, and sometimes they overlap. I used Python for a large (60+ Beaglebone SBC's [Single Board Computer]) inhouse project for a customer. (If any of you know my somewhat colorful past, you'll know what "industry" this was for.) We implemented a simple web server for each Beaglebone, used RabbitMq for messaging, had a small screen with a GTK interface for the graphics, and custom IO for the various buttons and one-wire iButton readers, all running under Debian. The cool thing was that I could test all the software (GTK runs in Windows as well) and emulate the hardware I/O on my Windows machine, debugging it directly in Visual Studio. And the software included an auto-update process that would automatically update all 60+ Beaglebones (that took some trial and error but eventually worked.) Being able to test the app on Windows and deploy it automatically with WinSCP to the test jigs (I had 6 Beaglebones at home for testing) was, frankly, a very pleasant experience. Would I write a professional web server with database requirements in Python? Heck no, but Python definitely has its uses, certainly in the SBC arena.
Latest Article:
Create a Digital Ocean Droplet for .NET Core Web API with a real SSL Certificate on a Domain -
trønderen wrote:
In my student days as a junior, around 1980, our group project - no more than a couple thousand lines - required more than half an hour compilation time, on a VAX. So we made sure to make all known changes/fixes in the source code before starting a recompilation.
My first year of college we used punch cards -- had to carefully type the deck (make a mistake, throw out the card), bundle it up, and drop it off at the data center. At the beginning of the semester, come back an hour later to pick up the printout and deck. At the end of the semester? The turnaround was 12 hours. That type of hassle makes more careful programmers, as we needed to get things right on the first try, not F5 / fix a line / F5 / fix a line / F5 / repeat until it looks like it works. I hated it, but it was excellent training. Those that screwed around and waited to write & run their programs typically switched majors in their second semester.
In 1978, I was in the last freshman class to use punched cards. Two years later, a group of professors and graduate students from our university was on a visit to MIT. Somewhat embarrassed, they revealed that not until last year (i.e. 1979) was the introductory programming course run on interactive terminals. The MIT people stalled: Interactive terminals in an introductory programming course? At that time at MIT, interactive terminals were reserved for graduate work! In 1978, a 12 h turnaround was unheard of. Usually, the printout was on the shelves the next day, but in rush periods, it could take two days. Be careful to note, though, that out of those 48 hours, maybe five seconds were compilation and running time. The rest of the time, the deck was sitting in the input queue (a physical one!), being handled mechanically, or the printout laying stacked up in the line printer output tray waiting to be carried to the output shelves. If the operators had been given interpreters for interpreting the card decks, rather than compilers and run time systems, it would have affected the turnaround time nothing at all. You are most certainly correct: It made us more careful programmers. It was excellent training. Old memory worth recalling: In the compiler construction course, one essential quality metric was the compiler's ability to detect all, or as many as possible, (real, primary) errors in one compilation run. So we all became fans of LALR over recursive descent :-) The first compiler I studied close up was the classic recursive Pascal P4 compiler (open source didn't come with Linux!), being impressed by the number of tricks it did to be able to continue compilation even after quite serious syntax errors, while generating as few second order error messages as possible. Today, who cares at all for such qualities? Many times have I seen coworkers getting a long list of error reports, fixing the first ten, and ignoring the rest before they rebuild the system from scratch 'so that we are not bothered by second order error messages from the first ten errors'. To some degree they are right: Modern compilers does a much poorer job of hiding already reported errors and avoiding second order errors.
-
I decided to do a couple of compiles, partly to show how much faster CMake/Ninja is than MSBuild. 240KLOCs of C++, MSVC compiler, VS2022: - MSBuild, x86 Release: 2m 52s - CMake/Ninja, x64 Release: 0m 32s I was forced to switch to CMake/Ninja when targeting for Linux instead of just Windows. I wish I'd gotten around to it sooner.
Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing.Greg Utas wrote:
MSBuild, x86 Release: 2m 52s CMake/Ninja, x64 Release: 0m 32s
A factor of 5.4 for (presumably) doing identical jobs would make me very cautious. I would not take that at face value as an indicator of 'typical' performance, but spend some effort on learning what makes the one alternative more than five times faster. Although x86 and x64 are not quite apples and oranges, it is at least apples and pears. So the two jobs are not identical. Obviously, the compilers are different. Even if they have the same call line interface, different modules are activated. Were all the options exactly the same? E.g. the same level of optimization. The same amount of runtime checks. Etc. Did the two jobs generate the same number of compiler activations, and the same number of object files? With two different target architectures, you cannot except exactly the same number of object files, but they should be comparable. Where both jobs clean compiles? This includes e.g. precompilation of header files. For a 'fair' compare, you could run the job on a newly formatted disk, but if this forces one setup to do heavy one-time preparations that saves a lot of time later, maybe it isn't as 'fair' as you first thought. If you are doing an incremental, non-clean compile: Were exactly the same changes made in both cases? Are the dependency rules set up identically in the two alternatives? Did both jobs do the same kind of preparatory work, e.g. building the dependencies? If the job 'in principle' is of the same kind, were there significant differences, such as in one case, the developer supplies dependencies, while in the other, it is automatically detected through analysis of the source code? Even for a clean compile: Are the dependency rules set up 'ideally'? I have seen compile logs from large compilations (typically 30-60 minutes build time) compiling the same source file five times. This happens not once, but often! The developers argue that to maintain their part of the build files efficiently, they need to be independent of what the other guys are doing, and need to maintain their own independent dependencies ... (And they refuse to let a separate team or expert do all build file maintenance, claiming that is it too tightly interwoven with the source code.) Did both jobs provide the same (/comparable) results? E.g. did they both include complete linking into an executable? Did both deliver auto-generated documentatio
-
I am NOT criticizing Python and I implore the reader please do not start nor engage in any “flame war” on Python. I have programmed in a number of different languages over my 40 years as a developer. My current languages that I use the most are C# and T-SQL. I see that Python is popular, or at least appears so. What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#? Is Python a better general purpose language, or is it better at some specific niche(s) in development? I would really like to get a clearer picture of why using Python along with, or in place of, C# would be of value. Thanks in advance for the non-flame responses.
I work out in the real world writing C#, JavaScript, TypeScript, and SQL everyday. I also teach one night a week at a community college, Data Structures in the Spring semester and a programming language (currently C) in the Fall semester. My observations is that Python is a scripting language and, like other scripting languages, useful for doing things a little more rapidly but inexactly. My observation of students who want to know why they can't use Python in the Data Structures class (college requires C++ for transfer reasons) is that the new crop of students really don't understand what or why they are doing things but are monkey-see, monkey-do programmers. Of course that doesn't apply to the 5 - 10 % of my students who really DO understand how a computer works. The net being, a lot of computer is "close enough is good enough" in our world of today. Obviously that doesn't apply to certain financial type transactions in banking, real estate, etc, but DOES apply to a lot of things that are just providing info.
-
Greg Utas wrote:
MSBuild, x86 Release: 2m 52s CMake/Ninja, x64 Release: 0m 32s
A factor of 5.4 for (presumably) doing identical jobs would make me very cautious. I would not take that at face value as an indicator of 'typical' performance, but spend some effort on learning what makes the one alternative more than five times faster. Although x86 and x64 are not quite apples and oranges, it is at least apples and pears. So the two jobs are not identical. Obviously, the compilers are different. Even if they have the same call line interface, different modules are activated. Were all the options exactly the same? E.g. the same level of optimization. The same amount of runtime checks. Etc. Did the two jobs generate the same number of compiler activations, and the same number of object files? With two different target architectures, you cannot except exactly the same number of object files, but they should be comparable. Where both jobs clean compiles? This includes e.g. precompilation of header files. For a 'fair' compare, you could run the job on a newly formatted disk, but if this forces one setup to do heavy one-time preparations that saves a lot of time later, maybe it isn't as 'fair' as you first thought. If you are doing an incremental, non-clean compile: Were exactly the same changes made in both cases? Are the dependency rules set up identically in the two alternatives? Did both jobs do the same kind of preparatory work, e.g. building the dependencies? If the job 'in principle' is of the same kind, were there significant differences, such as in one case, the developer supplies dependencies, while in the other, it is automatically detected through analysis of the source code? Even for a clean compile: Are the dependency rules set up 'ideally'? I have seen compile logs from large compilations (typically 30-60 minutes build time) compiling the same source file five times. This happens not once, but often! The developers argue that to maintain their part of the build files efficiently, they need to be independent of what the other guys are doing, and need to maintain their own independent dependencies ... (And they refuse to let a separate team or expert do all build file maintenance, claiming that is it too tightly interwoven with the source code.) Did both jobs provide the same (/comparable) results? E.g. did they both include complete linking into an executable? Did both deliver auto-generated documentatio
Both were clean compiles, and both use the MSVC compiler. It's the front ends that differ. The options for x86 and x64 are basically the same, and so were MSBuild times for x86 and x64 when I used to build both with MSBuild. When I switched to CMake, I got rid of my .vcxproj files. The MSBuild time for x86 is still about the same. I have no idea why CMake/Ninja is that much faster; I'm just happy about it and have no desire to investigate why. True, this has nothing to do with whether code is compiled or interpreted. I just wanted to point out that C++ compiles for a large code base are fairly fast but that, even then, there can be significant differences.
Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing. -
I would add: - it's free - it works everywhere. And I mean browsers, iPhones, computing cards, workstations, PCs, macs, [watches](https://www.zerynth.com/blog/programming-hexiwear-wearable-iot-in-python-using-zerynth/) - there's a huge number of libraries and code examples available. - it's a very approachable language for teaching. As much as Python makes me swear some days, I'd recommend it as a teaching language over C or C++ for power and simplicity, and also over C# and Java for how ubiquitous it is and how it's not tied down to any platform or vendor. I love C#, very much, but it's a very Microsoft-centric experience (and mindset) still and that's not healthy for someone starting out who needs every option open to them. - It's the language that anyone dealing with data analysis will use. Engineers, environmental scientists, Data analysis, AI (Obviously). The introduction of Jupyter notebooks was such a boon (and obviously this is no longer Python-only) I will say, though, that the library system will do your head in. It's wonderful until it's not. BUT: you generally get the code, so last ditch efforts of debugging and manual patching can work in emergencies. It also has some awkward syntax. Very awkward. And the whole culture is a little weird, and dare I say fanatical at times. And can Guido please stop pasting Monty Python quotes in the Python docs. Dude. Seriously.
cheers Chris Maunder
-
I am NOT criticizing Python and I implore the reader please do not start nor engage in any “flame war” on Python. I have programmed in a number of different languages over my 40 years as a developer. My current languages that I use the most are C# and T-SQL. I see that Python is popular, or at least appears so. What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#? Is Python a better general purpose language, or is it better at some specific niche(s) in development? I would really like to get a clearer picture of why using Python along with, or in place of, C# would be of value. Thanks in advance for the non-flame responses.
i have zero learning hours of python and yet i have never had problem a reading Python code. to be real, that was only 3-4 times and the code was short the first time i saw Python was in the book Foundations of Python Network Programming. somehow i got this book in my hand and i started reading immediately. i was surprised how easy i understood the language. it was so clear that i didn't bother to write the examples in Python, but i translated them on the fly to C on the Windows platform. the only gotchas were the ones with the Win32 API next comes what's important about C# and Python, that i think is highly subjective. at the moment i get my living from writing C# code. i'm not good at it, just barely good enough for people to put up with me. i feel repulsion with languages like Java and C#. i was assigned to read data from a server and luckily from me i found code examples: https://github.com/flightaware/firehose\_examples i was looking at the C# example and looking and looking... even had it compiled and it was working, but still i couldn't grasp it. then i turned to the Python example and i immediately understood what needs to be done. what is the essence. that's what i mean when i say your question is highly subjective. there is no "the right language" and "the only right thing to do for the common good", because too much right turns into left i believe, no matter what others say (although some of those people i know for decades and value their opinion highly), based on my experience, that Python is clean and very good language for introduction into programming. regardless if it will potentially introduce bad habits on may-have-been-future professional programmers. you know when you ask how to do something in git and you get a 10 page explanation of the theoretical possibilities and historical background of all version control software VS an answer saying: type this 4 words and it will do? for me, the former is C# and the latter is Python cheers