Python - no arguments, please
-
Maximilien wrote:
do lot of data analysis with it
But what language is the data analysis code written in?
Python with NumPy. We deal with large dataset of 3d points and 3d measurements. (see bio for company) We do a lot in the software itself, but sometimes the clients have domain (proprietary) specific things they want to do.
CI/CD = Continuous Impediment/Continuous Despair
-
I've backed myself into far tighter corners than Python on 2+ year projects :-D It's all about architecting the solution. And some tech stacks are really well suited to this. And some just aren't...
cheers Chris Maunder
I am not saying that it is useless for large projects. And "architecting" is ofc important, but let's say 2+ years PLUS volatile requirements PLUS a situation where developers come and go. Then I would not recommend Python as the first choice.
"If we don't change direction, we'll end up where we're going"
-
I am not saying that it is useless for large projects. And "architecting" is ofc important, but let's say 2+ years PLUS volatile requirements PLUS a situation where developers come and go. Then I would not recommend Python as the first choice.
"If we don't change direction, we'll end up where we're going"
I've not had experience with large (and long) python projects yet, but two things really stick out for me when I think about your statement: 1. Comments seem to be very much a second (or third or fourth) consideration. Not even having formal syntax support for multi-line comments just strikes me as being of the mindset that support for comments is a necessary evil, not an integral part of documenting the niggly things that a dev, 3 years later, will need to refer to 2. The absolutely terrible variable names you see in so many samples. Having started my career being forced to maintain a massive legacy FORTRAN codebase in a research institution many, many years ago, I'm still scarred by the variables named
A
,AA
,AAA
,A2
,B2
and so on. And no, this isn't hyperbole: this was the common naming method. I don't see stuff as bad in Python, but naming isn't exactly deeply rooted in the Pythonic culture. It really does not help the cause of maintainability.cheers Chris Maunder
-
I've not had experience with large (and long) python projects yet, but two things really stick out for me when I think about your statement: 1. Comments seem to be very much a second (or third or fourth) consideration. Not even having formal syntax support for multi-line comments just strikes me as being of the mindset that support for comments is a necessary evil, not an integral part of documenting the niggly things that a dev, 3 years later, will need to refer to 2. The absolutely terrible variable names you see in so many samples. Having started my career being forced to maintain a massive legacy FORTRAN codebase in a research institution many, many years ago, I'm still scarred by the variables named
A
,AA
,AAA
,A2
,B2
and so on. And no, this isn't hyperbole: this was the common naming method. I don't see stuff as bad in Python, but naming isn't exactly deeply rooted in the Pythonic culture. It really does not help the cause of maintainability.cheers Chris Maunder
Now I am quite confused and almost wonder if you are commenting on my comment. #1 I said nothing about comments. #2 I said nothing about variable names. But... Maybe I just triggered you think about these things. Funny thing is that neither of these two are what I was thinking of. As for comments I dunno. You surely remember the hype around the Ada programming language. They only (AFAIK) had
--
on each line for comments. And one of the key language design criteria was: "WORMS i.e Write Once Read Many timeS" And actually Python has docstrings that lazy people could use for multi-line comments. But that would be considered extremely bad style, certainly any formal code review would reject such stuff. As for naming, there are at least strong linting/formatting tools (Pep8/Black), that my organisation has in our CI, you cannot check in without it. And what I've seen of naming culture the Python community is quite strong. In web snippets? Maybe less so. Yeah, I have done my fair share of FORTRAN too, where naming was darkness... Despite such draconic measures Python code rots rapidly, but IMO the key reason is the lack of typing combined with optional arguments. In long call chains, in the end, you have no idea what the elephant is being passed or returned. Cheerz :java:"If we don't change direction, we'll end up where we're going"
-
Python is interpreted; easier to get started with (IMO). Popular in schools; universities. Unless one has a specific problem to solve, it's another solution looking for one.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
Gerry Schmitz wrote:
Python is interpreted; easier to get started with (IMO).
That certainly was an essential argument in favor of interpreted languages ... Long time ago. In my student days as a junior, around 1980, our group project - no more than a couple thousand lines - required more than half an hour compilation time, on a VAX. So we made sure to make all known changes/fixes in the source code before starting a recompilation. I haven't timed compilers for a few years. Last time I did timing, on a complete recompilation of a system with a few hundred modules, on the average the compiler produced eight object code modules per second. If you use an IDE such as VS, which takes the responsibility for recompiling modified modules only, compilation is practically unnoticeable. No compilation delay once was an argument in favor of interpretation. It no longer is.
-
Now I am quite confused and almost wonder if you are commenting on my comment. #1 I said nothing about comments. #2 I said nothing about variable names. But... Maybe I just triggered you think about these things. Funny thing is that neither of these two are what I was thinking of. As for comments I dunno. You surely remember the hype around the Ada programming language. They only (AFAIK) had
--
on each line for comments. And one of the key language design criteria was: "WORMS i.e Write Once Read Many timeS" And actually Python has docstrings that lazy people could use for multi-line comments. But that would be considered extremely bad style, certainly any formal code review would reject such stuff. As for naming, there are at least strong linting/formatting tools (Pep8/Black), that my organisation has in our CI, you cannot check in without it. And what I've seen of naming culture the Python community is quite strong. In web snippets? Maybe less so. Yeah, I have done my fair share of FORTRAN too, where naming was darkness... Despite such draconic measures Python code rots rapidly, but IMO the key reason is the lack of typing combined with optional arguments. In long call chains, in the end, you have no idea what the elephant is being passed or returned. Cheerz :java:"If we don't change direction, we'll end up where we're going"
megaadam wrote:
#1 I said nothing about comments. #2 I said nothing about variable names. But...
My mind is merely meandering and thinking about what Python in a big project would be like. I started thinking about the things that I struggle with now when I look at new code, and the rush of comment and varname angst came out. Sorry if that seemed a little random. Pent up venting... And typing. I completely forgot about typing! They have the type hints, but they on;y get you so far. Yeah, there's tools, and conventions, and reviews, and everything that's available to disciplined teams. And then there's the slippery slope of getting away with whatever you can. Naming was darkness. What a great turn of phrase.
cheers Chris Maunder
-
Gerry Schmitz wrote:
Python is interpreted; easier to get started with (IMO).
That certainly was an essential argument in favor of interpreted languages ... Long time ago. In my student days as a junior, around 1980, our group project - no more than a couple thousand lines - required more than half an hour compilation time, on a VAX. So we made sure to make all known changes/fixes in the source code before starting a recompilation. I haven't timed compilers for a few years. Last time I did timing, on a complete recompilation of a system with a few hundred modules, on the average the compiler produced eight object code modules per second. If you use an IDE such as VS, which takes the responsibility for recompiling modified modules only, compilation is practically unnoticeable. No compilation delay once was an argument in favor of interpretation. It no longer is.
I decided to do a couple of compiles, partly to show how much faster CMake/Ninja is than MSBuild. 240KLOCs of C++, MSVC compiler, VS2022: - MSBuild, x86 Release: 2m 52s - CMake/Ninja, x64 Release: 0m 32s I was forced to switch to CMake/Ninja when targeting for Linux instead of just Windows. I wish I'd gotten around to it sooner.
Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing. -
I am NOT criticizing Python and I implore the reader please do not start nor engage in any “flame war” on Python. I have programmed in a number of different languages over my 40 years as a developer. My current languages that I use the most are C# and T-SQL. I see that Python is popular, or at least appears so. What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#? Is Python a better general purpose language, or is it better at some specific niche(s) in development? I would really like to get a clearer picture of why using Python along with, or in place of, C# would be of value. Thanks in advance for the non-flame responses.
For us it is the huge range of third-party packages and how easily Python interfaces with other systems.
-
Gerry Schmitz wrote:
Python is interpreted; easier to get started with (IMO).
That certainly was an essential argument in favor of interpreted languages ... Long time ago. In my student days as a junior, around 1980, our group project - no more than a couple thousand lines - required more than half an hour compilation time, on a VAX. So we made sure to make all known changes/fixes in the source code before starting a recompilation. I haven't timed compilers for a few years. Last time I did timing, on a complete recompilation of a system with a few hundred modules, on the average the compiler produced eight object code modules per second. If you use an IDE such as VS, which takes the responsibility for recompiling modified modules only, compilation is practically unnoticeable. No compilation delay once was an argument in favor of interpretation. It no longer is.
-
I am NOT criticizing Python and I implore the reader please do not start nor engage in any “flame war” on Python. I have programmed in a number of different languages over my 40 years as a developer. My current languages that I use the most are C# and T-SQL. I see that Python is popular, or at least appears so. What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#? Is Python a better general purpose language, or is it better at some specific niche(s) in development? I would really like to get a clearer picture of why using Python along with, or in place of, C# would be of value. Thanks in advance for the non-flame responses.
import framework import that-cool-library-someone-made to be that nitty gritty announce, C# or c#.net, which includes so much everyday functions. And net 6/c#10 have global usings that hides a bunch even more, yay. So I would lean that python has a huge early years, "free" publicity growth, which then just lingers and continues to grow because its popular, which makes it more popular, and repeat. think 10 years ago, .net 4 is great, but still windows, and visual studio a heavy weight install. python, with scratch and other simple editors, and interpretation language, comparably more easy errors again comparing 2012 python world to c#.net. add 10 years people putting time and effort into python, they create packages, they share, improve them, write guides, and how tos. look at just c# changes in last 5 years, yearly major version numbering changes compared to 2008-2012 of slow if anything. So what you start with has big influence, I can switch between javascript and c#.net fairly easily because I dont have to worry if a space or tab, or if not align. VS with errors improved somewhat in last 10 years. but if someone starts with a python syntax of writing, then semi-colons and braces will be a pain. cross platform, run on raspberry pi, or linux, again most of that c#.net core has had to rewrite (or however that had to do to get methods which were copywrite to make code open source) has been done in last 5 ish years. Performance gains from core 3 to 5, significant and leaps things like json parsing for asp.net comparable if not faster then node.js. not sure if any makes sense, simply attempting to compare the legacy with what have today, and why python might appear more popular
-
I've not had experience with large (and long) python projects yet, but two things really stick out for me when I think about your statement: 1. Comments seem to be very much a second (or third or fourth) consideration. Not even having formal syntax support for multi-line comments just strikes me as being of the mindset that support for comments is a necessary evil, not an integral part of documenting the niggly things that a dev, 3 years later, will need to refer to 2. The absolutely terrible variable names you see in so many samples. Having started my career being forced to maintain a massive legacy FORTRAN codebase in a research institution many, many years ago, I'm still scarred by the variables named
A
,AA
,AAA
,A2
,B2
and so on. And no, this isn't hyperbole: this was the common naming method. I don't see stuff as bad in Python, but naming isn't exactly deeply rooted in the Pythonic culture. It really does not help the cause of maintainability.cheers Chris Maunder
-
PIEBALDconsult wrote:
You don't have to build it before deploying -- just copy the code to the destination. Edit: Oh, I had read that as advantages, not disadvantages.
That still works as a disadvantage - with no compilation step, you've lost a basic sanity check on the code you're deploying. :)
"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer
I used to develop glue code (mostly monitoring and piping data between applications) for a major electric utility. The systems had to run in real time 24x7. I found scripted code to be an advantage because the on-call person always had easy access to the production code in the event of a failure anywhere in the system (often due to an external problem with a data source). This meant that emergency patches were easily applied. No special development environment was required. Just because the production code was scripted doesn't mean it wasn't thoroughly tested before going live.
-
Gerry Schmitz wrote:
Python is interpreted; easier to get started with (IMO).
That certainly was an essential argument in favor of interpreted languages ... Long time ago. In my student days as a junior, around 1980, our group project - no more than a couple thousand lines - required more than half an hour compilation time, on a VAX. So we made sure to make all known changes/fixes in the source code before starting a recompilation. I haven't timed compilers for a few years. Last time I did timing, on a complete recompilation of a system with a few hundred modules, on the average the compiler produced eight object code modules per second. If you use an IDE such as VS, which takes the responsibility for recompiling modified modules only, compilation is practically unnoticeable. No compilation delay once was an argument in favor of interpretation. It no longer is.
trønderen wrote:
In my student days as a junior, around 1980, our group project - no more than a couple thousand lines - required more than half an hour compilation time, on a VAX. So we made sure to make all known changes/fixes in the source code before starting a recompilation.
My first year of college we used punch cards -- had to carefully type the deck (make a mistake, throw out the card), bundle it up, and drop it off at the data center. At the beginning of the semester, come back an hour later to pick up the printout and deck. At the end of the semester? The turnaround was 12 hours. That type of hassle makes more careful programmers, as we needed to get things right on the first try, not F5 / fix a line / F5 / fix a line / F5 / repeat until it looks like it works. I hated it, but it was excellent training. Those that screwed around and waited to write & run their programs typically switched majors in their second semester.
-
I am NOT criticizing Python and I implore the reader please do not start nor engage in any “flame war” on Python. I have programmed in a number of different languages over my 40 years as a developer. My current languages that I use the most are C# and T-SQL. I see that Python is popular, or at least appears so. What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#? Is Python a better general purpose language, or is it better at some specific niche(s) in development? I would really like to get a clearer picture of why using Python along with, or in place of, C# would be of value. Thanks in advance for the non-flame responses.
Great questions. I have recently started using Python for machine vision prototyping as well as exploring the machine learning libraries. While I like the language, the lack of structured architecture is a bit unnerving. It reminds me of National Instruments LabVIEW which is also very easy to use and make a big mess in. I have had conversations with our software department and there are some very good libraries available in Python for complex math operations that are well documented and have community support. There are similar libraries available in C#; however, they are difficult to implement and lack proper documentation or support. I think this may have more to do with the underlying code that supports Python is C++ rather than C#, so there is some conversion/wrapping that needs to be done to make the library available to C#. I think it comes down to your use case, but I don't think there is anything available in Python that isn't available in C#. A lot of the available Python libraries are wrappers for other general purpose code blobs available in other languages. Case in point is the Kivy library for building UI interfaces. The code base underpinning Kivy is god-level genius. Some of the implemenation requirements are a bit clunky, but I have been impressed so far with the library.
-
I used to develop glue code (mostly monitoring and piping data between applications) for a major electric utility. The systems had to run in real time 24x7. I found scripted code to be an advantage because the on-call person always had easy access to the production code in the event of a failure anywhere in the system (often due to an external problem with a data source). This meant that emergency patches were easily applied. No special development environment was required. Just because the production code was scripted doesn't mean it wasn't thoroughly tested before going live.
Absolutely. As long as the change gets into source control. But it's also too easy for a properly-authorized bad-actor to make illegitimate changes to such a system as well.
-
Absolutely. As long as the change gets into source control. But it's also too easy for a properly-authorized bad-actor to make illegitimate changes to such a system as well.
-
I am NOT criticizing Python and I implore the reader please do not start nor engage in any “flame war” on Python. I have programmed in a number of different languages over my 40 years as a developer. My current languages that I use the most are C# and T-SQL. I see that Python is popular, or at least appears so. What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#? Is Python a better general purpose language, or is it better at some specific niche(s) in development? I would really like to get a clearer picture of why using Python along with, or in place of, C# would be of value. Thanks in advance for the non-flame responses.
MSBassSinger wrote:
What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#?
Not to be flippant, but if you have C#/T-SQL experience, go ahead and learn Python and you will discover the advantages and disadvantages, and sometimes they overlap. I used Python for a large (60+ Beaglebone SBC's [Single Board Computer]) inhouse project for a customer. (If any of you know my somewhat colorful past, you'll know what "industry" this was for.) We implemented a simple web server for each Beaglebone, used RabbitMq for messaging, had a small screen with a GTK interface for the graphics, and custom IO for the various buttons and one-wire iButton readers, all running under Debian. The cool thing was that I could test all the software (GTK runs in Windows as well) and emulate the hardware I/O on my Windows machine, debugging it directly in Visual Studio. And the software included an auto-update process that would automatically update all 60+ Beaglebones (that took some trial and error but eventually worked.) Being able to test the app on Windows and deploy it automatically with WinSCP to the test jigs (I had 6 Beaglebones at home for testing) was, frankly, a very pleasant experience. Would I write a professional web server with database requirements in Python? Heck no, but Python definitely has its uses, certainly in the SBC arena.
Latest Article:
Create a Digital Ocean Droplet for .NET Core Web API with a real SSL Certificate on a Domain -
trønderen wrote:
In my student days as a junior, around 1980, our group project - no more than a couple thousand lines - required more than half an hour compilation time, on a VAX. So we made sure to make all known changes/fixes in the source code before starting a recompilation.
My first year of college we used punch cards -- had to carefully type the deck (make a mistake, throw out the card), bundle it up, and drop it off at the data center. At the beginning of the semester, come back an hour later to pick up the printout and deck. At the end of the semester? The turnaround was 12 hours. That type of hassle makes more careful programmers, as we needed to get things right on the first try, not F5 / fix a line / F5 / fix a line / F5 / repeat until it looks like it works. I hated it, but it was excellent training. Those that screwed around and waited to write & run their programs typically switched majors in their second semester.
In 1978, I was in the last freshman class to use punched cards. Two years later, a group of professors and graduate students from our university was on a visit to MIT. Somewhat embarrassed, they revealed that not until last year (i.e. 1979) was the introductory programming course run on interactive terminals. The MIT people stalled: Interactive terminals in an introductory programming course? At that time at MIT, interactive terminals were reserved for graduate work! In 1978, a 12 h turnaround was unheard of. Usually, the printout was on the shelves the next day, but in rush periods, it could take two days. Be careful to note, though, that out of those 48 hours, maybe five seconds were compilation and running time. The rest of the time, the deck was sitting in the input queue (a physical one!), being handled mechanically, or the printout laying stacked up in the line printer output tray waiting to be carried to the output shelves. If the operators had been given interpreters for interpreting the card decks, rather than compilers and run time systems, it would have affected the turnaround time nothing at all. You are most certainly correct: It made us more careful programmers. It was excellent training. Old memory worth recalling: In the compiler construction course, one essential quality metric was the compiler's ability to detect all, or as many as possible, (real, primary) errors in one compilation run. So we all became fans of LALR over recursive descent :-) The first compiler I studied close up was the classic recursive Pascal P4 compiler (open source didn't come with Linux!), being impressed by the number of tricks it did to be able to continue compilation even after quite serious syntax errors, while generating as few second order error messages as possible. Today, who cares at all for such qualities? Many times have I seen coworkers getting a long list of error reports, fixing the first ten, and ignoring the rest before they rebuild the system from scratch 'so that we are not bothered by second order error messages from the first ten errors'. To some degree they are right: Modern compilers does a much poorer job of hiding already reported errors and avoiding second order errors.
-
I decided to do a couple of compiles, partly to show how much faster CMake/Ninja is than MSBuild. 240KLOCs of C++, MSVC compiler, VS2022: - MSBuild, x86 Release: 2m 52s - CMake/Ninja, x64 Release: 0m 32s I was forced to switch to CMake/Ninja when targeting for Linux instead of just Windows. I wish I'd gotten around to it sooner.
Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing.Greg Utas wrote:
MSBuild, x86 Release: 2m 52s CMake/Ninja, x64 Release: 0m 32s
A factor of 5.4 for (presumably) doing identical jobs would make me very cautious. I would not take that at face value as an indicator of 'typical' performance, but spend some effort on learning what makes the one alternative more than five times faster. Although x86 and x64 are not quite apples and oranges, it is at least apples and pears. So the two jobs are not identical. Obviously, the compilers are different. Even if they have the same call line interface, different modules are activated. Were all the options exactly the same? E.g. the same level of optimization. The same amount of runtime checks. Etc. Did the two jobs generate the same number of compiler activations, and the same number of object files? With two different target architectures, you cannot except exactly the same number of object files, but they should be comparable. Where both jobs clean compiles? This includes e.g. precompilation of header files. For a 'fair' compare, you could run the job on a newly formatted disk, but if this forces one setup to do heavy one-time preparations that saves a lot of time later, maybe it isn't as 'fair' as you first thought. If you are doing an incremental, non-clean compile: Were exactly the same changes made in both cases? Are the dependency rules set up identically in the two alternatives? Did both jobs do the same kind of preparatory work, e.g. building the dependencies? If the job 'in principle' is of the same kind, were there significant differences, such as in one case, the developer supplies dependencies, while in the other, it is automatically detected through analysis of the source code? Even for a clean compile: Are the dependency rules set up 'ideally'? I have seen compile logs from large compilations (typically 30-60 minutes build time) compiling the same source file five times. This happens not once, but often! The developers argue that to maintain their part of the build files efficiently, they need to be independent of what the other guys are doing, and need to maintain their own independent dependencies ... (And they refuse to let a separate team or expert do all build file maintenance, claiming that is it too tightly interwoven with the source code.) Did both jobs provide the same (/comparable) results? E.g. did they both include complete linking into an executable? Did both deliver auto-generated documentatio
-
I am NOT criticizing Python and I implore the reader please do not start nor engage in any “flame war” on Python. I have programmed in a number of different languages over my 40 years as a developer. My current languages that I use the most are C# and T-SQL. I see that Python is popular, or at least appears so. What value-add(s) does Python bring that I cannot get now in C#? What disadvantages are there, if any, to using Python over C#? Is Python a better general purpose language, or is it better at some specific niche(s) in development? I would really like to get a clearer picture of why using Python along with, or in place of, C# would be of value. Thanks in advance for the non-flame responses.
I work out in the real world writing C#, JavaScript, TypeScript, and SQL everyday. I also teach one night a week at a community college, Data Structures in the Spring semester and a programming language (currently C) in the Fall semester. My observations is that Python is a scripting language and, like other scripting languages, useful for doing things a little more rapidly but inexactly. My observation of students who want to know why they can't use Python in the Data Structures class (college requires C++ for transfer reasons) is that the new crop of students really don't understand what or why they are doing things but are monkey-see, monkey-do programmers. Of course that doesn't apply to the 5 - 10 % of my students who really DO understand how a computer works. The net being, a lot of computer is "close enough is good enough" in our world of today. Obviously that doesn't apply to certain financial type transactions in banking, real estate, etc, but DOES apply to a lot of things that are just providing info.