for(int i=0; i<size; i++)
-
I am sure this was one of the hello-world codes for many of us ... But I wonder why the letter "i" .. I mean why on earth? With "a" the leading character why "i" ... After sometime I found out that Fortran language (which was/is historically used for scientific calculations) use "i" as a starting character for all integer type variables, and the quickest varible to write would be "i" Most authors and coders continued to use "i" even in C and then to C++ and then to C#, Java etc ... Is this an interpretation?
Maybe you'll also be interested about the story of the unknow 'x' in math : Explanation on ted talk[^] ;)
-
When I first learned about mathematical functions f(x)(a dozen years before my first programming class) I was told mathematicians used "i" for the first incrementing variable, "j" for the next and so on. When I started to learn programming languages, a FORTRAN professor (FORTRAN as in "FORmula TRANslation") who (of course) was a member of the Math department, said something along the lines of "This is not the theology department but using anything other that "I" for the first incremental etc is heresy. Those who want to use 'meaningful words' should consider being English or Philosophy majors and take Professor So-and-So's COBOL class." So, 'I' as an index did not start with FORTRAN, it started before FORTRAN, but it fit nicely because those who used FORTRAN knew its meaning from math studies.
Yes, it is so: i, j and k are standard notation for mathematical arithmetic progressions and series. From well before FORTRAN. But FORTRAN was designed by mathematicians, so we programmers are carrying over the notation. a,b,c -> constants x,y,z -> unknown terms (or real coordinates in cartesian plane) k,j,i -> INTEGER indexes or vector coordinates. I think Gauss was already using this conventions, more than 100 years ago. Gauss wins. As always.
-
I am sure this was one of the hello-world codes for many of us ... But I wonder why the letter "i" .. I mean why on earth? With "a" the leading character why "i" ... After sometime I found out that Fortran language (which was/is historically used for scientific calculations) use "i" as a starting character for all integer type variables, and the quickest varible to write would be "i" Most authors and coders continued to use "i" even in C and then to C++ and then to C#, Java etc ... Is this an interpretation?
Fortran probably got it from math. Now math... I don't know.
-
I am sure this was one of the hello-world codes for many of us ... But I wonder why the letter "i" .. I mean why on earth? With "a" the leading character why "i" ... After sometime I found out that Fortran language (which was/is historically used for scientific calculations) use "i" as a starting character for all integer type variables, and the quickest varible to write would be "i" Most authors and coders continued to use "i" even in C and then to C++ and then to C#, Java etc ... Is this an interpretation?
I've always used 'x'. Why? Because a "Programming in C" book I started with used it. So I just got used to it I guess.
-
I am sure this was one of the hello-world codes for many of us ... But I wonder why the letter "i" .. I mean why on earth? With "a" the leading character why "i" ... After sometime I found out that Fortran language (which was/is historically used for scientific calculations) use "i" as a starting character for all integer type variables, and the quickest varible to write would be "i" Most authors and coders continued to use "i" even in C and then to C++ and then to C#, Java etc ... Is this an interpretation?
It was also a feature of some early BASIC implementations. There were 26 variables available; A through H were floating point, I through P (??) were integers; R(??) through Z were also floating point. The only explicit type declarations were to suffix one of the single-letter variables with the $ symbol to indicate string. "I" was commonly used for loop control as being the first integer variable. It's a habit I learned in the 1960s and I still use it (very occasionally). Old habits die hard...
-
I am sure this was one of the hello-world codes for many of us ... But I wonder why the letter "i" .. I mean why on earth? With "a" the leading character why "i" ... After sometime I found out that Fortran language (which was/is historically used for scientific calculations) use "i" as a starting character for all integer type variables, and the quickest varible to write would be "i" Most authors and coders continued to use "i" even in C and then to C++ and then to C#, Java etc ... Is this an interpretation?
while(life!=death) { age++; research++; development++; }
-
I am sure this was one of the hello-world codes for many of us ... But I wonder why the letter "i" .. I mean why on earth? With "a" the leading character why "i" ... After sometime I found out that Fortran language (which was/is historically used for scientific calculations) use "i" as a starting character for all integer type variables, and the quickest varible to write would be "i" Most authors and coders continued to use "i" even in C and then to C++ and then to C#, Java etc ... Is this an interpretation?
Hi, Others have said this, but it deserves to be repeated: This is classical math (handwritten, typed or printed), from a couple of centuries before the advent of computers. It's pretty natural that it was adopted by most mathematically-oriented computer languages (and, consequently, by later languages), and it also explains why it feels natural to most programmers (who have read any math - which should be a reasonable expectation) even today. indices: i, j, k (don't remember how it goes on after that quantities: n, m (then p, q if I remember correctly - o can be confusing). dimensions: x, y, z, t, (xi), (eta), (theta). I don't remember what happens when we run out of letters for indices and quantities, but if I remember correctly these sequences also continue with greek letters. So:
for (i = 0; i < n; i++)
{
for (j = 0; j < m; i++)
{
for (k = 0; k < p; k++)
{
}
}
}Pretty straightforward. And Newton would have understood it without thinking... :)
-
We programmers are a self-centered bunch. It's never about the other person, it's always "I I I" this, "I I I" that. jk, jk. lol. Hmmm... I suppose the letters in preference would have to be: ijklo. An expanding counter-clockwise spiral starting with "i"! From this, we can ascertain the correct letters to use for each new level of loop nesting: ijklouhmpygntfbrdvescwaxqz. Any other order is incorrect.
-
AspDotNetDev wrote:
From this, we can ascertain the correct letters to use for each new level of loop nesting: ijklouhmpygntfbrdvescwaxqz.
How deep did you go into the spiral? I usually stop on the 'g'.
Greetings - Jacek
I've never had a need to go past "c".
-
I've never had a need to go past "c".
-
I am sure this was one of the hello-world codes for many of us ... But I wonder why the letter "i" .. I mean why on earth? With "a" the leading character why "i" ... After sometime I found out that Fortran language (which was/is historically used for scientific calculations) use "i" as a starting character for all integer type variables, and the quickest varible to write would be "i" Most authors and coders continued to use "i" even in C and then to C++ and then to C#, Java etc ... Is this an interpretation?
I personally use x (and y and z for nested loops - if you nest more than two levels, you need to redesign), I make it unsigned (unless there's a specific reason to use signed) because that more accurately models the real world, and I use the prefix form of the increment operator because it's potentially more efficient. So my canonical form of that construct would be:
for (unsigned x = 0; x < size; ++x)
And yes, some of my co-workers make fun of me. The ones who spend twice as much time debugging their code as I do debugging mine.
-
I am sure this was one of the hello-world codes for many of us ... But I wonder why the letter "i" .. I mean why on earth? With "a" the leading character why "i" ... After sometime I found out that Fortran language (which was/is historically used for scientific calculations) use "i" as a starting character for all integer type variables, and the quickest varible to write would be "i" Most authors and coders continued to use "i" even in C and then to C++ and then to C#, Java etc ... Is this an interpretation?
I use
for (int r = 0; r < size; i++)
because 1) "r" is directly next to "t" when I type "int_[spaceBar]_r", typing flow is much nicer; 2) "r" is on my left-hand vs. the right-hand, so when I type "someArray[r]" I type faster as my fingers type in sync with one another; seriously, try it for a week and you'll see it flows much better; 3) "r" means "repeat" (or "record") in my brain 4) using i reminds me of when I flunked Spanish... example: iFlunked! 5) I got sick of iThisCrap, iThatCrap, iWTF; .... - even though I actually originated using "i[SomeWord]" during my Apple dayz ... yes, iCreatedThisMess ... r u kidding me? no, i am not!
-
Sorry, I still don't see the humour! (Sound of tumbleweed blowing)
Since I, j, k, l are int by default, a variable named God would, by default, be a 'real' number - ie, floating point. You'd have to declare the variable God to be an integer in order for it to be taken that way. Yes, this is probably a good candidate for the most delayed response ever.
-
Since I, j, k, l are int by default, a variable named God would, by default, be a 'real' number - ie, floating point. You'd have to declare the variable God to be an integer in order for it to be taken that way. Yes, this is probably a good candidate for the most delayed response ever.
mc42 wrote:
Yes, this is probably a good candidate for the most delayed response ever.
It did take me a while to get the context there. Thanks for explaining. It does tickle me that an Imaginary being defaults to Real, while i is definitely not imaginary. Mathematicians would shudder. (Edited for grammar)
"If you don't fail at least 90 percent of the time, you're not aiming high enough." Alan Kay.