Signed integers considered harmful?
-
Signed integers seem to be a minefield of undefined behaviour lurking around every corner. You can't even add them without potentially blowing up the Death Star, unless you test for potential overflow first. Should they be avoided? How should this be dealt with? How bad is it to pretend it's not a problem?
-
Signed integers seem to be a minefield of undefined behaviour lurking around every corner. You can't even add them without potentially blowing up the Death Star, unless you test for potential overflow first. Should they be avoided? How should this be dealt with? How bad is it to pretend it's not a problem?
know your data. check bounds.
-
Signed integers seem to be a minefield of undefined behaviour lurking around every corner. You can't even add them without potentially blowing up the Death Star, unless you test for potential overflow first. Should they be avoided? How should this be dealt with? How bad is it to pretend it's not a problem?
harold aptroot wrote:
You can't even add them without potentially blowing up the Death Star, unless you test for potential overflow first.
I'm interested to learn more about this. Can you provide an example of what you mean?
The difficult we do right away... ...the impossible takes slightly longer.
-
Signed integers seem to be a minefield of undefined behaviour lurking around every corner. You can't even add them without potentially blowing up the Death Star, unless you test for potential overflow first. Should they be avoided? How should this be dealt with? How bad is it to pretend it's not a problem?
And here I've been wrestling with unsigned longs (in C#) for the last few days... :sigh: Consider:
ulong max = 1UL << 63 ; // 63 is the maximum of a calculated value
for ( ulong i = max - 1 ; i < max ; i-- )
How many people will freak out when they see the test in the
for
statement? -
Signed integers seem to be a minefield of undefined behaviour lurking around every corner. You can't even add them without potentially blowing up the Death Star, unless you test for potential overflow first. Should they be avoided? How should this be dealt with? How bad is it to pretend it's not a problem?
-
Same could be said about unsigned integers.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment "Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
or floats, or doubles or any numeric encoding that has to reside on a computer with finite storage. it's all about what you need the numbers to do for you.
-
or floats, or doubles or any numeric encoding that has to reside on a computer with finite storage. it's all about what you need the numbers to do for you.
Exactly.
Chris Losinger wrote:
it's all about what you need the numbers to do for you.
Yes it is.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment "Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
-
Same could be said about unsigned integers.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment "Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
I was wondering why he singled out signed integers. But now that you point this out, I think I understand what he means.
The difficult we do right away... ...the impossible takes slightly longer.
-
I was wondering why he singled out signed integers. But now that you point this out, I think I understand what he means.
The difficult we do right away... ...the impossible takes slightly longer.
Signed integers lead to singed digits. :-D
-
Same could be said about unsigned integers.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment "Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
-
Signed integers seem to be a minefield of undefined behaviour lurking around every corner. You can't even add them without potentially blowing up the Death Star, unless you test for potential overflow first. Should they be avoided? How should this be dealt with? How bad is it to pretend it's not a problem?
-
harold aptroot wrote:
You can't even add them without potentially blowing up the Death Star, unless you test for potential overflow first.
I'm interested to learn more about this. Can you provide an example of what you mean?
The difficult we do right away... ...the impossible takes slightly longer.
Not of Death Stars exploding (sadly?). But there's a well-known (and infamous) optimization that GCC does where if on some code path a signed integer would overflow, it deduces that therefore that code path must be dead. In practice this often means that overflow tests that are done after the overflow has already occurred (such as maybe you calculate something, then don't use the result if the calculation overflowed) are deleted, so your program looks correct (after all, you tested for overflows, right?) but isn't.
-
harold aptroot wrote:
Signed integers seem to be a minefield of undefined behaviour
Since their behaviour is completely defined, I find it hard to understand what you mean.
It isn't. See for example 3.4.3 of C99[^],
An example of undefined behavior is the behavior on integer overflow
This does not apply to unsigned integers, which can't overflow because they wrap. Taking some other standard doesn't help either, as far as I know that's undefined in all version of C and C++ (but I'd really like to be wrong about that).
-
Signed integers seem to be a minefield of undefined behaviour lurking around every corner. You can't even add them without potentially blowing up the Death Star, unless you test for potential overflow first. Should they be avoided? How should this be dealt with? How bad is it to pretend it's not a problem?
Lots and lots of things in C++ (and C for that matter) can lead to undefined behavior if preconditions are not met. Signed integer arithmetic is just one of many. If you're programming in this language, you should be used to dealing with narrow contracts. So, no, they shouldn't be avoided. Deal with them depending on the situation, in many cases an assert will suffice. Pretending it's not a problem is fatal. (and no, gcc isn't the only compiler that assumes that naive signed overflow checks are always false)