Isn't that the point of C/C++? "Code should run as fast as the hardware allows" and "We allow you to do crazy things that probably won't be portable, if that's what you want".
I wonder, if somebody created a language with C like semantics, like pointer arithmetic, but with a layer in between to ensure consistency and portability.. Would people want to use that, or would the semantics be considered less interesting, when they don't map directly to hardware?
Corresponding directly to the hardware makes sense, sure. But what modern GCC (especially) does goes far beyond that. The intent of signed integer overflow being undefined is that it will be twos complement on a twos complement machine, not that I want half my code to not run if I had an overflow somewhere.
"Evaluates to an implementation-defined value" would not include that. I'm not sure if any of the usual definitions in the standard includes trapping and doesn't include arbitrary behaviour.
Well, there's kind of one place: when a value is converted to a signed type and it's out of range for that type, the standard says "either the
result is implementation-defined or an implementation-defined signal is raised."
However, integer overflow within an expression is trickier than this to define, because you want to preserve the compiler's ability to reorder sub-expressions that are mathematically equivalent when there's no overflow, without forcing the compiler to prove that there won't be overflow. The same is true of CSE and hoisting expressions out of loops.
For example, if you have a machine that traps on overflow and thus under this hypothetical standard your implementation defined signed overflow to trap, you can't simplify (a + 2 - 1) to (a + 1) because the second might trap when the second does not, and your implementation has said that the trap will happen. It also means that you can't hoist an expression out of a loop if that expression might overflow, because the trap has to happen at the point where it would happen on the C virtual machine, not later after other side-effects have become visible.
This is why trapping implementations imply "undefined behaviour" - it's because defining to trap severely restricts the compiler's freedom under the as-if rule to reorder and simplify expressions.
Sure, but the standard need not say "Evaluates to an implementation-defined value". There are other wordings which use "implementation-defined" and would support trapping on overflow.
I disagree. There's nothing difficult to understand about new phrases with new meanings in general.
Do you have a specific phrase which you can show to be sufficiently difficult to understand by a group of people, and which you don't think could be rewritten to retain the same meaning but be easier to understand?
If you do, let me know, and I will try to simplify it for you and help you understand it.
If you don't, then there isn't much to say. Your current argument ("phrases could be written in a difficult to understand way") is too general to be useful.
I suppose it's less the phrase itself than the concept. What would be a way to say "integer overflow behaviour may reflect the underlying machine, but must be sensible", rigorously enough for the C standard? We've established that "undefined behaviour" is too broad, and "implementation-defined value" is too narrow.
Why "must be sensible"? Why not just say "signed integer overflow is implementation-defined" and in a footnote say "for example, it may wrap, trap, saturate, or do anything else documented by your implementation"?
> Why "must be sensible"? Why not just say "signed integer overflow is implementation-defined" and in a footnote say "for example, it may wrap, trap, saturate, or do anything else documented by your implementation"?
I don't think that actually defines anything - I don't think GCC would interpret that as anything other than undefined behaviour.
If that was the intent, then overflow should have been classified as enumerated implementation-defined behavior, just like the bit representation of an integer type which already deals with those variations.
I wonder, if somebody created a language with C like semantics, like pointer arithmetic, but with a layer in between to ensure consistency and portability.. Would people want to use that, or would the semantics be considered less interesting, when they don't map directly to hardware?