What the hell gcc?
-
Quote:
I hate assuming compiler bugs
No, it is definitely not a compiler bug. It is a defined behaviour, there are lots of documents in www which explain the background.
0x01AA wrote:
It is a defined behaviour
That's precisely what I was afraid of. :~
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
-
0x01AA wrote:
It is a defined behaviour
That's precisely what I was afraid of. :~
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
In a message above, you mentioned, there is no
std
available. But maybe in your environement some kind ofbit_cast
is available? If not, I think a similar behaviour (to inform the compiler [optimizer]) can be achived withreinterpret_cast
, but at the moment I don't remember the document, from where I got this :( Sorry, for my strange English ... -
In a message above, you mentioned, there is no
std
available. But maybe in your environement some kind ofbit_cast
is available? If not, I think a similar behaviour (to inform the compiler [optimizer]) can be achived withreinterpret_cast
, but at the moment I don't remember the document, from where I got this :( Sorry, for my strange English ...It's possible I could do it with reinterpret_cast? I dunno
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
-
It's possible I could do it with reinterpret_cast? I dunno
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix