Unions question

So when you have an union like this:

1
2
3
4
5
union uint4
{
    unsigned int i;
    unsigned char b[4];
}


And you set i to 0xAABBCCDD like:

1
2
uint4 myInt;
myInt.i=0xAABBCCDD;


Then you will have that myInt.b[0] equals 0xDD, myInt.b[1] equals 0xCC, myInt.b[2] equals 0xBB and myInt.b[3] equals 0xAA, so little-endian style.
But is it possible that a different computer, maybe with another processor architecture, uses another 'style', like big-endian, and makes myInt.b[0] equal to 0xAA?
My understanding is, the behavior is undefined.

Reading anything but the value last written results in nonstandard/undefined behavior.
Yes, but no compiler exists for which the data is not overlayed and accessible like that.

So yes, the endianness of the computer makes a difference. Typically you will find:

big endian - where a value is stored most-to-least significant byte in sequence

little endian - where a value is stored least-to-most significant byte in sequence

There is also:

mixed endians - where things get hairy.
ok, thanks, I gues I'll need to use other things than unions then to seperate values in bytes.

Thanks!
Topic archived. No new replies allowed.