# Endianness
## Review: Bits and bytes
- Bytes
- 8 bits in 1 byte
- A byte is the smallest addressable unit for a CPU
- Word
- The register size, e.g. 32 bits on a 32 bit CPU, 64 bits on a 64 bit CPU
- Hex
- Includes 16 characters: `0123456789ABCDEF`
- Each character requires $log_2(16)$ = 4 bits to encode
- A byte, e.g. `0xA5`, consists of two characters, and thus requires 8 bits to encode
## Review: Endianness
### [Quick explainer page](https://chortle.ccsu.edu/assemblytutorial/Chapter-15/ass15_3.html)
(Explainer assumes a 32-bit architecture)
**Example**
For example, say that the 32-bit pattern `0x12345678` is stored at address 0x00400000. The most significant byte is 0x12; the least significant is 0x78.
![[Screen Shot 2022-04-14 at 11.03.55 AM.png]]
**Explanation**
A load word or store word instruction uses only one memory address. The **lowest address** of the four bytes is used for the address of a block of four contiguous bytes.
How is a 32-bit pattern held in the four bytes of memory? There are 32 bits in the four bytes and 32 bits in the pattern, but a choice has to be made about which byte of memory gets what part of the pattern. There are two ways that computers commonly do this:
- **Big Endian Byte Order:** The **most significant** byte (the "big end") of the data is placed at the byte with the lowest address. The rest of the data is placed in order in the next three bytes in memory.
- **Little Endian Byte Order:** The **least significant** byte (the "little end") of the data is placed at the byte with the lowest address. The rest of the data is placed in order in the next three bytes in memory.
In these definitions, the data, a 32-bit pattern, is regarded as a 32-bit unsigned integer.
- The "most significant" byte is the one for the largest powers of two: $2^{31}$, ..., $2^{24}$.
- The "least significant" byte is the one for the smallest powers of two: $2^7$, ..., $2^0$.
Within a **byte** the order of the bits is the same for all computers (no matter how the bytes themselves are arranged).