Hex and Binary
Binary and Hexadecimal (sometimes abbreviated to hex) are alternative numeral systems. Decimal, the everyday numeral system, is base 10, meaning that each digit represents 10 times the value of the digit to its right.
Contents
Overview
Binary is base 2, while hexadecimal is base 16. So "10" represents ten in decimal, but it represents 2 in binary, and 16 in hexadecimal.
Digits | Binary | Decimal | Hexadecimal |
---|---|---|---|
"1" | 1 | 1 | 1 |
"10" | 2 | 10 | 16 |
"100" | 4 | 100 | 256 |
"1000" | 8 | 1000 | 4096 |
"10000" | 16 | 10000 | 65536 |
"100000" | 32 | 100000 | 1048576 |
"1000000" | 64 | 1000000 | 16777216 |
"10000000" | 128 | 10000000 | 268435456 |
"100000000" | 256 | 100000000 | 4294967296 |
"1000000000" | 512 | 1000000000 | 68719476736 |
"10000000000" | 1024 | 10000000000 | 1099511627776 |
Binary
Binary is the basic representation for computers, because electric transistors can have (close to) zero charge or (close to) high charge.
Each bit is worth twice more than the previous one. You simply add them all up to have your value in hex/decimal.
1st bit: 1 2nd bit: 2 3rd bit: 4 4th bit: 8 5th bit: 16 6th bit: 32 7th bit: 64 8th bit: 128 and so on
Hexadecimal
Hexadecimal is used in programming instead of decimal because each hexadecimal digit expands exactly to four binary digits. Conversion between decimal and binary is much more complicated.
Hexadecimals on this wiki are usually either prefixed with "0x" or suffixed with "h". For example, "0x8F" or "8Fh".
The hexadecimal digits are (lowercase or uppercase):
Hex | Decimal | Binary |
---|---|---|
0 | 0 | 0 |
1 | 1 | 1 |
2 | 2 | 10 |
3 | 3 | 11 |
4 | 4 | 100 |
5 | 5 | 101 |
6 | 6 | 110 |
7 | 7 | 111 |
8 | 8 | 1000 |
9 | 9 | 1001 |
A | 10 | 1010 |
B | 11 | 1011 |
C | 12 | 1100 |
D | 13 | 1101 |
E | 14 | 1110 |
F | 15 | 1111 |
Binary, Hexadecimal and Decimal (unsigned) equivalents
Binary | Hexadecimal | Decimal |
---|---|---|
00000000 | 0x00 | 0 |
00000001 | 0x01 | 1 |
00000010 | 0x02 | 2 |
00000011 | 0x03 | 3 |
00000100 | 0x04 | 4 |
00000101 | 0x05 | 5 |
00000110 | 0x06 | 6 |
00000111 | 0x07 | 7 |
00001000 | 0x08 | 8 |
00001001 | 0x09 | 9 |
00001010 | 0x0A | 10 |
00001011 | 0x0B | 11 |
00001100 | 0x0C | 12 |
00001101 | 0x0D | 13 |
00001110 | 0x0E | 14 |
00001111 | 0x0F | 15 |
00010000 | 0x10 | 16 |
00011000 | 0x18 | 24 |
00100000 | 0x20 | 32 |
00110000 | 0x30 | 48 |
01000000 | 0x40 | 64 |
01100000 | 0x60 | 96 |
10000000 | 0x80 | 128 |
10100000 | 0xA0 | 160 |
10110000 | 0xB0 | 176 |
11000000 | 0xC0 | 192 |
11010000 | 0xD0 | 208 |
11100000 | 0xE0 | 224 |
11110000 | 0xF0 | 240 |
11111111 | 0xFF | 255 |
Bytes and Beyond
Bytes, Half-Words, Words are composed of so many bits (a bit is a boolean value, TRUE or FALSE):
- Boolean: 0 or 1 (1 bit)
- Nibble: 0x0 to 0xF (4 bits)
- unsigned: 0 to +15
- signed: -8 to +7
- Byte: 0x00 to 0xFF (8 bits)
- unsigned: 0 to +255
- signed: -128 to +127
- Half-Word: 0x0000 to 0xFFFF (16 bits)
- unsigned: 0 to +65535
- signed: -32768 to +65535
- Word: 0x00000000 to 0xFFFFFFFF (32 bits)
- unsigned: 0 to +4294967295
- signed: -2147483648 to +2147483647
Flags
Flags are values that are usually one bit long, part of a byte, that it meant to literally be interpreted as TRUE or FALSE by the game.
For example, this byte is only composed of flags. They're technically Boolean values, but we call them flags to understand each other better:
0x??
0x80 Male 0x40 Female 0x20 Monster 0x10 Join After Event 0x08 Load Formation 0x04 ??? Stats 0x02 0x01 Save Formation
Now let's say this Byte has a value of 0x59, which in binary is 010011001 or 0x40 + 0x10 + 0x08 + 0x01. Meaning that "Female", "Join After Event", "Load Formation" and "Save Formation" are set.
Byte Order
Now you have to understand that MIPS r3000 will swap the bytes when handling Half-Words and Words.
What does this mean?
If the game wants to fetch a Half-Word, in the game's memory that value might be written as:
78 5A
But because the game is trying to read a Half-Word through opcode instructions (see ASM Hacking), it will be stored in the registers as 0x00005A78. Registers are 32bit and are used to hold values that will be computed by the game. They are loaded, altered, compared, and saved back as necessary.
Similarly, if the game wanted to load a Word:
9C B5 06 80
The value loaded would be 0x8006B59C.
Now if you see these values in the game's memory through a Hex Editor or an emulator's Memory Viewer:
59 00 C8 00 74 01 66 01 12 00 B7 01 32 00
You can simply guess the format and say those all look like Half-Words, you'd likely be right. But be careful, it depends on the way the game loads and saves values that determines if they are Bytes/Half-Words/Words, and not what you, yourself see in just Hex.
Unsigned VS signed
Again, this is determined by the game's code. Unless you can figure it out through the game's code or through logic, you cannot tell if a value is signed or unsigned. 0xC0 can be +192 (unsigned) or -64 (signed) 0xFFF8 can be +65528 (unsigned) or -8 (signed)
- Bytes:
- Lowest unsigned value: 0x00 (0)
- Highest unsigned value: 0xFF (+255)
- Lowest signed value: 0x80 (-128)
- Highest signed value: 0x7F (+127)
- Half-Words:
- Lowest unsigned value: 0x0000 (0)
- Highest unsigned value: 0xFFFF (+65535)
- Lowest signed value: 0x8000 (-32768)
- Highest signed value: 0x7FFF (+32767)
- Half-Words:
- Lowest unsigned value: 0x00000000 (0)
- Highest unsigned value: 0xFFFFFFFF (+4294967295)
- Lowest signed value: 0x80000000 (-2147483648)
- Highest signed value: 0x7FFFFFFF (+2147483647)
Half the values are negative and the other half are positive when dealing with signed values (0 being positive).
Signed Values:
0x80000000 = -2147483648 0xFF000000 = -16777216 0xFFFF0000 = -65536 0xFFFFFF00 = -256 0xFFFFFFF8 = -8 0xFFFFFFFF = -1 0x00000000 = 0 0x00000008 = +8 0x000000FF = +255 0x0000FFFF = +65535 0x00FFFFFF = +16777215 0x7FFFFFFF = +2147483647