Intel processor instructions for arithmetic operations expect integers to be represented in a particular way in memory.
The int type typically is expected to be represented by a sequence of 32 1's and 0's; that is, 32 bits.
Representation of integers using only 1's and 0's means that the (decimal) value of each position is a power of 2: 1, 2, 4, 8, 16, etc. instead of 1, 10, 100, 1000, etc.
Binary: 1 0 1 1 = 1 * 8 + 0 * 4 + 1 * 2 + 1 * 1 = 11 (in decimal) Position value: 8 4 2 1 Decimal: 1 0 1 1 = 1 * 1000 + 0 * 100 + 1 * 10 + 1 * 1 = 1011 (in decimal) Position value: 1000 100 10 1
With 4 bits, there are 16 (= 2 x 2 x 2 x 2) possible combinations of 0's and 1's in each of the 4 positions.