Notation

Given $k+1$ bits, with bits numbered $b_k b_{k-1} b_{k-2}\ldots b_2 b_1
b_0$, we assume that bit $b_0$ corresponding to the least significant bit. Then the sequence of bits on the left is the binary number equivalent of the decimal number represented by the summation on the right. Since each bit can only take values from the alphabet {0,1}, a string of $k+1$ bits so numbered can represent up to $2^{k+1}$ unique patterns. The type representation adopted for this string of bits determines how the string will be interpreted, as described below.



MM Hugue 2017-08-28