A single bit can store a binary piece of information. We can use it to distinguish between two states.
These two states could be represented by True \(T\) and False \(F\), but by convention we use \(1\) and \(0\).
We can combine bits to store more complex pieces of information. If we have \(n\) bits, we can distinguish between \(2^n\) states.
Eight bits together constitute a byte. This can represent one of \(2^8=256\) states.
We can perform basic operations on inputs.
A simple operation is the unary NOT operator, which returns the reverse of the values of all bits.
We also have binary operators, which take two bits and return another bit. Of note are, AND, OR and XOR.
These operations are a model of Boolean algebra. So there are \(16\) possible binary functions and \(4\) possible unary operators.
As in logic, we can combine elementary logical gates to create other logic gates.
These operations can also be performed on a series of bits, however each individual bit is independent of other bits for these operations.
We can have other operations where bits do interact. An important operator is the bit shift. This takes a series of bits and shifts them to the right or left by one place. This pushes one bit off the end.
The new bit can take a \(0\) or \(1\).
Logic gates are not needed for bit shifts. Instead wiring of inputs to outputs achieves the same effect.
Simple operators, such as bitwise operators and bit shifts, can be combined to create more complex operators. These could have a large number of inputs, outputs and logical gates.