Series: Rotations

This series starts from 2D rotation matrices and takes us through the developments we have made to understand and describe rotations - it covers imaginary numbers, intrinsic and extrinsic rotations, Euler angles, quaternions, geometric algebra and Lie theory.
Rotation matricesComplex numbersEuler anglesQuaternionsReflectionsGeometric algebraLie algebraexp(iθ)
0. Establishing fundamental facts about rotations
First let's show that rotations are linear transformations. Note that:
Sum of two rotated vectors is the same as rotating the sum of the vectors. In other words \( R(\vec{v}) + R(\vec{w}) = R(\vec{v} + \vec{w}) \).
Scaling a rotated vector is the same as rotating the scaled vector. In other words \( R(\alpha \vec{v}) = \alpha R(\vec{v}) \).
These two properties establish that rotations are linear transformations. Thus we can represent any rotation as a matrix. That is \( R(\vec{v}) = R\vec{v} \) where \( R \) is a matrix and \( \vec{v} \) is the vector to be rotated.
Also note that a rotation preserves the angles between two vectors. In other words, the dot product of two rotated vectors is the same as the dot product of the original vectors:
\( Rv \cdot Rw = v \cdot w \Rightarrow (Rv)^T \cdot Rw = v^T \cdot w \Rightarrow v^TR^TRw = v^Tw \).
This implies that \( R^TR = I \) where \( I \) is the identity matrix. Thus, the inverse of a rotation matrix is its transpose. It also tells us that the columns of a rotation matrix are orthonormal.
Note that all the above arguments are valid for rotations in any dimension and basis.