Linear Algebra Note#5: Transpose & Permutation
This note is based on MIT 18.06 📒
Content
- #1: Row, Column & Matrix
- #2: Elimination
- #3: Multiplication & Inverse
- #4: LU Factorization
- #5: Transpose & Permutation 👈
- #6: Vector Space
Transpose
The tranpose operation maps the columns of the matrix into rows.
Consider a rectangular matrix of size \(3\times 2\):
\[ \begin{bmatrix} 1 & 3 \\\\ 2 & 3 \\\\ 4 & 1 \end{bmatrix}^{T} = \begin{bmatrix} 1 & 2 & 4 \\\\ 3 & 3 & 1 \end{bmatrix} \]
Permutation
Permutation matrix \(P\): execute row exchanges:
\[ \begin{array}{c} \begin{bmatrix} 0 & 1 & 0 \\\\ 1 & 0 & 0 \\\\ 0 & 0 & 1 \end{bmatrix} \\ P \end{array} \begin{array}{c} \begin{bmatrix} 1 & 2 & 1 \\\\ 3 & 8 & 1 \\\\ 0 & 4 & 1 \end{bmatrix} \\ A \end{array} = \begin{array}{c} \begin{bmatrix} 3 & 8 & 1 \\\\ 1 & 2 & 1 \\\\ 0 & 4 & 1 \end{bmatrix} \\ \ \end{array} \]
Consider an \(n\times n\) matrix. There are \(n!\) permutation matrices in total, since each corresponds to a distinct permutation of \(n\) rows/columns. This is simply a combinatorics problem.
- Any multiplication of the permutation matrices is still in the group, inverse too.
- The inverse of a permutation matrix is just its transpose: \(P^{-1} = P^{T}\).
Symmetric Matrix
A matrix \(A\) is symmetric when: \(A^T = A\)
The multiplication of a matrix and its tranpose if always symmetric:
\[ (R^TR)^T = R^TR \]