Last time, we talked about diagonalization having the same eigenvalues as which are easy to find. Not only are the diagonal entries of the eigenvalues, but the columns of are the eigenvectors!
In particular, the change of basis from to is through eigenvectors. In the perspective of the eigenvectors , is just a diagonal operator (it breaks individual coordinates into piece and scales along those dimensions).
We can call diagonalization an eigenvalue-eigenvector decomposition.
Suppose we have a system of differential equations like which might be tricky to solve if each derivative is in terms of the other functions. but if is diagonalizable, it may be easier to solve this system by changing to which is much easier to deal with.
Let have eigenvalues . For , define
Example
Take all ways to multiply the eigenvalues and then add them.
and Principal Submatrix
Submatrices: destroy some columns and rows. We call it a principle submatrix if we get rid of the same rows as columns (we delete each th row and th column).
Proposition
If , then
It's easy to see why the first equality holds. If we factor the polynomial into and expand out, we get exactly as the coefficients. (When select of the terms to get a term we get exactly )
Exercise
Prove the second equality via induction with the Laplace expansion
Consequences:
For a diagonalizable matrix, this is easy to see. But this is useful to know for matrices that are not diagonalizable. Another perspective: when there is 0 eigenvalue, the determinant collapses to zero and the matrix is singular. When there is no 0 eigenvalue, then the matrix is invertible which we saw last time.
(see determinant)
This implies that the trace of similar matrices are the same! So this means that the trace is a property of the transformation - not necessarily obvious.
(see trace)
Coming up:
Seeing a relationship between and spectrum. They have the same nonzero eigenvalues! (for rectangular matrices and )
Commutativity of and if and only if simultaneously diagonalizable.
To get there: some background
Multiplying partitioned matrices
Suppose is partitioned, not necessarily regularly. Suppose is partitioned conformally (in the same way for the rows as the columns of )
The th block is , a submatrix is
where is and
This is suspiciously like multiplying regular matrices with single entries! And each matrix in the multiplication is of size . So why does this work?
Say I want to take a single row and a single column and compute their inner product like doing it for a "normal" matrix with single entries. We are doing the same thing but chunk by chunk.
Permutation Matrices
Permutation Matrix
A permutation matrix is a square matrix such that there is one in every row, one in every column, and the rest are zeroes.
(this reorders/rearranges the rows of if we multiply and reorders the columns of if we multiply according to the pattern of the rows or columns of respectively)