In linear algebra, one associates a polynomial to every square matrix, its characteristic polynomial or secular equation. This polynomial encodes several important properties of the matrix, most notably its eigenvalues, its determinant and its trace.
Given a square matrix A, we want to find a polynomial whose roots are precisely the eigenvalues of A. For a diagonal matrix A, the characteristic polynomial is easy to define: if the diagonal entries are a, b, c the characteristic polynomial will be
- (t − a)(t − b)(t − c)...
up to a convention about sign (+ or −). This works because the diagonal entries are also the eigenvalues of this matrix.
For a general matrix A, one can proceed as follows. If λ is an eigenvalue of A, then there is an eigenvector v≠0 such that
- A v = λv,
- (λI − A)v = 0
(where I is the identity matrix). Since v is non-zero, this means that the matrix λI − A is singular, which in turn means that its determinant is 0. We have just shown that the roots of the function det(t I − A) are the eigenvalues of A. Since this function is a polynomial in t, we're done.
- pA(t) = det(t I − A)
where I denotes the n-by-n identity matrix. This is indeed a polynomial, since determinants are defined in terms of sums of products. (Some authors define the characteristic polynomial to be det(A − t I); the difference is immaterial since the two polynomials differ at most by a sign.)
Suppose we want to compute the characteristic polynomial of the matrix
We have to compute the determinant of
and this determinant is
The latter is the characteristic polynomial of A.
The polynomial pA(t) is monic (its leading coefficient is 1) and its degree is n. The most important fact about the characteristic polynomial was already mentioned in the motivational paragraph: the eigenvalues of A are precisely the roots of pA(t). The constant coefficient pA(0) is equal to (−1)n times the determinant of A, and the coefficient of t n − 1 is equal to the negative of the trace of A.
All real polynomials of odd degree have a real number as a root, so for odd n, every real matrix has at least one real eigenvalue. Many real polynomials of even degree do not have a real root, but the fundamental theorem of algebra states that every polynomial of degree n has n complex roots, counted with their multiplicities. The non-real roots of real polynomials, hence the non-real eigenvalues, come in conjugate pairs.
For a 2×2 matrix A, the characteristic polynomial is nicely expressed then as
- t 2 − tr(A)t + det(A)
The Cayley-Hamilton theorem states that replacing t by A in the expression for pA(t) yields the zero matrix: pA(A) = 0. Simply, every matrix satisfies its own characteristic equation. As a consequence of this, one can show that the minimal polynomial of A divides the characteristic polynomial of A.
Two similar matrices have the same characteristic polynomial. The converse however is not true in general: two matrices with the same characteristic polynomial need not be similar.
The matrix A and its transpose have the same characteristic polynomial. A is similar to a triangular matrix if and only if its characteristic polynomial can be completely factored into linear factors over K. In fact, A is even similar to a matrix in Jordan normal form in this case.