Linear algebra

Let $x = (x_1, \ldots, x_n)$ and $y = (y_1, \ldots, y_n)$. The scalar product of $x$ and $y$ is defined as: $$ x \cdot y = \sum_i x_i y_i $$ Let $A = (a_{i,j})$ be an $n \times m$ matrix and $B = (b_{i,j})$ be an $m \times p$ matrix. The matrix product $A B$ is an $n \times p$ matrix $C = (c_{i,j})$ such that: $$ c_{i,j} = \sum_{k} a_{i,k} b_{k,j} $$ that is, $c_{i,j}$ is the scalar product of the $i$th row of $A$ and the $j$th column of $B$.

The transpose $A^T$ of a matrix $A$ is a matrix in which the rows of $A$ are the columns of $A^T$, that is $$ a^{T}_{i,j} = a_{j,i} $$ A matrix $A$ is symmetric if $$A^T = A,$$ that is, $a_{i,j} = a_{j,i}$, that is, $A$ is specular along the main diagonal.

The inverse of a square matrix $A$ is a matrix $A^{-1}$ such that $$A A^{-1} =A^{-1} A = I,$$ where $I$ is the identity matrix (the matrix with 1 on the diagonal and 0 elsewhere). Not all matrices are invertible. Matrices that are not invertible are called singular.

Matrix inversion is useful to solve linear systems. A linear system has the form $$A x = b,$$ where $A$ is a known matrix, $b$ is a known vector, and $x$ is a variable vector. If $A$ is square and not singular, then $$x = A^{-1} b.$$

A vector $x \neq 0$ is a left-eigenvector of a square matrix $A$ if $$x A = \lambda x,$$ where $\lambda$ is a scalar called the left-eigenvalue of $A$ associated with $x$. A vector $x \neq 0$ is a right-eigenvector of a matrix $A$ if $$A x = \lambda x,$$ where $\lambda$ is a scalar called the right-eigenvalue of $A$ associated with $x$. In fact, left and right eigenvalues are the same. This is not the case, in general, for eigenvectors. The left-eigenvectors of a matrix $A$ corresponds to the right-eigenvectors of its transpose $A^T$. It $A$ is symmetric, there is no difference between left and right eigenvectors. Notice that the equation $x A = \lambda x$ is not a linear system, since both $\lambda$ and $x$ are variables.

The eigenvalues of a square matrix of size $n \times n$ are the roots of the characteristic polynomial $$p(\lambda) = \det(A - \lambda I)$$ where $det(\cdot)$ is the determinant. The degree of $p(\lambda)$ is $n$ hence $A$ has $n$ eigenvalues, but some may be complex numbers (even for real matrices), and some eigenvalues may be repeated. The algebraic multiplicity of an eigenvalue $\lambda$ is the number of times that $\lambda$ is repeated as root in the characteristic polynomial. The geometric multiplicity of an eigenvalue $\lambda$ is the number of linearly independent eigenvectors that are associated with $\lambda$. It holds that the geometric multiplicity is always less than or equal to the algebraic multiplicity. The set $\sigma(A)$ of distinct eigenvalues is called the spectrum of $A$, and the spectral radius is the nonnegative number: $$ \rho(A) = \max_{\lambda \in \sigma(A)} |\lambda| $$

The Perron-Frobenius Theorem is the following. If $A \geq 0$ is irreducible (that is, if the graph of $A$ is (strongly) connected), then:

  1. the spectral radius $r = \rho(A) > 0$ and $r \in \sigma(A)$;
  2. the algebraic multiplicity of $r$ is 1 (hence also its geometric multiplicity is 1);
  3. the unique (up to a multiplicative constant) eigenvector $x$ such that $A x = r x$ is positive.