A test for the linear dependence of vectors may be given in terms of determinant.
Theorem. Let $A^1,\cdots,A^n$ be column vectors of dimension $n$. They are linearly dependent if and only if
$$\det(A^1,\cdots,A^n)=0.$$
Corollary. If a system of $n$ linear equations in $n$ unknowns has a matrix of coefficients whose determinants is not 0, then this system has a unique solution.
Proof. A system of $n$ linear equations in $n$ unknowns may be written as
$$x_1A^1+\cdots+x_nA^n=B,$$
where $A^1,\cdots,A^n$ are the column vectors of dimension $n$ of the matrix of coefficients and $B$ is a column vector of dimension $n$. Since $\det(A^1,\cdots,A^n)\ne 0$, $A^1,\cdots,A^n$ are linearly independent by the theorem. So there exists a unique solution $x_1,\cdots,x_n$ of the system.
Since determinants can be used to test linear dependence , they can be also used to determine the rank of a matrix in stead of using row operations as seen here.
Example. Let
$$A=\begin{pmatrix}
3 & 1 & 2 & 5\\
1 & 2 & -1 & 2\\
1 & 1 & 0 & 1
\end{pmatrix}.$$
Since $A$ is a $3\times 4$ matrix, its rank is at most 3. If we can find three linearly independent column vectors, the rank is 3. In fact,
$$\left|\begin{array}{ccc}
1 & 2 & 5\\
2 & -1 & 2\\
1 & 0 & 1
\end{array}\right|=4.$$
So, the rank is exactly 3.
Example. Let
$$B=\begin{pmatrix}
3 & 1 & 2 & 5\\
1 & 2 & -1 & 2\\
4 & 3 & 1 & 7
\end{pmatrix}.$$
Every $3\times 3$ subdeterminant has value 0, so the rank of $B$ is at most 2. The first two rows of $B$. The first two rows are linearly independent since teh determinant
$$\left|\begin{array}{cc}
3 & 1\\
1 & 2
\end{array}\right|$$
is not 0. Hence, the rank is 2.
Pingback: Cramer’s Rule | MathPhys Archive