Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Linear independence
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Evaluating linear independence == === The zero vector === If one or more vectors from a given sequence of vectors <math>\mathbf{v}_1, \dots, \mathbf{v}_k</math> is the zero vector <math>\mathbf{0}</math> then the vectors <math>\mathbf{v}_1, \dots, \mathbf{v}_k</math> are necessarily linearly dependent (and consequently, they are not linearly independent). To see why, suppose that <math>i</math> is an index (i.e. an element of <math>\{ 1, \ldots, k \}</math>) such that <math>\mathbf{v}_i = \mathbf{0}.</math> Then let <math>a_{i} := 1</math> (alternatively, letting <math>a_{i}</math> be equal to any other non-zero scalar will also work) and then let all other scalars be <math>0</math> (explicitly, this means that for any index <math>j</math> other than <math>i</math> (i.e. for <math>j \neq i</math>), let <math>a_{j} := 0</math> so that consequently <math>a_{j} \mathbf{v}_j = 0 \mathbf{v}_j = \mathbf{0}</math>). Simplifying <math>a_1 \mathbf{v}_1 + \cdots + a_k\mathbf{v}_k</math> gives: :<math>a_1 \mathbf{v}_1 + \cdots + a_k\mathbf{v}_k = \mathbf{0} + \cdots + \mathbf{0} + a_i \mathbf{v}_i + \mathbf{0} + \cdots + \mathbf{0} = a_i \mathbf{v}_i = a_i \mathbf{0} = \mathbf{0}.</math> Because not all scalars are zero (in particular, <math>a_{i} \neq 0</math>), this proves that the vectors <math>\mathbf{v}_1, \dots, \mathbf{v}_k</math> are linearly dependent. As a consequence, the zero vector can not possibly belong to any collection of vectors that is linearly ''in''dependent. Now consider the special case where the sequence of <math>\mathbf{v}_1, \dots, \mathbf{v}_k</math> has length <math>1</math> (i.e. the case where <math>k = 1</math>). A collection of vectors that consists of exactly one vector is linearly dependent if and only if that vector is zero. Explicitly, if <math>\mathbf{v}_1</math> is any vector then the sequence <math>\mathbf{v}_1</math> (which is a sequence of length <math>1</math>) is linearly dependent if and only if {{nowrap|<math>\mathbf{v}_1 = \mathbf{0}</math>;}} alternatively, the collection <math>\mathbf{v}_1</math> is linearly independent if and only if <math>\mathbf{v}_1 \neq \mathbf{0}.</math> === Linear dependence and independence of two vectors === This example considers the special case where there are exactly two vector <math>\mathbf{u}</math> and <math>\mathbf{v}</math> from some real or complex vector space. The vectors <math>\mathbf{u}</math> and <math>\mathbf{v}</math> are linearly dependent [[if and only if]] at least one of the following is true: # <math>\mathbf{u}</math> is a scalar multiple of <math>\mathbf{v}</math> (explicitly, this means that there exists a scalar <math>c</math> such that <math>\mathbf{u} = c \mathbf{v}</math>) or # <math>\mathbf{v}</math> is a scalar multiple of <math>\mathbf{u}</math> (explicitly, this means that there exists a scalar <math>c</math> such that <math>\mathbf{v} = c \mathbf{u}</math>). If <math>\mathbf{u} = \mathbf{0}</math> then by setting <math>c := 0</math> we have <math>c \mathbf{v} = 0 \mathbf{v} = \mathbf{0} = \mathbf{u}</math> (this equality holds no matter what the value of <math>\mathbf{v}</math> is), which shows that (1) is true in this particular case. Similarly, if <math>\mathbf{v} = \mathbf{0}</math> then (2) is true because <math>\mathbf{v} = 0 \mathbf{u}.</math> If <math>\mathbf{u} = \mathbf{v}</math> (for instance, if they are both equal to the zero vector <math>\mathbf{0}</math>) then ''both'' (1) and (2) are true (by using <math>c := 1</math> for both). If <math>\mathbf{u} = c \mathbf{v}</math> then <math>\mathbf{u} \neq \mathbf{0}</math> is only possible if <math>c \neq 0</math> ''and'' <math>\mathbf{v} \neq \mathbf{0}</math>; in this case, it is possible to multiply both sides by <math display="inline">\frac{1}{c}</math> to conclude <math display="inline">\mathbf{v} = \frac{1}{c} \mathbf{u}.</math> This shows that if <math>\mathbf{u} \neq \mathbf{0}</math> and <math>\mathbf{v} \neq \mathbf{0}</math> then (1) is true if and only if (2) is true; that is, in this particular case either both (1) and (2) are true (and the vectors are linearly dependent) or else both (1) and (2) are false (and the vectors are linearly ''in''dependent). If <math>\mathbf{u} = c \mathbf{v}</math> but instead <math>\mathbf{u} = \mathbf{0}</math> then at least one of <math>c</math> and <math>\mathbf{v}</math> must be zero. Moreover, if exactly one of <math>\mathbf{u}</math> and <math>\mathbf{v}</math> is <math>\mathbf{0}</math> (while the other is non-zero) then exactly one of (1) and (2) is true (with the other being false). The vectors <math>\mathbf{u}</math> and <math>\mathbf{v}</math> are linearly ''in''dependent if and only if <math>\mathbf{u}</math> is not a scalar multiple of <math>\mathbf{v}</math> ''and'' <math>\mathbf{v}</math> is not a scalar multiple of <math>\mathbf{u}</math>. === Vectors in R<sup>2</sup> === '''Three vectors:''' Consider the set of vectors <math>\mathbf{v}_1 = (1, 1),</math> <math>\mathbf{v}_2 = (-3, 2),</math> and <math>\mathbf{v}_3 = (2, 4),</math> then the condition for linear dependence seeks a set of non-zero scalars, such that :<math>a_1 \begin{bmatrix} 1\\1\end{bmatrix} + a_2 \begin{bmatrix} -3\\2\end{bmatrix} + a_3 \begin{bmatrix} 2\\4\end{bmatrix} =\begin{bmatrix} 0\\0\end{bmatrix},</math> or :<math>\begin{bmatrix} 1 & -3 & 2 \\ 1 & 2 & 4 \end{bmatrix}\begin{bmatrix} a_1\\ a_2 \\ a_3 \end{bmatrix}= \begin{bmatrix} 0\\0\end{bmatrix}.</math> [[Row reduction|Row reduce]] this matrix equation by subtracting the first row from the second to obtain, :<math>\begin{bmatrix} 1 & -3 & 2 \\ 0 & 5 & 2 \end{bmatrix}\begin{bmatrix} a_1\\ a_2 \\ a_3 \end{bmatrix}= \begin{bmatrix} 0\\0\end{bmatrix}.</math> Continue the row reduction by (i) dividing the second row by 5, and then (ii) multiplying by 3 and adding to the first row, that is :<math>\begin{bmatrix} 1 & 0 & 16/5 \\ 0 & 1 & 2/5 \end{bmatrix}\begin{bmatrix} a_1\\ a_2 \\ a_3 \end{bmatrix}= \begin{bmatrix} 0\\0\end{bmatrix}.</math> Rearranging this equation allows us to obtain :<math>\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}\begin{bmatrix} a_1\\ a_2 \end{bmatrix}= \begin{bmatrix} a_1\\ a_2 \end{bmatrix}=-a_3\begin{bmatrix} 16/5\\2/5\end{bmatrix}.</math> which shows that non-zero ''a''<sub>''i''</sub> exist such that <math>\mathbf{v}_3 = (2, 4)</math> can be defined in terms of <math>\mathbf{v}_1 = (1, 1)</math> and <math>\mathbf{v}_2 = (-3, 2).</math> Thus, the three vectors are linearly dependent. '''Two vectors:''' Now consider the linear dependence of the two vectors <math>\mathbf{v}_1 = (1, 1)</math> and <math>\mathbf{v}_2 = (-3, 2),</math> and check, :<math>a_1 \begin{bmatrix} 1\\1\end{bmatrix} + a_2 \begin{bmatrix} -3\\2\end{bmatrix} =\begin{bmatrix} 0\\0\end{bmatrix},</math> or :<math>\begin{bmatrix} 1 & -3 \\ 1 & 2 \end{bmatrix}\begin{bmatrix} a_1\\ a_2 \end{bmatrix}= \begin{bmatrix} 0\\0\end{bmatrix}.</math> The same row reduction presented above yields, :<math>\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}\begin{bmatrix} a_1\\ a_2 \end{bmatrix}= \begin{bmatrix} 0\\0\end{bmatrix}.</math> This shows that <math>a_i = 0,</math> which means that the vectors <math>\mathbf{v}_1 = (1, 1)</math> and <math>\mathbf{v}_2 = (-3, 2)</math> are linearly independent. === Vectors in R<sup>4</sup> === In order to determine if the three vectors in <math>\mathbb{R}^4,</math> :<math>\mathbf{v}_1= \begin{bmatrix}1\\4\\2\\-3\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}7\\10\\-4\\-1\end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}-2\\1\\5\\-4\end{bmatrix}.</math> are linearly dependent, form the matrix equation, :<math>\begin{bmatrix}1&7&-2\\4& 10& 1\\2&-4&5\\-3&-1&-4\end{bmatrix}\begin{bmatrix} a_1\\ a_2 \\ a_3 \end{bmatrix} = \begin{bmatrix}0\\0\\0\\0\end{bmatrix}.</math> Row reduce this equation to obtain, :<math>\begin{bmatrix} 1& 7 & -2 \\ 0& -18& 9\\ 0 & 0 & 0\\ 0& 0& 0\end{bmatrix} \begin{bmatrix} a_1\\ a_2 \\ a_3 \end{bmatrix} = \begin{bmatrix}0\\0\\0\\0\end{bmatrix}.</math> Rearrange to solve for v<sub>3</sub> and obtain, :<math>\begin{bmatrix} 1& 7 \\ 0& -18 \end{bmatrix} \begin{bmatrix} a_1\\ a_2 \end{bmatrix} = -a_3\begin{bmatrix}-2\\9\end{bmatrix}.</math> This equation is easily solved to define non-zero ''a''<sub>i</sub>, :<math>a_1 = -3 a_3 /2, a_2 = a_3/2,</math> where <math>a_3</math> can be chosen arbitrarily. Thus, the vectors <math>\mathbf{v}_1, \mathbf{v}_2,</math> and <math>\mathbf{v}_3</math> are linearly dependent. === Alternative method using determinants === An alternative method relies on the fact that <math>n</math> vectors in <math>\mathbb{R}^n</math> are linearly '''independent''' [[if and only if]] the [[determinant]] of the [[matrix (mathematics)|matrix]] formed by taking the vectors as its columns is non-zero. In this case, the matrix formed by the vectors is :<math>A = \begin{bmatrix}1&-3\\1&2\end{bmatrix} .</math> We may write a linear combination of the columns as :<math>A \Lambda = \begin{bmatrix}1&-3\\1&2\end{bmatrix} \begin{bmatrix}\lambda_1 \\ \lambda_2 \end{bmatrix} .</math> We are interested in whether {{math|1=''A''Ξ = '''0'''}} for some nonzero vector Ξ. This depends on the determinant of <math>A</math>, which is :<math>\det A = 1\cdot2 - 1\cdot(-3) = 5 \ne 0.</math> Since the [[determinant]] is non-zero, the vectors <math>(1, 1)</math> and <math>(-3, 2)</math> are linearly independent. Otherwise, suppose we have <math>m</math> vectors of <math>n</math> coordinates, with <math>m < n.</math> Then ''A'' is an ''n''Γ''m'' matrix and Ξ is a column vector with <math>m</math> entries, and we are again interested in ''A''Ξ = '''0'''. As we saw previously, this is equivalent to a list of <math>n</math> equations. Consider the first <math>m</math> rows of <math>A</math>, the first <math>m</math> equations; any solution of the full list of equations must also be true of the reduced list. In fact, if {{math|β¨''i''<sub>1</sub>,...,''i''<sub>''m''</sub>β©}} is any list of <math>m</math> rows, then the equation must be true for those rows. :<math>A_{\lang i_1,\dots,i_m \rang} \Lambda = \mathbf{0} .</math> Furthermore, the reverse is true. That is, we can test whether the <math>m</math> vectors are linearly dependent by testing whether :<math>\det A_{\lang i_1,\dots,i_m \rang} = 0</math> for all possible lists of <math>m</math> rows. (In case <math>m = n</math>, this requires only one determinant, as above. If <math>m > n</math>, then it is a theorem that the vectors must be linearly dependent.) This fact is valuable for theory; in practical calculations more efficient methods are available. === More vectors than dimensions === If there are more vectors than dimensions, the vectors are linearly dependent. This is illustrated in the example above of three vectors in <math>\R^2.</math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)