Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Row and column spaces
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Column space== ===Definition=== Let {{mvar|K}} be a [[field (mathematics)|field]] of [[scalar (mathematics)|scalars]]. Let {{math|A}} be an {{math|''m'' × ''n''}} matrix, with column vectors {{math|'''v'''<sub>1</sub>, '''v'''<sub>2</sub>, ..., '''v'''<sub>''n''</sub>}}. A [[linear combination]] of these vectors is any vector of the form :<math>c_1 \mathbf{v}_1 + c_2 \mathbf{v}_2 + \cdots + c_n \mathbf{v}_n,</math> where {{math|''c''<sub>1</sub>, ''c''<sub>2</sub>, ..., ''c<sub>n</sub>''}} are scalars. The set of all possible linear combinations of {{math|'''v'''<sub>1</sub>, ..., '''v'''<sub>''n''</sub>}} is called the '''column space''' of {{mvar|A}}. That is, the column space of {{mvar|A}} is the [[linear span|span]] of the vectors {{math|'''v'''<sub>1</sub>, ..., '''v'''<sub>''n''</sub>}}. Any linear combination of the column vectors of a matrix {{mvar|A}} can be written as the product of {{mvar|A}} with a column vector: :<math>\begin{array} {rcl} A \begin{bmatrix} c_1 \\ \vdots \\ c_n \end{bmatrix} & = & \begin{bmatrix} a_{11} & \cdots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \cdots & a_{mn} \end{bmatrix} \begin{bmatrix} c_1 \\ \vdots \\ c_n \end{bmatrix} = \begin{bmatrix} c_1 a_{11} + \cdots + c_{n} a_{1n} \\ \vdots \\ c_{1} a_{m1} + \cdots + c_{n} a_{mn} \end{bmatrix} = c_1 \begin{bmatrix} a_{11} \\ \vdots \\ a_{m1} \end{bmatrix} + \cdots + c_n \begin{bmatrix} a_{1n} \\ \vdots \\ a_{mn} \end{bmatrix} \\ & = & c_1 \mathbf{v}_1 + \cdots + c_n \mathbf{v}_n \end{array}</math> Therefore, the column space of {{mvar|A}} consists of all possible products {{math|''A'''''x'''}}, for {{math|'''x''' ∈ ''K''<sup>''n''</sup>}}. This is the same as the [[image (mathematics)|image]] (or [[range of a function|range]]) of the corresponding [[matrix transformation]]. ==== Example ==== If <math>A = \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 2 & 0 \end{bmatrix}</math>, then the column vectors are {{math|1='''v'''<sub>1</sub> = [1, 0, 2]<sup>T</sup>}} and {{math|1='''v'''<sub>2</sub> = [0, 1, 0]<sup>T</sup>}}. A linear combination of '''v'''<sub>1</sub> and '''v'''<sub>2</sub> is any vector of the form <math display="block">c_1 \begin{bmatrix} 1 \\ 0 \\ 2 \end{bmatrix} + c_2 \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} = \begin{bmatrix} c_1 \\ c_2 \\ 2c_1 \end{bmatrix}</math> The set of all such vectors is the column space of {{mvar|A}}. In this case, the column space is precisely the set of vectors {{math|(''x'', ''y'', ''z'') ∈ '''R'''<sup>3</sup>}} satisfying the equation {{math|1=''z'' = 2''x''}} (using [[Cartesian coordinates]], this set is a [[plane (mathematics)|plane]] through the origin in [[three-dimensional space]]). ===Basis=== The columns of {{mvar|A}} span the column space, but they may not form a [[basis (linear algebra)|basis]] if the column vectors are not [[linearly independent]]. Fortunately, [[elementary row operations]] do not affect the dependence relations between the column vectors. This makes it possible to use [[row reduction]] to find a [[basis (linear algebra)|basis]] for the column space. For example, consider the matrix :<math>A = \begin{bmatrix} 1 & 3 & 1 & 4 \\ 2 & 7 & 3 & 9 \\ 1 & 5 & 3 & 1 \\ 1 & 2 & 0 & 8 \end{bmatrix}.</math> The columns of this matrix span the column space, but they may not be [[linearly independent]], in which case some subset of them will form a basis. To find this basis, we reduce {{mvar|A}} to [[reduced row echelon form]]: :<math>\begin{bmatrix} 1 & 3 & 1 & 4 \\ 2 & 7 & 3 & 9 \\ 1 & 5 & 3 & 1 \\ 1 & 2 & 0 & 8 \end{bmatrix} \sim \begin{bmatrix} 1 & 3 & 1 & 4 \\ 0 & 1 & 1 & 1 \\ 0 & 2 & 2 & -3 \\ 0 & -1 & -1 & 4 \end{bmatrix} \sim \begin{bmatrix} 1 & 0 & -2 & 1 \\ 0 & 1 & 1 & 1 \\ 0 & 0 & 0 & -5 \\ 0 & 0 & 0 & 5 \end{bmatrix} \sim \begin{bmatrix} 1 & 0 & -2 & 0 \\ 0 & 1 & 1 & 0 \\ 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 0 \end{bmatrix}.</math><ref>This computation uses the [[Gaussian elimination|Gauss–Jordan]] row-reduction algorithm. Each of the shown steps involves multiple elementary row operations.</ref> At this point, it is clear that the first, second, and fourth columns are linearly independent, while the third column is a linear combination of the first two. (Specifically, {{math|1='''v'''<sub>3</sub> = −2'''v'''<sub>1</sub> + '''v'''<sub>2</sub>}}.) Therefore, the first, second, and fourth columns of the original matrix are a basis for the column space: :<math>\begin{bmatrix} 1 \\ 2 \\ 1 \\ 1\end{bmatrix},\;\; \begin{bmatrix} 3 \\ 7 \\ 5 \\ 2\end{bmatrix},\;\; \begin{bmatrix} 4 \\ 9 \\ 1 \\ 8\end{bmatrix}.</math> Note that the independent columns of the reduced row echelon form are precisely the columns with [[Pivot element|pivots]]. This makes it possible to determine which columns are linearly independent by reducing only to [[row echelon form|echelon form]]. The above algorithm can be used in general to find the dependence relations between any set of vectors, and to pick out a basis from any spanning set. Also finding a basis for the column space of {{mvar|A}} is equivalent to finding a basis for the row space of the [[transpose]] matrix {{math|''A''<sup>T</sup>}}. To find the basis in a practical setting (e.g., for large matrices), the [[singular-value decomposition]] is typically used. ===Dimension=== {{main|Rank (linear algebra)}} The [[dimension (linear algebra)|dimension]] of the column space is called the '''[[rank (linear algebra)|rank]]''' of the matrix. The rank is equal to the number of pivots in the [[reduced row echelon form]], and is the maximum number of linearly independent columns that can be chosen from the matrix. For example, the 4 × 4 matrix in the example above has rank three. Because the column space is the [[image (mathematics)|image]] of the corresponding [[matrix transformation]], the rank of a matrix is the same as the dimension of the image. For example, the transformation <math>\R^4 \to \R^4</math> described by the matrix above maps all of <math>\R^4</math> to some three-dimensional [[Euclidean subspace|subspace]]. The '''nullity''' of a matrix is the dimension of the [[kernel (matrix)|null space]], and is equal to the number of columns in the reduced row echelon form that do not have pivots.<ref>Columns without pivots represent free variables in the associated homogeneous [[system of linear equations]].</ref> The rank and nullity of a matrix {{mvar|A}} with {{mvar|n}} columns are related by the equation: :<math>\operatorname{rank}(A) + \operatorname{nullity}(A) = n.\,</math> This is known as the [[rank–nullity theorem]]. ===Relation to the left null space=== The [[left null space]] of {{mvar|A}} is the set of all vectors {{math|'''x'''}} such that {{math|1='''x'''<sup>T</sup>''A'' = '''0'''<sup>T</sup>}}. It is the same as the [[kernel (matrix)|null space]] of the [[transpose]] of {{mvar|A}}. The product of the matrix {{math|''A''<sup>T</sup>}} and the vector {{math|'''x'''}} can be written in terms of the [[dot product]] of vectors: :<math>A^\mathsf{T}\mathbf{x} = \begin{bmatrix} \mathbf{v}_1 \cdot \mathbf{x} \\ \mathbf{v}_2 \cdot \mathbf{x} \\ \vdots \\ \mathbf{v}_n \cdot \mathbf{x} \end{bmatrix},</math> because [[row vector]]s of {{math|''A''<sup>T</sup>}} are transposes of column vectors {{math|'''v'''<sub>''k''</sub>}} of {{mvar|A}}. Thus {{math|1=''A''<sup>T</sup>'''x''' = '''0'''}} if and only if {{math|'''x'''}} is [[orthogonal]] (perpendicular) to each of the column vectors of {{mvar|A}}. It follows that the left null space (the null space of {{math|''A''<sup>T</sup>}}) is the [[orthogonal complement]] to the column space of {{mvar|A}}. For a matrix {{mvar|A}}, the column space, row space, null space, and left null space are sometimes referred to as the ''four fundamental subspaces''. ===For matrices over a ring=== Similarly the column space (sometimes disambiguated as ''right'' column space) can be defined for matrices over a [[ring (mathematics)|ring]] {{mvar|K}} as :<math>\sum\limits_{k=1}^n \mathbf{v}_k c_k</math> for any {{math|''c''<sub>1</sub>, ..., ''c<sub>n</sub>''}}, with replacement of the vector {{mvar|m}}-space with "[[left and right (algebra)|right]] [[free module]]", which changes the order of [[scalar multiplication]] of the vector {{math|'''v'''<sub>''k''</sub>}} to the scalar {{math|''c<sub>k</sub>''}} such that it is written in an unusual order ''vector''–''scalar''.<ref>Important only if {{mvar|K}} is not [[commutative ring|commutative]]. Actually, this form is merely a [[matrix multiplication|product]] {{math|''A'''''c'''}} of the matrix {{mvar|A}} to the column vector {{math|'''c'''}} from {{math|''K''<sup>''n''</sup>}} where the order of factors is ''preserved'', unlike [[#Definition|the formula above]].</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)