Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Row and column spaces
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Vector spaces associated to a matrix}} [[File:Matrix Rows.svg|thumb|right|The row vectors of a [[matrix (mathematics)|matrix]]. The row space of this matrix is the vector space spanned by the row vectors.]] [[Image:Matrix Columns.svg|thumb|right|The column vectors of a [[matrix (mathematics)|matrix]]. The column space of this matrix is the vector space spanned by the column vectors.]] In [[linear algebra]], the '''column space''' (also called the '''range''' or [[Image (mathematics)|'''image''']]) of a [[matrix (mathematics)|matrix]] ''A'' is the [[Linear span|span]] (set of all possible [[linear combination]]s) of its [[column vector]]s. The column space of a matrix is the [[image (mathematics)|image]] or [[range of a function|range]] of the corresponding [[matrix transformation]]. Let <math>F</math> be a [[field (mathematics)|field]]. The column space of an {{math|''m'' × ''n''}} matrix with components from <math>F</math> is a [[linear subspace]] of the [[Examples of vector spaces#Coordinate space|''m''-space]] <math>F^m</math>. The [[dimension (linear algebra)|dimension]] of the column space is called the [[rank (linear algebra)|rank]] of the matrix and is at most {{math|min(''m'', ''n'')}}.<ref name="ReferenceA">Linear algebra, as discussed in this article, is a very well established mathematical discipline for which there are many sources. Almost all of the material in this article can be found in Lay 2005, Meyer 2001, and Strang 2005.</ref> A definition for matrices over a [[ring (mathematics)|ring]] <math>R</math> [[#For matrices over a ring|is also possible]]. The '''row space''' is defined similarly. The row space and the column space of a matrix {{mvar|A}} are sometimes denoted as {{math|'''''C'''''(''A''<sup>T</sup>)}} and {{math|'''''C'''''(''A'')}} respectively.<ref>{{Cite book|last=Strang|first=Gilbert|url=https://www.worldcat.org/oclc/956503593|title=Introduction to linear algebra|publisher=Wellesley-Cambridge Press|year=2016|isbn=978-0-9802327-7-6|edition=Fifth|location=Wellesley, MA|pages=128,168|oclc=956503593}}</ref> This article considers matrices of [[real number]]s. The row and column spaces are subspaces of the [[real coordinate space|real spaces]] <math>\R^n</math> and <math>\R^m</math> respectively.<ref>{{harvtxt|Anton|1987|p=179}}</ref> ==Overview== Let {{mvar|A}} be an {{mvar|m}}-by-{{mvar|n}} matrix. Then * {{math|1=rank(''A'') = dim(rowsp(''A'')) = dim(colsp(''A''))}},<ref>{{harvtxt|Anton|1987|p=183}}</ref> * {{math|rank(''A'')}} = number of [[Pivot element|pivots]] in any echelon form of {{mvar|A}}, * {{math|rank(''A'')}} = the maximum number of linearly independent rows or columns of {{mvar|A}}.<ref>{{harvtxt|Beauregard|Fraleigh|1973|p=254}}</ref> If the matrix represents a [[linear transformation]], the column space of the matrix equals the [[image (mathematics)|image]] of this linear transformation. The column space of a matrix {{mvar|A}} is the set of all linear combinations of the columns in {{mvar|A}}. If {{math|1=''A'' = ['''a'''<sub>1</sub> ⋯ '''a'''<sub>''n''</sub>]}}, then {{math|1=colsp(''A'') = span({{mset|'''a'''<sub>1</sub>, ..., '''a'''<sub>''n''</sub>}})}}. Given a matrix {{mvar|A}}, the action of the matrix {{mvar|A}} on a vector {{math|'''x'''}} returns a linear combination of the columns of {{mvar|A}} with the coordinates of {{math|'''x'''}} as coefficients; that is, the columns of the matrix generate the column space. ===Example=== Given a matrix {{mvar|J}}: :<math> J = \begin{bmatrix} 2 & 4 & 1 & 3 & 2\\ -1 & -2 & 1 & 0 & 5\\ 1 & 6 & 2 & 2 & 2\\ 3 & 6 & 2 & 5 & 1 \end{bmatrix} </math> the rows are <math>\mathbf{r}_1 = \begin{bmatrix} 2 & 4 & 1 & 3 & 2 \end{bmatrix}</math>, <math>\mathbf{r}_2 = \begin{bmatrix} -1 & -2 & 1 & 0 & 5 \end{bmatrix}</math>, <math>\mathbf{r}_3 = \begin{bmatrix} 1 & 6 & 2 & 2 & 2 \end{bmatrix}</math>, <math>\mathbf{r}_4 = \begin{bmatrix} 3 & 6 & 2 & 5 & 1 \end{bmatrix}</math>. Consequently, the row space of {{mvar|J}} is the subspace of <math>\R^5</math> [[linear span|spanned]] by {{math|{{mset| '''r'''<sub>1</sub>, '''r'''<sub>2</sub>, '''r'''<sub>3</sub>, '''r'''<sub>4</sub> }}}}. Since these four row vectors are [[Linear independence|linearly independent]], the row space is 4-dimensional. Moreover, in this case it can be seen that they are all [[orthogonality|orthogonal]] to the vector {{math|1='''n''' = [6, −1, 4, −4, 0]}} ({{math|1='''n'''}} is an element of the [[Kernel (linear algebra)|kernel]] of {{mvar|J}} ), so it can be deduced that the row space consists of all vectors in <math>\R^5</math> that are orthogonal to {{math|'''n'''}}. ==Column space== ===Definition=== Let {{mvar|K}} be a [[field (mathematics)|field]] of [[scalar (mathematics)|scalars]]. Let {{math|A}} be an {{math|''m'' × ''n''}} matrix, with column vectors {{math|'''v'''<sub>1</sub>, '''v'''<sub>2</sub>, ..., '''v'''<sub>''n''</sub>}}. A [[linear combination]] of these vectors is any vector of the form :<math>c_1 \mathbf{v}_1 + c_2 \mathbf{v}_2 + \cdots + c_n \mathbf{v}_n,</math> where {{math|''c''<sub>1</sub>, ''c''<sub>2</sub>, ..., ''c<sub>n</sub>''}} are scalars. The set of all possible linear combinations of {{math|'''v'''<sub>1</sub>, ..., '''v'''<sub>''n''</sub>}} is called the '''column space''' of {{mvar|A}}. That is, the column space of {{mvar|A}} is the [[linear span|span]] of the vectors {{math|'''v'''<sub>1</sub>, ..., '''v'''<sub>''n''</sub>}}. Any linear combination of the column vectors of a matrix {{mvar|A}} can be written as the product of {{mvar|A}} with a column vector: :<math>\begin{array} {rcl} A \begin{bmatrix} c_1 \\ \vdots \\ c_n \end{bmatrix} & = & \begin{bmatrix} a_{11} & \cdots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \cdots & a_{mn} \end{bmatrix} \begin{bmatrix} c_1 \\ \vdots \\ c_n \end{bmatrix} = \begin{bmatrix} c_1 a_{11} + \cdots + c_{n} a_{1n} \\ \vdots \\ c_{1} a_{m1} + \cdots + c_{n} a_{mn} \end{bmatrix} = c_1 \begin{bmatrix} a_{11} \\ \vdots \\ a_{m1} \end{bmatrix} + \cdots + c_n \begin{bmatrix} a_{1n} \\ \vdots \\ a_{mn} \end{bmatrix} \\ & = & c_1 \mathbf{v}_1 + \cdots + c_n \mathbf{v}_n \end{array}</math> Therefore, the column space of {{mvar|A}} consists of all possible products {{math|''A'''''x'''}}, for {{math|'''x''' ∈ ''K''<sup>''n''</sup>}}. This is the same as the [[image (mathematics)|image]] (or [[range of a function|range]]) of the corresponding [[matrix transformation]]. ==== Example ==== If <math>A = \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 2 & 0 \end{bmatrix}</math>, then the column vectors are {{math|1='''v'''<sub>1</sub> = [1, 0, 2]<sup>T</sup>}} and {{math|1='''v'''<sub>2</sub> = [0, 1, 0]<sup>T</sup>}}. A linear combination of '''v'''<sub>1</sub> and '''v'''<sub>2</sub> is any vector of the form <math display="block">c_1 \begin{bmatrix} 1 \\ 0 \\ 2 \end{bmatrix} + c_2 \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} = \begin{bmatrix} c_1 \\ c_2 \\ 2c_1 \end{bmatrix}</math> The set of all such vectors is the column space of {{mvar|A}}. In this case, the column space is precisely the set of vectors {{math|(''x'', ''y'', ''z'') ∈ '''R'''<sup>3</sup>}} satisfying the equation {{math|1=''z'' = 2''x''}} (using [[Cartesian coordinates]], this set is a [[plane (mathematics)|plane]] through the origin in [[three-dimensional space]]). ===Basis=== The columns of {{mvar|A}} span the column space, but they may not form a [[basis (linear algebra)|basis]] if the column vectors are not [[linearly independent]]. Fortunately, [[elementary row operations]] do not affect the dependence relations between the column vectors. This makes it possible to use [[row reduction]] to find a [[basis (linear algebra)|basis]] for the column space. For example, consider the matrix :<math>A = \begin{bmatrix} 1 & 3 & 1 & 4 \\ 2 & 7 & 3 & 9 \\ 1 & 5 & 3 & 1 \\ 1 & 2 & 0 & 8 \end{bmatrix}.</math> The columns of this matrix span the column space, but they may not be [[linearly independent]], in which case some subset of them will form a basis. To find this basis, we reduce {{mvar|A}} to [[reduced row echelon form]]: :<math>\begin{bmatrix} 1 & 3 & 1 & 4 \\ 2 & 7 & 3 & 9 \\ 1 & 5 & 3 & 1 \\ 1 & 2 & 0 & 8 \end{bmatrix} \sim \begin{bmatrix} 1 & 3 & 1 & 4 \\ 0 & 1 & 1 & 1 \\ 0 & 2 & 2 & -3 \\ 0 & -1 & -1 & 4 \end{bmatrix} \sim \begin{bmatrix} 1 & 0 & -2 & 1 \\ 0 & 1 & 1 & 1 \\ 0 & 0 & 0 & -5 \\ 0 & 0 & 0 & 5 \end{bmatrix} \sim \begin{bmatrix} 1 & 0 & -2 & 0 \\ 0 & 1 & 1 & 0 \\ 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 0 \end{bmatrix}.</math><ref>This computation uses the [[Gaussian elimination|Gauss–Jordan]] row-reduction algorithm. Each of the shown steps involves multiple elementary row operations.</ref> At this point, it is clear that the first, second, and fourth columns are linearly independent, while the third column is a linear combination of the first two. (Specifically, {{math|1='''v'''<sub>3</sub> = −2'''v'''<sub>1</sub> + '''v'''<sub>2</sub>}}.) Therefore, the first, second, and fourth columns of the original matrix are a basis for the column space: :<math>\begin{bmatrix} 1 \\ 2 \\ 1 \\ 1\end{bmatrix},\;\; \begin{bmatrix} 3 \\ 7 \\ 5 \\ 2\end{bmatrix},\;\; \begin{bmatrix} 4 \\ 9 \\ 1 \\ 8\end{bmatrix}.</math> Note that the independent columns of the reduced row echelon form are precisely the columns with [[Pivot element|pivots]]. This makes it possible to determine which columns are linearly independent by reducing only to [[row echelon form|echelon form]]. The above algorithm can be used in general to find the dependence relations between any set of vectors, and to pick out a basis from any spanning set. Also finding a basis for the column space of {{mvar|A}} is equivalent to finding a basis for the row space of the [[transpose]] matrix {{math|''A''<sup>T</sup>}}. To find the basis in a practical setting (e.g., for large matrices), the [[singular-value decomposition]] is typically used. ===Dimension=== {{main|Rank (linear algebra)}} The [[dimension (linear algebra)|dimension]] of the column space is called the '''[[rank (linear algebra)|rank]]''' of the matrix. The rank is equal to the number of pivots in the [[reduced row echelon form]], and is the maximum number of linearly independent columns that can be chosen from the matrix. For example, the 4 × 4 matrix in the example above has rank three. Because the column space is the [[image (mathematics)|image]] of the corresponding [[matrix transformation]], the rank of a matrix is the same as the dimension of the image. For example, the transformation <math>\R^4 \to \R^4</math> described by the matrix above maps all of <math>\R^4</math> to some three-dimensional [[Euclidean subspace|subspace]]. The '''nullity''' of a matrix is the dimension of the [[kernel (matrix)|null space]], and is equal to the number of columns in the reduced row echelon form that do not have pivots.<ref>Columns without pivots represent free variables in the associated homogeneous [[system of linear equations]].</ref> The rank and nullity of a matrix {{mvar|A}} with {{mvar|n}} columns are related by the equation: :<math>\operatorname{rank}(A) + \operatorname{nullity}(A) = n.\,</math> This is known as the [[rank–nullity theorem]]. ===Relation to the left null space=== The [[left null space]] of {{mvar|A}} is the set of all vectors {{math|'''x'''}} such that {{math|1='''x'''<sup>T</sup>''A'' = '''0'''<sup>T</sup>}}. It is the same as the [[kernel (matrix)|null space]] of the [[transpose]] of {{mvar|A}}. The product of the matrix {{math|''A''<sup>T</sup>}} and the vector {{math|'''x'''}} can be written in terms of the [[dot product]] of vectors: :<math>A^\mathsf{T}\mathbf{x} = \begin{bmatrix} \mathbf{v}_1 \cdot \mathbf{x} \\ \mathbf{v}_2 \cdot \mathbf{x} \\ \vdots \\ \mathbf{v}_n \cdot \mathbf{x} \end{bmatrix},</math> because [[row vector]]s of {{math|''A''<sup>T</sup>}} are transposes of column vectors {{math|'''v'''<sub>''k''</sub>}} of {{mvar|A}}. Thus {{math|1=''A''<sup>T</sup>'''x''' = '''0'''}} if and only if {{math|'''x'''}} is [[orthogonal]] (perpendicular) to each of the column vectors of {{mvar|A}}. It follows that the left null space (the null space of {{math|''A''<sup>T</sup>}}) is the [[orthogonal complement]] to the column space of {{mvar|A}}. For a matrix {{mvar|A}}, the column space, row space, null space, and left null space are sometimes referred to as the ''four fundamental subspaces''. ===For matrices over a ring=== Similarly the column space (sometimes disambiguated as ''right'' column space) can be defined for matrices over a [[ring (mathematics)|ring]] {{mvar|K}} as :<math>\sum\limits_{k=1}^n \mathbf{v}_k c_k</math> for any {{math|''c''<sub>1</sub>, ..., ''c<sub>n</sub>''}}, with replacement of the vector {{mvar|m}}-space with "[[left and right (algebra)|right]] [[free module]]", which changes the order of [[scalar multiplication]] of the vector {{math|'''v'''<sub>''k''</sub>}} to the scalar {{math|''c<sub>k</sub>''}} such that it is written in an unusual order ''vector''–''scalar''.<ref>Important only if {{mvar|K}} is not [[commutative ring|commutative]]. Actually, this form is merely a [[matrix multiplication|product]] {{math|''A'''''c'''}} of the matrix {{mvar|A}} to the column vector {{math|'''c'''}} from {{math|''K''<sup>''n''</sup>}} where the order of factors is ''preserved'', unlike [[#Definition|the formula above]].</ref> ==Row space== ===Definition=== Let {{mvar|K}} be a [[field (mathematics)|field]] of [[scalar (mathematics)|scalars]]. Let {{mvar|A}} be an {{math|''m'' × ''n''}} matrix, with row vectors {{math|'''r'''<sub>1</sub>, '''r'''<sub>2</sub>, ..., '''r'''<sub>''m''</sub>}}. A [[linear combination]] of these vectors is any vector of the form :<math>c_1 \mathbf{r}_1 + c_2 \mathbf{r}_2 + \cdots + c_m \mathbf{r}_m,</math> where {{math|''c''<sub>1</sub>, ''c''<sub>2</sub>, ..., ''c<sub>m</sub>''}} are scalars. The set of all possible linear combinations of {{math|'''r'''<sub>1</sub>, ..., '''r'''<sub>''m''</sub>}} is called the '''row space''' of {{mvar|A}}. That is, the row space of {{mvar|A}} is the [[linear span|span]] of the vectors {{math|'''r'''<sub>1</sub>, ..., '''r'''<sub>''m''</sub>}}. For example, if :<math>A = \begin{bmatrix} 1 & 0 & 2 \\ 0 & 1 & 0 \end{bmatrix},</math> then the row vectors are {{math|1='''r'''<sub>1</sub> = [1, 0, 2]}} and {{math|1='''r'''<sub>2</sub> = [0, 1, 0]}}. A linear combination of {{math|'''r'''<sub>1</sub>}} and {{math|'''r'''<sub>2</sub>}} is any vector of the form :<math>c_1 \begin{bmatrix}1 & 0 & 2\end{bmatrix} + c_2 \begin{bmatrix}0 & 1 & 0\end{bmatrix} = \begin{bmatrix}c_1 & c_2 & 2c_1\end{bmatrix}.</math> The set of all such vectors is the row space of {{mvar|A}}. In this case, the row space is precisely the set of vectors {{math|(''x'', ''y'', ''z'') ∈ ''K''<sup>3</sup>}} satisfying the equation {{math|1=''z'' = 2''x''}} (using [[Cartesian coordinates]], this set is a [[plane (mathematics)|plane]] through the origin in [[three-dimensional space]]). For a matrix that represents a homogeneous [[system of linear equations]], the row space consists of all linear equations that follow from those in the system. The column space of {{mvar|A}} is equal to the row space of {{math|''A''<sup>T</sup>}}. ===Basis=== The row space is not affected by [[elementary row operations]]. This makes it possible to use [[row reduction]] to find a [[basis (linear algebra)|basis]] for the row space. For example, consider the matrix :<math>A = \begin{bmatrix} 1 & 3 & 2 \\ 2 & 7 & 4 \\ 1 & 5 & 2\end{bmatrix}.</math> The rows of this matrix span the row space, but they may not be [[linearly independent]], in which case the rows will not be a basis. To find a basis, we reduce {{mvar|A}} to [[row echelon form]]: {{math|'''r'''<sub>1</sub>}}, {{math|'''r'''<sub>2</sub>}}, {{math|'''r'''<sub>3</sub>}} represents the rows. :<math> \begin{align} \begin{bmatrix} 1 & 3 & 2 \\ 2 & 7 & 4 \\ 1 & 5 & 2\end{bmatrix} &\xrightarrow{\mathbf{r}_2-2\mathbf{r}_1 \to \mathbf{r}_2} \begin{bmatrix} 1 & 3 & 2 \\ 0 & 1 & 0 \\ 1 & 5 & 2\end{bmatrix} \xrightarrow{\mathbf{r}_3-\,\,\mathbf{r}_1 \to \mathbf{r}_3} \begin{bmatrix} 1 & 3 & 2 \\ 0 & 1 & 0 \\ 0 & 2 & 0\end{bmatrix} \\ &\xrightarrow{\mathbf{r}_3-2\mathbf{r}_2 \to \mathbf{r}_3} \begin{bmatrix} 1 & 3 & 2 \\ 0 & 1 & 0 \\ 0 & 0 & 0\end{bmatrix} \xrightarrow{\mathbf{r}_1-3\mathbf{r}_2 \to \mathbf{r}_1} \begin{bmatrix} 1 & 0 & 2 \\ 0 & 1 & 0 \\ 0 & 0 & 0\end{bmatrix}. \end{align} </math> Once the matrix is in echelon form, the nonzero rows are a basis for the row space. In this case, the basis is {{math|{{mset| [1, 3, 2], [2, 7, 4] }}}}. Another possible basis {{math|{{mset| [1, 0, 2], [0, 1, 0] }}}} comes from a further reduction.<ref name="example">The example is valid over the [[real number]]s, the [[rational number]]s, and other [[number field]]s. It is not necessarily correct over fields and rings with non-zero [[characteristic (algebra)|characteristic]].</ref> This algorithm can be used in general to find a basis for the span of a set of vectors. If the matrix is further simplified to [[reduced row echelon form]], then the resulting basis is uniquely determined by the row space. It is sometimes convenient to find a basis for the row space from among the rows of the original matrix instead (for example, this result is useful in giving an elementary proof that the [[Rank (linear algebra)#Alternative definitions|determinantal rank]] of a matrix is equal to its rank). Since row operations can affect linear dependence relations of the row vectors, such a basis is instead found indirectly using the fact that the column space of {{math|''A''<sup>T</sup>}} is equal to the row space of {{mvar|A}}. Using the example matrix {{mvar|A}} above, find {{math|''A''<sup>T</sup>}} and reduce it to row echelon form: :<math> A^{\mathrm{T}} = \begin{bmatrix} 1 & 2 & 1 \\ 3 & 7 & 5 \\ 2 & 4 & 2\end{bmatrix} \sim \begin{bmatrix} 1 & 2 & 1 \\ 0 & 1 & 2 \\ 0 & 0 & 0\end{bmatrix}. </math> The pivots indicate that the first two columns of {{math|''A''<sup>T</sup>}} form a basis of the column space of {{math|''A''<sup>T</sup>}}. Therefore, the first two rows of {{mvar|A}} (before any row reductions) also form a basis of the row space of {{mvar|A}}. ===Dimension=== {{main|Rank (linear algebra)}} The [[dimension (linear algebra)|dimension]] of the row space is called the '''[[rank (linear algebra)|rank]]''' of the matrix. This is the same as the maximum number of linearly independent rows that can be chosen from the matrix, or equivalently the number of pivots. For example, the 3 × 3 matrix in the example above has rank two.<ref name="example"/> The rank of a matrix is also equal to the dimension of the [[column space]]. The dimension of the [[null space]] is called the '''nullity''' of the matrix, and is related to the rank by the following equation: :<math>\operatorname{rank}(A) + \operatorname{nullity}(A) = n,</math> where {{mvar|n}} is the number of columns of the matrix {{mvar|A}}. The equation above is known as the [[rank–nullity theorem]]. ===Relation to the null space=== The [[null space]] of matrix {{mvar|A}} is the set of all vectors {{math|'''x'''}} for which {{math|1=''A'''''x''' = '''0'''}}. The product of the matrix {{mvar|A}} and the vector {{math|'''x'''}} can be written in terms of the [[dot product]] of vectors: :<math>A\mathbf{x} = \begin{bmatrix} \mathbf{r}_1 \cdot \mathbf{x} \\ \mathbf{r}_2 \cdot \mathbf{x} \\ \vdots \\ \mathbf{r}_m \cdot \mathbf{x} \end{bmatrix},</math> where {{math|'''r'''<sub>1</sub>, ..., '''r'''<sub>''m''</sub>}} are the row vectors of {{mvar|A}}. Thus {{math|1=''A'''''x''' = '''0'''}} if and only if {{math|'''x'''}} is [[orthogonal]] (perpendicular) to each of the row vectors of {{mvar|A}}. It follows that the null space of {{mvar|A}} is the [[orthogonal complement]] to the row space. For example, if the row space is a plane through the origin in three dimensions, then the null space will be the perpendicular line through the origin. This provides a proof of the [[rank–nullity theorem]] (see [[#Dimension|dimension]] above). The row space and null space are two of the [[four fundamental subspaces]] associated with a matrix {{mvar|A}} (the other two being the [[column space]] and [[left null space]]). ===Relation to coimage=== If {{mvar|V}} and {{mvar|W}} are [[vector spaces]], then the [[kernel (linear algebra)|kernel]] of a [[linear transformation]] {{math|''T'': ''V'' → ''W''}} is the set of vectors {{math|'''v''' ∈ ''V''}} for which {{math|1=''T''('''v''') = '''0'''}}. The kernel of a linear transformation is analogous to the null space of a matrix. If {{mvar|V}} is an [[inner product space]], then the orthogonal complement to the kernel can be thought of as a generalization of the row space. This is sometimes called the [[coimage]] of {{mvar|T}}. The transformation {{mvar|T}} is one-to-one on its coimage, and the coimage maps [[isomorphism|isomorphically]] onto the [[image (mathematics)|image]] of {{mvar|T}}. When {{mvar|V}} is not an inner product space, the coimage of {{mvar|T}} can be defined as the [[quotient space (linear algebra)|quotient space]] {{math|''V'' / ker(''T'')}}. ==See also== * [[Euclidean subspace]] ==References & Notes== {{reflist}} {{see also|Linear algebra#Further reading}} ==Further reading== * {{Citation | last = Anton | first = Howard | date = 1987 | title = Elementary Linear Algebra | location = New York | publisher = [[John Wiley & Sons|Wiley]] | edition = 5th | isbn = 0-471-84819-0 }} * {{Citation | last = Axler | first = Sheldon Jay | date = 1997 | title = Linear Algebra Done Right | publisher = Springer-Verlag | edition = 2nd | isbn = 0-387-98259-0 }} * {{Citation | last1 = Banerjee | first1 = Sudipto | last2 = Roy | first2 = Anindya | date = June 6, 2014 | title = Linear Algebra and Matrix Analysis for Statistics | publisher = CRC Press | edition = 1st | isbn = 978-1-42-009538-8 }} * {{citation | last1 = Beauregard | first1 = Raymond A. | last2 = Fraleigh | first2 = John B. | title = A First Course In Linear Algebra: with Optional Introduction to Groups, Rings, and Fields | location = Boston | publisher = [[Houghton Mifflin Company]] | year = 1973 | isbn = 0-395-14017-X | url-access = registration | url = https://archive.org/details/firstcourseinlin0000beau }} * {{Citation | last = Lay | first = David C. | date = August 22, 2005 | title = Linear Algebra and Its Applications | publisher = Addison Wesley | edition = 3rd | isbn = 978-0-321-28713-7 }} * {{Citation | last = Leon | first = Steven J. | date = 2006 | title = Linear Algebra With Applications | publisher = Pearson Prentice Hall | edition = 7th }} * {{Citation |last = Meyer |first = Carl D. |date = February 15, 2001 |title = Matrix Analysis and Applied Linear Algebra |publisher = Society for Industrial and Applied Mathematics (SIAM) |isbn = 978-0-89871-454-8 |url = http://www.matrixanalysis.com/DownloadChapters.html |url-status = dead |archive-url = https://web.archive.org/web/20010301161440/http://matrixanalysis.com/DownloadChapters.html |archive-date = March 1, 2001 }} * {{Citation | last = Poole | first = David | date = 2006 | title = Linear Algebra: A Modern Introduction | publisher = Brooks/Cole | edition = 2nd | isbn = 0-534-99845-3 }} * {{Citation | last = Strang | first = Gilbert | date = July 19, 2005 | title = Linear Algebra and Its Applications | publisher = Brooks Cole | edition = 4th | isbn = 978-0-03-010567-8 }} ==External links== {{wikibooks|Linear Algebra/Column and Row Spaces}} *{{MathWorld |title=Row Space |urlname=RowSpace}} *{{MathWorld |title=Column Space |urlname=ColumnSpace}} *{{aut|[[Gilbert Strang]]}}, [http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/VideoLectures/detail/lecture10.htm MIT Linear Algebra Lecture on the Four Fundamental Subspaces] at Google Video, from [[MIT OpenCourseWare]] *[http://www.khanacademy.org/video/column-space-of-a-matrix?playlist=Linear+Algebra Khan Academy video tutorial] *[http://mfile.akamai.com/7870/rm/mitstorage.download.akamai.com/7870/18/18.06/videolectures/strang-1806-lec06-20sep1999-80k.rm Lecture on column space and nullspace by Gilbert Strang of MIT] *[http://wps.prenhall.com/am_leon_linearalg_7/53/13727/3514210.cw/content/index.html Row Space and Column Space] {{linear algebra}} [[Category:Abstract algebra]] [[Category:Linear algebra]] [[Category:Matrices (mathematics)]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Aut
(
edit
)
Template:Citation
(
edit
)
Template:Cite book
(
edit
)
Template:Harvtxt
(
edit
)
Template:Linear algebra
(
edit
)
Template:Main
(
edit
)
Template:Math
(
edit
)
Template:MathWorld
(
edit
)
Template:Mvar
(
edit
)
Template:Navbox
(
edit
)
Template:Reflist
(
edit
)
Template:See also
(
edit
)
Template:SfnRef
(
edit
)
Template:Short description
(
edit
)
Template:Sister project
(
edit
)
Template:Wikibooks
(
edit
)