Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Rank (linear algebra)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Proof using orthogonality=== Let {{mvar|A}} be an {{math|''m'' Γ ''n''}} matrix with entries in the [[real number]]s whose row rank is {{mvar|r}}. Therefore, the dimension of the row space of {{mvar|A}} is {{mvar|r}}. Let {{math|'''x'''<sub>1</sub>, '''x'''<sub>2</sub>, β¦, '''x'''<sub>''r''</sub>}} be a [[basis (linear algebra)|basis]] of the row space of {{mvar|A}}. We claim that the vectors {{math|''A'''''x'''<sub>1</sub>, ''A'''''x'''<sub>2</sub>, β¦, ''A'''''x'''<sub>''r''</sub>}} are [[linearly independent]]. To see why, consider a linear homogeneous relation involving these vectors with scalar coefficients {{math|''c''<sub>1</sub>, ''c''<sub>2</sub>, β¦, ''c<sub>r</sub>''}}: <math display="block">0 = c_1 A\mathbf{x}_1 + c_2 A\mathbf{x}_2 + \cdots + c_r A\mathbf{x}_r = A(c_1 \mathbf{x}_1 + c_2 \mathbf{x}_2 + \cdots + c_r \mathbf{x}_r) = A\mathbf{v}, </math> where {{math|1='''v''' = ''c''<sub>1</sub>'''x'''<sub>1</sub> + ''c''<sub>2</sub>'''x'''<sub>2</sub> + β― + ''c<sub>r</sub>'''''x'''<sub>''r''</sub>}}. We make two observations: (a) {{math|'''v'''}} is a linear combination of vectors in the row space of {{mvar|A}}, which implies that {{math|'''v'''}} belongs to the row space of {{mvar|A}}, and (b) since {{math|1=''A'''''v''' = 0}}, the vector {{math|'''v'''}} is [[orthogonal]] to every row vector of {{mvar|A}} and, hence, is orthogonal to every vector in the row space of {{mvar|A}}. The facts (a) and (b) together imply that {{math|'''v'''}} is orthogonal to itself, which proves that {{math|1='''v''' = 0}} or, by the definition of {{math|'''v'''}}, <math display="block">c_1\mathbf{x}_1 + c_2\mathbf{x}_2 + \cdots + c_r \mathbf{x}_r = 0.</math> But recall that the {{math|'''x'''<sub>''i''</sub>}} were chosen as a basis of the row space of {{mvar|A}} and so are linearly independent. This implies that {{math|1=''c''<sub>1</sub> = ''c''<sub>2</sub> = β― = ''c<sub>r</sub>'' = 0}}. It follows that {{math|''A'''''x'''<sub>1</sub>, ''A'''''x'''<sub>2</sub>, β¦, ''A'''''x'''<sub>''r''</sub>}} are linearly independent. Now, each {{math|''A'''''x'''<sub>''i''</sub>}} is obviously a vector in the column space of {{mvar|A}}. So, {{math|''A'''''x'''<sub>1</sub>, ''A'''''x'''<sub>2</sub>, β¦, ''A'''''x'''<sub>''r''</sub>}} is a set of {{mvar|r}} linearly independent vectors in the column space of {{mvar|A}} and, hence, the dimension of the column space of {{mvar|A}} (i.e., the column rank of {{mvar|A}}) must be at least as big as {{mvar|r}}. This proves that row rank of {{mvar|A}} is no larger than the column rank of {{mvar|A}}. Now apply this result to the transpose of {{mvar|A}} to get the reverse inequality and conclude as in the previous proof.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)