## A linear combination of vectors

**w**is called a

*linear combination*of the vectors

**v**,

_{1}**v**, …,

_{2}**v**

_{r}if

**w**can be expressed in the form

**w**=

*k*

_{1}

**v**+

_{1}*k*

_{2}

**v**+ … +

_{2}*k*

_{r}**v**

_{r}

*k*

_{1},

*k*

_{2}, …,

*k*are scalars.

_{r}**u**= [1, 2, −1]

^{T}and

**v**= [6, 4, 2]

^{T}. Is

**w**= [9, 2, 7]

^{T}a linear combination of

**u**and

**v**? What about

**w**' = [4, −1, 8]

^{T}?

Tackling the question "Is **w** = [9, 2, 7]^{T} a linear combination of **u** and **v**?"

From definition we know that if **w** is a linear combination of **u** and **v** then there must exist scalars *k*_{1} and *k*_{2} such that

**w**=

*k*

_{1}

**u**+

*k*

_{2}

**v**

*k*

_{1}+ 6

*k*

_{2}= 9

2

*k*

_{1}+ 4

*k*

_{2}= 2

−

*k*

_{1}+ 2

*k*

_{2}= 7

*k*

_{1}+ 6

*k*

_{2}= 9

−

*k*

_{1}+ 2

*k*

_{2}= 7

0 + 8

*k*

_{2}= 16

⇒

*k*

_{2}= 2

*k*

_{2}= 2 into the middle equation returns

*k*

_{1}+ 4⋅2 = 2

⇒

*k*

_{1}= −3

*k*

_{1}= −3 and

*k*

_{2}= 2 into the system returns

*k*

_{1}+ 6

*k*

_{2}= −3 + 6⋅2 = 9

2

*k*

_{1}+ 4

*k*

_{2}= 2(−3) + 4⋅2 = 2

−

*k*

_{1}+ 2

*k*

_{2}= −(−3) + 2⋅2 = 7

*k*

_{1}= −3 and

*k*

_{2}= 2. That is,

**w**can be expressed in the form

**w**= −3

**u**+ 2

**v**

**w**is a linear combination of

**u**and

**v**.

Tackling the question "Is **w**' = [4, −1, 8]^{T} a linear combination of **u** and **v**?"

For **w**' to be a linear combination of **u** and **v** there must exist (some other) scalars *k*_{1} and *k*_{2} such that

**w**' =

*k*

_{1}

**u**+

*k*

_{2}

**v**

*k*

_{1}+ 6

*k*

_{2}= 4

2

*k*

_{1}+ 4

*k*

_{2}= −1

−

*k*

_{1}+ 2

*k*

_{2}= 8

*k*

_{1}+ 6

*k*

_{2}= 4

−

*k*

_{1}+ 2

*k*

_{2}= 8

0 + 8

*k*

_{2}= 12

⇒

*k*

_{2}= 3/2

*k*

_{2}= 3/2 into the middle equation returns

*k*

_{1}+ 4⋅3/2 = −1

⇒

*k*

_{1}= −7/2

*k*

_{1}= −7/2 and

*k*

_{2}= 3/2 into the system returns

*k*

_{1}+ 6

*k*

_{2}= −7/2 + 6⋅3/2 = −7/2 + 9 ≠ 4

2

*k*

_{1}+ 4

*k*

_{2}= 2(−7/2) + 4⋅3/2 = −7 +6 = −1

−

*k*

_{1}+ 2

*k*

_{2}= −(−7/2) + 2⋅3/2 = 7/2 + 3 ≠ 8

*k*

_{1}= −7/2 and

*k*

_{2}= 3/2 are the scalars that solves the system. In other words, the system has no solution. Therefore

**w**' cannot be expressed in the form

**w**' =

*k*

_{1}

**u**+

*k*

_{2}

**v**

**w**' is not a linear combination of

**u**and

**v**.

## Span for vector space *V*

*V*containing the vectors

**v**,

_{1}**v**, …,

_{2}**v**if every vector in

_{r}*V*can be expressed as a linear combination of

**v**,

_{1}**v**, …,

_{2}**v**, then the vectors

_{r}**v**,

_{1}**v**, …,

_{2}**v**are said to

_{r}*span*

*V*.

^{3}containing the vectors

**i**= [1, 0, 0]

^{T},

**j**= [0, 1, 0]

^{T}and

**k**= [0, 0, 1]

^{T}, do the vectors

**i**,

**j**and

**k**span ℜ

^{3}?

From definition we know that for the vectors **i**, **j** and **k** to span the vector space any vector in the vector space should be expressible as their linear combination.

Let us consider an arbitrary vector [*a*, *b*, *c*]^{T} in ℜ^{3}. Picking one element of this vector and multiplying it to one of the span in question, say *a***i**, and doing this for other elements we get

*a*

**i**+

*b*

**j**+

*c*

**k**

*a*,

*b*,

*c*]

^{T}=

*a*

**i**+

*b*

**j**+

*c*

**k**

**i**,

**j**and

**k**span ℜ

^{3}because any arbitrary vector in the said vector space can be represented as a linear combination of

**i**,

**j**and

**k**.

What if the arbitrary vector is specified to be one of the vectors that spans ℜ^{3}, say [*a*, *b*, *c*]^{T} = [1, 0, 0]^{T} = **i**?

The linear combination 1 **i** + 0 **j** + 0 **k** returns

**i**can be expressed as the lienar combination;

**i**= 1

**i**+ 0

**j**+ 0

**k**.

What this tells us is that when the definition states "every vector in the vector space should be expressible as a linear combination" it includes vectors that are said to span the vector space.

## Linearly Dependent and Indenpendent Set (of vectors)

*V*containing the finite set of vectors

*S*= {

**v**,

_{1}**v**, …,

_{2}**v**} such that the vector equation

_{r}*k*

_{1}

**v**+

_{1}*k*

_{2}

**v**+ … +

_{2}*k*

_{r}**v**

_{r}=

**0**

*k*

_{1}= 0,

*k*

_{2}= 0, …,

*k*= 0

_{r}- If this is the only solution,
*S*is called a*linearly independent set.* - If there are other solutions,
*S*is called a*linearly dependent set.*

^{3}containing the vectors

**i**= [1, 0, 0]

^{T},

**j**= [0, 1, 0]

^{T}and

**k**= [0, 0, 1]

^{T}, are the vectors

**i**,

**j**and

**k**linearly independent?

From definition we know that for the vectors **i**, **j** and **k** to be independent its vector equation

*k*

_{1}

**i**+

*k*

_{2}

**j**+

*k*

_{3}

**k**=

**0**

*k*

_{1}= 0,

*k*

_{2}= 0,

*k*

_{3}= 0.

Substituting the values for the vectors in the vector equation we get

*k*

_{1}= 0,

*k*

_{2}= 0,

*k*

_{3}= 0 is the only solution to the equation; the set

*S*= {

**i**,

**j**,

**k**} is linearly independent.

*V*containing the set

*S*= {

**v**,

_{1}**v**,

_{2}**v**} such that vectors

_{3}**v**= [2, −1, 0, 3]

_{1}^{T},

**v**= [1, 2, 5, −1]

_{2}^{T}and

**v**= [7, −1, 5, 8]

_{3}^{T}. Are

**v**,

_{1}**v**and

_{2}**v**linearly dependent?

_{3}From definition we have the vector equation

*k*

_{1}

**v**+

_{1}*k*

_{2}

**v**+

_{2}*k*

_{3}

**v**=

_{3}**0**

*k*

_{1}+

*k*

_{2}+ 7

*k*

_{3}= 0

−

*k*

_{1}+ 2

*k*

_{2}−

*k*

_{3}= 0

5

*k*

_{2}+ 5

*k*

_{3}= 0

3

*k*

_{1}−

*k*

_{2}+ 8

*k*

_{3}= 0

*k*

_{2}= −

*k*

_{3}. So if we let

*k*

_{2}= 1 then

*k*

_{3}= −1. Substituting these values to any of the remaining three equations yields

*k*

_{3}= 3. Below shows the result for subsitution done on the first equation.

*k*

_{1}+ 1 − 7

*k*

_{3}= 0

⇒ 2

*k*

_{1}− 6 = 0

or

*k*

_{1}= 6/2 = 3

*k*

_{1}= 3,

*k*

_{2}= 1 and

*k*

_{3}= −1 into the system

*k*

_{1}+

*k*

_{2}+ 7

*k*

_{3}= 2⋅3 + 1 + 7(−1) = 6 − 6 = 0

−

*k*

_{1}+ 2

*k*

_{2}−

*k*

_{3}= −3 + 2 − (−1)= −3 + 3 = 0

5

*k*

_{2}+ 5

*k*

_{3}= 5 + 5(−1) = 5 − 5 = 0

3

*k*

_{1}−

*k*

_{2}+ 8

*k*

_{3}= 3⋅3 − 1 + 8(−1) = 9 − 9 = 0

*k*

_{1}= 0,

*k*

_{2}= 0,

*k*

_{3}= 0

*k*

_{1}= 3,

*k*

_{2}= 1,

*k*

_{3}= −1

*S*= {

**v**,

_{1}**v**,

_{2}**v**} is linearly dependent.

_{3}## Linearly Dependent Set and Linear combination of vectors

The term *linearly dependent* suggests that vectors in the set in some way *depend* on each other. This statement can be illustrated with an example.

Let the set *S* = { **v _{1}**,

**v**, …,

_{2}**v**} be a linearly dependent set. Then by definition the vector equation

_{r}*k*

_{1}

**v**+

_{1}*k*

_{2}

**v**+ … +

_{2}*k*

_{r}**v**

_{r}=

**0**

*k*

_{1}=

*k*

_{2}= … =

*k*= 0.

_{r}
Furthermore, assume *k*_{1} ≠ 0. Multiplying both sides by 1/*k*_{1} and solving for **v _{1}** yields

**v**= (−

_{1}*k*

_{2}/

*k*

_{1})

**v**+ (−

_{2}*k*

_{3}/

*k*

_{1})

**v**+ … + (−

_{3}*k*/

_{r}*k*

_{1})

**v**

_{r}

**v**can be expressed as a linear combination of the remaining vectors.

_{1}*at least one of the vectors*is a linear combination of the remaining vectors.

**v**and

_{1}**v**, form a linearly dependent set if and only if one of them is a scalar multiple of the other.

_{2}
For a linearly dependent set *S* = { **v _{1}**,

**v**} and the vector expression

_{2}*k*

_{1}

**v**+

_{1}*k*

_{2}

**v**=

_{2}**0**, if

*k*

_{1}≠ 0 then

**v**= (−

_{1}*k*

_{2}/

*k*

_{1})

**v**. That is

_{2}**v**is a scalar multiple of

_{1}**v**.

_{2}*S*= {

**v**,

_{1}**v**} is linearly independent then geometrically the vectors

_{2}**v**and

_{1}**v**do not lie along the same line. ♻

_{2}
If a vector space ℜ^{2} has a set that contains more than two vectors then the set is linearly dependent.

*S*= {

**v**,

_{1}**v**, …,

_{2}**v**

_{r}} be a set of vectors in ℜ

^{n}. If

*r*>

*n*, then

*S*is linearly dependent.

**v**= [

_{1}*v*

_{11},

*v*

_{12}, …,

*v*

_{1n}]

**v**= [

_{2}*v*

_{21},

*v*

_{22}, …,

*v*

_{2n}]

⋮

**v**

_{r}= [

*v*

_{r1},

*v*

_{r2}, …,

*v*

_{rn}]

*k*

_{1}

**v**+

_{1}*k*

_{2}

**v**+ … +

_{2}*k*

_{r}**v**

_{r}=

**0**

*v*

_{11}

*k*

_{1}+

*v*

_{21}

*k*

_{2}+ … +

*v*

_{r1}

*k*= 0

_{r}*v*

_{12}

*k*

_{1}+

*v*

_{22}

*k*

_{2}+ … +

*v*

_{r2}

*k*= 0

_{r}⋮

*v*

_{1n}

*k*

_{1}+

*v*

_{2n}

*k*

_{2}+ … +

*v*

_{rn}

*k*= 0

_{r}*homogeneous system of*

*n*equations with*r*unknowns*k*

_{1},

*k*

_{2}, …,

*k*.

_{r}Invoking the fundamental theorem of homogeneous system of linear equations

*S*= {

**v**,

_{1}**v**, …,

_{2}**v**

_{r}} in the vector space ℜ

^{n}is a homogeneous system with

*r*>

*n*the system has nontrivial solutions, i.e. many solutions.

Therefore, for *r* > *n* the set *S* = { **v _{1}**,

**v**, …,

_{2}**v**

_{r}} in ℜ

^{n}is a linearly dependent set.∎

## Basis for vector space *V*

*V*the finite set of vectors in

*V*,

*S*= {

**v**,

_{1}**v**, …,

_{2}**v**} is called a

_{r}*basis for V*if

*S*spans*V*;*S*is linearly independent.

^{3}and its set of vectors

*S*= {

**i**,

**j**,

**k**} such that

**i**= [1, 0, 0]

^{T},

**j**= [0, 1, 0]

^{T}and

**k**= [0, 0, 1]

^{T}, is

*S*a basis for ℜ

^{3}?

Does *S* span ℜ^{3}?

We showed in the above example that for an arbitrary vector [*a*, *b*, *c*]^{T} in ℜ^{3} the vector equation

*a*

**i**+

*b*

**j**+

*c*

**k**

*a*,

*b*,

*c*]

^{T}=

*a*

**i**+

*b*

**j**+

*c*

**k**

**i**,

**j**and

**k**span ℜ

^{3}because any arbitrary vector in the said vector space can be represented as a linear combination of

**i**,

**j**and

**k**.

Is *S* a linearly independent set?

We showed in another example that for some scalars *k*_{1}, *k*_{2} and *k*_{3} which form the coefficients of the vector equation

*k*

_{1}

**i**+

*k*

_{2}

**j**+

*k*

_{3}

**k**=

**0**

*k*

_{1}= 0,

*k*

_{2}= 0,

*k*

_{3}= 0 is the only solution to the equation; the set

*S*= {

**i**,

**j**,

**k**} is linearly independent.

Thus, the set of vectors *S* spans its vector space and is a linearly independent set. Thus *S* is a basis for the vector space ℜ^{3}.

Notice that *any vector* **v** = [*v*_{1}, *v*_{2}, *v*_{3}]^{T} in ℜ^{3} can be written *as a linear combination of all the vectors in the set* *S*.

**v**=

*v*

_{1}

**i**+

*v*

_{2}

**j**+

*v*

_{3}

**k**

*S*is a special kind of basis called the

**standard basis**for ℜ

^{3}.

*S*= {

**v**,

_{1}**v**, …,

_{2}**v**} spans the vector space

_{r}*V*and any vector

**v**= [

*v*

_{1},

*v*

_{2}, …,

*v*

_{r}] in

*V*can be expressed as

**v**=

*v*

_{1}

**v**+

_{1}*v*

_{2}

**v**+ … +

_{2}*v*

_{r}

**v**, then

_{r}*S*is called the standard basis for

*V*.

## Dimension of vector space *V*

*V*, the

*number of vectors in a basis*for

*V*is called the dimension of

*V*. Furthermore, the zero vector is defined to have dimension zero.

*x*

_{1}+ 2

*x*

_{2}−

*x*

_{3}+

*x*

_{5}= 0

−

*x*

_{1}−

*x*

_{2}+ 2

*x*

_{3}− 3

*x*

_{4}+

*x*

_{5}= 0

*x*

_{1}+

*x*

_{2}− 2

*x*

_{3}−

*x*

_{5}= 0

*x*

_{3}+

*x*

_{4}+

*x*

_{5}= 0

We must first find the set of vectors that span the solution space. But to this we need to know the solutions for the homogeneous system.

The augmented matrix of the homogeneous system is

*x*

_{1}+

*x*

_{2}+

*x*

_{5}= 0

*x*

_{3}+

*x*

_{5}= 0

*x*

_{4}= 0

*x*

_{1}= −

*x*

_{2}−

*x*

_{5}

*x*

_{3}= −

*x*

_{5}

*x*

_{4}= 0

*x*

_{2}and

*x*

_{5}to some arbitrary constants, say,

*x*

_{2}=

*s*and

*x*

_{5}=

*t*, then the solution space (or solution set) for the homogeneous system is

*x*

_{1}= −

*s*−

*t*,

*x*

_{2}=

*s*,

*x*

_{3}= −

*t*,

*x*

_{4}= 0,

*x*

_{5}=

*t*

*x*

_{1},

*x*

_{2},

*x*

_{3},

*x*

_{4},

*x*

_{5}]

^{T}= [−

*s*−

*t*,

*s*, −

*t*, 0,

*t*]

^{T}. Taking out the unknown terms

*s*and

*t*in the vector we get

**v**= [−1, 1, 0, 0, 0]

_{1}^{T}and

**v**= [−1, 0, −1, 0, 1]

_{2}^{T}then the solution space can be expressed as the solution vector

*x*

_{1},

*x*

_{2},

*x*

_{3},

*x*

_{4},

*x*

_{5}]

^{T}=

*s*

**v**+

_{1}*t*

**v**

_{2}*S*= {

**v**,

_{1}**v**} where

_{2}*s*and

*t*are scalars. Furthermore we know that the vector equation is consistent for any arbitrary

*s*and

*t*values; that is, the vectors of the set span the solution space.

To check if the set *S* = { **v _{1}**,

**v**} is linearly independent or dependent we need to solve the vector equation

_{2}*s*

**v**+

_{1}*t*

**v**=

_{2}**0**

*s*= 0 and

*t*= 0 is the only solution. Thus, the set is independent and because it also span the solution space we can conclude that the set

*S*= {

**v**,

_{1}**v**} is a basis for the solution space.

_{2}
The number of vectors in the basis *S* = { **v _{1}**,

**v**} is two so the solution space is two dimensional.

_{2}
Finding the dimension of a vector space *V* requires first finding its basis which by definition requires determining if the set in question spans *V* **and** is linearly independent.
But, for a vector space *V* known to be *n*-dimensional finding if some set of *n* vectors is a basis for *V* requires determining if it is **either** spanning or linearly independent (theorem below).

*S*= {

**v**,

_{1}**v**, …,

_{2}**v**} is a set of

_{n}*n*linearly independent vectors in a

*n*-dimensional space

*V*, then

*S*is a basis for

*V*.

If *S* = { **v _{1}**,

**v**, …,

_{2}**v**} is a set of

_{n}*n*vectors that spans an

*n*-dimensional space

*V*, then

*S*is a basis for

*V*.

If *S* = { **v _{1}**,

**v**, …,

_{2}**v**} is a set of

_{n}*n*linearly independent vectors in a

*n*-dimensional space

*V*and

*r*<

*n*, then

*S*can be enlarged to a basis for

*V*; that is, there are vectors

**v**, …,

_{r+1}**v**such that {

_{n}**v**,

_{1}**v**, …,

_{2}**v**,

_{r}**v**, …,

_{r+1}**v**} is a basis for

_{n}*V*.

## Row space and column space

*m*×

*n*

**r**= [

_{1}*a*

_{11},

*a*

_{12}, …,

*a*

_{1n}]

**r**= [

_{2}*a*

_{21},

*a*

_{22}, …,

*a*

_{2n}]

⋮

**r**

_{m}= [

*a*

_{r1},

*a*

_{r2}, …,

*a*

_{mn}]

*A*are called

**row vectors**of

*A*and the vectors

**c**= [

_{1}*a*

_{11},

*a*

_{21}, …,

*a*

_{m1}]

^{T}

**c**= [

_{2}*a*

_{12},

*a*

_{22}, …,

*a*

_{m2}]

^{T}

⋮

**c**

_{n}= [

*a*

_{1n},

*a*

_{2n}, …,

*a*

_{mn}]

^{T}

*A*are called

**column vectors**of

*A*.

^{n}spanned by the row vectors of the

*m*×

*n*matrix

*A*is called the

**row space**of

*A*.

The subspace (set of vectors) of ℜ^{n} spanned by the column vectors of the *m* × *n* matrix *A* is called the **column space** of *A*.

*A*.

**r**,

_{1}**r**, …,

_{2}**r**

_{m}*m*×

*n*matrix

*A*. Also, suppose matrix

*B*is the result of performing elmentary row operations on

*A*.

Then to show *A* and *B* have the same row space we need to show that

- 1. every row vector in
*B*is also in row space of*A* - 2. every row vector in
*A*is in the row space of*B*

- • row interchange

*B*and*A*will have the same row vectors and hence the same row space.

- • scalar multiplication of row or addition of a multiple of one row to another

The row vectors**r**',_{1}**r**', …,_{2}**r**' of_{m}*B*are linear combinations of**r**,_{1}**r**, …,_{2}**r**. Therefore, row vectors of_{m}*B*lie in the row space of*A*.

What about the linear combinations of**r**',_{1}**r**', …,_{2}**r**', do they lie in the row space of_{m}*A*?

Because of the theorem that says that a vector space is closed under addition and scalar multiplicationand since the row vectors ofIf*W*is a set of one or more vectors from a vector space*V*, then*W*is a**subspace**of*V*if and only if the following conditions hold.*W*is**closed under addition**; If**u**and**v**are any vectors in*W*, then**u**+**v**is in*W*.*W*is**closed under scalar multiplication**; If*k*is any scalar and**u**is any vectors in*W*, then*k***u**is in*W*.

*B*lie in the row space of*A*(consequence of the row vectors of*B*being linear combinations of row vectors of*A*) we can conclude that all linear combinations of**r**',_{1}**r**', …,_{2}**r**' will also lie in the row space of_{m}*A*.

Thus, every vector in the row space of*B*is in the row space of*A*.∎^{1}

*B*was obtained from

*A*by performing row operations

*A*by doing the inverse operations on

*B*

**r**,

_{1}**r**, …,

_{2}**r**lie in the row space of

_{m}*B*and all linear combinations of

**r**,

_{1}**r**, …,

_{2}**r**lie in the row space of

_{m}*B*.

Hence, every vector in the row space of *A* is in the row space of *B*.∎^{2}

*A*form a basis for the row space of

*A*.

**r**,

_{1}**r**, …,

_{2}**r**

_{m}*m*×

*n*matrix

*A*whose row-echelon form is

*B*containing the row vectors

**r**',

_{1}**r**', …,

_{2}**r**'

_{m}**r**' =

_{i}**0**of

*B*as

**0**,

_{1}**0**, …,

_{2}**0**

_{p}**r**' ≠

_{i}**0**of

*B*as

**,**~~0~~

_{1}**, …,**~~0~~

_{2}~~0~~

_{q}- 1. the set {
**0**,_{1}**0**, …,_{2}**0**} is not a basis for the row space of_{p}*A* - 2. the set {
,~~0~~_{1}, …,~~0~~_{2}} is a basis for the row space of~~0~~_{q}*A*

*S*of space

*V*must span

*V*and

*S*is linearly independent.

Because of the theorem

*A*.

**r**',

_{1}**r**', …,

_{2}**r**' } span the row space of

_{m}*A*, consequently both {

**0**,

_{1}**0**, …,

_{2}**0**} and {

_{p}**,**~~0~~

_{1}**, …,**~~0~~

_{2}**} wil span the row space of**~~0~~

_{q}*A*.

Expressing set { **0 _{1}**,

**0**, …,

_{2}**0**} as the vector equation

_{p}*k*

_{1}

**0**+

_{1}*k*

_{2}

**0**+ … +

_{2}*k*

_{p}**0**

_{p}=

**0**

*k*

_{1},

*k*

_{2}, …,

*k*is the solution. Hence, the set of zero row vectors are linearly dependent. Therefore, the set is not a basis for the row space of

_{r}*A*.∎

^{1}

For the set { ** 0 _{1}**,

**, …,**~~0~~

_{2}**} with the vector equation**~~0~~

_{q}*k*

_{1}

**+**~~0~~

_{1}*k*

_{2}

**+ … +**~~0~~

_{2}*k*

_{q}~~0~~

_{q}=

**0**

*k*

_{1}= 0,

*k*

_{2}= 0, …,

*k*= 0

_{r}**,**~~0~~

_{1}**, …,**~~0~~

_{2}**} is linearly independent. Therefore, the set is a basis for the row space of**~~0~~

_{q}*A*.∎

^{2}

**v**= [1, −2, 0, 0, 3]

_{1}**v**= [2, −5, −3, −2, 6]

_{2}**v**= [0, 5, 15, 10, 0]

_{3}**v**= [2, 6, 18, 8, 6]

_{4}*S*= {

**v**,

_{1}**v**,

_{2}**v**,

_{3}**v**} span the vector space. The set can be viewed as a row space

_{4}**w**= [1, −2, 0, 0, 3]

_{1}**w**= [0, 1, 3, 2, 0]

_{2}**w**= [0, 0, 1, 1, 0]

_{3}**w**,

_{1}**w**,

_{2}**w**} span the row space. And since the set of row vectors

_{3}*S*span the vector space, the set {

**w**,

_{1}**w**,

_{2}**w**} span the same vector space.

_{3}
Therefore, { **w _{1}**,

**w**,

_{2}**w**} is the basis for the vector space.

_{3}*A*is any matrix, then the row space and column space of

*A*have the same dimension.

*m*×

*n*matrix

**r**,

_{1}**r**, …,

_{2}**r**, suppose its row space has

_{m}*k*dimension with basis of the row space

*S*= {

**b**,

_{1}**b**, …,

_{2}**b**}, where

_{k}**b**= [

_{i}*b*

_{i1},

*b*

_{i2}, …,

*b*].

_{in}
Since *S* is a basis for the row space, each row vector can be expressed as a linear combination of the basis vectors **b _{1}**,

**b**, …,

_{2}**b**.

_{k}**r**=

_{1}*c*

_{11}

**b**+

_{1}*c*

_{12}

**b**+ … +

_{2}*c*

_{1k}

**b**

_{k}**r**=

_{2}*c*

_{21}

**b**+

_{1}*c*

_{22}

**b**+ … +

_{2}*c*

_{2k}

**b**

_{k}⋮

**r**=

_{m}*c*

_{m1}

**b**+

_{1}*c*

_{m2}

**b**+ … +

_{2}*c*

_{mk}**b**

_{k}^{n}are equal if and only if corresponding components are equal, equating the

*j*component in

^{th}**r**with the corresponding components of the vectors

_{i}**b**,

_{1}**b**, …,

_{2}**b**on the right hand side of the above expression we get

_{k}*a*

_{1j}=

*c*

_{11}

*b*

_{1j}+

*c*

_{12}

*b*

_{2j}+ … +

*c*

_{1k}

*b*

_{kj}*a*

_{2j}=

*c*

_{21}

*b*

_{1j}+

*c*

_{22}

*b*

_{2j}+ … +

*c*

_{2k}

*b*

_{kj}⋮

*a*=

_{mj}*c*

_{m1}

*b*

_{1j}+

*c*

_{m2}

*b*

_{2j}+ … +

*c*

_{mk}*b*

_{kj}*j*column vector of

^{th}*A*for

*j*= 1, 2, …,

*n*and the right hand side showing that each column vector of

*A*lies in the vector space spanned by the

*k*vectors; [

*c*

_{11},

*c*

_{21}, …,

*c*

_{m1}]

^{T}, [

*c*

_{12},

*c*

_{22}, …,

*c*

_{m2}]

^{T}, …, [

*c*

_{1k},

*c*

_{2k}, …,

*c*

_{mk}]

^{T}.

Therefore, the set
{ [*c*_{11}, *c*_{21}, …, *c*_{m1}]^{T},
[*c*_{12}, *c*_{22}, …, *c*_{m2}]^{T}, …,
[*c*_{1k}, *c*_{2k}, …, *c*_{mk}]^{T} }
spans the column space of *A*. Hence, the column space has dimension ≤ *k*.

Since we assumed *k* to be the dimension of the row space

*k*= dim(row space of

*A*)

*A*) ≤ dim(row space of

*A*)

*A*is arbitrary, the same conclusion applies to

*A*

^{T}

*A*

^{T}) ≤ dim(row space of

*A*

^{T})

*A*= column space of

*A*

^{T}

column space of

*A*= row space of

*A*

^{T}

*A*) ≤ dim(column space of

*A*)

*A*) ≤ dim(row space of

*A*)

*A*) = dim(row space of

*A*)∎

*A*

*A*, the row space is two dimensional.

The transpose of matrix *A*, *A*^{T} is

*A*

^{T}is two dimensional.

Since the column vectors of *A* are the row vectors of *A*^{T} the basis for the row space of *A*^{T}

^{T}, [0, 1, 2]

^{T}}

*A*. Thus, the column space of

*A*is two dimensional; the row space and column space of

*A*are two dimensional.

*A*is a matrix that is not square, then either the row vectors of

*A*or the column vectors of

*A*are linearly dependent.

*A*

**x**=

**b**such that

*A*is a

*m*×

*n*matrix,

**x**a

*n*× 1 vector and

**b**a

*m*× 1 column vector. [

*A*

**b**] is the augmented matrix.

From the theorem^{(ibid. 5)}

*Nonzero row vectors in a row-echelon form of a matrix A form a basis for the row space of*

*A*.

*A*and consequently the rank of matrix

*A*.

The preceeding theorem states that *A***x** = **b** is consistent if and only if **b** is in the column space of *A*. Thus, for a consistent system since the only unique column vector, **b** between matrices *A* and [*A* **b**] must be in the column space of *A*, the number of nonzero vectors following row operations on the *A* and [*A* **b**] will be equal.

Therefore, for a consistent system *A***x** = **b**, matrices *A* and [*A* **b**] will have equal ranks.

## Rank of a matrix

*A*is called the

**rank**of

*A*.

*A*

*A*, the row space is two dimensional.

From theorem

*A*is any matrix, then the row space and column space of

*A*have the same dimension.

*A*is two dimensional.

The rank of *A* is 2.

*A*

_{m × n}is a matrix and

*m*≠

*n*, then the

*largest possible rank*is

*m*if

*m*<

*n*

≤

*n*if

*m*>

*n*

*The dimension of the row space and column space for any*

*A*_{m × n}matrix is the same.*The nonzero row vectors in a row-echelon form of any*

*A*_{m × n}matrix is a basis for the the row space.*m*<

*n*, then

dimension of row space = dimension of column space ≤

*m*

if

*m*>

*n*, then

dimension of row space = dimension of column space ≤

*n*

*m*<

*n*, then

the largest possible rank ≤

*m*

if

*m*>

*n*, then

the largest possible rank ≤

*n*∎

Since the definition of rank refers to the number of vectors in a set that form a basis and since a basis must span (row/column space) and be linearly independent, then

*A*

_{m × n}is a matrix and

*m*≠

*n*, then

*m*<

*n*, then

the column vectors of

*A*are linearly dependent

if

*m*>

*n*, then

the row vectors of

*A*are linearly dependent∎

*A*is a matrix that is not square, then either the row vectors of

*A*or the column vectors of

*A*are linearly dependent.