*vector*

- as directed line segment (arrow)
- as arrow with rectangular coordinate system (for 2D but coordinate system in general)
- as component of the arrow in the coordinate system. ♻

Notice that these representations introduces *geometric properties* of a vector. But, representation of vectors as components — coordinate component — makes vectors have a mathematical existence in the sense that the vectors can be analyzed numerically, i.e., *analytical or numerical property*.

By late 19th century mathematicians concluded that there is "no need to stop at triplets" to define vectors. That is, even if we can no longer extend the vectors geometrically beyond 3D one can analyze vectors as

*a*

_{1},

*a*

_{2},

*a*

_{3},

*a*

_{4}) for 4D

quintuples of numbers (

*a*

_{1},

*a*

_{2},

*a*

_{3},

*a*

_{4},

*a*

_{5}) for 5D

⋮

*n*is in a positive integer, then an

**ordered**is a sequence of

*n*-tuple*n*real numbers (

*a*

_{1},

*a*

_{2}, …,

*a*).

_{n}The set

*a*

_{1},

*a*

_{2}, …,

*a*), (

_{n}*b*

_{1},

*b*

_{2}, …,

*b*), (

_{n}*c*

_{1},

*c*

_{2}, …,

*c*), … }

_{n}*n*-tuples is called

**and is denoted by ℜ**

*n*-space^{n}.

- ℜ
^{n}: ordered*n*-tuples or*n*-space - ℜ
^{2}: ordered pair or 2-space - ℜ
^{3}: ordered triples or 3-space

## Analytic properties of a vector provides deeper insight

The symbol (*a*

_{1},

*a*

_{2},

*a*

_{3}) in 3-space has two geometric interpretations

- as a
point with coordinates
(
*a*_{1},*a*_{2},*a*_{3}) - as a
vector with components
(
*a*_{1},*a*_{2},*a*_{3}) ♻

*a*

_{1},

*a*

_{2}, …,

*a*) in

_{n}*n*-space is

- a generalized point with coordinates
(
*a*_{1},*a*_{2}, …,*a*)_{n} - a generalized vector with components
(
*a*_{1},*a*_{2}, …,*a*)_{n}

*mathematical distinction is unimportant*.

In its algebraic form vectors can be denoted using either *coordinate* notation or *matrix* notation

**u**in ℜ

^{n}its

coordinate notation is (

*u*

_{1},

*u*

_{2}, …,

*u*)

_{n}matrix notation is

[

*u*

_{1},

*u*

_{2}, …,

*u*] as row elements

_{n}[

*u*

_{1},

*u*

_{2}, …,

*u*]

_{n}^{T}as column elements

**u**= (

*u*

_{1},

*u*

_{2}, …,

*u*) and

_{n}**v**= (

*v*

_{1},

*v*

_{2}, …,

*v*) are vectors in ℜ

_{n}^{n}then

**u**and

**v**are called

**equivalent vectors**if and only if

*u*

_{1}=

*v*

_{1},

*u*

_{2}=

*v*

_{2}, …,

*u*=

_{n}*v*

_{n}**u**=

**v**.

Because of how vectors are defined equivalent implies equal. Therefore, **u** and **v** are **equal** vectors.

## Standard operations on ℜ^{n}

*k*is a scalar,

**u**= (

*u*

_{1},

*u*

_{2}, …,

*u*) and

_{n}**v**= (

*v*

_{1},

*v*

_{2}, …,

*v*) are vectors in ℜ

_{n}^{n}the operation

**u**+

**v**= (

*u*

_{1}+

*v*

_{1},

*u*

_{2}+

*v*

_{2}, …,

*u*+

_{n}*v*)

_{n}**sum**

**u**+

**v**which is a vector in ℜ

^{n}and the operation

*k*

**u**=

*k*

*u*

_{1},

*k*

*u*

_{2}, …,

*k*

*u*

_{n}**scalar multiple**

*k*

**u**, also a vector in ℜ

^{n}.

These operations are called **standard operations** on ℜ^{n}.

^{n}without the need of expressing vectors as components.

*k*and

*l*are scalars and

**u**= (

*u*

_{1},

*u*

_{2}, …,

*u*),

_{n}**v**= (

*v*

_{1},

*v*

_{2}, …,

*v*) and

_{n}**w**= (

*w*

_{1},

*w*

_{2}, …,

*w*) are vectors in ℜ

_{n}^{n}then

**u**+**v**=**v**+**u**Commutative law for addition**u**+ (**v**+**w**) = (**u**+**v**) +**w**Associative law for addition**u**+**0**=**0**+**u**=**0**Adding zero**u**+ (−**u**) =**0**⇒**u**−**u**=**0**Negation law*k*(**u**+**v**) =*k***u**+*k***v**Distributive multiplication over addition*k*(*l***u**) = (*kl*)**u**Multiplication by scalar product- (
*k*+*l*)**u**=*k***u**+*l***u**Multiplication by scalar sum - 1
**u**=**u**Multiplying one

^{n}given by the above theorem applies to vectors represented by either coordinate notation or matrix notation. However the matrix notation is preferred because it is easier to manipulate.

Although it will not be discussed here, there exist another operation, the Euclidean inner product **u** ⋅ **v**. Like the arithmetic properties of the standard operations mentioned in the theorem above, the inner product operation is governed by its own set of arithmetic properties.

The *n*-space ℜ^{n} where both the standard operations and the inner product operation is defined is called an *Euclidean* *n*-space.

## Generalizing Vector Space

The notion of vectors need not be restricted to vectors in*n*-space, ℜ

^{n}. One can abstract the most important properties in ℜ

^{n}such that each property abstracted is considered an axiom. The collection of abstracted properties form a set of axioms.

^{n}automatically satisfy the set of axioms. However, other objects may satisfy the axioms. The

*class of objects that satisfy the set*are the generalized vectors. This new concept of vectors include

- old vectors; vectors in ℜ
^{n} - new vectors; class of objects that satisfy the set of axioms

*k*and

*l*are scalar real numbers and

*V*is an arbitrary set of objects say,

*V*= {

**u**,

**v**,

**w**, … }

- • addition

Given any two objects in*V*,**u**and**v**, this rule associates the object pair to an element**u**+**v**;**sum**of**u**and**v**. - • scalar multiplication

Given any scalar*k*and any objects in*V*,**u**, this rule associates the scalar and the object to an element*k***u**;**scalar multiple**of**u**by*k*.

*V*(say,

**u**,

**v**,

**w**) and all scalars (say,

*k*,

*l*) satisfy the ten

*axioms*

- ➀ Closure law for addition

If**u**,**v**∈*V*, then**u**+**v**∈*V* - ➁ Commutative law for addition

**u**+**v**=**v**+**u** - ➂ Associative law for addition

**u**+ (**v**+**w**) = (**u**+**v**) +**w** - ➃ Adding zero

There exists**0**∈*V*such that for all**u**∈*V*,**u**+**0**=**0**+**u**=**u**

**0**is called the**zero vector**for*V* - ➄ Negation law

For each**u**∈*V*there exists −**u**∈*V*such that**u**+ (−**u**) =**u**−**u**=**0**

and with commutative law for addition (Axiom ➁)

**u**+ (−**u**) = (−**u**) +**u**=**0**; −**u**is called**negative**of**u** - ➅ Closure law for multiplication

If*k*is any real number scalar and**u**∈*V*, then*k***u**∈*V* - ➆ Distributive multiplication over addition

*k*(**u**+**v**) =*k***u**+*k***v** - ➇ Multiplication by scalar sum

(*k*+*l*)**u**=*k***u**+*l***u** - ➈ Multiplication by scalar product

*k*(*l***u**) = (*kl*)**u** - ➉ Multiplying one

1**u**=**u**

*V*is called a

**vector space**.

If *k* and *l* are complex scalars, then *V* is called a **complex vector space**.

*neither nature of vectors nor operations are specified*. Therefore,

*any object that satisfies the ten axioms*is a candidate to be a vector. ❹

## Investigating if objects satisfy the set for generalized vectors

The definition of the*general vector space*helps us expand the notion of vectors beyond vectors defined in

*n*-space, ℜ

^{n}. The set

*V*= ℜ

^{n}such that standard operations (addition and scalar multiplication) apply is by definition a vector space.

^{n}. The rest of the axioms are the arithmetic properties of the standard operations given by the theorem (see ❸) that enables manipulation of elements in ℜ

^{n}. Thus, the elements in ℜ

^{n}satify all the ten axioms. Therefore, these elements or points in

*V*= ℜ

^{n}continue to be vectors in the context of the definition for the general vector space.

One may ask the questions

- Are all the
points on a plane
in ℜ
^{n}vectors? That is, Is*V*= set of points on the plane in ℜ^{n}a vector space? - Are all the
points on a line
in ℜ
^{n}vectors? That is, Is*V*= set of points on the line in ℜ^{n}a vector space? - Are all the
points on one corner region
of ℜ
^{n}vectors? That is, Is*V*= set of points on a quadrant in ℜ^{n = 2}a vector space? - Is a
function whose values correspond to points in ℜ
^{n}a vector? That is, Is*V*= set of real functions a vector space? - Can matrices be considered as vectors in the general sense? That is, Is
*V*= set of matrices a vector space?

*V*that passes through the origin in ℜ

^{3}= {(

*a*

_{1},

*a*

_{2},

*a*

_{3}) ∣

*a*∈ ℜ }. Do points in

_{i}*V*form a vector space?

^{n = 3}is a vector space. Since axioms addressing the mechanics of arithmetic operations such as ➁, ➂, ➆, ➇, ➈ and ➉ is satisfied by all points in ℜ

^{3}. These axioms will hold for all points in the plane

*V*.

Do all points in *V* satisfy axioms ➀, ➃, ➄ and ➅?

### Checking for axiom ➀

Since*a*,

*b*,

*c*and

*d*are constants and

*a*,

*b*,

*c*are not all zero, then the equation

*ax*+

*by*+

*cz*+

*d*= 0

**n**= (

*a*,

*b*,

*c*) as a normal to the plane.

If *a* ≠ 0 the equation can be rewritten as

*a*(

*x*+ (

*d*/

*a*)) +

*by*+

*cz*= 0

*d*/

*a*, 0, 0). Thus, for

*d*= 0 the point-normal form passing through (0, 0, 0) is

*ax*+

*by*+

*cz*= 0

*V*through the origin (0, 0, 0) has an equation of the form

*ax*+

*by*+

*cz*= 0

**u**= (

*u*

_{1},

*u*

_{2},

*u*

_{3}) and

**v**= (

*v*

_{1},

*v*

_{2},

*v*

_{3}). Is the sum

**u**+

**v**= (

*u*

_{1}+

*v*

_{1},

*u*

_{2}+

*v*

_{2},

*u*

_{3}+

*v*

_{3}) a point on the plane

*V*?

We know that the plane equation at point **u** is

*au*

_{1}+

*bu*

_{2}+

*cu*

_{3}= 0

**v**is

*av*

_{1}+

*bv*

_{2}+

*cv*

_{3}= 0

**u**+

**v**will be

*au*

_{1}+

*bu*

_{2}+

*cu*

_{3}= 0

*av*

_{1}+

*bv*

_{2}+

*cv*

_{3}= 0

*a*(

*u*

_{1}+

*v*

_{1}) +

*b*(

*u*

_{2}+

*v*

_{2}) +

*c*(

*u*

_{3}+

*v*

_{3}) = 0

*V*though the origin

*ax*+

*by*+

*cz*= 0

**u**+

**v**will lie on the plane passing at origin;

**u**+

**v**will lie on

*V*. Therefore, the closure law for addition,

**axiom ➀ is satisfied**.

### Checking for axiom ➃

Multiplying through the plane equation at point**u**by 0

*a*(0 ⋅

*u*

_{1}) +

*b*(0 ⋅

*u*

_{2}) +

*c*(0 ⋅

*u*

_{3}) = 0

or

*a*⋅ 0 +

*b*⋅ 0 +

*c*⋅ 0 = 0

**0**= (0, 0, 0). The point will lie on

*V*because this point satisfies the plane equation

*V*through the origin

*ax*+

*by*+

*cz*= 0

**u**+

**0**will be

*au*

_{1}+

*bu*

_{2}+

*cu*

_{3}= 0

*a*⋅ 0 +

*b*⋅ 0 +

*c*⋅ 0 = 0

*au*

_{1}+

*bu*

_{2}+

*cu*

_{3}= 0

**u**+

**0**=

**u**.

Similarly, it can be shown that the sum **0** + **u** = **u**. Thus,

**u**+

**0**=

**u**=

**0**+

**u**=

**u**

**axiom ➃ is satisfied**.

### Checking for axiom ➄

Multiplying through the plane equation at point**u**by −1

*a*(−1 ⋅

*u*

_{1}) +

*b*(−1 ⋅

*u*

_{2}) +

*c*(−1 ⋅

*u*

_{3}) = 0

or

*a*(−

*u*

_{1}) +

*b*(−

*u*

_{2}) +

*c*(−

*u*

_{3}) = 0

or

−

*au*

_{1}−

*bu*

_{2}−

*cu*

_{3}= 0

**u**= (−

*u*

_{1}, −

*u*

_{2}, −

*u*

_{3}). The point will lie on

*V*because this point satisfies the plane equation

*V*through the origin

*ax*+

*by*+

*cz*= 0

**u**+ (−

**u**

_{1}) will be

*au*

_{1}+

*bu*

_{2}+

*cu*

_{3}= 0

−

*au*

_{1}−

*bu*

_{2}−

*cu*

_{3}= 0

0 + 0 + 0 = 0

or

*a*⋅ 0 +

*b*⋅ 0 +

*c*⋅ 0 = 0

**0**= (0, 0, 0). From above we know that this point lies on

*V*. Thus,

**u**+ (−

**u**) =

**u**−

**u**=

**0**

**axiom ➄ is satisfied**.

### Checking for axiom ➅

Multiplying through the plane equation at point**u**by

*k*

*a*(

*ku*

_{1}) +

*b*(

*ku*

_{2}) +

*c*(

*ku*

_{3}) = 0

*k*

**u**= (

*ku*

_{1},

*ku*

_{2},

*ku*

_{3}). The point will lie on

*V*because this point satisfies the plane equation

*V*through the origin

*ax*+

*by*+

*cz*= 0

**axiom ➅ is satisfied**.

Since the above arguments show that *points on the plane V passing through the origin in* ℜ

^{3}

*satisfy all the ten axioms we can say that this set of points form a vector space*.∎

*V*that passes through the origin in ℜ

^{3}= {(

*a*

_{1},

*a*

_{2},

*a*

_{3}) ∣

*a*∈ ℜ }. Do points in

_{i}*V*form a vector space?

^{n = 3}is a vector space. Since axioms addressing the mechanics of arithmetic operations such as ➁, ➂, ➆, ➇, ➈ and ➉ is satisfied by all points in ℜ

^{3}. These axioms will hold for all points in the line

*V*.

Do all points in *V* satisfy axioms ➀, ➃, ➄ and ➅?

### Checking for axiom ➀

Since*a*,

*b*and

*c*are constants and parameter

*t*is such that it is bounded by −∞ <

*t*< +∞, then the equations

*x*=

*x*

_{0}+

*ta*

*y*=

*y*

_{0}+

*tb*

*z*=

*z*

_{0}+

*tc*

*t*.

*V*through the origin (

*x*

_{0}= 0,

*y*

_{0}= 0,

*z*

_{0}= 0) has a system equation of the form

*x*=

*ta*

*y*=

*tb*

*z*=

*tc*

**u**= (

*u*

_{1},

*u*

_{2},

*u*

_{3}) and

**v**= (

*v*

_{1},

*v*

_{2},

*v*

_{3}). Is the sum

**u**+

**v**= (

*u*

_{1}+

*v*

_{1},

*u*

_{2}+

*v*

_{2},

*u*

_{3}+

*v*

_{3}) a point on the line

*V*?

We know that the parametric equation of a line at point **u** is

*u*

_{1}=

*ta*

*u*

_{2}=

*tb*

*u*

_{3}=

*tc*

**v**is

*v*

_{1}=

*ta*

*v*

_{2}=

*tb*

*v*

_{3}=

*tc*

**u**+

**v**will be

*u*

_{1}+

*v*

_{1}= 2

*ta*

*u*

_{2}+

*v*

_{2}= 2

*tb*

*u*

_{3}+

*v*

_{3}= 2

*tc*

*t*the position of the point

**u**+

**v**will vary. But, this point will be on the line. Therefore, the above system is a parameteric equation of the point

**u**+

**v**on the line

*V*. Hence, the closure law for addition,

**axiom ➀ is satisfied**.

### Checking for axiom ➃

Multiplying through the parameteric equation of a line at point**u**by 0

*u*

_{1}= 0 in

*u*

_{1}=

*ta*

0 = 0 ⇒

*u*

_{2}= 0 in

*u*

_{1}=

*tb*

0 = 0 ⇒

*u*

_{3}= 0 in

*u*

_{1}=

*tc*

**0**= (0, 0, 0). The point will lie on

*V*because this point satisfies the parameteric equation of the line

*V*through the origin

*x*=

*ta*

*y*=

*tb*

*z*=

*tc*

**u**+

**0**will be

*u*

_{1}+ 0 =

*ta*+ 0 ⇒

*u*

_{1}=

*ta*

*u*

_{2}+ 0 =

*tb*+ 0 ⇒

*u*

_{2}=

*tb*

*u*

_{3}+ 0 =

*tc*+ 0 ⇒

*u*

_{3}=

*tc*

**u**+

**0**=

**u**.

Similarly, it can be shown that the sum **0** + **u** = **u**. Thus,

**u**+

**0**=

**u**=

**0**+

**u**=

**u**

**axiom ➃ is satisfied**.

### Checking for axiom ➄

Multiplying through the parametric equation of a line at point**u**by −1

*u*

_{1}= −

*ta*

−

*u*

_{2}= −

*tb*

−

*u*

_{3}= −

*tc*

**u**= (−

*u*

_{1}, −

*u*

_{2}, −

*u*

_{3}). The point will lie on

*V*because this point satisfies the parameteric equation of the line

*V*through the origin

*x*=

*ta*

*y*=

*tb*

*z*=

*tc*

**u**+ (−

**u**

_{1}) will be

*u*

_{1}+ (−

*u*

_{1}) =

*ta*−

*ta*= 0

*u*

_{2}+ (−

*u*

_{2}) =

*tb*−

*tb*= 0

*u*

_{3}+ (−

*u*

_{3}) =

*tc*−

*tc*= 0

**0**= (0, 0, 0). From above we know that this point lies on

*V*. Thus,

**u**+ (−

**u**) =

**u**−

**u**=

**0**

**axiom ➄ is satisfied**.

### Checking for axiom ➅

Multiplying through the parameteric equation of a line at point**u**by

*k*

*ku*

_{1}=

*kta*

*ku*

_{2}=

*ktb*

*ku*

_{3}=

*ktc*

*k*

**u**= (

*ku*

_{1},

*ku*

_{2},

*ku*

_{3}). The point will lie on

*V*because this point satisfies the parameteric equation of the line

*V*through the origin

*x*=

*ta*

*y*=

*tb*

*z*=

*tc*

**axiom ➅ is satisfied**.

Since the above arguments show that *points on the line V passing through the origin in* ℜ

^{3}

*satisfy all the ten axioms we can say that this set of points form a vector space*.∎

^{2}as the plane

*V*where ℜ

^{2}= {(

*a*

_{1},

*a*

_{2}) ∣

*a*∈ ℜ }. Do points in

_{i}*V*form a vector space?

*V*is defined to be the set {(

*a*

_{1},

*a*

_{2}) ∣

*a*≥ 0} if

_{i}**u**= (

*u*

_{1},

*u*

_{2}) is a point in

*V*then the point −

**u**= (−

*u*

_{1}, −

*u*

_{2}) is not a point in

*V*.

Thus, points on *V* **do not satisfy axioms ➄**, the negation law

**u**+ (−

**u**) =

**u**−

**u**=

**0**

**u**is not a point on

*V*.

Since *every point on V do not satisfy all the ten axioms we can say that this set of points do not form a vector space*.∎

*V*

^{2}— functions whose values are on the the entire ℜ.. Does the set

*V*form a vector space?

*u*,

*v*and

*w*are functions in

*V*

**u**=

*u*whose value at

*x*is

**u**(

*x*) =

*u*(

*x*)

**v**=

*v*whose value at

*x*is

**v**(

*x*) =

*v*(

*x*)

**w**=

*w*whose value at

*x*is

**w**(

*x*) =

*w*(

*x*)

- function addition
( Adding the value of**u**+**v**)*x*=*u*(*x*) +*v*(*x*)**u**at*x*=*a*to the value of**v**at*x*=*a*we obtain the value of**u**+**v**at*x*=*a*.

- scalar multiplication of function
( Multiply the value of*k***u**)*x*=*ku*(*x*)**u**at*x*=*a*by the scalar*k*to get the value of*k***u**at*x*=*a*

**u**+

**v**is an object in

*V*and function

*k*

**u**is an object in

*V*.

Therefore, **axioms ➀ and ➅** — axioms that reflect definition of standard operations on ℜ^{n} — **are satisfied by all the functions in** *V*.

### Checking for axioms ➁ and ➂

Since**u**+

**v**)(

*a*) =

*u*(

*a*) +

*v*(

*a*)

*u*(

*a*),

*v*(

*a*) ∈ ℜ — (

*a*,

*u*(

*a*)) and (

*a*,

*v*(

*a*)) ∈ ℜ

^{2}— thus

*u*(

*a*) +

*v*(

*a*) =

*v*(

*a*) +

*u*(

*a*)

or

(

**u**+

**v**)(

*a*) = (

**v**+

**u**)(

*a*)

**u**(

*a*) + (

**v**+

**w**)(

*a*) =

*u*(

*a*) + [

*v*(

*a*) +

*w*(

*a*)]

**u**(

*a*) + (

**v**+

**w**)(

*a*) = [

*u*(

*a*) +

*v*(

*a*)] +

*w*(

*a*)

or

**u**(

*a*) + (

**v**+

**w**)(

*a*) = (

**u**+

**v**)(

*a*) +

**w**(

*a*)

**axioms ➁ and ➂ are satisfied**by all functions in

*V*.

### Checking for axioms ➃ and ➄

The zero function is the constant function whose value**0**(

*x*) is 0 for all

*x*in ℜ

**0**= 0

**u**is

**u**= −

*u*whose value at

*x*is −

**u**(

*x*) = −

*u*(

*x*)

**0**and

**u**at

*x*=

*a*we get

**0**+

**u**)(

*a*) = zero function at

*a*+

*u*(

*a*) = 0 +

*u*(

*a*) =

*u*(

*a*) =

**u**(

*a*)

**u**+

**0**)(

*a*) =

**u**(

*a*)

**axioms ➃ is satisfied**.

Now,
adding the values of **u** and −**u**
at *x* = *a* we get

**u**+ −

**u**)(

*a*) =

*u*(

*a*) + −

*u*(

*a*) =

*u*(

*a*) −

*u*(

*a*) = 0

**u**+

**u**)(

*a*) = 0

**axioms ➄ is satisfied**.

### Checking for axioms ➆ and ➇

Since functions in*V*satisfy axioms ➀ and ➅ which corresponds to the definitions for standard operations consider the sum of two function

**u**+

**v**such that some scalar

*k*multiplies the sum to obtain the function

*k*(

**u**+

**v**). Then, for some

*x*=

*a*

- If
*k***u**(*a*) is the scalar multiple at*a*and*k***v**(*a*) is the scalar multiple at*a*then do their sum*ku*(*a*) +*kv*(*a*) equal*k*[**u**(*a*) +**v**(*a*)]? - If
*k***u**(*a*) is the scalar multiple at*a*and*l***u**(*a*) is another scalar multiple at*a*then do their sum*ku*(*a*) +*lu*(*a*) equal (*k*+*l*)[**u**(*a*)]?

*k*(

**u**+

**v**) function at

*x*=

*a*will be

*k*(

**u**+

**v**)](

*a*) =

*k*(

**u**+

**v**)(

*a*)

[

*k*(

**u**+

**v**)](

*a*) =

*k*[

*u*(

*a*) +

*v*(

*a*)]

*k*,

*u*(

*a*) and

*v*(

*a*) are assumed to fall on the ℜ number line applying the distributive law of real numbers we know

*k*[

*u*(

*a*) +

*v*(

*a*)] =

*ku*(

*a*) +

*kv*(

*a*)

*k*(

**u**+

**v**)](

*a*) =

*k*

**u**(

*a*) +

*k*

**v**(

*a*)

*k*(

**u**+

**v**) =

*k*

**u**+

*k*

**v**for all

*x*

**axioms ➆ is satisfied**.

If a scalar is the sum *k* + *l* then, from the scalar multiplication of functions we get

*k*+

*l*)

**u**(

*a*) = (

*k*+

*l*)

*u*(

*a*)

*k*+

*l*)

*u*(

*a*) =

*ku*(

*a*) +

*lu*(

*a*)

*k*+

*l*)

**u**(

*a*) =

*k*

**u**(

*a*) +

*l*

**u**(

*a*)

(

*k*+

*l*)

**u**=

*k*

**u**+

*l*

**u**for all

*x*

**axioms ➇ is satisfied**.

### Checking for axioms ➈ and ➉

*l*

**u**(

*a*) is a scalar multiple at

*a*then we obtain the function

*l*

**u**. The multiplication of this newly obtained function by the scalar

*k*such that

*k*[

*l*

**u**(

*a*)] is a scalar multiple at

*a*that yields

*k*[

*l*

**u**(

*a*)] =

*k*[

*l*

*u*(

*a*)]

*k*,

*l*and

*u*(

*a*) are assumed to fall on the ℜ number line applying the law of multiplication by product of real numbers we know

*k*[

*l*

*u*(

*a*)] =

*klu*(

*a*)

*k*[

*l*

*u*(

*a*)] = (

*kl*)

*u*(

*a*)

*k*[

*l*

**u**(

*a*)] = (

*kl*)

**u**

**axioms ➈ is satisfied**.

For
scalar multiple *k***u**(*a*) at *a*
if *k* = 1 we get
1**u**(*a*) as its scalar multiple at *a*
thus,

**u**(

*a*) = 1 ⋅

*u*(

*a*)

*u*(

*a*) =

*u*(

*a*)

**u**(

*a*) =

*u*(

*a*)

1

**u**=

**u**

**axioms ➉ is satisfied**.

Since the above arguments show that *functions in V satisfy all the ten axioms we can say that this set of functons form a vector space*.∎

*V*contaning

*m*×

*n*matrices such that the matrix entries ∈ ℜ and the matrices can perform the operations, matrix addition and scalar multiplication. Then,

**u**

- ⓐ
*A*+*B*=*B*+*A* - ⓑ
*A*+ (*B*+*C*) = (*A*+*B*) +*C* - ⓒ
*A*(*BC*) = (*AB*)*C* - ⓓ
*A*(*B*+*C*) =*AB*+*AC* - ⓔ (
*B*+*C*)*A*=*BA*+*CA* - ⓕ
*A*(*B*−*C*) =*AB*−*AC* - ⓖ (
*B*−*C*)*A*=*BA*−*CA* - ⓗ
*a*(*B*+*C*) =*aB*+*aC* - ⓘ
*a*(*B*−*C*) =*aB*−*aC* - ⓙ (
*a*+*b*)*C*=*aC*+*bC* - ⓚ (
*a*−*b*)C =*a*C −*b*C - ⓛ (
*ab*)*C*=*a*(*bC*) - ⓜ
*a*(*BC*) = (*aB*)*C*=*B*(*aC*)

*given any sum or any product of matrices, pairs of parentheses can be inserted or deleted anywhere within the expression without affecting the end result*.

**u**=

*A*,

**v**=

*B*,

**w**=

*C*

*k*=

*a*,

*l*=

*b*,

**u**+

**v**=

**v**+

**u**or axiom ➁

ⓑ becomes

**u**+ (

**v**+

**w**) = (

**u**+

**v**) +

**w**or axiom ➂

ⓗ becomes

*k*(

**v**+

**w**) =

*k*

**v**+

*k*

**w**or axiom ➆

ⓙ becomes (

*k*+

*l*)

**w**=

*k*

**w**+

*l*

**w**or axiom ➇

ⓛ becomes (

*kl*) =

*k*(

*l*

**w**) or axiom ➈

**axioms ➁, ➂, ➆, ➇ and ➈ are satisfied**.

We know that **0**, **u**, **v** and −**u** are objects in *V* whose matrix elements are in the real number line. We then find that

**u**+

**v**will be an object in

*V*. Therefore,

**axiom ➀ is satisfied**.

For the axiom that deals with the rule of adding zero we can show that

**u**+

**0**=

**u**. Thefore,

**axiom ➃ is satisfied**.

For negation law we can show that

**u**+

**u**=

**0**. Thefore,

**axiom ➄ is satisfied**.

For any scalar multiplication operation

*k*was considered to lie on the real line. That is, the product

*k*

**u**will be an object in

*V*. Thefore,

**axiom ➅ is satisfied**.

Finally if *k* = 1

**u**=

**u**. Thefore,

**axiom ➉ is satisfied**.

Since the above arguments show that *any real matrices in V satisfy all the ten axioms we can say that this set of matrices form a vector space*.∎

## Zero vector space

*V*consisting a single object denoted by

**0**and

**0**+

**0**=

**0**

*k*

**0**=

**0**

*k*scalars then,

*V*is a

*zero vector space*.

*V*be a vector space,

**u**a vector in

*V*, and

*k*a scalar; then

**u**=

**0**

*k*

**0**=

**0**

(−1)

**u**= −

**u**

If

*k*

**u**=

**0**, then

*k*= 0 or

**u**=

**0**

**u**=

**0**

Since axiom ➇, multiplication by scalar sum, says

*k*+

*l*)

**u**=

*k*

**u**+

*l*

**u**

**u**= 0

**u**+ 0

**u**

**u**= 0

**u**+ 0

**u**

Since axiom ➄, negation law, tells us that the negative of 0**u** is −0**u**, adding this to both sides of the expression we get

**u**+ (−0

**u**) = (0

**u**+ 0

**u**) + (−0

**u**)

Axiom ➂, associate law for addition, says

**u**+ (

**v**+

**w**) = (

**u**+

**v**) +

**w**

**u**+ (−0

**u**) = 0

**u**+ [0

**u**+ (−0

**u**)]

Axiom ➄, negation law, says

**u**+ (−

**u**) =

**u**−

**u**=

**0**⇒ 0

**u**+ (−0

**u**) = 0

**0**

**0**= 0

**u**+

**0**

From axiom ➃, adding zero, we know

**u**+

**0**=

**0**+

**u**=

**u**

**0**= 0

**u**+

**0**

becomes

0

**0**= 0

**u**∎

Proof for

*k*

**0**=

**0**

Since axiom ➆, distributive multiplication over addition, says

*k*(

**u**+

**v**) =

*k*

**u**+

*k*

**v**

*k*(

**0**+

**0**) =

*k*

**0**+

*k*

**0**

Axiom ➇, multiplication by scalar sum, says

*k*+

*l*)

**u**=

*k*

**u**+

*l*

**u**

*k*+

*k*)

**u**=

*k*

**0**+

*k*

**0**

*k*(

**0**+

**0**) =

*k*

**0**+

*k*

**0**

becomes

*k*(

**0**+

**0**) = (

*k*+

*k*)

**0**

*k*+

*k*= 2

*k*, thus

*k*(

**0**+

**0**) = (2

*k*)

**0**

From axiom ➃, adding zero, we know

**u**+

**0**=

**0**+

**u**=

**u**⇒

**0**+ 0 =

**0**

*k*(

**0**+

**0**) = (2

*k*)

**0**

becomes

*k*

**0**= (2

*k*)

**0**

*k*

**0**=

*k*

**0**∎

Proof for (−1)

**u**= −

**u**

Proof for this can be achieved by demonstrating the negation law, axiom ➄, that is, show

**u**+ (−1)

**u**=

**0**

**u**= −

**u**

Since axiom ➉ tells us that 1**u** = **u** thus,

**u**+ (−1)

**u**= 1

**u**+ (−1)

**u**

*k*+

*l*)

**u**=

*k*

**u**+

*l*

**u**

**u**+ (−1)

**u**= 1

**u**+ (−1)

**u**

become

**u**+ (−1)

**u**= (1 + −1)

**u**

**u**+ (−1)

**u**= 0

**u**

**u**=

**0**

**u**+ (−1)

**u**=

**0**∎

Proof for if

*k*

**u**=

**0**then 1)

**u**= 0 or 2)

*k*= 0

We proved *k***0** = **0** thus,

*k*

**u**=

**0**

becomes

*k*

**u**=

*k*

**0**

**u**=

**0**∎

^{1}

Axiom ➄, negation law, says

**u**+ (−

**u**) =

**u**−

**u**=

**0**

*k*

**u**=

**0**is given. Hence

**u**+ (−1

**u**) =

**0**

becomes

**u**+ (−1

**u**) =

*k*

**u**

**u**=

**u**thus the expression can be written as

**u**+ (−1

**u**) =

*k*

**u**

Axiom ➇, multiplication by scalar sum, says

*k*+

*l*)

**u**=

*k*

**u**+

*l*

**u**

**u**+ (−1

**u**) =

*k*

**u**

become

[1 + (−1)]

**u**=

*k*

**u**

**u**=

*k*

**u**

*k*= 0∎

^{2}

## Subspace

*W*of a vector space

*V*is called a

**subspace**of

*V*if

*W*is also a vector space under the standard operations (addition and scalar multiplication) defined on

*V*.

*W*⊆

*V*vector space, then to verify

*W*is a subspace of

*V*one need to only test for axioms

- ➀ Closure law for addition

If**u**,**v**∈*V*, then**u**+**v**∈*V* - ➃ Adding zero

There exists**0**∈*V*such that for all**u**∈*V*,**u**+**0**=**0**+**u**=**u**

**0**is called the**zero vector**for*V* - ➄ Negation law

For each**u**∈*V*there exists −**u**∈*V*such that**u**+ (−**u**) =**u**−**u**=**0**

and with commutative law for addition (Axiom ➁)

**u**+ (−**u**) = (−**u**) +**u**=**0**; −**u**is called**negative**of**u** - ➅ Closure law for multiplication

If*k*is any real number scalar and**u**∈*V*, then*k***u**∈*V*

*V*.

*W*is a set of one or more vectors from a vector space

*V*, then

*W*is a

**subspace**of

*V*if and only if the following conditions hold.

*W*is**closed under addition**; If**u**and**v**are any vectors in*W*, then**u**+**v**is in*W*.*W*is**closed under scalar multiplication**; If*k*is any scalar and**u**is any vectors in*W*, then*k***u**is in*W*.

*V*be a vector space such that

*W*⊆

*V*and the two conditions holds

- If
**u**and**v**are any vectors in*W*, then**u**+**v**is in*W*. - If
*k*is any scalar and**u**is any vectors in*W*, then*k***u**is in*W*.

*W*

**satisfies axioms ➀ and ➅**because the two conditions reflect the axioms.

Since *V* is a vector space and *W* ⊆ *V*, then all objects in *W* automatically **satisfies axioms ➁, ➂, ➆, ➇, ➈ and ➉**.

Therefore, to prove *W* is a vector space and hence a subspace of *V* one only needs to check for axioms ➃ and ➄.

Let **u** be any vector in *W*. The second condition (above) tells us that for every scalar *k*, *k***u** ∈ *W*. Then,

- For
*k*= 0, 0**u**=**u**is in*W* - For
*k*= −1, (−1)**u**= −1**u**is in*W*

**axioms ➃ and ➄ are satisfied**.

Since *all objects in W satisfy all the ten axioms and W is a subset of the vector space V we can say that the set W is a vector subspace*.∎

*V*has

*at least two subspaces*;

*V*itself is a subspace and the set {

**0**}. The set {

**0**} is called the

**zero subspace**.