Vector space

From Conservapedia

Jump to: navigation, search

A vector space is a set of objects that can be added together and multiplied by elements of another set, while satisfying certain properties. Elements of the first set are called "vectors" while elements of the second set are called "scalars". The idea of a vector space is one of the most fundamental and important concepts in mathematics, physics, and engineering.

In two dimensions, a vector has a "magnitude" (or length) and a "direction" (or angle).Perhaps the simplest vector to visualize is the velocity vector, showing the speed and direction of motion of a particle. Other extremely common vectors are the electric field and magnetic field vectors, though vectors abound in numerous areas of mathematics, physics, mechanical engineering, and aspects of electrical engineering.

The most important operations involving vectors are the vector sum and the vector-scalar product. As an example of the first, if we are in a train traveling with speed given by one vector, and we throw something inside the train with a velocity, relative to the train, of another vector, the velocity of the object relative to a fixed observer is the sum of those two vectors. As an example of the second, if we double the current through an electromagnet, its magnetic field vector will be multiplied by the number 2. That is, its direction will be unchanged and its magnitude will double.

A vector space is a set of vectors that can be added to each other or multiplied by a "scalar". (The term "scalar" is used for treatments of unusual vector spaces—see below. For the straightforward case, think of a scalar as just an ordinary real number.)

Not everything is a vector: some examples of things that are not vectors are air temperature and pressure, or the electrostatic voltage. These have no direction. They are scalars. But there is a special type of derivative, the "gradient" of a scalar, that is a vector. This measures the change in a scalar from one point in space to another.

Vector spaces have a "dimension". In the physically simple cases, that dimension is usually just 2 or 3. Vectors drawn as arrows on a piece of paper are two-dimensional vectors. Vectors giving velocity, electric field strength, etc., in real 3-dimensional space are three-dimensional vectors. Given a choice of "coordinate system" or "basis" for representing vectors, any vector can be denoted by 2 or 3 (or whatever the dimension is) scalars. So, for example, a particle's velocity vector can be represented by its x-velocity, y-velocity, and z-velocity. These numbers are called the "components" of the vector, and are generally written with subscripts running from 1 to the dimension of the space. So a vector might be represented as

\vec{v} = (v_1,v_2,v_3)


\vec{w} = (w_1,w_2,w_3)

When represented in this way, the vector sum is very straightforward:


and the vector-scalar product is equally straightforward:

 a \cdot \vec{v} = (a v_1,a v_2,a v_3)

Because of this representation, each vector can be thought of as a point in a Cartesian coordinate system; or, rather, as an arrow from the origin to the point. For example, \vec{v} = (4,5,6) can be thought of as the arrow from the origin (0, 0, 0) to the point 4,5,6. In this case, (4,5,6) would be called the head of v, and (0,0,0) would be called the tail.

Two vectors can be multiplied in two ways. The dot product of two vectors is a scalar, while the cross product is another vector.

Vector spaces are the fundamental objects of study of linear algebra and, due to their usefulness with gradients, advanced calculus.



  1. The space \mathbb R^n of n-tuples of real numbers is a vector space, where to add two vectors we simply add the corresponding components. The case n = 2 is exactly the case of vectors in the plane discussed above.
  2. The set \mathbb R[x] of polynomials with real coefficients is a vector space. If we add two polynomials together, we get another polynomial, and similarly, if we multiply a polynomial by a constant, we get another polynomial. Note that although it's also possible to multiply two polynomials and get another one, this is not part of the vector space structure: a vector space with a reasonable notion of multiplication of vectors is called an algebra.
  3. The set of polynomials of degree less than or equal to n (for any n \geq 0) is a vector space, for the same reason.
  4. The set of all continuous functions on the real line is a vector space: the sum of two continuous functions is again continuous, as is the product of a continuous function with a constant.
  5. The set of 2 \times 2 matrices M_{2 \times 2}(\mathbb R) is a vector space.
  6. If V and W are vector spaces, then we can form a new vector space V \oplus W (called the direct sum of V and W) whose entries are ordered pairs (v,w) of elements of V and elements of W.


Many familiar properties of vectors in the plane carry over to the set of vector spaces. For example, just as the plane is 2-dimensional, it makes sense to talk about the dimension of any vector space (though it may be infinite, as in the case of polynomials!) Vectors in the plane can all be written in the form ae1 + be2, where e1 = (1,0),e2 = (0,1), and a set elements with this same property that all vectors can be written as sums of multiples of vectors in the set is called a basis. Having a convenient basis often makes computations easier. It turns out that every finite dimensional vector space has a basis -- in fact, if we're feeling adventurous and assume the Axiom of Choice, even every infinite dimensional vector space has a basis.

However, a general vector space has no notion of "distance": given a vector, there's not necessarily a way to define the length | v | of that vector. For example, it's not obvious how we should define the length of a polynomial or a matrix. A vector space endowed which a notion of distance is called normed.


The above discussion only considers vector spaces in which the scalars are real numbers, but we could just as well talk about the set of polynomials with complex coefficients, where we multiple by complex scalars. More generally, given any field F, a vector space over F is an additive abelian group (where addition is commutative) and with which is associated a field of scalars, as the field of real numbers, such that the product of a scalar and an element of the group or a vector is defined, the product of two scalars times a vector is associative, one times a vector is the vector, and two distributive laws hold. In terms of another definition, a vector space is simply a module for which the ground ring is a field.

Specifically, let V be vector space over a field F. Then for all u,v,w ∈ V and a,b ∈ F the following axioms are obeyed:

Vector Addition
1. Commutativity: u + v = v + u.
2. Associativity: (u + v) + w = u + (v + w).
3. Identity: There exists a 0 ∈ V such that v + 0 = 0 + v = v.
4. Inverse: For all v there exists a (-v) such that v + (-v) =(-v) + v = 0.

Scalar Multiplication
5. Associativity: a(bv) = (ab)v.
6. Identity: For 1 ∈ F (i.e, the multiplicative identity of F) it follows that 1v = v.

7. Scalar sums: (a + b)v = av + bv.
8. Vector sums: a(v + w) = av + aw.


Weisstein, Eric W. "Vector Space." From MathWorld--A Wolfram Web Resource.

Personal tools