Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
attachments / l7_alg_eng020114.doc
Скачиваний:
12
Добавлен:
25.02.2016
Размер:
463.36 Кб
Скачать

L E C T U R E 7

Transformation of the coordinates at transition to a new basis

In the linear space we can choose a basis by non-unique way, and therefore the rule of changing the coordinates of an element of the space at transition from one basis to another has a practical interest.

Let there be two bases: e1, e2, e3, … (old) and (new) in a n-dimensional linear space , and dependences expressing each vector of the new basis through vectors of the old basis are the following:

(*)

…………………………….

The matrix is called the transition matrix from the old basis to the new one.

Theorem. Every transition matrix A is regular, i.e. det A  0.

Proof: Prove the theorem for n = 2. Let andbe “old” and “new” bases of a linear spaceV. Let be the transition matrix fromto, i.e.

Assume the contrary: , i.e.. Suppose thator(if bothorare equal to zero, then obviouslyandare linearly dependent). Multiply both parts of the first equation of (**) on, and multiply both parts of the second equation of (**) on. Then add the obtained expressions. We have, i.e.andare linearly dependent, contradicting the hypothesis that they form a basis.

Take an arbitrary vector x. Let be the coordinates of this vector in the old basis, and be its coordinates in the new basis.

Theorem. The coordinates and are connected by the following formulas:

…………………………….

which are called formulas of transformation of coordinates.

It is easily to see that the columns of the matrix A are the coordinates in formulas of transition from the old basis to new, and the rows of this matrix are the coordinates in formulas of transformation of old coordinates through new.

Proof. By (*) we have

Consequently, . We have a linear combination of linearly independent elements that is equal to zero. It must be trivial. Then we obtain:.

In a matrix form: If , then.

Example. Let a vector be given. Express this vector through the new basis if .

Solution:

I way. Write the transition matrix from the old basis to new:

.

The rows of the matrix are coefficients in formulas of transformation of coordinates:

Since then by solving the system of equations we find and .

II way. Since then

Consequently, .

Example. The system of coordinates xOy has been turned around the origin of coordinates on angle α. Express the coordinates of a vector a = xi + yj in the new system through its coordinates in the old system.

Solution: Decompose the vectors andon orts i and j:

Since then the matrix of transition from the old basis {i, j} to the new basis {,} is the following:

Then we obtain

i.e.

Subspace of a linear space

A non-empty set formed of elements of a linear spaceis called asubspace of the linear space if for alland every numberand.

Remark. Obviously, the set is a linear space itself since all the axioms of a linear space hold.

Example. The set of all vectors that are parallel to the same plane is a subspace of all geometric vectors of the space.

Example. Determine whether the set of square matrices of the n-th order with zero first row in the linear space of all square matrices of then-th order is a linear subspace, and if yes then find its dimension.

Solution: Indeed, if we take two matrices with zero first row :

and , we have

and

Thus, is a subspace of. Obviously,.

Example. Determine whether the set of singular square matrices of the 2-nd order (i.e. having zero determinant) in the linear spaceof all square matrices of the 2-nd order is a linear subspace.

Solution: Let and. Obviously,since detA = 0, det B = 0.

But , i.e.is not a subspace of.

If x, y, z, …, u are vectors of a linear space then all vectors, where α, β, γ, …, λ are all possible real numbers, form a subspace of the space. The set of all linear combinations of vectorsis called alinear hull of the vectors x, y, z, …, u and denoted by L(x, y, z, …, u).

Example. Find the dimension and a basis of the linear hull of vectors andin the space.

Solution: Let . Obviously,, i.e. these vectors are linearly dependent. Since, vectorsandare linearly independent. Thenandform a basis ofL.

If is a subspace of a linear space then.

Let and be subspaces of a linear space .

The union of andis called the set of elementssuch thator. The union ofandis denoted by.

The intersection of andis called the set of all elements simultaneously belonging toand. The intersection ofandis denoted by.

The sum of andis called the set of all elements of kindx + y where and. The sum ofandis denoted by.

The direct sum of andis called the set of all elements of kindx + y where ,and. The direct sum ofandis denoted by.

Theorem. Both the intersection and the sum of subspaces andare subspaces of.

Theorem. The dimension of the sum of subspaces andis equal to:

Proof. Let the subspace has a basis, i.e.. Complete this basis by elementsto a basis inand by elementsto a basis in. Then every elementcan be decomposed on the system of elements

Show that this system is linearly independent in . Consider some linear combination of these elements that is equal to zero:.

Observe that by our construction .

But on the other hand . This means thatand consequently allandin (**). And sinceis a basis, all, and the linear combination of (**) is trivial. Consequently,is a linearly independent system of elements. Then it is a basis in.

Thus, .

Соседние файлы в папке attachments