- •1.Matrices. Classification of matrices. Operations over matrices: addition of matrices, multiplication of a matrix on number.
- •2. Matrices. Multiplication of matrices.
- •3. Determinants. Calculating determinants of the second and third order.
- •5. Properties of determinants. Decomposing a determinant of the fourth order on a row (or column). Notion of determinant of the n-th order.
- •Notion of a determinant of the n-th order
- •7.Inverse matrix. Finding the inverse matrix.
- •Compute the determinant of the matrix а (if it equals zero then there is no inverse matrix).
- •6.Systems of linear equations.(Cramer rule)
- •8.Matrix representation of a system of linear equations. Finding solutions of a system of linear equations by method of inverse matrix.
- •9.Rank of a matrix. Finding the rank of a matrix by two methods.
- •10.Criterion for compatibility of a system of linear equations (Theorem of Kronecker-Capelli). Example of determining the compatibility of a system by this theorem.
- •13Vectors (in the geometric space). Changing the coordinates of a vector at replacement of a basis and the origin of coordinates.
- •14Transition between orthonormal systems of coordinates on plane. Right (left) oriented pair of vectors on plane.
- •15 Linear dependence of vectors in the geometric space (plane and line). Theorems on properties of linearly dependent vectors.
- •16. Basis in the geometric space (plane and line). The coordinates of a vector in a basis.
- •17. Cartesian system of coordinates. Radius-vector of a point. Finding the coordinates of a point dividing a segment in some ratio.
- •18. Complex numbers. Actions over complex numbers. Algebraic and trigonometric forms of a complex number.
- •20. Dimension and a basis of a linear space. Isomorphism of linear spaces.
- •19. Linear space. Theorems on properties of a linear space. Linearly independent vectors in a linear space.
- •Linearly independent vectors. Let X, y, z, …, u be vectors of a linear space .
- •21. Transformation of coordinates at transition to a new basis in a linear space. Theorems on transition matrix and formulas of transformation of coordinates.
- •22. Subspaces of a linear space. Linear hull of vectors. Intersection, union, sum and direct sum of subspaces.
- •23. Fundamental system of solutions of a homogeneous system of equations. Subspaces formed by solutions of a homogeneous linear system of equations.
- •24. Linear transformations. Examples of linear transformations. Actions over linear transformations.
- •Actions over linear transformations
- •28.The image and kernel of a linear operator.
- •29Linear mapping. Injective and surjective linear mappings. Matrix of a linear mapping.
- •30Linear functionals. The components of a linear functional. Dual space of linear functional
- •28.The image and kernel of a linear operator.
Linearly independent vectors. Let X, y, z, …, u be vectors of a linear space .
A vector v = x + y + z + … + u, where , , , …, – real numbers, also belongs to . It is called a linear combination of vectors x, y, z, …, u.
Let a linear combination of vectors x, y, z, …, u be zero-vector, i.e.
x + y + z + … + u = 0 (1)
The vectors x, y, z, …, u are called linearly independent if the equality (1) holds only for = = = …= = 0. If (1) can also hold when not all numbers , , , …, are equal to zero then the vectors x, y, z, …, u are called linearly dependent.
Theorem. If every vector of a linear space can be represented as a linear combination of linearly independent vectors e1, e2, …, en, then d( ) = n (and consequently the vectors e1, e2, …, en form a basis in the space ).
21. Transformation of coordinates at transition to a new basis in a linear space. Theorems on transition matrix and formulas of transformation of coordinates.
Let there be two basis: e1, e2, …, en (old) and e’1, e’2, …, e’n (new) in a n-dimensional linear since Rn
e’1 = a11e1 + a21e2 + …+an1en
e’2 = a12e1 + a22e2 + … + an2en
…………………………………..
en= a1ne1 + a2ne2 + ….+annen
The
matrix A= (
)
is the transition matrix from the old basis to the new basis.
Theorem: Every transition matrix A is regular, i.e. det A = 0
proof: Prove the theorem for n=2
Let {e1, e2} and {e’1, e’2} be “old” and “new” basis, and
A
=
transition matrix
i.e. we have e’1 = a11e1 + a21e2 *
e’2 = a12e1 + a22e2
Assume the contrary : detA=0, i.e. 8a11a22 – a12a21 = 0
Suppose that a21≠0 or a22≠0. multiply both parts of the 1st equation of (*) on a22, and multiply both parts of the 2nd equation of (*) on (-a21). Then add the obtained expressions.
a22e’1 – a21e’2 = 0 ⇒ e’1, e’2 are linearly dependent contradicting the hypothesis that they form a basis.
Theorem: The coordinates ξ1, …, ξn and ξ’1, …, ξ’n are connected by
ξ1 = a11 ξ’1 + a12 ξ’2+…+a1n ξ’n
ξ2 = a21 ξ’1 + a22 ξ’2+…+a2n ξ’n
………………………………………………..
ξ1 = an1 ξ’1 + an2 ξ’2+…+ann ξ’n
Which are called the formulas of transformation of coordinates.
22. Subspaces of a linear space. Linear hull of vectors. Intersection, union, sum and direct sum of subspaces.
A
non-empty set
formed
of elements of a linear space
is called a subspace
of the linear space
if for all
and every number
and
.
The
union
of
and
is called the set of elements
such that
or
.
The union of
and
is denoted by
.
The
intersection
of
and
is called the set of all elements simultaneously belonging to
and
.
The intersection of
and
is denoted by
.
The
sum
of
and
is called the set of all elements of kind x
+ y
where
and
.
The sum of
and
is denoted by
.
The
direct
sum
of
and
is called the set of all elements of kind x
+ y
where
,
and
.
The direct sum of
and
is denoted by
.
Theorem. Both the intersection and the sum of subspaces and are subspaces of
Theorem. The dimension of the sum of subspaces and is equal to:
If
x,
y, z, …, u
are vectors of a linear space
then all vectors
,
where α, β, γ, …, λ are all possible real numbers, form a
subspace of the space
.
The set of all linear combinations of vectors
is
called a linear
hull
of the vectors x,
y, z, …, u
and denoted by L(x,
y, z, …, u).
