Part IA — Vectors and Matrices
Based on lectures by N. Peake
These notes are not endorsed by the lecturers, and I have modified them (often
significantly) after lectures. They are nowhere near accurate representations of what
was actually lectured, and in particular, all errors are almost surely mine.
Complex numbers
Review of complex numbers, including complex conjugate, inverse, modulus, argument
and Argand diagram. Informal treatment of complex logarithm, n-th roots and complex
powers. de Moivre’s theorem. [2]
Vectors
Review of elementary algebra of vectors in R3 , including scalar product. Brief discussion
of vectors in Rn and Cn ; scalar product and the Cauchy-Schwarz inequality. Concepts
of linear span, linear independence, subspaces, basis and dimension.
Suffix notation: including summation convention, δij and εijk . Vector product and
triple product: definition and geometrical interpretation. Solution of linear vector
equations. Applications of vectors to geometry, including equations of lines, planes and
spheres. [5]
Matrices
Elementary algebra of 3 × 3 matrices, including determinants. Extension to n × n
complex matrices. Trace, determinant, non-singular matrices and inverses. Matrices as
linear transformations; examples of geometrical actions including rotations, reflections,
dilations, shears; kernel and image. [4]
Simultaneous linear equations: matrix formulation; existence and uniqueness of solu-
tions, geometric interpretation; Gaussian elimination. [3]
Symmetric, anti-symmetric, orthogonal, hermitian and unitary matrices. Decomposition
of a general matrix into isotropic, symmetric trace-free and antisymmetric parts. [1]
Eigenvalues and Eigenvectors
Eigenvalues and eigenvectors; geometric significance. [2]
Proof that eigenvalues of hermitian matrix are real, and that distinct eigenvalues give
an orthogonal basis of eigenvectors. The effect of a general change of basis (similarity
transformations). Diagonalization of general matrices: sufficient conditions; examples
of matrices that cannot be diagonalized. Canonical forms for 2 × 2 matrices. [5]
Discussion of quadratic forms, including change of basis. Classification of conics,
cartesian and polar forms. [1]
Rotation matrices and Lorentz transformations as transformation groups. [1]
1
,Contents IA Vectors and Matrices
Contents
0 Introduction 4
1 Complex numbers 5
1.1 Basic properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2 Complex exponential function . . . . . . . . . . . . . . . . . . . . 6
1.3 Roots of unity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.4 Complex logarithm and power . . . . . . . . . . . . . . . . . . . . 8
1.5 De Moivre’s theorem . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.6 Lines and circles in C . . . . . . . . . . . . . . . . . . . . . . . . 9
2 Vectors 11
2.1 Definition and basic properties . . . . . . . . . . . . . . . . . . . 11
2.2 Scalar product . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.1 Geometric picture (R2 and R3 only) . . . . . . . . . . . . 12
2.2.2 General algebraic definition . . . . . . . . . . . . . . . . . 12
2.3 Cauchy-Schwarz inequality . . . . . . . . . . . . . . . . . . . . . . 13
2.4 Vector product . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.5 Scalar triple product . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.6 Spanning sets and bases . . . . . . . . . . . . . . . . . . . . . . . 15
2.6.1 2D space . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.6.2 3D space . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.6.3 Rn space . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.6.4 Cn space . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.7 Vector subspaces . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.8 Suffix notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.9 Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.9.1 Lines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.9.2 Plane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.10 Vector equations . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3 Linear maps 24
3.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.1.1 Rotation in R3 . . . . . . . . . . . . . . . . . . . . . . . . 24
3.1.2 Reflection in R3 . . . . . . . . . . . . . . . . . . . . . . . 25
3.2 Linear Maps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.3 Rank and nullity . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.4 Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.4.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.4.2 Matrix Algebra . . . . . . . . . . . . . . . . . . . . . . . . 29
3.4.3 Decomposition of an n × n matrix . . . . . . . . . . . . . 30
3.4.4 Matrix inverse . . . . . . . . . . . . . . . . . . . . . . . . 31
3.5 Determinants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.5.1 Permutations . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.5.2 Properties of determinants . . . . . . . . . . . . . . . . . 33
3.5.3 Minors and Cofactors . . . . . . . . . . . . . . . . . . . . 35
2
,Contents IA Vectors and Matrices
4 Matrices and linear equations 38
4.1 Simple example, 2 × 2 . . . . . . . . . . . . . . . . . . . . . . . . 38
4.2 Inverse of an n × n matrix . . . . . . . . . . . . . . . . . . . . . . 38
4.3 Homogeneous and inhomogeneous equations . . . . . . . . . . . . 39
4.3.1 Gaussian elimination . . . . . . . . . . . . . . . . . . . . . 40
4.4 Matrix rank . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
4.5 Homogeneous problem Ax = 0 . . . . . . . . . . . . . . . . . . . 42
4.5.1 Geometrical interpretation . . . . . . . . . . . . . . . . . . 43
4.5.2 Linear mapping view of Ax = 0 . . . . . . . . . . . . . . . 43
4.6 General solution of Ax = d . . . . . . . . . . . . . . . . . . . . . 43
5 Eigenvalues and eigenvectors 46
5.1 Preliminaries and definitions . . . . . . . . . . . . . . . . . . . . . 46
5.2 Linearly independent eigenvectors . . . . . . . . . . . . . . . . . . 48
5.3 Transformation matrices . . . . . . . . . . . . . . . . . . . . . . . 50
5.3.1 Transformation law for vectors . . . . . . . . . . . . . . . 51
5.3.2 Transformation law for matrix . . . . . . . . . . . . . . . 52
5.4 Similar matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
5.5 Diagonalizable matrices . . . . . . . . . . . . . . . . . . . . . . . 54
5.6 Canonical (Jordan normal) form . . . . . . . . . . . . . . . . . . 55
5.7 Cayley-Hamilton Theorem . . . . . . . . . . . . . . . . . . . . . . 57
5.8 Eigenvalues and eigenvectors of a Hermitian matrix . . . . . . . . 58
5.8.1 Eigenvalues and eigenvectors . . . . . . . . . . . . . . . . 58
5.8.2 Gram-Schmidt orthogonalization (non-examinable) . . . . 59
5.8.3 Unitary transformation . . . . . . . . . . . . . . . . . . . 60
5.8.4 Diagonalization of n × n Hermitian matrices . . . . . . . 60
5.8.5 Normal matrices . . . . . . . . . . . . . . . . . . . . . . . 62
6 Quadratic forms and conics 63
6.1 Quadrics and conics . . . . . . . . . . . . . . . . . . . . . . . . . 63
6.1.1 Quadrics . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
6.1.2 Conic sections (n = 2) . . . . . . . . . . . . . . . . . . . . 64
6.2 Focus-directrix property . . . . . . . . . . . . . . . . . . . . . . . 65
7 Transformation groups 68
7.1 Groups of orthogonal matrices . . . . . . . . . . . . . . . . . . . 68
7.2 Length preserving matrices . . . . . . . . . . . . . . . . . . . . . 68
7.3 Lorentz transformations . . . . . . . . . . . . . . . . . . . . . . . 69
3
, 0 Introduction IA Vectors and Matrices
0 Introduction
Vectors and matrices is the language in which a lot of mathematics is written
in. In physics, many variables such as position and momentum are expressed as
vectors. Heisenberg also formulated quantum mechanics in terms of vectors and
matrices. In statistics, one might pack all the results of all experiments into a
single vector, and work with a large vector instead of many small quantities. In
group theory, matrices are used to represent the symmetries of space (as well as
many other groups).
So what is a vector? Vectors are very general objects, and can in theory
represent very complex objects. However, in this course, our focus is on vectors
in Rn or Cn . We can think of each of these as an array of n real or complex
numbers. For example, (1, 6, 4) is a vector in R3 . These vectors are added in the
obvious way. For example, (1, 6, 4) + (3, 5, 2) = (4, 11, 6). We can also multiply
vectors by numbers, say 2(1, 6, 4) = (2, 12, 8). Often, these vectors represent
points in an n-dimensional space.
Matrices, on the other hand, represent functions between vectors, i.e. a
function that takes in a vector and outputs another vector. These, however, are
not arbitrary functions. Instead matrices represent linear functions. These are
functions that satisfy the equality f (λx + µy) = λf (x) + µf (y) for arbitrary
numbers λ, µ and vectors x, y. It is important to note that the function x 7→ x+c
for some constant vector c is not linear according to this definition, even though
it might look linear.
It turns out that for each linear function from Rn to Rm , we can represent
the function uniquely by an m × n array of numbers, which is what we call the
matrix. Expressing a linear function as a matrix allows us to conveniently study
many of its properties, which is why we usually talk about matrices instead of
the function itself.
4