没有合适的资源?快使用搜索试试~ 我知道了~
资源推荐
资源详情
资源评论
Spectral Graph Theory
Gabriel Coutinho
January 25, 2022
These are the course notes of a (under)grad course being offered at UFMG in 2021.2.
Contents
1 Symmetric matrices and adjacency of a graph 3
1.1 Symmetric matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2 The adjacency matrix of a graph . . . . . . . . . . . . . . . . . . . . . . . . 6
1.3 Perron-Frobenius (a special case) . . . . . . . . . . . . . . . . . . . . . . . . 9
1.4 Eigenvalues of some classes of graphs . . . . . . . . . . . . . . . . . . . . . . 13
1.5 Bounds to the largest and smallest eigenvalues . . . . . . . . . . . . . . . . . 14
1.6 A result for regular graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
1.7 Graphs with largest eigenvalue at most 2 . . . . . . . . . . . . . . . . . . . . 16
1.8 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2 Graph isomorphism 18
2.1 The problem statement and a naive approach . . . . . . . . . . . . . . . . . 18
2.2 A canonical ordering of the vertices . . . . . . . . . . . . . . . . . . . . . . . 20
2.3 How to find the canonical star basis? . . . . . . . . . . . . . . . . . . . . . . 22
2.4 Star bases and larger multiplicities . . . . . . . . . . . . . . . . . . . . . . . 23
2.5 How to order star bases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
2.6 Star partitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.7 Combinatorial connections to star partitions . . . . . . . . . . . . . . . . . . 27
2.8 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3 Graph polynomials 29
3.1 Reconstruction — an interlude . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.2 Walks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
3.3 Spectral decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.4 Reconstructing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
3.5 The matching polynomial of a graph . . . . . . . . . . . . . . . . . . . . . . 38
3.6 Real roots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
3.7 Number of matchings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
3.8 Average . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
3.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
1
Gabriel Coutinho Spectral Graph Theory - 2021.2
4 Covers and Interlacing families 44
4.1 Sensitivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
4.2 Cauchy Interlacing — a first contact . . . . . . . . . . . . . . . . . . . . . . 45
4.3 The Sensitivity Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
4.4 Covers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
4.5 Expander graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
4.6 Ramanujan graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
4.7 Interlacing families . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
4.8 Real roots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
5 Eigenvalues and the structure of graphs 56
5.1 Rayleigh quotients and Interlacing . . . . . . . . . . . . . . . . . . . . . . . . 56
5.2 Partitions - cliques, cocliques, colourings . . . . . . . . . . . . . . . . . . . . 59
5.3 Other eigenvalues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
5.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6 Optimization for cliques and colourings 66
6.1 Geometry of inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
6.2 Linear programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
6.3 Fractional chromatic number . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
6.4 Positive semidefinite matrices . . . . . . . . . . . . . . . . . . . . . . . . . . 70
6.5 Kronecker and Schur . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
6.6 Cones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
6.7 Duality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
6.8 Three SDP formulations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
7 Theta 79
7.1 Theta and its dual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
7.2 Largest eigenvalue . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
7.3 Conic optimization for cocliques and colourings . . . . . . . . . . . . . . . . 81
7.4 Conic program for the independence number . . . . . . . . . . . . . . . . . . 82
7.5 Conic programming for the fractional chromatic number . . . . . . . . . . . 84
7.6 Vector colourings and theta variants . . . . . . . . . . . . . . . . . . . . . . . 85
8 The Laplacian matrix 87
8.1 Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
8.2 Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
8.3 Representation, springs and energy . . . . . . . . . . . . . . . . . . . . . . . 89
8.4 Electrical currents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
8.5 Connectivity and interlacing . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
8.6 Partitioning and cuts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
8.7 Normalized Laplacian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
8.8 Random Walks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
8.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
2
Gabriel Coutinho Spectral Graph Theory - 2021.2
1 Symmetric matrices and adjacency of a graph
In this section, we shall introduce the basic theory of symmetric matrices, including a result
generally overlooked in a first or second linear algebra course. We shall define the adjacency
matrix of a graph, and then make connections between the algebraic properties of this matrix
and the combinatorial properties of the graph.
1.1 Symmetric matrices
We shall work over the vector space R
n
. If u, v ∈ R
n
, then hv, ui = v
T
u is an inner
product (meaning, it is a positive-definite commutative bilinear form). A linear operator
M : R
n
→ R
n
is self-adjoint if hMv, ui = hv, Mui for all u and v, and, because M can (and
will) be seen as a square matrix, it follows that M is a self-adjoint operator if and only if
M = M
T
, that is, M is a symmetric matrix. Symmetric matrices enjoy two key important
properties: they are diagonalizable by orthogonal eigenvectors, and all of their eigenvalues
are real. We start proving both properties.
Lemma 1.1. The eigenvalues of a real symmetric matrix are real numbers.
Proof. Let Mu = λu, with u 6= 0. Some of these things could be complex numbers, so we
can take the conjugate on both sides, recovering
Mu = λu.
Thus u is an eigenvector with eigenvalue λ. Thus
λu
T
u = (Mu)
T
u = u
T
(Mu) = λu
T
u.
Because u
T
u 6= 0 if u 6= 0, then λ = λ.
Now simply assume whenever we are dealing with a symmetric matrix, its eigenvalues
are real, and any eigenvector can be assumed to be real.
Lemma 1.2. Let M be a real symmetric matrix, and assume u and v are eigenvectors
associated to different eigenvalues. Then v
T
u = 0, that is, they are orthogonal.
Proof. Say Mu = λu and Mv = µv, with λ 6= µ. It follows that
λ(v
T
u) = v
T
Mu = (v
T
Mu)
T
= u
T
M
T
v = u
T
Mv = µ(u
T
v) = µ(v
T
u).
As λ 6= µ, it must be that v
T
u = 0.
The lemma above already implies that if M is diagonalizable, then it is diagonalizable
with orthogonal eigenvectors — as, in fact, we eigenvectors corresponding to distinct eigen-
values are orthogonal, and inside each eigenspace we can always find an orthogonal basis.
We move forward.
A subspace U of R
n
is said to be M-invariant if, for all u ∈ U, Mu ∈ U. This is a key
fundamental concept in linear algebra, and several results are proven by noting that certain
subspaces are invariant for certain operator.
3
Gabriel Coutinho Spectral Graph Theory - 2021.2
Lemma 1.3. Let M be a real symmetric matrix. If U is M-invariant, then U
⊥
is also
M-invariant.
Proof. Note that v ∈ U
⊥
, by definition, if v
T
u = 0 for all u ∈ U. For all u ∈ U and v ∈ U
⊥
,
note that
(Mv)
T
u = v
T
Mu = v
T
(Mu) = 0,
because u ∈ U, U is M-invariant, and so Mu ∈ U, and v ∈ U
⊥
. Thus Mv ∈ U
⊥
, as we
wanted.
Let λ be such that det(λI −M) = 0. Then λI −M is singular, and therefore it contains
at least one non-zero vector in its kernel. This is saying that all square matrices M contain
at least one eigenvector for each root of φ
M
(x) = det(xI −M). As M is symmetric, we now
know that all possible roots of φ
M
are real.
Lemma 1.4. Let U be an M-invariant subspace with dimension ≥ 1. Then there is one
eigenvector of M in U.
Proof. Let P be a matrix whose columns form an orthonormal basis for U. As U is M-
invariant, it follows that there is a matrix N so that
MP = PN.
(Stop now and think carefully why this equality is true.) In particular, N = P
T
MP, so N
is symmetric. Let u be one eigenvector of N with eigenvalue λ. Then
MPu = PNu = λPu,
and, moreover Pu 6= 0, as the columns of P are linearly independent. Thus Pu is an
eigenvector for M in U.
These four lemmas above are all you need to prove the following result by induction as
an exercise.
Theorem 1.5. Let M be a real symmetric matrix. Then M is diagonalizable by set of
orthogonal eigenvectors, all of them corresponding to real eigenvalues.
Exercise 1.6. Write the proof of this theorem as an exercise.
Corollary 1.7. Let v
1
, ..., v
n
be an orthonormal basis of eigenvectors for M, each corre-
sponding to an eigenvalue λ
1
, ..., λ
n
(these are not necessarily distinct). Let P be the matrix
whose ith column is v
i
, and Λ the diagonal matrix whose ith diagonal element is λ
i
. Then
P
T
MP = Λ,
and
M = λ
1
(v
1
v
T
1
) + ... + λ
n
(v
n
v
T
n
).
4
Gabriel Coutinho Spectral Graph Theory - 2021.2
Proof. A linear operator is defined and determined by its action on a basis. The first equality
follows from the fact that both sides act equally on the canonical basis of R
n
. The second
follows from
M = PΛP
T
,
and, by definition of matrix product, M = v
1
(λ
1
v
T
1
) + ... + v
n
(λ
n
v
T
n
).
You should recall right now that, because v
i
is normalized, then P
i
= v
i
v
T
i
is the
matrix that represents the orthogonal projection onto the line spanned by v
i
, that is, P
i
is
a projection as P
2
i
= P
i
, and it is an orthogonal projection as P
i
is symmetric. Note that
P
i
P
j
= 0 whenever i 6= j, and so any sum of the P
i
s for distinct indices will correspond
to the orthogonal projection onto the space spanned by the v
i
s of the same indices. In
particular
P
n
i=1
P
i
= I.
Exercise 1.8. Assume P
i
s are orthogonal projections. Show that P
1
+ P
2
is an orthogonal
projection if and only if P
1
P
2
= 0.
Show now that P
1
+ ... + P
k
is an orthogonal projection if and only if P
i
P
j
= 0 for i 6= j.
Say M is an n×n symmetric matrix with distinct eigenvalues θ
0
, ..., θ
d
. When we write the
second equation from the statement of Corollary 1.7, we can collect the terms corresponding
to equal eigenvalues, and have
M =
d
X
r=0
θ
r
E
r
, (1)
where, according to the discussion above, each E
r
corresponds to the orthogonal projection
onto the θ
r
eigenspace. Equation (1) is usually referred to the as the spectral decomposition
of the matrix M.
Exercise 1.9. Find the spectral decomposition of
M =
1 +
√
2 0 1 −
√
2 0
0 1 +
√
2 0 1 −
√
2
1 −
√
2 0 1 +
√
2 0
0 1 −
√
2 0 1 +
√
2
Hint: do not try to compute the characteristic polynomial. It is easier to simply try to look
and guess which are the eigenvectors and eigenvalues.
Note that the E
r
are symmetric matrices satisfying E
r
E
s
= δ
rs
E
r
, and
P
d
r=0
E
r
= I.
Exercise 1.10. Prove (or at least convince yourself) that for any polynomial p(x), it follows
that
p(M) =
d
X
r=0
p(θ
r
)E
r
.
Exercise 1.11. Let M be a symmetric matrix, with spectral decomposition as in (1).
(A) What is the minimal polynomial of M? (B) Prove that for each E
r
, there is a polynomial
p
r
of degree d so that p
r
(M) = E
r
. Describe this polynomial as explicitly as you can.
5
剩余99页未读,继续阅读
资源评论
a2507283885
- 粉丝: 7
- 资源: 1
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功