While graphs are intuitively and naturally represented by vertices and edges, such representations are limited in terms of their analysis, both theoretically and practically (e.g., when implementing graph algorithms). A more powerful approach is yielded by representing them via appropriate matrices (e.g., adjacency, diffusion kernels, or graph Laplacians) that capture intrinsic relations between vertices over the "geometry" represented by the graph structure. Spectral graph theory leverages such matrices, and in particular their spectral and eigendecompositions, to study the properties of graphs and their underlying intrinsic structure. This study leads to surprising and elegant results, not only from a mathematical standpoint, but also in practice with tractable implementations used, e.g., in clustering, visualization, dimensionality reduction, manifold learning, and geometric deep learning. Finally, since nearly any modern data nowadays can be modelled as a graph, either naturally (e.g., social networks) or via appropriate affinity measures, the notions and tools studied in this course provide a powerful framework for capturing and understanding data geometry in general.

This is a graduate-level 4 credit course at UdeM, available also via the ISM. It is suitable for CS, statistics, and applied math students interested in data science and machine learning.

- Foundations & Preliminaries:
- Eigendecomposition, eigenvalues, and eigenvectors
- Relevant results on optimization, test vectors, and eigenvalue interlacing
- Adjacency matrices and Graph Lapacians

- Characterization of common graph types:
- Complete Graphs
- Product Graphs
- Star Graphs
- Cycle Graph
- Path Graph & Weighted Path Graphs
- Expander Graphs

- Notable results:
- Perron-Frobenius Theory
- Fiedler's Nodal Domain Theorem
- Cheeger's Inequality

- Properties & Operations on Graphs
- Graph Connectivity
- Graph Conductance
- Drawing Graphs
- Comparing Graphs
- Graph Partitioning & Clustering
- Graph Sparsification via Random Sampling

- Advanced topics:
- Graph Signal Processing
- Random Walks on Graphs
- Graph Convolutional Networks

- 40% -- homework
- 30% -- literature review (recorded presentation)
- 30% -- project (written report)

- Each student is expected to perform a literature review on a topic of interest relevant to spectral graph theory.
- The literature review should include 3-4 papers on the chosen topic.
- The intended length of selected papers should be at least 8-10 pages not including references.
- Shorter papers (4-5 pages) may be acceptable, but then you should cover more of them (e.g., 4-5 papers).
- If you cover significantly longer papers (e.g., 20+ pages), you can choose 2-3 papers.

- Selected papers should be published within the past 15 years in a reputable peer-reviewed venue (e.g., well known conferences or journals)
- You may include at most one unpublished preprint, as long as it is available on arxiv or biorxiv. The rest of the papers should be published.
- You may include at most one older paper (published more than 10 years ago), as long as you can justify it as a seminal work required to understand the other selected papers.

- The review will consist of a short recorded presentation (10-20 minutes) covering the papers and general conclusions drawn from them.

- The literature review should include 3-4 papers on the chosen topic.
- An assignment for submitting the literature review presentation will be posted on MS Teams.

- Each student is expected to work on a project relating to the topics of the course.
- Collaboration is allowed, but has to be declared, and each student is expected to submit their own report.
- Formatting and expected page range will be posted on MS Teams.

- An assignment for submitting the project report will be posted on MS Teams.