Christopher Morris > Teaching WS 24/25 > Seminar (Master): Transformers for Graphs

Machine learning on graphs has seen a surge in interest due to the wide availability of graph data across a broad spectrum of disciplines, from life to social and engineering sciences. Recently, transformer architectures for graphs emerged as an alternative to established techniques for machine learning with graphs, such as graph neural networks. So far, they have shown promising empirical results, e.g., on molecular prediction datasets. This seminar will discuss recent progress in graph transformers, focusing on a theoretical understanding.

Requirements for Passing

To pass the seminar, you need to fulfill the following:
  1. Give a 30-minute-long talk about your assigned paper.
  2. Write a 12- to 15-page (excluding title page) detailed report about your assigned paper.
  3. Peer-review your fellow students' reports.
  4. Attend all meetings and actively participate; see below for dates.

Talks

At the end of the semester, each student will give a 30-minute-long talk about their assigned paper. You should provide an overview of your choosen/assigned paper and highlight the most important concepts and ideas. Ideally, your presentation should give the audience (i.e., your fellow students) a good understanding of your assigned paper.

Reports

The report gives a detailed overview of the choosen/assigned paper. The required report length is 12 to 15 pages, using the provided LaTeX template. This means that after you get your paper assigned, you write your report and submit it for "peer review" by your fellow students. You will receive constructive feedback to improve the paper; afterward, you will receive additional feedback from the seminar organizers. You can then submit an updated, final version, which will be graded. Note that this means that you will also have to write some short reviews on the reports by your fellow students.

Organization

  1. More details are given during the mandatory kick-off meeting.
  2. Papers will be assigned after the kick-off meeting.
  3. The long talks will be presented in day-long block seminar.
  4. All meetings (kick-off, peer-review, and final talks) will take place in Room 228, Theaterstraße 35 - 39.

Dates

Date
11.10.2024, 12:00   Kick-off meeting (in person).
04.11.2024, 24:00 Submission of report drafts.
02.12.2024, 24:00 Submission of reports for peer review.
11.12.2024, 24:00 Submission of peer reviews.
13.12.2024, 12:00 Discussion of peer reviews (in person).
09.01.2025, 24:00 Submission of reports.
19.01.2025, 24:00 Feedback by the organizers.
31.01.2025, 24:00 Submission of of presentation slides.
04.02.2025, 12:00 Peer review of presentation slides (in person).
21.02.2025, 10:00 Talks (in person).

Papers

The papers can be chosen from the following list.
  1. On the Theoretical Expressive Power and the Design Space of Higher-Order Graph Transformers
  2. Comparing Graph Transformers via Positional Encodings
  3. Distinguished In Uniform: Self Attention Vs. Virtual Nodes
  4. On the Connection Between MPNN and Graph Transformer
  5. On the Stability of Expressive Positional Encodings for Graphs
  6. On the Expressive Power of Spectral Invariant Graph Neural Networks