Graph Neural Networks
This is a short course of the Mathematics for daTa scieNce study plan
![]() |
|
Bio
Veronica Lachi is a researcher in the Mobile and Social Computing Lab at Bruno Kessler Foundation in Trento. Her research interests focus on the theoretical properties of Graph Neural Networks, particularly their expressiveness, hierarchical pooling, and temporal GNNs. She obtained a PhD in Artificial Intelligence from the University of Siena under the supervision of Prof. Monica Bianchini. During her PhD, she was a Visiting Researcher at UiT The Arctic University of Tromsø and the GAIN Group at the University of Kassel.Course Description
Many real-world systems, from social networks and recommendation platforms to biological pathways and protein structures, can be naturally modeled as graphs, where nodes represent entities and edges represent their relationships. Traditional machine learning methods often struggle with such structured data. Graph Neural Networks (GNNs), instead, provide a powerful framework to learn directly from graph data. They compute representations of nodes, links, and entire graphs in a continuous space, typically \(R^n\), which can then be used for downstream predictive tasks such as node classification or link prediction. This course will cover the basics of GNNs, starting from the message passing paradigm, the core mechanism that allows GNNs to propagate and aggregate information across nodes, moving through the theoretical properties of GNNs, such as their expressive power and its connection to the graph isomorphism problem, and concluding with key applications across science and industry.
Detailed content
Supervised and Unsupervised learning tasks on graphs, Message Passing, type of message passing layers, pooling layers, spectral graph neural networks, hyperbolic graph neural networks, espressive power of graph neural networks, homophily and heterophily, oversmoothing, oversquashing, temporal graph neural networks.
References
- Hamilton, William L. Graph representation learning. Morgan & Claypool Publishers, 2020.
- Bacciu, Davide, et al. “A gentle introduction to deep learning for graphs.” Neural Networks 129 (2020): 203-221.
- Corso, Gabriele, et al. “Graph neural networks.” Nature Reviews Methods Primers 4.1 (2024): 17.
- Xu, Keyulu, et al. “How Powerful are Graph Neural Networks?.” International Conference on Learning Representations.
- Arnaiz-Rodriguez, Adrian, and Federico Errica. “Oversmoothing,” Oversquashing”, Heterophily, Long-Range, and more: Demystifying Common Beliefs in Graph Machine Learning.” arXiv preprint arXiv:2505.15547 (2025).
Schedule
- Thursday, 13 November 2025, 12:00-13:30, room A213 @ Povo1
- Friday, 14 November 2025, 12:30-14:30, room A212 @ Povo1
- Thursday, 20 November 2025, 12:00-13:30, room A213 @ Povo1
- Friday, 21 November 2025, 12:30-14:30, room A212 @ Povo1
Details
- Language: English
- For further information, please contact Prof. Claudio Agostinelli