AI4Physics Learning Workshop

Date
16–17 April 2026
Type
Workshop
Organiser
AI4Physics
Contact person
Magdalena Larfors

In this workshop, you got an introduction to Graph Neural Networks and Geometric Deep Learning, AI methods that has proven useful in the analysis of many areas of physics and related sciences.

We will start with introductory material, only requiring some general familiarity with AI/ML (for example obtained in last year's AI4Physics workshop). Lectures will be mixed with practical tutorials, and there will be a poster session for internal communication about AI-related research at our department.

Time

Thursday

Time

Friday

 

 

9.00

Daniel Murnane - Lecture 2 Heinz-Otto Kreiss
Geometric Deep Learning for Science: Methods

 

 

10.00

Coffee

 

 

10.30

Daniel Murnane - Tutorial Evelyn Sokolowsky A
Geometric Deep Learning: Hands-on

 

Colloquium lunch, separate event, ends 13.00

12.00

Lunch Evelyn Sokolowsky A

13.15

Welcome 80121


Paul Häusner - Lecture

Introduction to graph neural networks
Daniel Murnane - Lecture 1
Geometric Deep Learning for Science: Methods

13.00

Daniel Persson - Lecture Evelyn Sokolowsky B
Geometric Deep Learning: From equivariance to weather predictions, and beyond
Maria Bånkestad - Lecture
Graph neural networks for scientific machine learning
Conclusion and info about AI4TekNat

15.30

Coffee

15.30

Coffee

16.00

Paul Häusner - Tutorial 80121

Molecular energy prediction using invariant neural networks

 

 

17.30

Poster session and reception
House 9, floor 2, TP kitchen area

 

 

Titles and abstract for talks:

Maria Bånkestad (RISE)
Lecture: Graph neural networks for scientific machine learning

Abstract: Graph neural networks (GNNs) have become a key tool for modeling structured scientific data, from particle interactions and molecular systems to lattice models and physical simulations. In this talk, I will build up the core ideas behind GNNs from scratch: what graphs are, how message passing works, and how to go from raw scientific data to predictions. I will show how physical knowledge, such as symmetries, can be encoded directly into model architectures to improve accuracy and data efficiency, with examples from my own research and from other areas of physics. The talk is aimed at researchers interested in applying these methods in their own work.

Paul Häusner (UU/IT)
Lecture: Introduction to graph neural networks

Abstract: In this lecture I will introduce the basics of message-passing neural networks that allow efficient learning on graph-structured data. Starting from classical graph theory, I will motivate why standard deep learning architectures fall short on such data, and show how message-passing networks address this by aggregating local neighborhood information. In the tutorial we apply this framework to the problem of energy prediction of molecules, with a particular focus on feature representations that encode physical inductive biases into the learning problem.

This lecture is accompanied by a tutorial on Molecular energy prediction using invariant neural networks

Daniel Murnane (NBI)
Lecture: Geometric Deep Learning for Science: Methods and Hands-on

Daniel Persson (Chalmers)
Lecture: Geometric Deep Learning: From equivariance to weather predictions, and beyond

Link to registration: https://doit.medfarm.uu.se/bin/kurt3/kurt/8903148

FOLLOW UPPSALA UNIVERSITY ON

Uppsala University on Facebook
Uppsala University on Instagram
Uppsala University on Youtube
Uppsala University on Linkedin