Slide-Label Aware Multi-Task Pretraining Using Adaptive Gradient Surgery – Marco Acerbis
- Date
- 26 January 2026, 14:15–15:00
- Location
- Theatrum Visuale, room 100155, building 10, Ångström Laboratory
- Type
- Seminar
- Lecturer
- Marco Acerbis
- Organiser
- Centre for Image Analysis
- Contact person
- Natasa Sladoje
In this seminar, I will present our latest work on multi-task pretraining for exploiting weak patient-level labels in computational cytology. This field faces two major challenges: (i) instance-level annotations are unreliable and prohibitively expensive to obtain, and (ii) witness rates—the proportion of truly abnormal instances in positive samples—are extremely low. To address these challenges, we propose SLAM-AGS, a Slide-Label-Aware Multitask pretraining framework. SLAM-AGS jointly optimizes two complementary objectives: a weakly supervised similarity objective applied to patches from slide-negative samples, and a self-supervised contrastive objective applied to patches from slide-positive samples. This design enables the model to leverage slide-level supervision while remaining robust to sparse and noisy instance evidence. To stabilize multi-task learning, we introduce Adaptive Gradient Surgery, which mitigates conflicting gradients between tasks and prevents training instability and model collapse. The pretrained encoder is then integrated into an attention-based Multiple Instance Learning framework for bag-level prediction and attention-guided retrieval of the most abnormal instances within each bag. We evaluate our approach on a publicly available bone-marrow cytology dataset under simulated witness rates ranging from 10% down to 0.5%. SLAM-AGS consistently improves bag-level F1-score and Top-400 positive cell retrieval compared to existing pretraining methods, with the largest gains observed at the lowest witness rates. These results demonstrate that resolving gradient interference during pretraining leads to more stable learning and significantly improved downstream performance.

Speaker: Marco Acerbis