Staats- und Universitätsbibliothek Hamburg Carl von Ossietzky
Erscheinungsjahr:
2025
Medientyp:
Text
Schlagworte:
004: Informatik
ddc:004:
Beschreibung:
Continual learning addresses the challenge of training deep neural networks on a sequence of tasks without catastrophic forgetting. This thesis bridges the gap in downstream continual fine-tuning of foundation models by introducing and evaluating four different strategies: dual-memory replay, multi-layer prototyping, selective specialization, and noise-augmented feature replay, each delivering strong performance across diverse unimodal classification and multimodal reasoning problems.