Methods for Downstream Continual Learning with Foundation Models

Link:
Autor/in:
Beteiligte Person:
  • Wermter, Stefan
Verlag/Körperschaft:
Staats- und Universitätsbibliothek Hamburg Carl von Ossietzky
Erscheinungsjahr:
2025
Medientyp:
Text
Schlagworte:
  • 004: Informatik
  • ddc:004:
Beschreibung:
  • Continual learning addresses the challenge of training deep neural networks on a sequence of tasks without catastrophic forgetting. This thesis bridges the gap in downstream continual fine-tuning of foundation models by introducing and evaluating four different strategies: dual-memory replay, multi-layer prototyping, selective specialization, and noise-augmented feature replay, each delivering strong performance across diverse unimodal classification and multimodal reasoning problems.
Lizenzen:
  • http://purl.org/coar/access_right/c_abf2
  • info:eu-repo/semantics/openAccess
  • https://creativecommons.org/licenses/by-sa/4.0/
Quellsystem:
E-Dissertationen der UHH

Interne Metadaten
Quelldatensatz
oai:ediss.sub.uni-hamburg.de:ediss/11833