Ruprecht-Karls-Universität Heidelberg
Institut für Computerlinguistik

Bilder vom Neuenheimer Feld, Heidelberg und der Universität Heidelberg

Diachronic Language Models

Kursbeschreibung

Studiengang Modulkürzel Leistungs-
bewertung
BA-2010 AS-CL, AS-FL 8 LP
BA-2010[100%|75%] CS-CL 6 LP
BA-2010[50%] BS-CL 6 LP
BA-2010[25%] BS-AC, BS-FL 4 LP
Master SS-CL-TAC, SS-SC-FAL 8 LP
Dozenten/-innen Wei Zhao, Yi Fan
Veranstaltungsart Proseminar/Hauptseminar
Sprache English
Erster Termin 15.10.2024
Zeit und Ort Dienstags, 15:15-16:45, INF 329 / SR 26
Commitment-Frist tbd.

Teilnahmevoraussetzungen

  • Introduction to Computational Linguistics or similar introductory courses
  • Introduction to Neural Networks and Sequence-To-Sequence Learning (or equal)
  • Completion of Programming I

Leistungsnachweis

  • Active Participation
  • Presentation
  • Term Paper Writing

Content

Despite huge progress in Large Language Models (LLMs), which have revolutionized the way we consume text-based information through conversational interface, their ability to understand knowledge and language at different points in time from the past to present remains unclear. This capability may seem nascent at first glance, but it has the potential to impact people’s lives at the greatest. For instance, approximately 30% of search engine queries worldwide (over 2 billion Google queries every day) are time-sensitive, making it crucial for LLMs to ensure the temporal relevance of model responses. Additionally, this capability is essential for the automatic induction of previously non-existent low-resource and bilingual dictionaries that reflect language change over time; LLMs, when temporally grounded, would provide a means of dictionary induction to support the EU’s multilingualism policy and globalization efforts beyond the EU. In this course, we will explore the following topics:

  • Evaluating the temporal grounding of LLMs
  • Demystifying model delusionality and rationality over time
  • Manipulating LLMs to force model responses to be time-relevant
  • Applications of LLMs in automatic dictionary induction
This course extends upon the first edition of Diachronic Language Models in WS23/24 which previously covered diachronic modeling and analyses of language change over time. For the second edition, our aim shifts to group projects on the same subject with a focus on temporal LLMs. The group project tasks include literature reading, implementation, experiments, results, and a final term paper. Having attended last year’s course edition is helpful but not necessary.
Course resources are available here.

» weitere Kursmaterialien

zum Seitenanfang