Bridging the Gap - Language Models and Structured Knowledge in AI

Organizers

Dr. Gerrit Großmann
Cennet Oguz, M.Sc.
Dr. Simon Ostermann
Prof. Dr. Verena Wolf

For any issues regarding the seminar, please e-mail Gerrit Großmann and have [LanguageModelsSeminar2024] (including the brackets) in the subject line.

Organization

  • This block seminar is open to CoLi and CS students (and related courses).
  • If you are a CS student, please use the seminar assignment system to register.
  • If you are a CoLi student, please write us an email to register.
  • Please register only if you are available at the time slot of the seminar.
  • If you want to take the seminar but were not selected by the assignment system, please apply for the waiting list by emailing us.
  • The seminar takes place on September 19 (Thursday) and 20 (Friday), in person (DFKI, D3.2, room Reuse, next to the main entrance).
  • The seminar language is English.
  • The seminar earns you 7 ECTS (CS students) or 3 ECTS (CoLi students).
  • Depending on the study regulations, you need to register in HISPOS/LSF.

Grading

To pass the seminar, you have to attend all sessions and:

  • give a presentation (CoLi and CS studenetsm worth 3 ECTS);
  • write reports in which you critically examine the topics in the seminar (only CS students, worth 4 ECTS);
  • participate in discussions;

… with a passing grade.

For CS students, the final grade is calculated from the weighted average. Both CoLi and CS students have the opportunity to earn a bonus for submitting an optional practical project. A good project can improve your final grade by 0.3 points (e.g., from 1.7 to 1.3), an excellent project can boost your grade by 0.7 points (e.g., from 2.3 to 1.7).


Topic Overview

This is a block seminar in September 2024, which will be held in cooperation between the Department of Multilingual Technologies (MLT) and Neuro-Mechanistic Modeling (NMM).

Large language models (LLMs) have swiftly become a cornerstone in AI research, capturing the attention of the public as the most accessible gateway to artificial intelligence. Despite their groundbreaking impact, LLMs are not without their imperfections. Notably, the occurrence of hallucinations and limited reasoning capabilities, particularly in specialized domains, remain significant challenges.

This seminar begins by investigating the theoretical foundations of language representations, tracing the evolution of transformers and their progression towards the cutting-edge LLMs we see today. Building on this foundation, the seminar will then explore promising future directions. Special emphasis will be placed on the integration of LLMs with neuro-symbolic reasoning and the enrichment of these models through knowledge graphs and other forms of structured data.

Requirements

We expect no prior knowledge in language modeling. The seminar is open to CS (including related majors) and CoLi students.


Presentation

TBA

Reports

TBA

Practical Project (Bonus)

TBA


Topics

| Topic | Student | Paper | | ——| —— | —— |

TBA