Funded Projects

Our primary focus is linguistically-informed Neural Natural Language Processing (NNLP). Towards that end, we are working on various funded projects, as described below.

ERC Advanced Grant NonSequeToR

Text understanding can fundamentally be viewed as a process of composition: the meaning of smaller units is composed to compute the meaning of larger units and eventually of sentences and documents. Our hypothesis is that optimal generalization in deep learning requires that more regular processes of composition are learned as composition functions whereas units that are the output of less regular processes are learned as static embeddings. We investigate novel representation learning algorithms and architectures to test this hypothesis. The envisioned goal of the project is a new robust and powerful text representation that captures all aspects of form and meaning that NLP needs for successful processing of text.

(Project Description)

High quality subword vocabulary induction (Project within the Munich Center for Machine Learning)

A common approach in representing text as input for deep learning models is to use heuristically induced word pieces. Such a representation is easier to process than characters, but does not incur the high cost of large word vocabularies. In this project, we will investigate alternatives to currently used heuristics that are a more natural representation of the semantics of text.

(Project Description)

ReMLAV: Relational Machine Learning for Argument Validation (DFG project)

Argument validation is the task of classifying a given argument as valid or invalid based on its linguistic form, the larger document context, world knowledge and other factors. This project aims to combine representation learning (both static and contextualized embeddings) and relational machine learning to solve this task.

(Project Description)

Munich Doctoral Program (IDK): Premodern Cultures, Global Perspectives and the Foundations of a New Philology

Based on an examination of 4000 years of history of literature, this program’s goal is to synthesize theory and practice of European traditions with those of the East Asian and South Asian cultural spheres as well as the Jewish and Arab worlds. A particular focus will be on digital humanities methods.

Robotics Projects

As large language models (LLMs) show impressive emergent abilities in various fields, we are exploring multiple directions combining LLMs into robotics. Currently, we are mostly interested in the following topics:

Long-Horizon Language-Conditioned Robotic Manipulation

Given a high-level human instruction, the robot is supposed to understand the language instruction well and perform long-horizon memorizing and complex reasoning to complete the designated task.
We open-sourced a challenging benchmark LoHoRavens for this task and provided two baselines. See more details on LoHoRavens page.