If you are interested in writing your thesis on Natural Language Processing or using large pretrained models, feel free to contact me! We'll discuss which open topics might be the best fit for you or you can bring your own ideas.
Research Interests
My research is centered on adapting pretrained large language models (LLMs) to new tasks and domains. In particular, I am focusing on cross-lingual transfer of LLMS: How can we best adapt a model trained exclusively on a language like English for use in another language? Previously, I have also worked on art generation with Generative Adversarial Networks (StyleGAN2).
My general topics are:
> Large Pretrained Language Models
> Transfer Learning
> Zero-shot Cross-lingual Transfer
> Natural Language Processing
> Generative Adversarial Networks
Publications
- K. Dobler, F. Hübscher, J. Westphal, A. Sierra-Múnera, G. de Melo, and R. Krestel, ‘Art Creation with Multi-Conditional StyleGANs’, in Thirty-First International Joint Conference on Artificial Intelligence, 2022.