Scaling contrastive training of auto encoders for NLP and low resource settings

Stephen Mander

Postgraduate Researcher, School of Computer and Communication

Abstract

Contrastive training still underlies many technologies within the realm of machine learning. It has shown much promise in multimodal activations and logical abilities. However, replication remains an ongoing challenge in academic and low-resource communities. This talk showcases an exploration of using different data shapes to train models with multiple input streams. There are myriad applications in supervised training, low-resource language, cross-modal training, and machine translation tasks where annotations are almost none existent.

Week 20 2022/2023

Thursday 13th April 2023
1:00-2:00pm

Microsoft Teams - request a link via email

For queries, contact Ignatius Ezeani (i.ezeani@lancaster.ac.uk)