I am a final year Ph.D. student at the Computer Science department at the University of Sheffield (SheffieldNLP), working on natural language processing & machine learning. My two amazing supervisors are Nikos Aletras (main) and Loïc Barrault, and my work is funded by an Amazon Alexa Fellowship. My research focuses on active learning, evaluation & benchmarking, robustness and language modeling – but I’m fascinated by other topics as well!
Currently, I am Research Scientist Intern at Meta AI (FAIR Labs) in London. Earlier this year, I did an internship as an Applied Scientist at Amazon Web Services (AWS) in NYC, working with the AI human language technology group.
Last year, I visited the CoAStaL group in the University of Copenhagen, where I had the pleasure of working with Anders Søgaard and the rest of the team on exciting projects on learning from disagreement, fairness and cross-cultural NLP.
Prior to starting my Ph.D., I worked as a Machine Learning Engineer at the awesome Greek startup DeepSea Technologies. In my undergrad, I studied Electrical & Computer Engineering at the National Technical University of Athens (NTUA).
|Jan 27, 2023||2 papers accepted at EACL 2023 (main conf.)!|
|Sep 1, 2022||Very excited to start my internship at Meta AI in London!|
|Jun 7, 2022||Invited talk at Bloomberg’s AI Group / in-person in NYC.|
|Apr 26, 2022||Happy to start my internship at AWS in NYC, working with Miguel Ballesteros, Shuai Wang, Yogarshi Vyas and the rest of the Amazon Comprehend team!|
|Apr 22, 2022||Invited talk at NLPhD Speaker Series @ Saarland University / remote. (slides)|
|Feb 24, 2022||2 papers accepted at ACL 2022 (main conf.)!|
- EACLDynamic Benchmarking of Masked Language Models on Temporal Concept Drift with Multiple ViewsIn Proceedings of the European Meeting of the Association for Computational Linguistics (EACL) 2023
- EACLInvestigating Multi-source Active Learning for Natural Language InferenceIn Proceedings of the European Meeting of the Association for Computational Linguistics (EACL) 2023
- ACLOn the Importance of Effectively Adapting Pretrained Language Models for Active LearningIn Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL) 2022
✨ Oral ✨Active Learning by Acquiring Contrastive ExamplesIn Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP) 2021