Daniel Simig
YOU?
Author Swipe
Understanding In-Context Learning via Supportive Pretraining Data Open
In-context learning (ICL) improves language models' performance on a variety of NLP tasks by simply demonstrating a handful of examples at inference time. It is not well understood why ICL ability emerges, as the model has never been speci…
Understanding In-Context Learning via Supportive Pretraining Data Open
Xiaochuang Han, Daniel Simig, Todor Mihaylov, Yulia Tsvetkov, Asli Celikyilmaz, Tianlu Wang. Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2023.
View article: Analysing Off-The-Shelf Options for Question Answering with Portuguese FAQs
Analysing Off-The-Shelf Options for Question Answering with Portuguese FAQs Open
Following the current interest in developing automatic question answering systems, we analyse alternative approaches for finding suitable answers from a list of Frequently Asked Questions (FAQs), in Portuguese. These rely on different tech…
View article: Few-shot Learning with Multilingual Generative Language Models
Few-shot Learning with Multilingual Generative Language Models Open
Xi Victoria Lin, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O’Horo, Jeff Wang, Lu…
View article: Few-shot Learning with Multilingual Language Models
Few-shot Learning with Multilingual Language Models Open
Large-scale generative language models such as GPT-3 are competitive few-shot learners. While these models are known to be able to jointly represent many different languages, their training data is dominated by English, potentially limitin…