Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на этот ресурс: http://elar.urfu.ru/handle/10995/130299
Название: The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer
Авторы: Efimov, P.
Boytsov, L.
Arslanova, E.
Braslavski, P.
Дата публикации: 2023
Издатель: Springer Science and Business Media Deutschland GmbH
Библиографическое описание: Efimov, P, Boytsov, L, Arslanova, E & Braslavski, P 2023, The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer: book chapter. в J Kamps & L Goeuriot (ред.), Advances in Information Retrieval: 45th European Conference on Information Retrieval: book. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Том. 13982, Springer Cham, стр. 51-67. https://doi.org/10.1007/978-3-031-28241-6_4
Efimov, P., Boytsov, L., Arslanova, E., & Braslavski, P. (2023). The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer: book chapter. в J. Kamps, & L. Goeuriot (Ред.), Advances in Information Retrieval: 45th European Conference on Information Retrieval: book (стр. 51-67). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Том 13982). Springer Cham. https://doi.org/10.1007/978-3-031-28241-6_4
Аннотация: Large multilingual language models such as mBERT or XLM-R enable zero-shot cross-lingual transfer in various IR and NLP tasks. Cao et al. [8] proposed a data- and compute-efficient method for cross-lingual adjustment of mBERT that uses a small parallel corpus to make embeddings of related words across languages similar to each other. They showed it to be effective in NLI for five European languages. In contrast we experiment with a topologically diverse set of languages (Spanish, Russian, Vietnamese, and Hindi) and extend their original implementations to new tasks (XSR, NER, and QA) and an additional training regime (continual learning). Our study reproduced gains in NLI for four languages, showed improved NER, XSR, and cross-lingual QA results in three languages (though some cross-lingual QA gains were not statistically significant), while mono-lingual QA performance never improved and sometimes degraded. Analysis of distances between contextualized embeddings of related and unrelated words (across languages) showed that fine-tuning leads to “forgetting” some of the cross-lingual alignment information. Based on this observation, we further improved NLI performance using continual learning. Our software is publicly available https://github.com/pefimov/cross-lingual-adjustment. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
Ключевые слова: CROSS-LINGUAL TRANSFER
MULTILINGUAL EMBEDDINGS
NATURAL LANGUAGE PROCESSING SYSTEMS
ZERO-SHOT LEARNING
CONTEXTUAL WORDS
CONTINUAL LEARNING
CROSS-LINGUAL
CROSS-LINGUAL TRANSFER
EMBEDDINGS
LANGUAGE MODEL
MULTILINGUAL EMBEDDING
PARALLEL CORPORA
PERFORMANCE
WORD REPRESENTATIONS
EMBEDDINGS
URI: http://elar.urfu.ru/handle/10995/130299
Условия доступа: info:eu-repo/semantics/openAccess
Конференция/семинар: 45th European Conference on Information Retrieval, ECIR 2023
Дата конференции/семинара: 2 April 2023 through 6 April 2023
Идентификатор SCOPUS: 85151051828
Идентификатор WOS: 000995495200004
Идентификатор PURE: 37140299
ISSN: 0302-9743
ISBN: 9783031282409
DOI: 10.1007/978-3-031-28241-6_4
Сведения о поддержке: Russian Science Foundation, RSF: 20-11-20166
Acknowledgment. This research was supported in part through computational resources of HPC facilities at HSE University [27]. PE is grateful to Yandex Cloud for their grant toward computing resources of Yandex DataSphere. PB acknowledges support by the Russian Science Foundation, grant No 20-11-20166.
Карточка проекта РНФ: 20-11-20166
Располагается в коллекциях:Научные публикации ученых УрФУ, проиндексированные в SCOPUS и WoS CC

Файлы этого ресурса:
Файл Описание РазмерФормат 
2-s2.0-85151051828.pdf323 kBAdobe PDFПросмотреть/Открыть


Все ресурсы в архиве электронных ресурсов защищены авторским правом, все права сохранены.