Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на этот ресурс: http://elar.urfu.ru/handle/10995/130299
Полная запись метаданных
Поле DCЗначениеЯзык
dc.contributor.authorEfimov, P.en
dc.contributor.authorBoytsov, L.en
dc.contributor.authorArslanova, E.en
dc.contributor.authorBraslavski, P.en
dc.date.accessioned2024-04-05T16:18:01Z-
dc.date.available2024-04-05T16:18:01Z-
dc.date.issued2023-
dc.identifier.citationEfimov, P, Boytsov, L, Arslanova, E & Braslavski, P 2023, The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer: book chapter. в J Kamps & L Goeuriot (ред.), Advances in Information Retrieval: 45th European Conference on Information Retrieval: book. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Том. 13982, Springer Cham, стр. 51-67. https://doi.org/10.1007/978-3-031-28241-6_4harvard_pure
dc.identifier.citationEfimov, P., Boytsov, L., Arslanova, E., & Braslavski, P. (2023). The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer: book chapter. в J. Kamps, & L. Goeuriot (Ред.), Advances in Information Retrieval: 45th European Conference on Information Retrieval: book (стр. 51-67). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Том 13982). Springer Cham. https://doi.org/10.1007/978-3-031-28241-6_4apa_pure
dc.identifier.isbn9783031282409-
dc.identifier.issn0302-9743-
dc.identifier.otherFinal2
dc.identifier.otherAll Open Access, Green3
dc.identifier.otherhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85151051828&doi=10.1007%2f978-3-031-28241-6_4&partnerID=40&md5=6cf730ccd4a5490b5051ddd822241e891
dc.identifier.otherhttps://arxiv.org/pdf/2204.06457pdf
dc.identifier.urihttp://elar.urfu.ru/handle/10995/130299-
dc.description.abstractLarge multilingual language models such as mBERT or XLM-R enable zero-shot cross-lingual transfer in various IR and NLP tasks. Cao et al. [8] proposed a data- and compute-efficient method for cross-lingual adjustment of mBERT that uses a small parallel corpus to make embeddings of related words across languages similar to each other. They showed it to be effective in NLI for five European languages. In contrast we experiment with a topologically diverse set of languages (Spanish, Russian, Vietnamese, and Hindi) and extend their original implementations to new tasks (XSR, NER, and QA) and an additional training regime (continual learning). Our study reproduced gains in NLI for four languages, showed improved NER, XSR, and cross-lingual QA results in three languages (though some cross-lingual QA gains were not statistically significant), while mono-lingual QA performance never improved and sometimes degraded. Analysis of distances between contextualized embeddings of related and unrelated words (across languages) showed that fine-tuning leads to “forgetting” some of the cross-lingual alignment information. Based on this observation, we further improved NLI performance using continual learning. Our software is publicly available https://github.com/pefimov/cross-lingual-adjustment. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.en
dc.description.sponsorshipRussian Science Foundation, RSF: 20-11-20166en
dc.description.sponsorshipAcknowledgment. This research was supported in part through computational resources of HPC facilities at HSE University [27]. PE is grateful to Yandex Cloud for their grant toward computing resources of Yandex DataSphere. PB acknowledges support by the Russian Science Foundation, grant No 20-11-20166.en
dc.format.mimetypeapplication/pdfen
dc.language.isoenen
dc.publisherSpringer Science and Business Media Deutschland GmbHen
dc.relationinfo:eu-repo/grantAgreement/RSF//20-11-20166en
dc.rightsinfo:eu-repo/semantics/openAccessen
dc.sourceAdvances in Information Retrieval2
dc.sourceLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)en
dc.subjectCROSS-LINGUAL TRANSFERen
dc.subjectMULTILINGUAL EMBEDDINGSen
dc.subjectNATURAL LANGUAGE PROCESSING SYSTEMSen
dc.subjectZERO-SHOT LEARNINGen
dc.subjectCONTEXTUAL WORDSen
dc.subjectCONTINUAL LEARNINGen
dc.subjectCROSS-LINGUALen
dc.subjectCROSS-LINGUAL TRANSFERen
dc.subjectEMBEDDINGSen
dc.subjectLANGUAGE MODELen
dc.subjectMULTILINGUAL EMBEDDINGen
dc.subjectPARALLEL CORPORAen
dc.subjectPERFORMANCEen
dc.subjectWORD REPRESENTATIONSen
dc.subjectEMBEDDINGSen
dc.titleThe Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transferen
dc.typeConference Paperen
dc.typeinfo:eu-repo/semantics/conferenceObjecten
dc.typeinfo:eu-repo/semantics/submittedVersionen
dc.conference.name45th European Conference on Information Retrieval, ECIR 2023en
dc.conference.date2 April 2023 through 6 April 2023-
dc.identifier.doi10.1007/978-3-031-28241-6_4-
dc.identifier.scopus85151051828-
local.contributor.employeeEfimov, P., ITMO University, Saint Petersburg, Russian Federationen
local.contributor.employeeBoytsov, L., Bosch Center for Artificial Intelligence, Pittsburgh, United Statesen
local.contributor.employeeArslanova, E., Ural Federal University, Yekaterinburg, Russian Federationen
local.contributor.employeeBraslavski, P., Ural Federal University, Yekaterinburg, Russian Federation, HSE University, Moscow, Russian Federationen
local.description.firstpage51-
local.description.lastpage67-
local.volume13982 LNCS-
dc.identifier.wos000995495200004-
local.contributor.departmentITMO University, Saint Petersburg, Russian Federationen
local.contributor.departmentBosch Center for Artificial Intelligence, Pittsburgh, United Statesen
local.contributor.departmentUral Federal University, Yekaterinburg, Russian Federationen
local.contributor.departmentHSE University, Moscow, Russian Federationen
local.identifier.pure37140299-
local.identifier.eid2-s2.0-85151051828-
local.fund.rsf20-11-20166-
local.identifier.wosWOS:000995495200004-
Располагается в коллекциях:Научные публикации ученых УрФУ, проиндексированные в SCOPUS и WoS CC

Файлы этого ресурса:
Файл Описание РазмерФормат 
2-s2.0-85151051828.pdf323 kBAdobe PDFПросмотреть/Открыть


Все ресурсы в архиве электронных ресурсов защищены авторским правом, все права сохранены.