Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на этот ресурс: http://elar.urfu.ru/handle/10995/111587
Полная запись метаданных
Поле DCЗначениеЯзык
dc.contributor.authorMokrii, I.en
dc.contributor.authorBoytsov, L.en
dc.contributor.authorBraslavski, P.en
dc.date.accessioned2022-05-12T08:19:25Z-
dc.date.available2022-05-12T08:19:25Z-
dc.date.issued2021-
dc.identifier.citationMokrii I. A Systematic Evaluation of Transfer Learning and Pseudo-labeling with BERT-based Ranking Models / I. Mokrii, L. Boytsov, P. Braslavski // SIGIR 2021 - Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. — 2021. — Vol. — P. 2081-2085. — 3463093.en
dc.identifier.isbn9781450380379-
dc.identifier.otherAll Open Access, Green3
dc.identifier.urihttp://elar.urfu.ru/handle/10995/111587-
dc.description.abstractDue to high annotation costs making the best use of existing human-created training data is an important research direction. We, therefore, carry out a systematic evaluation of transferability of BERT-based neural ranking models across five English datasets. Previous studies focused primarily on zero-shot and few-shot transfer from a large dataset to a dataset with a small number of queries. In contrast, each of our collections has a substantial number of queries, which enables a full-shot evaluation mode and improves reliability of our results. Furthermore, since source datasets licences often prohibit commercial use, we compare transfer learning to training on pseudo-labels generated by a BM25 scorer. We find that training on pseudo-labels - -possibly with subsequent fine-tuning using a modest number of annotated queries - -can produce a competitive or better model compared to transfer learning. Yet, it is necessary to improve the stability and/or effectiveness of the few-shot training, which, sometimes, can degrade performance of a pretrained model. © 2021 ACM.en
dc.description.sponsorshipPavel Braslavski thanks the Ministry of Science and Higher Education of the Russian Federation (“Ural Mathematical Center” project).en
dc.format.mimetypeapplication/pdfen
dc.language.isoenen
dc.publisherAssociation for Computing Machinery, Incen1
dc.publisherACMen
dc.rightsinfo:eu-repo/semantics/openAccessen
dc.sourceSIGIR - Proc. Int. ACM SIGIR Conf. Res. Dev. Inf. Retr.2
dc.sourceSIGIR 2021 - Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrievalen
dc.subjectNEURAL INFORMATION RETRIEVALen
dc.subjectPSEUDO-LABELINGen
dc.subjectTRANSFER LEARNINGen
dc.subjectINFORMATION RETRIEVALen
dc.subjectLARGE DATASETen
dc.subjectLEARNING SYSTEMSen
dc.subjectTRANSFER LEARNINGen
dc.subjectEVALUATION MODESen
dc.subjectFINE TUNINGen
dc.subjectRANKING MODELen
dc.subjectSYSTEMATIC EVALUATIONen
dc.subjectTRAINING DATAen
dc.subjectLEARNING TO RANKen
dc.titleA Systematic Evaluation of Transfer Learning and Pseudo-labeling with BERT-based Ranking Modelsen
dc.typeConference Paperen
dc.typeinfo:eu-repo/semantics/conferenceObjecten
dc.typeinfo:eu-repo/semantics/submittedVersionen
dc.conference.name44th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2021en
dc.conference.date11 July 2021 through 15 July 2021-
dc.identifier.doi10.1145/3404835.3463093-
dc.identifier.scopus85111638459-
local.contributor.employeeMokrii, I., Hse University, Moscow, Russian Federation; Boytsov, L., Bosch Center for Artificial Intelligence, Pittsburgh, PA, United States; Braslavski, P., Hse University, Moscow, Russian Federation, Ural Federal University and Hse University, Yekaterinburg, Russian Federationen
local.description.firstpage2081-
local.description.lastpage2085-
dc.identifier.wos000719807900251-
local.contributor.departmentHse University, Moscow, Russian Federation; Bosch Center for Artificial Intelligence, Pittsburgh, PA, United States; Ural Federal University and Hse University, Yekaterinburg, Russian Federationen
local.identifier.pure22990153-
local.description.order3463093-
local.identifier.eid2-s2.0-85111638459-
local.identifier.wosWOS:000719807900251-
Располагается в коллекциях:Научные публикации ученых УрФУ, проиндексированные в SCOPUS и WoS CC

Файлы этого ресурса:
Файл Описание РазмерФормат 
2-s2.0-85111638459.pdf579,34 kBAdobe PDFПросмотреть/Открыть


Все ресурсы в архиве электронных ресурсов защищены авторским правом, все права сохранены.