000 01075nam a2200301 4500
008 231025t2022 ||||| |||| 00| 0 eng d
041 _aeng
080 _a004(043)
100 _aThillainathan, Sarubi
245 _aPre-training and fine-tuning multilingual sequence-to-sequence models for domain-specific low-resource neural machine translation
260 _c[2022]
300 _aix, 57p. : diagrams, tables,
_ePDF file
502 _cCS
_dFAC_ENG
_gUniversity of Moratuwa
_o22
650 _a PRE-TRAINING
650 _aFINE-TUNING
_926095
650 _aLOW-RESOURCE LANGUAGES
650 _aMBART
_926096
650 _a PRE-TRAINED LANGUAGE MODELS
650 _aNEURAL MACHINE TRANSLATION
650 _aINFORMATION TECHNOLOGY -Dissertation
650 _a COMPUTER SCIENCE -Dissertation
650 _a COMPUTER SCIENCE & ENGINEERING -Dissertation
658 _aMSc (Major Component Research)
700 _aProf. Sanath Jayasena
_esup
700 _aDr. Surangika Ranathunga
_esup
856 _uhttp://dl.lib.uom.lk/handle/123/21664
942 _cTH
999 _c181937
_d181937