Show simple item record

dc.contributor.advisorJiang, Hui
dc.contributor.authorWatcharawittayakul, Sedtawut
dc.date.accessioned2020-05-11T12:53:27Z
dc.date.available2020-05-11T12:53:27Z
dc.date.issued2020-05-11
dc.identifier.urihttps://yorkspace.library.yorku.ca/xmlui/handle/10315/37464
dc.description.abstractIn this thesis, we propose a new approach to employ fixed-size ordinally-forgetting encoding (FOFE) on Natural Language Processing (NLP) tasks, called dual-FOFE. The main idea behind dual-FOFE is that it allows the encoding to be done with two different forgetting factors; this would resolve the original FOFEs dilemma in choosing between the benefits offered by having either small or large values for its single forgetting factor. For this research, we have conducted our experiments on two prominent NLP tasks, namely, language modelling and machine reading comprehension. Our experiment results shown that the dual-FOFE provide a definite improvement over the original FOFE by approximately 11% in perplexity (PPL) for language modelling task and 8% in Exact Match (EM) score for machine reading comprehension task.
dc.languageen
dc.rightsAuthor owns copyright, except where explicitly noted. Please contact the author directly with licensing requests.
dc.subjectComputer science
dc.titleDual Fixed-Size Ordinally Forgetting Encoding (FOFE) For Natural Language Processing
dc.typeElectronic Thesis or Dissertation
dc.degree.disciplineComputer Science
dc.degree.nameMSc - Master of Science
dc.degree.levelMaster's
dc.date.updated2020-05-11T12:53:27Z
dc.subject.keywordsNatural language processing
dc.subject.keywordsDeep learning


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in the YorkSpace institutional repository are protected by copyright, with all rights reserved except where explicitly noted.