Hoque Prince, EnamulHuang, Xiangji "Jimmy"Laskar, Md Tahmid Rahman2021-03-082021-03-082020-112021-03-08http://hdl.handle.net/10315/38195The Question Answering (QA) task aims at building systems that can automatically answer a question or query about the given document(s). In this thesis, we utilize the transformer, a state-of-the-art neural architecture to study two QA problems: the answer sentence selection and the answer summary generation. For answer sentence selection, we present two new approaches that rank a list of candidate answers for a given question by utilizing different contextualized embeddings with the encoder of transformer. For answer summary generation, we study the query focused abstractive text summarization task to generate a summary in natural language from the source document(s) for a given query. For this task, we utilize transformer to address the lack of large training datasets issue in single-document scenarios and no labeled training datasets issue in multi-document scenarios. Based on extensive experiments, we observe that our proposed approaches obtain impressive results across several benchmark QA datasets.Author owns copyright, except where explicitly noted. Please contact the author directly with licensing requests.Information technologyUtilizing the Transformer Architecture for Question AnsweringElectronic Thesis or Dissertation2021-03-08Question AnsweringText SummarizationQuery Focused Abstractive SummarizationDeep LearningMachine LearningTransformerBERT