Loading

Automatic Question Generation using Sequence to Sequence RNN Model
Alina K. Jayarajan1, Ajumol P. A.2, Ani Sunny3

1Alina K. Jayarajan*, Department of Computer Science and Engineering, Mar Athanasius College of Engineering, Kothamangalam, Kerala, India.
2Ajumol P. A., Department of Computer Science and Engineering, Mar Athanasius College of Engineering, Kothamangalam, Kerala, India.
3Ani Sunny, Department of Computer Science and Engineering, Mar Athanasius College of Engineering, Kothamangalam, Kerala, India.
Manuscript received on February 10, 2020. | Revised Manuscript received on February 24, 2020. | Manuscript published on March 10, 2020. | PP: 1799-1803 | Volume-9 Issue-5, March 2020. | Retrieval Number: E2675039520/2020©BEIESP | DOI: 10.35940/ijitee.E2675.039520
Open Access | Ethics and Policies | Cite | Mendeley
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: Automatic Question Generation (AQG) has recently received growing focus in the processing of natural language (NLP). This attempts to create questions from a text paragraph, where certain sub-spans of the passage in question will answer the questions produced.. Traditional methods predominantly use rigid heuristic rules to turn a sentence into related questions. In this research, we suggest using the neural encoder-decoder model to produce substantive and complex questions from the sentences of natural language. We apply a attention-based sequence to sequence learning paradigm for the task and analyze the impact of encoding sentence vs. knowledge at paragraph level. Information retrieval and NLP are the core components of AQG. It incorporates the application of production rules; recurrent neural network (RNN) based encoder-decoder sequence to sequence (seq2seq) models, and other intelligent techniques. RNN is used because of its long short term memory power (LSTM).The proposed system focus on generating factual WH type questions. 
Keywords: Long-Short Term Memory, Natural Language Processing, Recurrent Neural Networks,
Scope of the Article: Neural Information Processing