Loading

Analysis of Chatbot Data Generation using LSTM
Deepanshu Sharma1, Ayaan Samad2, Deepali Dev3

1Deepanshu Sharma, Student, ABES Engineering College, Ghaziabad, India.

2Ayaan Samad, Student, ABES Engineering College, Ghaziabad, India.

3Deepali Dev, Assistant Professor, ABES Engineering College, Ghaziabad, India.

Manuscript received on 08 April 2019 | Revised Manuscript received on 15 April 2019 | Manuscript Published on 26 April 2019 | PP: 601-603 | Volume-8 Issue-6S April 2019 | Retrieval Number: F61380486S19/19©BEIESP

Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open-access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: Deep Neural Networks (DNNs) have provided admirable results on difficult learning tasks because of its powerful model. Whenever large data sets are available for preparing training set DNNs works very well as they can be used to process and generate sequences (questions) to sequences (answers). In this paper, we have worked on end-to-end approach related to a regular sequence-learning model that makes minimal presumption on the sequence structure and makes use of our processed data set. Our method makes use of multi-layered Long Short-Term Memory (LSTM) and attention mechanism. Our experimental result is based on a particular set of chat implementation from twitter data set and Cornell movie dialog corpus. The available size of meaningful data is confined, therefore the response time is limited, however the LSTM did not find any difficulty in handling long sentences.

Keywords: Our Method Makes use of Multi-Layered Long Short-Term Memory (LSTM) and Attention Mechanism.
Scope of the Article: Computational Techniques in Civil Engineering