Loading

Conversational Chatbot with Attention Model
Purushothaman Srikanth1, E.Ushitaasree2, G. Paavai Anand3

1Purushothaman Srikanth, is currently studying B.Tech Computer Science and Engineering in SRM Institute of Science and Technology, Ramapuram part- Vadapalani Campus, Chennai.
2Ushitaasree, Currently Studying B.Tech Computer Science and Engineering in SRM Institute of Science and Technology, Ramapuram part- Vadapalani Campus, Chennai.
3Dr. G. Paavai Anand, Assistant Professor at SRM Institute of Science and Technology, Ramapuram, Vadapalani Campus, Chennai.

Manuscript received on November 18, 2019. | Revised Manuscript received on 25 November, 2019. | Manuscript published on December 10, 2019. | PP: 35373540 | Volume-9 Issue-2, December 2019. | Retrieval Number: B6316129219/2019©BEIESP | DOI: 10.35940/ijitee.B6316.129219
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: A Chatbot is an Artificial Intelligence (AI) software that can give a simulation of a conversation between two humans. This Chatbot is based on State of the Art Transformer model architecture which works on Attention mechanism. The transformer model is a very efficient Sequence to Sequence model. Machine translation is at its core , simply a task in which you map the sentence to another sentence. Sentences consist of words that are equivalent to mapping to a different sequence. Beam search and Byte-pair encoding are the algorithms used in our model for heuristic searching in decoder units. A combination of many Unsupervised prediction tasks were carried out by fine-tuning using a multi-task objective every time the user starts the conversation. It takes a new persona for every new session opened and communicates with that persona which is chosen at random. Forwarding the perplexity by the ability to understand and generate natural language this model gives a whooping Hits@1 score efficiency as high as 80.9 percentage. 
Keywords: Attention Mechanism, Transformer, Conversation Chatbot, Beam Search, Byte-pair Encoding
Scope of the Article: Aggregation, Integration, and Transformation