Skip to content

Chatbot made via NLP for Question - Answering purposes as of a support assistant of websites

Notifications You must be signed in to change notification settings

mujaffarbhati/AI-Chatbot-End-to-End-via-Flask

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI-Chatbot

Chatbot made via NLP for Question - Answering purposes as of being a 24x7 support assistant of websites.
Bert model from the transformers library is used since its a pretrained model.
The project is made end to end via flask providing both chatbot as well as voicebot depending on the use of the user as well as an additional admin section is added to manage and customize the data on the users need.

Modules and Library used:

Transformers - BertQuestionandAnswering and Berttokenizer
Pytorch
Flask
Pickle
Bootstrap is used for frontend development

How to Use the Chatbot

  • In order to append the chatbox to your website pull the main.py since it is the model of the project.
  • Import the model in your backend. (Flask in our case)
  • Append the Html changes according to the theme of your website along with the dashboard page for managing and customizing the data of the chatbot.
  • Choose the appropriate bot model from the html file since there are 2 sub usage of the model - Chatbot and Voicebot.
  • Append the Chosen Html file in your code and select your code of need from the backend.py file where Flask is being used.
  • In main.py uncomment the pickle creating code while commenting the rest of the code to form the pickle of the model and tokenizer, (1.4GB approx)
  • The config.json file contains the parameters of the project.

    Additional features / Conclusion

  • The additional features that could be added in the module is allowing the admin to choose the layout of the chatbot from the dashboard itself.
  • Database is pre-connected for the users if they want to connect the database to the project.
  • Roberta model from transformers library could be used to further optimize the model.
  • Open to suggestions. Feel free to pull the repository for your need.