deepmipt/DeepPavlov
An open source library for deep learning end-to-end dialog systems and chatbots.
repo name | deepmipt/DeepPavlov |
repo link | https://github.com/deepmipt/DeepPavlov |
homepage | https://deeppavlov.ai |
language | Python |
size (curr.) | 35292 kB |
stars (curr.) | 4037 |
created | 2017-11-17 |
license | Apache License 2.0 |
DeepPavlov is an open-source conversational AI library built on TensorFlow and Keras.
DeepPavlov is designed for
- development of production ready chat-bots and complex conversational systems,
- research in the area of NLP and, particularly, of dialog systems.
Quick Links
- Demo demo.deeppavlov.ai
- Documentation docs.deeppavlov.ai
- Model List docs:features/
- Contribution Guide docs:contribution_guide/
- Issues github/issues/
- Forum forum.deeppavlov.ai
- Blogs medium.com/deeppavlov
- Tutorials examples/ and extended colab tutorials
- Docker Hub hub.docker.com/u/deeppavlov/
- Docker Images Documentation docs:docker-images/
Please leave us your feedback on how we can improve the DeepPavlov framework.
Models
Named Entity Recognition | Slot filling
Intent/Sentence Classification | Question Answering over Text (SQuAD)
Sentence Similarity/Ranking | TF-IDF Ranking
Morphological tagging | Automatic Spelling Correction
Skills
Goal(Task)-oriented Bot | Seq2seq Goal-Oriented bot
Open Domain Questions Answering | eCommerce Bot
Frequently Asked Questions Answering | Pattern Matching
Embeddings
BERT embeddings for the Russian, Polish, Bulgarian, Czech, and informal English
ELMo embeddings for the Russian language
FastText embeddings for the Russian language
Auto ML
Tuning Models with Evolutionary Algorithm
Integrations
REST API | Socket API | Yandex Alice
Telegram | Microsoft Bot Framework
Installation
-
We support
Linux
andWindows
platforms,Python 3.6
andPython 3.7
Python 3.5
is not supported!- installation for
Windows
requiresGit
(for example, git) andVisual Studio 2015/2017
withC++
build tools installed!
-
Create and activate a virtual environment:
Linux
python -m venv env source ./env/bin/activate
Windows
python -m venv env .\env\Scripts\activate.bat
-
Install the package inside the environment:
pip install deeppavlov
QuickStart
There is a bunch of great pre-trained NLP models in DeepPavlov. Each model is determined by its config file.
List of models is available on
the doc page in
the deeppavlov.configs
(Python):
from deeppavlov import configs
When you’re decided on the model (+ config file), there are two ways to train, evaluate and infer it:
- via Command line interface (CLI) and
- via Python.
GPU requirements
To run supported DeepPavlov models on GPU you should have CUDA 10.0
installed on your host machine and TensorFlow with GPU support (tensorflow-gpu
)
installed in your python environment. Current supported TensorFlow version is 1.14.0.
Run
pip install tensorflow-gpu==1.14.0
before installing model’s package requirements to install supported tensorflow-gpu
version.
Before making choice of an interface, install model’s package requirements (CLI):
python -m deeppavlov install <config_path>
- where
<config_path>
is path to the chosen model’s config file (e.g.deeppavlov/configs/ner/slotfill_dstc2.json
) or just name without .json extension (e.g.slotfill_dstc2
)
Command line interface (CLI)
To get predictions from a model interactively through CLI, run
python -m deeppavlov interact <config_path> [-d]
-d
downloads required data – pretrained model files and embeddings (optional).
You can train it in the same simple way:
python -m deeppavlov train <config_path> [-d]
Dataset will be downloaded regardless of whether there was -d
flag or not.
To train on your own data you need to modify dataset reader path in the train config doc. The data format is specified in the corresponding model doc page.
There are even more actions you can perform with configs:
python -m deeppavlov <action> <config_path> [-d]
<action>
can bedownload
to download model’s data (same as-d
),train
to train the model on the data specified in the config file,evaluate
to calculate metrics on the same dataset,interact
to interact via CLI,riseapi
to run a REST API server (see doc),telegram
to run as a Telegram bot (see doc),msbot
to run a Miscrosoft Bot Framework server (see doc),predict
to get prediction for samples from stdin or from <file_path> if-f <file_path>
is specified.
<config_path>
specifies path (or name) of model’s config file-d
downloads required data
Python
To get predictions from a model interactively through Python, run
from deeppavlov import build_model
model = build_model(<config_path>, download=True)
# get predictions for 'input_text1', 'input_text2'
model(['input_text1', 'input_text2'])
- where
download=True
downloads required data from web – pretrained model files and embeddings (optional), <config_path>
is path to the chosen model’s config file (e.g."deeppavlov/configs/ner/ner_ontonotes_bert_mult.json"
) ordeeppavlov.configs
attribute (e.g.deeppavlov.configs.ner.ner_ontonotes_bert_mult
without quotation marks).
You can train it in the same simple way:
from deeppavlov import train_model
model = train_model(<config_path>, download=True)
download=True
downloads pretrained model, therefore the pretrained model will be, first, loaded and then train (optional).
Dataset will be downloaded regardless of whether there was -d
flag or
not.
To train on your own data you need to modify dataset reader path in the train config doc. The data format is specified in the corresponding model doc page.
You can also calculate metrics on the dataset specified in your config file:
from deeppavlov import evaluate_model
model = evaluate_model(<config_path>, download=True)
There are also available integrations with various messengers, see Telegram Bot doc page and others in the Integrations section for more info.
Breaking Changes
Breaking changes in version 0.7.0
- in dialog logger config file dialog_logger_config.json
agent_name
parameter was renamed tologger_name
, the default value was changed - Agent, Skill, eCommerce Bot and Pattern Matching classes were moved to deeppavlov.deprecated
- AIML Skill, RASA Skill, Yandex Alice, Amazon Alexa, Microsoft Bot Framework and Telegram integration interfaces were changed
/start
and/help
Telegram messages were moved frommodels_info.json
to server_config.json- risesocket request and response format was changed
- riseapi and risesocket model-specific properties parametrization was changed
Breaking changes in version 0.6.0
- REST API:
- all models default endpoints were renamed to
/model
- by default model arguments names are taken from
chainer.in
configuration parameter instead of pre-set names from a settings file - swagger api endpoint moved from
/apidocs
to/docs
- all models default endpoints were renamed to
- when using
"max_proba": true
in aproba2labels
component for classification, it will return single label for every batch element instead of a list. One can set"top_n": 1
to get batches of single item lists as before
Breaking changes in version 0.5.0
- dependencies have to be reinstalled for most pipeline configurations
- models depending on
tensorflow
requireCUDA 10.0
to run on GPU instead ofCUDA 9.0
- scikit-learn models have to be redownloaded or retrained
Breaking changes in version 0.4.0!
- default target variable name for neural evolution
was changed from
MODELS_PATH
toMODEL_PATH
.
Breaking changes in version 0.3.0!
- component option
fit_on_batch
in configuration files was removed and replaced with adaptive usage of thefit_on
parameter.
Breaking changes in version 0.2.0!
utils
module was moved from repository root in todeeppavlov
modulems_bot_framework_utils
,server_utils
,telegram utils
modules was renamed toms_bot_framework
,server
andtelegram
correspondingly- rename metric functions
exact_match
tosquad_v2_em
andsquad_f1
tosquad_v2_f1
- replace dashes in configs name with underscores
Breaking changes in version 0.1.0!
-
As of
version 0.1.0
all models, embeddings and other downloaded data for provided configurations are by default downloaded to the.deeppavlov
directory in current user’s home directory. This can be changed on per-model basis by modifying aROOT_PATH
variable or related fields one by one in model’s configuration file. -
In configuration files, for all features/models, dataset readers and iterators
"name"
and"class"
fields are combined into the"class_name"
field. -
deeppavlov.core.commands.infer.build_model_from_config()
was renamed tobuild_model
and can be imported from thedeeppavlov
module directly. -
The way arguments are passed to metrics functions during training and evaluation was changed and documented.
License
DeepPavlov is Apache 2.0 - licensed.
The Team
DeepPavlov is built and maintained by Neural Networks and Deep Learning Lab at MIPT within iPavlov project.