ourownstory/neural_prophet
NeuralProphet - A simple forecasting model based on Neural Networks in PyTorch
repo name | ourownstory/neural_prophet |
repo link | https://github.com/ourownstory/neural_prophet |
homepage | https://ourownstory.github.io/neural_prophet/ |
language | Jupyter Notebook |
size (curr.) | 46975 kB |
stars (curr.) | 715 |
created | 2020-05-04 |
license | MIT License |
Please note that the project is still in beta phase. Please report any issues you encounter or suggestions you have. We will do our best to address them quickly. Contributions are also highly welcome!
NeuralProphet
A Neural Network based Time-Series model, inspired by Facebook Prophet and AR-Net, built on PyTorch.
For a visual introduction to NeuralProphet, view the presentation given at the 40th International Symposium on Forecasting.
Documentation
We are working on a documentation page. Contributions welcome!
Use
Install
You can now install neuralprophet directly with pip:
pip install neuralprophet
If you plan to use the package in a Jupyter notebook, we recommended to install the ‘live’ version:
pip install neuralprophet[live]
This will allow you to enable plot_live_loss
in the fit
function to get a live plot of train (and validation) loss.
If you would like the most up to date version, you can instead install direclty from github:
git clone <copied link from github>
cd neural_prophet
pip install .
Please note that NeuralProphet requires a Python version >= 3.7 due to the use of @dataclass
.
Basic example
from neuralprophet import NeuralProphet
After importing the package, you can use NeuralProphet in your code:
m = NeuralProphet()
metrics = m.fit(df, freq="D")
future = m.make_future_dataframe(df, periods=30)
forecast = m.predict(future)
You can visualize your results with the inbuilt plotting functions:
fig_forecast = m.plot(forecast)
fig_components = m.plot_components(forecast)
fig_model = m.plot_parameters()
Model Features
- Autocorrelation modelling through AR-Net
- Piecewise linear trend
- Fourier term Seasonality at different periods such as yearly, daily, weekly, hourly.
- Lagged regressors
- Future regressors
- Holidays & special events
- Sparsity of coefficients through regularization
- Plotting for forecast components, model coefficients as well as final forecasts
Contribute
Dev Install
Before starting it’s a good idea to first create and activate a new virtual environment:
python3 -m venv <path-to-new-env>
source <path-to-new-env>/bin/activate
Now you can install neuralprophet:
git clone <copied link from github>
cd neural_prophet
pip install -e .[dev]
neuralprophet_dev_setup
Notes:
- The last command runs the dev-setup script which installs appropriate git hooks for Black (pre-commit) and Unittests (pre-push).
- Including the optional
-e
flag will install neuralprophet in “editable” mode, meaning that instead of copying the files into your virtual environment, a symlink will be created to the files where they are.
Style
We deploy Black, the uncompromising code formatter, so there is no need to worry about style. Beyond that, where reasonable, for example for docstrings, we follow the Google Python Style Guide
As for Git practices, please follow the steps described at Swiss Cheese for how to git-rebase-squash when working on a forked repo.
Changelogs
Coming up Next
For details, please view the Development Timeline.
The next versions of NeuralProphet are expected to cover a set of new exciting features:
- Robustify training for different datasets
- Logistic growth for trend component.
- Uncertainty estimation of individual forecast components as well as the final forecasts.
- Support for panel data by building global forecasting models.
- Incorporate time series featurization for improved forecast accuracy.
0.2.7 (next release)
- soft-start regularization
- confidence interval for forecast (as quantiles via pinball loss)
0.2.6 (current release)
- Auto-set batch_size and epochs
- random-seed util
- continued removal of AttrDict
- fix to index issue in make_future_dataframe
0.2.5
- documentation pages added
- 1cycle policy
- learning rate range test
- tutorial notebooks: trend, events
- fixes to plotting, changepoints
Authors
The project efford is led by Oskar Triebe (Stanford University), advised by Nikolay Laptev (Facebook, Inc) and Ram Rajagopal (Stanford University) and has been partially funded by Total S.A. The project has been developed in close collaboration with Hansika Hewamalage, who is advised by Christoph Bergmeir (Monash University).
Contributors
This is the list of NeuralProphet’s significant contributors. This does not necessarily list everyone who has contributed code. To see the full list of contributors, see the revision history in source control.
- Oskar Triebe
- Hansika Hewamalage
- Nikolay Laptev
- Riley Dehaan
- Gonzague Henri
- Ram Rajagopal
- Christoph Bergmeir
- Italo Lima
- Caner Komurlu
- Rodrigo Riveraca
If you are interested in joining the project, please feel free to reach out to me (Oskar) - you can find my email on the AR-Net Paper.