December 12, 2020

312 words 2 mins read

ZHKKKe/MODNet

ZHKKKe/MODNet

A Trimap-Free Solution for Portrait Matting in Real Time under Changing Scenes

repo name ZHKKKe/MODNet
repo link https://github.com/ZHKKKe/MODNet
homepage
language Python
size (curr.) 38488 kB
stars (curr.) 895
created 2020-11-23
license

News

WebCam Matting Demo

We provide two real-time portrait video matting demos based on WebCam. When using the demo, you can move the WebCam around at will. If you have an Ubuntu system, we recommend you to try the offline demo to get a higher fps. Otherwise, you can access the online Colab demo.

Image Matting Demo

We provide an online Colab demo for portrait image matting.
It allows you to upload portrait images and predict/visualize/download the alpha mattes.

You can also use this WebGUI (hosted on Gradio) for portrait image matting directly from your browser without any code! The source code of this demo is coming soon.

TO DO

  • Release training code (scheduled in Jan. 2021)
  • Release PPM-100 validation benchmark (scheduled in Feb. 2021)

License

This project is released under the Creative Commons Attribution NonCommercial ShareAlike 4.0 license.

Acknowledgement

We thank City University of Hong Kong and SenseTime for their support to this project.
We thank the Gradio team for their contributions to building the demos.

Citation

If this work helps your research, please consider to cite:

@article{MODNet,
  author = {Zhanghan Ke and Kaican Li and Yurou Zhou and Qiuhua Wu and Xiangyu Mao and Qiong Yan and Rynson W.H. Lau},
  title = {Is a Green Screen Really Necessary for Real-Time Portrait Matting?},
  journal={ArXiv},
  volume={abs/2011.11961},
  year = {2020},
}

Contact

This project is currently maintained by Zhanghan Ke (@ZHKKKe).
If you have any questions, please feel free to contact kezhanghan@outlook.com.

comments powered by Disqus