June 19, 2019

2409 words 12 mins read

hibayesian/awesome-automl-papers

hibayesian/awesome-automl-papers

A curated list of automated machine learning papers, articles, tutorials, slides and projects

repo name hibayesian/awesome-automl-papers
repo link https://github.com/hibayesian/awesome-automl-papers
homepage
language
size (curr.) 17561 kB
stars (curr.) 2299
created 2017-11-20
license Apache License 2.0

Awesome-AutoML-Papers

Awesome-AutoML-Papers is a curated list of automated machine learning papers, articles, tutorials, slides and projects. Star this repository, and then you can keep abreast of the latest developments of this booming research field. Thanks to all the people who made contributions to this project. Join us and you are welcome to be a contributor.

What is AutoML?

Automated Machine Learning (AutoML) provides methods and processes to make Machine Learning available for non-Machine Learning experts, to improve efficiency of Machine Learning and to accelerate research on Machine Learning.

Machine Learning (ML) has achieved considerable successes in recent years and an ever-growing number of disciplines rely on it. However, this success crucially relies on human machine learning experts to perform the following tasks:

  • Preprocess the data,
  • Select appropriate features,
  • Select an appropriate model family,
  • Optimize model hyperparameters,
  • Postprocess machine learning models,
  • Critically analyze the results obtained.

As the complexity of these tasks is often beyond non-ML-experts, the rapid growth of machine learning applications has created a demand for off-the-shelf machine learning methods that can be used easily and without expert knowledge. We call the resulting research area that targets progressive automation of machine learning AutoML. As a new sub-area in machine learning, AutoML has got more attention not only in machine learning but also in computer vision, natural language processing and graph computing.

There are no formal definition of AutoML. From the descriptions of most papers,the basic procedure of AutoML can be shown as the following.

AutoML approaches are already mature enough to rival and sometimes even outperform human machine learning experts. Put simply, AutoML can lead to improved performance while saving substantial amounts of time and money, as machine learning experts are both hard to find and expensive. As a result, commercial interest in AutoML has grown dramatically in recent years, and several major tech companies and start-up companies are now developing their own AutoML systems. An overview comparison of some of them can be summarized to the following table.

Company AutoFE HPO NAS
4paradigm ×
Alibaba × ×
Baidu × ×
Google
H2O.ai ×
Microsoft ×
RapidMiner ×
Tencent × ×
Transwarp

Awesome-AutoML-Papers includes very up-to-date overviews of the bread-and-butter techniques we need in AutoML:

  • Automated Data Clean (Auto Clean)
  • Automated Feature Engineering (Auto FE)
  • Hyperparameter Optimization (HPO)
  • Meta-Learning
  • Neural Architecture Search (NAS)

Table of Contents

Papers

Surveys

  • 2019 | AutoML: A Survey of the State-of-the-Art | Xin He, et al. | arXiv | PDF
  • 2019 | Survey on Automated Machine Learning | Marc Zoeller, Marco F. Huber | arXiv | PDF
  • 2019 | Automated Machine Learning: State-of-The-Art and Open Challenges | Radwa Elshawi, et al. | arXiv | PDF
  • 2018 | Taking Human out of Learning Applications: A Survey on Automated Machine Learning | Quanming Yao, et al. | arXiv | PDF

Automated Feature Engineering

  • Expand Reduce

    • 2017 | AutoLearn — Automated Feature Generation and Selection | Ambika Kaul, et al. | ICDM | PDF
    • 2017 | One button machine for automating feature engineering in relational databases | Hoang Thanh Lam, et al. | arXiv | PDF
    • 2016 | Automating Feature Engineering | Udayan Khurana, et al. | NIPS | PDF
    • 2016 | ExploreKit: Automatic Feature Generation and Selection | Gilad Katz, et al. | ICDM | PDF
    • 2015 | Deep Feature Synthesis: Towards Automating Data Science Endeavors | James Max Kanter, Kalyan Veeramachaneni | DSAA | PDF
  • Hierarchical Organization of Transformations

    • 2016 | Cognito: Automated Feature Engineering for Supervised Learning | Udayan Khurana, et al. | ICDMW | PDF
  • Meta Learning

    • 2017 | Learning Feature Engineering for Classification | Fatemeh Nargesian, et al. | IJCAI | PDF
  • Reinforcement Learning

    • 2017 | Feature Engineering for Predictive Modeling using Reinforcement Learning | Udayan Khurana, et al. | arXiv | PDF
    • 2010 | Feature Selection as a One-Player Game | Romaric Gaudel, Michele Sebag | ICML | PDF
  • Evolutionary Algorithms

    • 2019 | Evolutionary Neural AutoML for Deep Learning | Jason Liang, et al. | GECCO | PDF
    • 2017 | Large-Scale Evolution of Image Classifiers | Esteban Real, et al. | PMLR | PDF
    • 2002 | Evolving Neural Networks through Augmenting Topologies | Kenneth O.Stanley, Risto Miikkulainen | Evolutionary Computation | PDF
    • 2017 | Simple and Efficient Architecture Search for Convolutional Neural Networks | Thomoas Elsken, et al. | ICLR | PDF
  • Meta Learning

    • 2016 | Learning to Optimize | Ke Li, Jitendra Malik | arXiv | PDF
  • Reinforcement Learning

    • 2018 | AMC: AutoML for Model Compression and Acceleration on Mobile Devices | Yihui He, et al. | ECCV | PDF
    • 2018 | Efficient Neural Architecture Search via Parameter Sharing | Hieu Pham, et al. | arXiv | PDF
    • 2017 | Neural Architecture Search with Reinforcement Learning | Barret Zoph, Quoc V. Le | ICLR | PDF
  • Transfer Learning

    • 2017 | Learning Transferable Architectures for Scalable Image Recognition | Barret Zoph, et al. | arXiv | PDF
  • Network Morphism

    • 2018 | Efficient Neural Architecture Search with Network Morphism | Haifeng Jin, et al. | arXiv | PDF
  • Continuous Optimization

    • 2018 | Neural Architecture Optimization | Renqian Luo, et al. | arXiv | PDF
    • 2019 | DARTS: Differentiable Architecture Search | Hanxiao Liu, et al. | ICLR | PDF

Frameworks

  • 2019 | Auptimizer – an Extensible, Open-Source Framework for Hyperparameter Tuning | Jiayi Liu, et al. | IEEE Big Data | PDF
  • 2019 | Towards modular and programmable architecture search | Renato Negrinho, et al. | NeurIPS | PDF
  • 2019 | Evolutionary Neural AutoML for Deep Learning | Jason Liang, et al. | arXiv | PDF
  • 2017 | ATM: A Distributed, Collaborative, Scalable System for Automated Machine Learning | T. Swearingen, et al. | IEEE | PDF
  • 2017 | Google Vizier: A Service for Black-Box Optimization | Daniel Golovin, et al. | KDD |PDF
  • 2015 | AutoCompete: A Framework for Machine Learning Competitions | Abhishek Thakur, et al. | ICML | PDF

Hyperparameter Optimization

  • Bayesian Optimization

    • 2019 | Bayesian Optimization with Unknown Search Space | NeurIPS | PDF
    • 2019 | Constrained Bayesian optimization with noisy experiments | PDF
    • 2019 | Learning search spaces for Bayesian optimization: Another view of hyperparameter transfer learning | NeurIPS | PDF
    • 2019 | Practical Two-Step Lookahead Bayesian Optimization | NeurIPS | PDF
    • 2019 | Predictive entropy search for multi-objective bayesian optimization with constraints | PDF
    • 2018 | BOCK: Bayesian optimization with cylindrical kernels | ICML | PDF
    • 2018 | Efficient High Dimensional Bayesian Optimization with Additivity and Quadrature Fourier Features | Mojmír Mutný, et al. | NeurIPS | PDF
    • 2018 | High-Dimensional Bayesian Optimization via Additive Models with Overlapping Groups. | PMLR | PDF
    • 2018 | Maximizing acquisition functions for Bayesian optimization | NeurIPS | PDF
    • 2018 | Scalable hyperparameter transfer learning | NeurIPS | PDF
    • 2016 | Bayesian Optimization with Robust Bayesian Neural Networks | Jost Tobias Springenberg, et al. | NIPS | PDF
    • 2016 | Scalable Hyperparameter Optimization with Products of Gaussian Process Experts | Nicolas Schilling, et al. | PKDD | PDF
    • 2016 | Taking the Human Out of the Loop: A Review of Bayesian Optimization | Bobak Shahriari, et al. | IEEE | PDF
    • 2016 | Towards Automatically-Tuned Neural Networks | Hector Mendoza, et al. | JMLR | PDF
    • 2016 | Two-Stage Transfer Surrogate Model for Automatic Hyperparameter Optimization | Martin Wistuba, et al. | PKDD | PDF
    • 2015 | Efficient and Robust Automated Machine Learning | PDF
    • 2015 | Hyperparameter Optimization with Factorized Multilayer Perceptrons | Nicolas Schilling, et al. | PKDD | PDF
    • 2015 | Hyperparameter Search Space Pruning - A New Component for Sequential Model-Based Hyperparameter Optimization | Martin Wistua, et al. | PDF
    • 2015 | Joint Model Choice and Hyperparameter Optimization with Factorized Multilayer Perceptrons | Nicolas Schilling, et al. | ICTAI | PDF
    • 2015 | Learning Hyperparameter Optimization Initializations | Martin Wistuba, et al. | DSAA | PDF
    • 2015 | Scalable Bayesian optimization using deep neural networks | Jasper Snoek, et al. | ACM | PDF
    • 2015 | Sequential Model-free Hyperparameter Tuning | Martin Wistuba, et al. | ICDM | PDF
    • 2013 | Auto-WEKA: Combined Selection and Hyperparameter Optimization of Classification Algorithms | PDF
    • 2013 | Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures | J. Bergstra | JMLR | PDF
    • 2012 | Practical Bayesian Optimization of Machine Learning Algorithms | PDF
    • 2011 | Sequential Model-Based Optimization for General Algorithm Configuration(extended version) | PDF
  • Evolutionary Algorithms

    • 2018 | Autostacker: A Compositional Evolutionary Learning System | Boyuan Chen, et al. | arXiv | PDF
    • 2017 | Large-Scale Evolution of Image Classifiers | Esteban Real, et al. | PMLR | PDF
    • 2016 | Automating biomedical data science through tree-based pipeline optimization | Randal S. Olson, et al. | ECAL | PDF
    • 2016 | Evaluation of a tree-based pipeline optimization tool for automating data science | Randal S. Olson, et al. | GECCO | PDF
  • Lipschitz Functions

    • 2017 | Global Optimization of Lipschitz functions | C´edric Malherbe, Nicolas Vayatis | arXiv | PDF
  • Local Search

    • 2009 | ParamILS: An Automatic Algorithm Configuration Framework | Frank Hutter, et al. | JAIR | PDF
  • Meta Learning

    • 2008 | Cross-Disciplinary Perspectives on Meta-Learning for Algorithm Selection | PDF
    • 2019 | SMARTML: A Meta Learning-Based Framework for Automated Selection and Hyperparameter Tuning for Machine Learning Algorithms | PDF
  • Particle Swarm Optimization

    • 2017 | Particle Swarm Optimization for Hyper-parameter Selection in Deep Neural Networks | Pablo Ribalta Lorenzo, et al. | GECCO | PDF
    • 2008 | Particle Swarm Optimization for Parameter Determination and Feature Selection of Support Vector Machines | Shih-Wei Lin, et al. | Expert Systems with Applications | PDF
    • 2016 | Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization | Lisha Li, et al. | arXiv | PDF
    • 2012 | Random Search for Hyper-Parameter Optimization | James Bergstra, Yoshua Bengio | JMLR | PDF
    • 2011 | Algorithms for Hyper-parameter Optimization | James Bergstra, et al. | NIPS | PDF
  • Transfer Learning

    • 2016 | Efficient Transfer Learning Method for Automatic Hyperparameter Tuning | Dani Yogatama, Gideon Mann | JMLR | PDF
    • 2016 | Flexible Transfer Learning Framework for Bayesian Optimisation | Tinu Theckel Joy, et al. | PAKDD | PDF
    • 2016 | Hyperparameter Optimization Machines | Martin Wistuba, et al. | DSAA | PDF
    • 2013 | Collaborative Hyperparameter Tuning | R´emi Bardenet, et al. | ICML | PDF

Miscellaneous

  • 2018 | Accelerating Neural Architecture Search using Performance Prediction | Bowen Baker, et al. | ICLR | PDF
  • 2017 | Automatic Frankensteining: Creating Complex Ensembles Autonomously | Martin Wistuba, et al. | SIAM | PDF

Tutorials

Bayesian Optimization

  • 2018 | A Tutorial on Bayesian Optimization. | PDF
  • 2010 | A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning | PDF

Meta Learning

  • 2008 | Metalearning - A Tutorial | PDF

Blog

Type Blog Title Link
HPO Bayesian Optimization for Hyperparameter Tuning Link
Meta-Learning Learning to learn Link
Meta-Learning Why Meta-learning is Crucial for Further Advances of Artificial Intelligence? Link

Books

Year of Publication Type Book Title Authors Publisher Link
2009 Meta-Learning Metalearning - Applications to Data Mining Brazdil, P., Giraud Carrier, C., Soares, C., Vilalta, R. Springer Download
2019 HPO, Meta-Learning, NAS AutoML: Methods, Systems, Challenges Frank Hutter, Lars Kotthoff, Joaquin Vanschoren Download

Projects

Project Type Language License Link
AdaNet NAS Python Apache-2.0 Github
Advisor HPO Python Apache-2.0 Github
AMLA HPO, NAS Python Apache-2.0 Github
ATM HPO Python MIT Github
Auger HPO Python Commercial Homepage
auptimizer HPO, NAS Python (support R script) GPL-3.0 Github
Auto-Keras NAS Python License Github
AutoML Vision NAS Python Commercial Homepage
AutoML Video Intelligence NAS Python Commercial Homepage
AutoML Natural Language NAS Python Commercial Homepage
AutoML Translation NAS Python Commercial Homepage
AutoML Tables AutoFE, HPO Python Commercial Homepage
auto-sklearn HPO Python License Github
auto_ml HPO Python MIT Github
BayesianOptimization HPO Python MIT Github
BayesOpt HPO C++ AGPL-3.0 Github
comet HPO Python Commercial Homepage
DataRobot HPO Python Commercial Homepage
DEvol NAS Python MIT Github
DeepArchitect NAS Python MIT Github
Driverless AI AutoFE Python Commercial Homepage
FAR-HO HPO Python MIT Github
H2O AutoML HPO Python, R, Java, Scala Apache-2.0 Github
HpBandSter HPO Python BSD-3-Clause Github
HyperBand HPO Python License Github
Hyperopt HPO Python License Github
Hyperopt-sklearn HPO Python License Github
Hyperparameter Hunter HPO Python MIT Github
Katib HPO Python Apache-2.0 Github
MateLabs HPO Python Commercial Github
Milano HPO Python Apache-2.0 Github
MLJAR HPO Python Commercial Homepage
nasbot NAS Python MIT Github
neptune HPO Python Commercial Homepage
NNI HPO, NAS Python MIT Github
Oboe HPO Python BSD-3-Clause Github
Optunity HPO Python License Github
R2.ai HPO Commercial Homepage
RBFOpt HPO Python License Github
RoBO HPO Python BSD-3-Clause Github
Scikit-Optimize HPO Python License Github
SigOpt HPO Python Commercial Homepage
SMAC3 HPO Python License Github
TPOT AutoFE, HPO Python LGPL-3.0 Github
TransmogrifAI HPO Scala BSD-3-Clause Github
Tune HPO Python Apache-2.0 Github
Xcessiv HPO Python Apache-2.0 Github
SmartML HPO R GPL-3.0 Github
MLBox AutoFE, HPO Python BSD-3 License Github
AutoAI Watson AutoFE, HPO Commercial Homepage

Slides

Type Slide Title Authors Link
AutoFE Automated Feature Engineering for Predictive Modeling Udyan Khurana, etc al. Download
HPO A Tutorial on Bayesian Optimization for Machine Learning Ryan P. Adams Download
HPO Bayesian Optimisation Gilles Louppe Download

Acknowledgement

Special thanks to everyone who contributed to this project.

Name Bio
Alexander Robles PhD Student @UNICAMP-Brazil
derekflint
Eric
Erin LeDell Chief Machine Learning Scientist @H2O.ai
fwcore
Gaurav Mittal
koala Senior Researcher @Tencent
Lilian Besson PhD Student @CentraleSupélec
罗磊
Marc
Mohamed Maher
Richard Liaw PhD Student @UC Berkeley
Randy Olson Lead Data Scientist @LifeEGX
Slava Kurilyak Founder, CEO @Produvia
Saket Maheshwary AI Researcher
shaido987
sophia-wright-blue
tengben0905
xuehui @Microsoft
Yihui He Grad Student @CMU

Contact & Feedback

If you have any suggestions (missing papers, new papers, key researchers or typos), feel free to pull a request. Also you can mail to:

Licenses

Awesome-AutoML-Papers is available under Apache Licenses 2.0.

comments powered by Disqus