AutoML
https://www.automl.org/automl/
AUTOML
What is AutoML?
Automated Machine Learning provides methods and processes to make Machine Learning available for non-Machine Learning experts, to improve efficiency of Machine Learning and to accelerate research on Machine Learning.
Machine learning (ML) has achieved considerable successes in recent years and an ever-growing number of disciplines rely on it. However, this success crucially relies on human machine learning experts to perform the following tasks:
- Preprocess and clean the data.
- Select and construct appropriate features.
- Select an appropriate model family.
- Optimize model hyperparameters.
- Postprocess machine learning models.
- Critically analyze the results obtained.
As the complexity of these tasks is often beyond non-ML-experts, the rapid growth of machine learning applications has created a demand for off-the-shelf machine learning methods that can be used easily and without expert knowledge. We call the resulting research area that targets progressive automation of machine learning AutoML.
Examples of AutoML
Research in Automated Machine Learning is very diverse and brought up packages and methods targeted at both researchers and end users.
AutoML systems
Throughout recent years several off-the-shelf packages have been developed which provide automated machine learning. While there are more packages than the one listed below, we restrict ourselves to a subset of the most well-known ones:
- AutoWEKA is an approach for the simultaneous selection of a machine learning algorithm and its hyperparameters; combined with the WEKA package it automatically yields good models for a wide variety of data sets.
- Auto-sklearn is an extension of AutoWEKA using the Python library scikit-learn which is a drop-in replacement for regular scikit-learn classifiers and regressors.
- TPOT is a data-science assistant which optimizes machine learning pipelines using genetic programming.
- H2O AutoML provides automated model selection and ensembling for the H2O machine learning and data analytics platform.
- TransmogrifAI is an AutoML library running on top of Spark.
- MLBoX is an AutoML library with three components: preprocessing, optimisation and prediction.
AutoML to advance and improve research
Making a science of model search argues that the performance of a given technique depends on both the fundamental quality of the algorithm and the details of its tuning and that it is sometimes difficult to know whether a given technique is genuinely better, or simply better tuned. To improve the situation, Bergstra et al. propose reporting results obtained by tuning all algorithms with the same hyperparameter optimization toolkit. Sculley et al.’s recent ICLR workshop paper Winner’s Curse argues in the same direction and gives recent examples in which correct hyperperameter optimization of baselines improved over the latest state-of-the-art results and newly proposed methods.
Hyperparameter optimization and algorithm configuration provide methods to automate the tedious, time-consuming and error-prone process of tuning hyperparameters to new tasks at hand and provide software packages implement the suggestion from Bergstra et al.’s Making a science of model search. These include:
- Hyperopt, including the TPE algorithm
- Sequential Model-based Algorithm Configuration (SMAC)
- Spearmint
We also provide packages for hyperparameter optimization:
- BOHB: Bayesian Optimization combined with HyperBand
- RoBO – Robust Bayesian Optimization framework
- SMAC3 – a python re-implementation of the SMAC algorithm
Architecture Search
The field of architecture search addresses the problem of finding a well-performing architecture of a deep neural network. For example, this includes the number of layers, number of neurons, the type of activation functions and many more design decisions. Automated architecture search can substantially sped up the development of new deep learning application as developers do not need to painstakingly evaluate different architectures.
For an overview on architecture search, we refer the interested reader to our literature overview on neural architecture search.
Packages for architecture search and hyperoptimization for deep learning include:
- Auto-PyTorch
- AutoKeras
- DEvol
- HyperAS: a combination of Keras and Hyperopt
- talos: Hyperparameter Scanning and Optimization for Keras
Resources
- NeurIPS 2018 tutorial on AutoML (recording); we’ve posted the slides for this and many other tutorials on our our webpage on invited talks and tutorials.
- Rich Caruana: Open Research Problems in AutoML
- Wikipedia
- KDNuggets on the current state of AutoML
댓글
댓글 쓰기