Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Under Apache License 2.0
By awslabs
AutoGluon: AutoML for Text, Image, and Tabular Data
Under Apache License 2.0
By awslabs
AutoML for Text, Image, and Tabular Data
AutoGluon automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just a few lines of code, you can train and deploy high-accuracy machine learning and deep learning models on text, image, and tabular data.
```python
from autogluon.tabular import TabularDataset, TabularPredictor
train_data = TabularDataset('https://autogluon.s3.amazonaws.com/datasets/Inc/train.csv')
test_data = TabularDataset('https://autogluon.s3.amazonaws.com/datasets/Inc/test.csv')
predictor = TabularPredictor(label='class').fit(train_data, time_limit=120) # Fit models for 120s
leaderboard = predictor.leaderboard(test_data)
```
| AutoGluon Task | Quickstart | API |
| :--- | :---: | :---: |
| TabularPredictor | | |
| TextPredictor | | |
| ImagePredictor | | |
| ObjectDetector | | |
Announcement for previous users: The AutoGluon codebase has been modularized into namespace packages, which means you now only need those dependencies relevant to your prediction task of interest! For example, you can now work with tabular data without having to install dependencies required for AutoGluon's computer vision tasks (and vice versa). Unfortunately this improvement required a minor API change (eg. instead of from autogluon import TabularPrediction
, you should now do: from autogluon.tabular import TabularPredictor
), for all versions newer than v0.0.15. Documentation/tutorials under the old API may still be viewed for version 0.0.15 which is the last released version under the old API.
See the AutoGluon Website for documentation and instructions on:
- Installing AutoGluon
- Learning with tabular data
- Tips to maximize accuracy (if benchmarking, make sure to run fit()
with argument presets='best_quality'
).
If you use AutoGluon in a scientific publication, please cite the following paper:
Erickson, Nick, et al. "AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data." arXiv preprint arXiv:2003.06505 (2020).
BibTeX entry:
bibtex
@article{agtabular,
title={AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data},
author={Erickson, Nick and Mueller, Jonas and Shirkov, Alexander and Zhang, Hang and Larroy, Pedro and Li, Mu and Smola, Alexander},
journal={arXiv preprint arXiv:2003.06505},
year={2020}
}
If you are using AutoGluon Tabular's model distillation functionality, please cite the following paper:
Fakoor, Rasool, et al. "Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation." Advances in Neural Information Processing Systems 33 (2020).
BibTeX entry:
bibtex
@article{agtabulardistill,
title={Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation},
author={Fakoor, Rasool and Mueller, Jonas W and Erickson, Nick and Chaudhari, Pratik and Smola, Alexander J},
journal={Advances in Neural Information Processing Systems},
volume={33},
year={2020}
}
If you use AutoGluon's multimodal text+tabular functionality in a scientific publication, please cite the following paper:
Shi, Xingjian, et al. "Multimodal AutoML on Structured Tables with Text Fields." 8th ICML Workshop on Automated Machine Learning (AutoML). 2021.
BibTeX entry:
bibtex
@inproceedings{agmultimodaltext,
title={Multimodal AutoML on Structured Tables with Text Fields},
author={Shi, Xingjian and Mueller, Jonas and Erickson, Nick and Li, Mu and Smola, Alex},
booktitle={8th ICML Workshop on Automated Machine Learning (AutoML)},
year={2021}
}
AutoGluon also provides state-of-the-art tools for neural hyperparameter and architecture search, such as for example ASHA, Hyperband, Bayesian Optimization and BOHB. To get started, checkout the following resources
Also have a look at our paper "Model-based Asynchronous Hyperparameter and Neural Architecture Search" arXiv preprint arXiv:2003.10865 (2020).
bibtex
@article{abohb,
title={Model-based Asynchronous Hyperparameter and Neural Architecture Search},
author={Klein, Aaron and Tiao, Louis and Lienart, Thibaut and Archambeau, Cedric and Seeger, Matthias},
journal={arXiv preprint arXiv:2003.10865},
year={2020}
}
AutoGluon includes an algorithm for constrained hyperparameter optimization. Check out our paper applying it to optimize model performance under fairness constraints: "Fair Bayesian Optimization", AIES (2021).
bibtex
@article{fairbo,
title={Fair Bayesian Optimization},
author={Perrone, Valerio and Donini, Michele and Zafar, Bilal Muhammad and Schmucker, Robin and Kenthapadi, Krishnaram and Archambeau, Cédric},
journal={AIES},
year={2021}
}
This library is licensed under the Apache 2.0 License.
We are actively accepting code contributions to the AutoGluon project. If you are interested in contributing to AutoGluon, please read the Contributing Guide to get started.