Go to file
Gian Pio Domiziani 730fd14ef6
micro/macro f1 metrics added. (#80)
* micro/macro f1 metrics added.

* format lines.
2021-04-26 14:50:41 -04:00
.github/workflows update image url (#71) 2021-04-21 01:36:06 -07:00
docs Update version.py (#70) 2021-04-20 16:57:35 -07:00
flaml micro/macro f1 metrics added. (#80) 2021-04-26 14:50:41 -04:00
notebook Lgbm w customized obj (#64) 2021-04-10 21:14:28 -04:00
test micro/macro f1 metrics added. (#80) 2021-04-26 14:50:41 -04:00
.coveragerc v0.1.0 2020-12-04 09:40:27 -08:00
.flake8 v0.1.0 2020-12-04 09:40:27 -08:00
.gitignore pickle the AutoML object (#37) 2021-03-16 22:13:35 -07:00
CODE_OF_CONDUCT.md v0.1.0 2020-12-04 09:40:27 -08:00
LICENSE V0.2.2 (#19) 2021-02-05 21:41:14 -08:00
README.md update image url (#71) 2021-04-21 01:36:06 -07:00
SECURITY.md v0.1.0 2020-12-04 09:40:27 -08:00
setup.py Issue58 (#59) 2021-04-08 09:29:55 -07:00

README.md

PyPI version Build Python Version Downloads Join the chat at https://gitter.im/FLAMLer/community

FLAML - Fast and Lightweight AutoML


FLAML is a lightweight Python library that finds accurate machine learning models automatically, efficiently and economically. It frees users from selecting learners and hyperparameters for each learner. It is fast and economical. The simple and lightweight design makes it easy to extend, such as adding customized learners or metrics. FLAML is powered by a new, cost-effective hyperparameter optimization and learner selection method invented by Microsoft Research. FLAML leverages the structure of the search space to choose a search order optimized for both cost and error. For example, the system tends to propose cheap configurations at the beginning stage of the search, but quickly moves to configurations with high model complexity and large sample size when needed in the later stage of the search. For another example, it favors cheap learners in the beginning but penalizes them later if the error improvement is slow. The cost-bounded search and cost-based prioritization make a big difference in the search efficiency under budget constraints.

FLAML is easy to use:

  • With three lines of code, you can start using this economical and fast AutoML engine as a scikit-learn style estimator.
from flaml import AutoML
automl = AutoML()
automl.fit(X_train, y_train, task="classification")
  • You can restrict the learners and use FLAML as a fast hyperparameter tuning tool for XGBoost, LightGBM, Random Forest etc. or a customized learner.
automl.fit(X_train, y_train, task="classification", estimator_list=["lgbm"])
  • You can also run generic ray-tune style hyperparameter tuning for a custom function.
from flaml import tune
tune.run(train_with_config, config={}, low_cost_partial_config={}, time_budget_s=3600)

Installation

FLAML requires Python version >= 3.6. It can be installed from pip:

pip install flaml

To run the notebook example, install flaml with the [notebook] option:

pip install flaml[notebook]

Examples

A basic classification example.

from flaml import AutoML
from sklearn.datasets import load_iris
# Initialize an AutoML instance
automl = AutoML()
# Specify automl goal and constraint
automl_settings = {
    "time_budget": 10,  # in seconds
    "metric": 'accuracy',
    "task": 'classification',
    "log_file_name": "test/iris.log",
}
X_train, y_train = load_iris(return_X_y=True)
# Train with labeled input data
automl.fit(X_train=X_train, y_train=y_train,
                        **automl_settings)
# Predict
print(automl.predict_proba(X_train))
# Export the best model
print(automl.model)

A basic regression example.

from flaml import AutoML
from sklearn.datasets import load_boston
# Initialize an AutoML instance
automl = AutoML()
# Specify automl goal and constraint
automl_settings = {
    "time_budget": 10,  # in seconds
    "metric": 'r2',
    "task": 'regression',
    "log_file_name": "test/boston.log",
}
X_train, y_train = load_boston(return_X_y=True)
# Train with labeled input data
automl.fit(X_train=X_train, y_train=y_train,
                        **automl_settings)
# Predict
print(automl.predict(X_train))
# Export the best model
print(automl.model)

More examples can be found in notebooks.

Documentation

The API documentation is here.

Read more about the hyperparameter optimization methods in FLAML here. They can be used beyond the AutoML context. And they can be used in distributed HPO frameworks such as ray tune or nni.

For more technical details, please check our papers.

@inproceedings{wang2021flaml,
    title={FLAML: A Fast and Lightweight AutoML Library},
    author={Chi Wang and Qingyun Wu and Markus Weimer and Erkang Zhu},
    year={2021},
    booktitle={MLSys},
}

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

If you are new to GitHub here is a detailed help source on getting involved with development on GitHub.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Developing

Setup:

git clone https://github.com/microsoft/FLAML.git
pip install -e .[test,notebook]

Coverage

Any code you commit should generally not significantly impact coverage. To run all unit tests:

coverage run -m pytest test

If all the tests are passed, please also test run notebook/flaml_automl to make sure your commit does not break the notebook example.

Authors

  • Chi Wang
  • Qingyun Wu

Contributors (alphabetical order): Sebastien Bubeck, Surajit Chaudhuri, Nadiia Chepurko, Ofer Dekel, Alex Deng, Anshuman Dutt, Nicolo Fusi, Jianfeng Gao, Johannes Gehrke, Silu Huang, Dongwoo Kim, Christian Konig, John Langford, Amin Saied, Neil Tenenholtz, Markus Weimer, Haozhe Zhang, Erkang Zhu.

License

MIT License