This commit is contained in:
Chi Wang 2021-03-22 13:33:30 -07:00 committed by GitHub
parent b058e0e041
commit 8bcdb2a0c2
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 11 additions and 11 deletions

View File

@ -1,9 +1,9 @@
# Economical Hyperparameter Optimization
`flaml.tune` is a module for economical hyperparameter tuning. It frees users from manually tuning many hyperparameters for a software, such as machine learning training procedures.
The API is compatible with ray tune.
It can be used standalone, or together with ray tune or nni.
Example:
* Example for sequential tuning (recommended when compute resource is limited and each trial can consume all the resources):
```python
# require: pip install flaml[blendsearch]
@ -41,7 +41,8 @@ print(analysis.best_trial.last_result) # the best trial's result
print(analysis.best_config) # the best config
```
Or, using ray tune's API:
* Example for using ray tune's API:
```python
# require: pip install flaml[blendsearch] ray[tune]
from ray import tune as raytune
@ -71,14 +72,19 @@ analysis = raytune.run(
time_budget_s=60, # the time budget in seconds
local_dir='logs/', # the local directory to store logs
search_alg=CFO(points_to_evaluate=[{'x':1}]) # or BlendSearch
# other algo example: raytune.create_searcher('optuna'),
)
print(analysis.best_trial.last_result) # the best trial's result
print(analysis.best_config) # the best config
```
For more examples, please check out
* Example for using NNI: An example of using BlendSearch with NNI can be seen in [test](https://github.com/microsoft/FLAML/tree/main/test/nni). CFO can be used as well in a similar manner. To run the example, first make sure you have [NNI](https://nni.readthedocs.io/en/stable/) installed, then run:
```shell
$nnictl create --config ./config.yml
```
* For more examples, please check out
[notebooks](https://github.com/microsoft/FLAML/tree/main/notebook/).
@ -159,12 +165,6 @@ Recommended scenario: cost-related hyperparameters exist, a low-cost
initial point is known, and the search space is complex such that local search
is prone to be stuck at local optima.
An example of using BlendSearch with NNI can be seen in [test](https://github.com/microsoft/FLAML/tree/main/test/nni), CFO can be used with NNI as well in a similar manner. To run the example, first make sure you have [NNI](https://nni.readthedocs.io/en/stable/) installed, then run:
```shell
$nnictl create --config ./config.yml
```
For more technical details, please check our papers.
* [Frugal Optimization for Cost-related Hyperparameters](https://arxiv.org/abs/2005.01571). Qingyun Wu, Chi Wang, Silu Huang. AAAI 2021.