mirror of https://github.com/microsoft/autogen.git
remove redundant doc and add tutorial (#1004)
* remove redundant doc and add tutorial * add demos for pydata2023 * Update pydata23 docs * remove redundant notebooks * Move tutorial notebooks to notebook folder * update readme and notebook links * update notebook links * update links * update readme --------- Co-authored-by: Li Jiang <lijiang1@microsoft.com> Co-authored-by: Li Jiang <bnujli@gmail.com>
This commit is contained in:
parent
b90e9ee283
commit
3e6e834bbb
|
@ -1,20 +0,0 @@
|
|||
# Minimal makefile for Sphinx documentation
|
||||
#
|
||||
|
||||
# You can set these variables from the command line, and also
|
||||
# from the environment for the first two.
|
||||
SPHINXOPTS ?=
|
||||
SPHINXBUILD ?= sphinx-build
|
||||
SOURCEDIR = .
|
||||
BUILDDIR = _build
|
||||
|
||||
# Put it first so that "make" without argument is like "make help".
|
||||
help:
|
||||
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
|
||||
|
||||
.PHONY: help Makefile
|
||||
|
||||
# Catch-all target: route all unknown targets to Sphinx using the new
|
||||
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
|
||||
%: Makefile
|
||||
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
|
60
docs/conf.py
60
docs/conf.py
|
@ -1,60 +0,0 @@
|
|||
# Configuration file for the Sphinx documentation builder.
|
||||
#
|
||||
# This file only contains a selection of the most common options. For a full
|
||||
# list see the documentation:
|
||||
# https://www.sphinx-doc.org/en/master/usage/configuration.html
|
||||
|
||||
# -- Path setup --------------------------------------------------------------
|
||||
|
||||
# If extensions (or modules to document with autodoc) are in another directory,
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
# import os
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
|
||||
|
||||
# -- Project information -----------------------------------------------------
|
||||
|
||||
project = "FLAML"
|
||||
copyright = "2020-2021, FLAML Team"
|
||||
author = "FLAML Team"
|
||||
|
||||
|
||||
# -- General configuration ---------------------------------------------------
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
|
||||
# ones.
|
||||
extensions = [
|
||||
"sphinx.ext.autodoc",
|
||||
"sphinx.ext.napoleon",
|
||||
"sphinx.ext.doctest",
|
||||
"sphinx.ext.coverage",
|
||||
"sphinx.ext.mathjax",
|
||||
"sphinx.ext.viewcode",
|
||||
"sphinx.ext.githubpages",
|
||||
"sphinx_rtd_theme",
|
||||
]
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ["_templates"]
|
||||
|
||||
# List of patterns, relative to source directory, that match files and
|
||||
# directories to ignore when looking for source files.
|
||||
# This pattern also affects html_static_path and html_extra_path.
|
||||
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
|
||||
|
||||
|
||||
# -- Options for HTML output -------------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
#
|
||||
html_theme = "sphinx_rtd_theme"
|
||||
|
||||
# Add any paths that contain custom static files (such as style sheets) here,
|
||||
# relative to this directory. They are copied after the builtin static files,
|
||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
html_static_path = ["_static"]
|
|
@ -1,45 +0,0 @@
|
|||
.. FLAML documentation master file, created by
|
||||
sphinx-quickstart on Mon Dec 14 23:33:24 2020.
|
||||
You can adapt this file completely to your liking, but it should at least
|
||||
contain the root `toctree` directive.
|
||||
|
||||
.. Welcome to FLAML's documentation!
|
||||
.. =================================
|
||||
|
||||
.. .. toctree::
|
||||
.. :maxdepth: 2
|
||||
.. :caption: Contents:
|
||||
|
||||
|
||||
FLAML API Documentation
|
||||
=======================
|
||||
|
||||
AutoML
|
||||
------
|
||||
|
||||
.. autoclass:: flaml.AutoML
|
||||
:members:
|
||||
|
||||
|
||||
Tune
|
||||
----
|
||||
|
||||
.. autofunction:: flaml.tune.run
|
||||
|
||||
.. autofunction:: flaml.tune.report
|
||||
|
||||
.. autoclass:: flaml.BlendSearch
|
||||
:members:
|
||||
|
||||
.. autoclass:: flaml.CFO
|
||||
:members:
|
||||
|
||||
.. autoclass:: flaml.FLOW2
|
||||
:members:
|
||||
|
||||
|
||||
Online AutoML
|
||||
-------------
|
||||
|
||||
.. autoclass:: flaml.AutoVW
|
||||
:members:
|
|
@ -1,35 +0,0 @@
|
|||
@ECHO OFF
|
||||
|
||||
pushd %~dp0
|
||||
|
||||
REM Command file for Sphinx documentation
|
||||
|
||||
if "%SPHINXBUILD%" == "" (
|
||||
set SPHINXBUILD=sphinx-build
|
||||
)
|
||||
set SOURCEDIR=.
|
||||
set BUILDDIR=_build
|
||||
|
||||
if "%1" == "" goto help
|
||||
|
||||
%SPHINXBUILD% >NUL 2>NUL
|
||||
if errorlevel 9009 (
|
||||
echo.
|
||||
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
|
||||
echo.installed, then set the SPHINXBUILD environment variable to point
|
||||
echo.to the full path of the 'sphinx-build' executable. Alternatively you
|
||||
echo.may add the Sphinx directory to PATH.
|
||||
echo.
|
||||
echo.If you don't have Sphinx installed, grab it from
|
||||
echo.http://sphinx-doc.org/
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
|
||||
goto end
|
||||
|
||||
:help
|
||||
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
|
||||
|
||||
:end
|
||||
popd
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because one or more lines are too long
File diff suppressed because it is too large
Load Diff
|
@ -0,0 +1,4 @@
|
|||
Please find tutorials on FLAML below:
|
||||
- [PyData Seattle 2023](flaml-tutorial-pydata-23.md)
|
||||
- [A hands-on tutorial on FLAML presented at KDD 2022](flaml-tutorial-kdd-22.md)
|
||||
- [A lab forum on FLAML at AAAI 2023](flaml-tutorial-aaai-23.md)
|
|
@ -0,0 +1,67 @@
|
|||
# AAAI 2023 Lab Forum - LSHP2: Automated Machine Learning & Tuning with FLAML
|
||||
|
||||
## Session Information
|
||||
|
||||
**Date and Time**: February 8, 2023 at 2-6pm ET.
|
||||
|
||||
Location: Walter E. Washington Convention Center, Washington DC, USA
|
||||
|
||||
Duration: 4 hours (3.5 hours + 0.5 hour break)
|
||||
|
||||
For the most up-to-date information, see the [AAAI'23 Program Agenda](https://aaai.org/Conferences/AAAI-23/aaai23tutorials/)
|
||||
|
||||
## [Lab Forum Slides](https://1drv.ms/b/s!Ao3suATqM7n7iokCQbF7jUUYwOqGqQ?e=cMnilV)
|
||||
|
||||
## What Will You Learn?
|
||||
|
||||
- What FLAML is and how to use FLAML to
|
||||
- find accurate ML models with low computational resources for common ML tasks
|
||||
- tune hyperparameters generically
|
||||
- How to leverage the flexible and rich customization choices
|
||||
- finish the last mile for deployment
|
||||
- create new applications
|
||||
- Code examples, demos, use cases
|
||||
- Research & development opportunities
|
||||
|
||||
## Session Agenda
|
||||
|
||||
### **Part 1. Overview of FLAML**
|
||||
|
||||
- Overview of AutoML and FLAML
|
||||
- Basic usages of FLAML
|
||||
- Task-oriented AutoML
|
||||
- [Documentation](https://microsoft.github.io/FLAML/docs/Use-Cases/Task-Oriented-AutoML)
|
||||
- [Notebook: A classification task with AutoML](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/automl_classification.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/automl_classification.ipynb)
|
||||
- Tune User-Defined-functions with FLAML
|
||||
- [Documentation](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function)
|
||||
- [Notebook: Tune user-defined function](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/tune_demo.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/tune_demo.ipynb)
|
||||
- Zero-shot AutoML
|
||||
- [Documentation](https://microsoft.github.io/FLAML/docs/Use-Cases/Zero-Shot-AutoML)
|
||||
- [Notebook: Zeroshot AutoML](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/zeroshot_lightgbm.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/zeroshot_lightgbm.ipynb)
|
||||
- [ML.NET demo](https://learn.microsoft.com/dotnet/machine-learning/tutorials/predict-prices-with-model-builder)
|
||||
|
||||
Break (15m)
|
||||
|
||||
### **Part 2. Deep Dive into FLAML**
|
||||
- The Science Behind FLAML’s Success
|
||||
- [Economical hyperparameter optimization methods in FLAML](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function/#hyperparameter-optimization-algorithm)
|
||||
- [Other research in FLAML](https://microsoft.github.io/FLAML/docs/Research)
|
||||
|
||||
- Maximize the Power of FLAML through Customization and Advanced Functionalities
|
||||
- [Notebook: Customize your AutoML with FLAML](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/customize_your_automl_with_flaml.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/customize_your_automl_with_flaml.ipynb)
|
||||
- [Notebook: Further acceleration of AutoML with FLAML](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/further_acceleration_of_automl_with_flaml.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/further_acceleration_of_automl_with_flaml.ipynb)
|
||||
- [Notebook: Neural network model tuning with FLAML ](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/tune_pytorch.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/tune_pytorch.ipynb)
|
||||
|
||||
|
||||
### **Part 3. New features in FLAML**
|
||||
- Natural language processing
|
||||
- [Notebook: AutoML for NLP tasks](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/automl_nlp.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/automl_nlp.ipynb)
|
||||
- Time Series Forecasting
|
||||
- [Notebook: AutoML for Time Series Forecast tasks](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/automl_time_series_forecast.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/automl_time_series_forecast.ipynb)
|
||||
- Targeted Hyperparameter Optimization With Lexicographic Objectives
|
||||
- [Documentation](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function/#lexicographic-objectives)
|
||||
- [Notebook: Find accurate and fast neural networks with lexicographic objectives](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/tune_lexicographic.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/tune_lexicographic.ipynb)
|
||||
- Online AutoML
|
||||
- [Notebook: Online AutoML with Vowpal Wabbit](https://github.com/microsoft/FLAML/blob/tutorial-aaai23/notebook/autovw.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial-aaai23/notebook/autovw.ipynb)
|
||||
- Fair AutoML
|
||||
### Challenges and open problems
|
|
@ -0,0 +1,48 @@
|
|||
# KDD 2022 Hands-on Tutorial - Automated Machine Learning & Tuning with FLAML
|
||||
|
||||
## Session Information
|
||||
|
||||
Date: August 16, 2022
|
||||
Time: 9:30 AM ET
|
||||
Location: 101
|
||||
Duration: 3 hours
|
||||
|
||||
For the most up-to-date information, see the [SIGKDD'22 Program Agenda](https://kdd.org/kdd2022/handsOnTutorial.html)
|
||||
|
||||
## [Tutorial Slides](https://1drv.ms/b/s!Ao3suATqM7n7ioQF8xT8BbRdyIf_Ww?e=qQysIf)
|
||||
|
||||
## What Will You Learn?
|
||||
|
||||
- What FLAML is and how to use it to find accurate ML models with low computational resources for common machine learning tasks
|
||||
- How to leverage the flexible and rich customization choices to:
|
||||
- Finish the last mile for deployment
|
||||
- Create new applications
|
||||
- Code examples, demos, and use cases
|
||||
- Research & development opportunities
|
||||
|
||||
## Session Agenda
|
||||
|
||||
### Part 1
|
||||
|
||||
- Overview of AutoML and FLAML
|
||||
- Task-oriented AutoML with FLAML
|
||||
- [Notebook: A classification task with AutoML](https://github.com/microsoft/FLAML/blob/tutorial/notebook/automl_classification.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial/notebook/automl_classification.ipynb)
|
||||
- [Notebook: A regression task with AuotML using LightGBM as the learner](https://github.com/microsoft/FLAML/blob/tutorial/notebook/automl_lightgbm.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial/notebook/automl_lightgbm.ipynb)
|
||||
- [ML.NET demo](https://docs.microsoft.com/dotnet/machine-learning/tutorials/predict-prices-with-model-builder)
|
||||
- Tune user defined functions with FLAML
|
||||
- [Notebook: Basic tuning procedures and advanced tuning options](https://github.com/microsoft/FLAML/blob/tutorial/notebook/tune_demo.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial/notebook/tune_demo.ipynb)
|
||||
- [Notebook: Tune pytorch](https://github.com/microsoft/FLAML/blob/tutorial/notebook/tune_pytorch.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial/notebook/tune_pytorch.ipynb)
|
||||
- Q & A
|
||||
|
||||
### Part 2
|
||||
|
||||
- Zero-shot AutoML
|
||||
- [Notebook: Zeroshot AutoML](https://github.com/microsoft/FLAML/blob/tutorial/notebook/zeroshot_lightgbm.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial/notebook/zeroshot_lightgbm.ipynb)
|
||||
- Time series forecasting
|
||||
- [Notebook: AutoML for Time Series Forecast tasks](https://github.com/microsoft/FLAML/blob/tutorial/notebook/automl_time_series_forecast.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial/notebook/automl_time_series_forecast.ipynb)
|
||||
- Natural language processing
|
||||
- [Notebook: AutoML for NLP tasks](https://github.com/microsoft/FLAML/blob/tutorial/notebook/automl_nlp.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial/notebook/automl_nlp.ipynb)
|
||||
- Online AutoML
|
||||
- [Notebook: Online AutoML with Vowpal Wabbit](https://github.com/microsoft/FLAML/blob/tutorial/notebook/autovw.ipynb); [Open In Colab](https://colab.research.google.com/github/microsoft/FLAML/blob/tutorial/notebook/autovw.ipynb)
|
||||
- Fair AutoML
|
||||
- Challenges and open problems
|
|
@ -0,0 +1,40 @@
|
|||
# PyData Seattle 2023 - Automated Machine Learning & Tuning with FLAML
|
||||
|
||||
## Session Information
|
||||
|
||||
**Date and Time**: 04-26, 09:00–10:30 PT.
|
||||
|
||||
Location: Microsoft Conference Center, Seattle, WA.
|
||||
|
||||
Duration: 1.5 hours
|
||||
|
||||
For the most up-to-date information, see the [PyData Seattle 2023 Agenda](https://seattle2023.pydata.org/cfp/talk/BYRA8H/)
|
||||
|
||||
## [Lab Forum Slides](https://drive.google.com/file/d/14uG0N7jnf18-wizeWWfmXcBUARTQn61w/view?usp=share_link)
|
||||
|
||||
## What Will You Learn?
|
||||
|
||||
In this session, we will provide an in-depth and hands-on tutorial on Automated Machine Learning & Tuning with a fast python library named FLAML. We will start with an overview of the AutoML problem and the FLAML library. We will then introduce the hyperparameter optimization methods empowering the strong performance of FLAML. We will also demonstrate how to make the best use of FLAML to perform automated machine learning and hyperparameter tuning in various applications with the help of rich customization choices and advanced functionalities provided by FLAML. At last, we will share several new features of the library based on our latest research and development work around FLAML and close the tutorial with open problems and challenges learned from AutoML practice.
|
||||
|
||||
## Tutorial Outline
|
||||
|
||||
### **Part 1. Overview**
|
||||
- Overview of AutoML & Hyperparameter Tuning
|
||||
|
||||
### **Part 2. Introduction to FLAML**
|
||||
- Introduction to FLAML
|
||||
- AutoML and Hyperparameter Tuning with FLAML
|
||||
- [Notebook: AutoML with FLAML Library](https://github.com/microsoft/FLAML/blob/d047c79352a2b5d32b72f4323dadfa2be0db8a45/notebook/automl_flight_delays.ipynb)
|
||||
- [Notebook: Hyperparameter Tuning with FLAML](https://github.com/microsoft/FLAML/blob/d047c79352a2b5d32b72f4323dadfa2be0db8a45/notebook/tune_synapseml.ipynb)
|
||||
|
||||
### **Part 3. Deep Dive into FLAML**
|
||||
- Advanced Functionalities
|
||||
- Parallelization with Apache Spark
|
||||
- [Notebook: FLAML AutoML on Apache Spark](https://github.com/microsoft/FLAML/blob/d047c79352a2b5d32b72f4323dadfa2be0db8a45/notebook/automl_bankrupt_synapseml.ipynb)
|
||||
|
||||
### **Part 4. New features in FLAML**
|
||||
- Targeted Hyperparameter Optimization With Lexicographic Objectives
|
||||
- [Notebook: Tune models with lexicographic preference across objectives](https://github.com/microsoft/FLAML/blob/7ae410c8eb967e2084b2e7dbe7d5fa2145a44b79/notebook/tune_lexicographic.ipynb)
|
||||
- OpenAI GPT-3, GPT-4 and ChatGPT tuning
|
||||
- [Notebook: Use FLAML to Tune OpenAI Models](https://github.com/microsoft/FLAML/blob/a0b318b12ee8288db54b674904655307f9e201c2/notebook/autogen_openai_completion.ipynb)
|
||||
- [Notebook: Use FLAML to Tune ChatGPT](https://github.com/microsoft/FLAML/blob/a0b318b12ee8288db54b674904655307f9e201c2/notebook/autogen_chatgpt_gpt4.ipynb)
|
Loading…
Reference in New Issue