document about docker (#119)

* document about docker

* clarify

* dev container
This commit is contained in:
Chi Wang 2023-10-05 12:48:24 -07:00 committed by GitHub
parent 5c2a268d95
commit 20d77a1039
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
6 changed files with 21 additions and 9 deletions

View File

@ -47,16 +47,18 @@ pip install pyautogen
``` ```
Minimal dependencies are installed without extra options. You can install extra options based on the feature you need. Minimal dependencies are installed without extra options. You can install extra options based on the feature you need.
For example, use the following to install the dependencies needed by the [`blendsearch`](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function#blendsearch-economical-hyperparameter-optimization-with-blended-search-strategy) option. <!-- For example, use the following to install the dependencies needed by the [`blendsearch`](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function#blendsearch-economical-hyperparameter-optimization-with-blended-search-strategy) option.
```bash ```bash
pip install "pyautogen[blendsearch]" pip install "pyautogen[blendsearch]"
``` ``` -->
Find more options in [Installation](https://microsoft.github.io/autogen/docs/Installation). Find more options in [Installation](https://microsoft.github.io/autogen/docs/Installation).
<!-- Each of the [`notebook examples`](https://github.com/microsoft/autogen/tree/main/notebook) may require a specific option to be installed. --> <!-- Each of the [`notebook examples`](https://github.com/microsoft/autogen/tree/main/notebook) may require a specific option to be installed. -->
For [code execution](https://microsoft.github.io/autogen/FAQ#code-execution), we strongly recommend installing the python docker package, and using docker.
For LLM inference configurations, check the [FAQ](https://microsoft.github.io/autogen/docs/FAQ#set-your-api-endpoints). For LLM inference configurations, check the [FAQ](https://microsoft.github.io/autogen/docs/FAQ#set-your-api-endpoints).
## Quickstart ## Quickstart
## Multi-Agent Conversation Framework ## Multi-Agent Conversation Framework

View File

@ -87,9 +87,9 @@ class ConversableAgent(Agent):
If a list or a str of image name(s) is provided, the code will be executed in a docker container If a list or a str of image name(s) is provided, the code will be executed in a docker container
with the first image successfully pulled. with the first image successfully pulled.
If None, False or empty, the code will be executed in the current environment. If None, False or empty, the code will be executed in the current environment.
Default is True, which will be converted into a list. Default is True when the docker python package is installed.
If the code is executed in the current environment, When set to True, a default list will be used.
the code must be trusted. We strongly recommend using docker for code execution.
- timeout (Optional, int): The maximum execution time in seconds. - timeout (Optional, int): The maximum execution time in seconds.
- last_n_messages (Experimental, Optional, int): The number of messages to look back for code execution. Default to 1. - last_n_messages (Experimental, Optional, int): The number of messages to look back for code execution. Default to 1.
llm_config (dict or False): llm inference configuration. llm_config (dict or False): llm inference configuration.

View File

@ -75,7 +75,7 @@ docker run -it autogen-dev
### Develop in Remote Container ### Develop in Remote Container
If you use vscode, you can open the autogen folder in a [Container](https://code.visualstudio.com/docs/remote/containers). If you use vscode, you can open the autogen folder in a [Container](https://code.visualstudio.com/docs/remote/containers).
We have provided the configuration in [devcontainer](https://github.com/microsoft/autogen/blob/main/.devcontainer). We have provided the configuration in [devcontainer](https://github.com/microsoft/autogen/blob/main/.devcontainer). They can be used in GitHub codespace too. Developing AutoGen in dev containers is recommended.
### Pre-commit ### Pre-commit

View File

@ -134,3 +134,10 @@ in the system message. This line is in the default system message of the `Assist
If the `# filename` doesn't appear in the suggested code still, consider adding explicit instructions such as "save the code to disk" in the initial user message in `initiate_chat`. If the `# filename` doesn't appear in the suggested code still, consider adding explicit instructions such as "save the code to disk" in the initial user message in `initiate_chat`.
The `AssistantAgent` doesn't save all the code by default, because there are cases in which one would just like to finish a task without saving the code. The `AssistantAgent` doesn't save all the code by default, because there are cases in which one would just like to finish a task without saving the code.
## Code execution
We strongly recommend using docker to execute code. There are two ways to use docker:
1. Run autogen in a docker container. For example, when developing in GitHub codespace, the autogen runs in a docker container.
2. Run autogen outside of a docker, while perform code execution with a docker container. For this option, make sure the python package `docker` is installed. When it is not installed and `use_docker` is omitted in `code_execution_config`, the code will be executed locally (this behavior is subject to change in future).

View File

@ -1,5 +1,3 @@
# Getting Started # Getting Started
<!-- ### Welcome to AutoGen, a library for enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework! --> <!-- ### Welcome to AutoGen, a library for enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework! -->
@ -21,7 +19,7 @@ AutoGen is powered by collaborative [research studies](/docs/Research) from Micr
### Quickstart ### Quickstart
Install from pip: `pip install pyautogen`. Find more options in [Installation](/docs/Installation). Install from pip: `pip install pyautogen`. Find more options in [Installation](/docs/Installation).
For [code execution](https://microsoft.github.io/autogen/FAQ#code-execution), we strongly recommend installing the python docker package, and using docker.
#### Multi-Agent Conversation Framework #### Multi-Agent Conversation Framework
Autogen enables the next-gen LLM applications with a generic multi-agent conversation framework. It offers customizable and conversable agents which integrate LLMs, tools and human. Autogen enables the next-gen LLM applications with a generic multi-agent conversation framework. It offers customizable and conversable agents which integrate LLMs, tools and human.

View File

@ -14,6 +14,11 @@ conda install pyautogen -c conda-forge
``` --> ``` -->
### Optional Dependencies ### Optional Dependencies
* docker
We strongly recommend using docker for code execution or running AutoGen in a docker container (e.g., when developing in GitHub codespace, the autogen runs in a docker container). To use docker for code execution, you also need to install the python package `docker`:
```bash
pip install docker
```
* blendsearch * blendsearch
```bash ```bash