diff --git a/README.md b/README.md index 13a49993c..4f315690a 100644 --- a/README.md +++ b/README.md @@ -47,16 +47,18 @@ pip install pyautogen ``` Minimal dependencies are installed without extra options. You can install extra options based on the feature you need. -For example, use the following to install the dependencies needed by the [`blendsearch`](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function#blendsearch-economical-hyperparameter-optimization-with-blended-search-strategy) option. + Find more options in [Installation](https://microsoft.github.io/autogen/docs/Installation). +For [code execution](https://microsoft.github.io/autogen/FAQ#code-execution), we strongly recommend installing the python docker package, and using docker. For LLM inference configurations, check the [FAQ](https://microsoft.github.io/autogen/docs/FAQ#set-your-api-endpoints). + ## Quickstart ## Multi-Agent Conversation Framework diff --git a/autogen/agentchat/conversable_agent.py b/autogen/agentchat/conversable_agent.py index 7ab7cd349..1c875da84 100644 --- a/autogen/agentchat/conversable_agent.py +++ b/autogen/agentchat/conversable_agent.py @@ -87,9 +87,9 @@ class ConversableAgent(Agent): If a list or a str of image name(s) is provided, the code will be executed in a docker container with the first image successfully pulled. If None, False or empty, the code will be executed in the current environment. - Default is True, which will be converted into a list. - If the code is executed in the current environment, - the code must be trusted. + Default is True when the docker python package is installed. + When set to True, a default list will be used. + We strongly recommend using docker for code execution. - timeout (Optional, int): The maximum execution time in seconds. - last_n_messages (Experimental, Optional, int): The number of messages to look back for code execution. Default to 1. llm_config (dict or False): llm inference configuration. diff --git a/website/docs/Contribute.md b/website/docs/Contribute.md index 55f20694c..49d44ef53 100644 --- a/website/docs/Contribute.md +++ b/website/docs/Contribute.md @@ -75,7 +75,7 @@ docker run -it autogen-dev ### Develop in Remote Container If you use vscode, you can open the autogen folder in a [Container](https://code.visualstudio.com/docs/remote/containers). -We have provided the configuration in [devcontainer](https://github.com/microsoft/autogen/blob/main/.devcontainer). +We have provided the configuration in [devcontainer](https://github.com/microsoft/autogen/blob/main/.devcontainer). They can be used in GitHub codespace too. Developing AutoGen in dev containers is recommended. ### Pre-commit diff --git a/website/docs/FAQ.md b/website/docs/FAQ.md index 8cac002b3..57eb30103 100644 --- a/website/docs/FAQ.md +++ b/website/docs/FAQ.md @@ -134,3 +134,10 @@ in the system message. This line is in the default system message of the `Assist If the `# filename` doesn't appear in the suggested code still, consider adding explicit instructions such as "save the code to disk" in the initial user message in `initiate_chat`. The `AssistantAgent` doesn't save all the code by default, because there are cases in which one would just like to finish a task without saving the code. + +## Code execution + +We strongly recommend using docker to execute code. There are two ways to use docker: + +1. Run autogen in a docker container. For example, when developing in GitHub codespace, the autogen runs in a docker container. +2. Run autogen outside of a docker, while perform code execution with a docker container. For this option, make sure the python package `docker` is installed. When it is not installed and `use_docker` is omitted in `code_execution_config`, the code will be executed locally (this behavior is subject to change in future). diff --git a/website/docs/Getting-Started.md b/website/docs/Getting-Started.md index 05ad8825d..c09296e38 100644 --- a/website/docs/Getting-Started.md +++ b/website/docs/Getting-Started.md @@ -1,5 +1,3 @@ - - # Getting Started @@ -21,7 +19,7 @@ AutoGen is powered by collaborative [research studies](/docs/Research) from Micr ### Quickstart Install from pip: `pip install pyautogen`. Find more options in [Installation](/docs/Installation). - +For [code execution](https://microsoft.github.io/autogen/FAQ#code-execution), we strongly recommend installing the python docker package, and using docker. #### Multi-Agent Conversation Framework Autogen enables the next-gen LLM applications with a generic multi-agent conversation framework. It offers customizable and conversable agents which integrate LLMs, tools and human. diff --git a/website/docs/Installation.md b/website/docs/Installation.md index 1a97f8bf6..f1c70507a 100644 --- a/website/docs/Installation.md +++ b/website/docs/Installation.md @@ -14,6 +14,11 @@ conda install pyautogen -c conda-forge ``` --> ### Optional Dependencies +* docker +We strongly recommend using docker for code execution or running AutoGen in a docker container (e.g., when developing in GitHub codespace, the autogen runs in a docker container). To use docker for code execution, you also need to install the python package `docker`: +```bash +pip install docker +``` * blendsearch ```bash