Improve readability for agentchat_function_call.ipynb (#347)

This commit is contained in:
Shruti Patel 2023-10-21 20:26:34 -04:00 committed by GitHub
parent 15ff52a52c
commit ec423ddf07
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 7 additions and 7 deletions

View File

@ -17,9 +17,9 @@
"source": [
"# Auto Generated Agent Chat: Task Solving with Provided Tools as Functions\n",
"\n",
"AutoGen offers conversable agents powered by LLM, tool or human, which can be used to perform tasks collectively via automated chat. This framework allows tool use and human participation through multi-agent conversation. Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n",
"AutoGen offers conversable agents powered by LLM, tool, or human, which can be used to perform tasks collectively via automated chat. This framework allows tool use and human participation through multi-agent conversation. Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n",
"\n",
"In this notebook, we demonstrate how to use `AssistantAgent` and `UserProxyAgent` to make function calls with the new feature of OpenAI models (in model version 0613). A specified prompt and function configs need to be passed to `AssistantAgent` to initialize the agent. The corresponding functions need to be passed to `UserProxyAgent`, which will be responsible for executing any function calls made by `AssistantAgent`. Besides this requirement of matching descriptions with functions, we recommend checking the system message in the `AssistantAgent` to make sure the instructions align with the function call descriptions.\n",
"In this notebook, we demonstrate how to use `AssistantAgent` and `UserProxyAgent` to make function calls with the new feature of OpenAI models (in model version 0613). A specified prompt and function configs must be passed to `AssistantAgent` to initialize the agent. The corresponding functions must be passed to `UserProxyAgent`, which will execute any function calls made by `AssistantAgent`. Besides this requirement of matching descriptions with functions, we recommend checking the system message in the `AssistantAgent` to ensure the instructions align with the function call descriptions.\n",
"\n",
"## Requirements\n",
"\n",
@ -54,7 +54,7 @@
"- Azure OpenAI API base: os.environ[\"AZURE_OPENAI_API_BASE\"] or `aoai_api_base_file=\"base_aoai.txt\"`. Multiple bases can be stored, one per line.\n",
"\n",
"It's OK to have only the OpenAI API key, or only the Azure OpenAI API key + base.\n",
"If you open this notebook in google colab, you can upload your files by click the file icon on the left panel and then choose \"upload file\" icon.\n",
"If you open this notebook in google colab, you can upload your files by clicking the file icon on the left panel and then choosing \"upload file\" icon.\n",
"\n",
"The following code excludes Azure OpenAI endpoints from the config list because some endpoints don't support functions yet. Remove the `exclude` argument if they do."
]
@ -270,7 +270,7 @@
" code_execution_config={\"work_dir\": \"coding\"},\n",
")\n",
"\n",
"# define functions according to the function desription\n",
"# define functions according to the function description\n",
"from IPython import get_ipython\n",
"\n",
"def exec_python(cell):\n",
@ -374,7 +374,7 @@
"import os\n",
"from autogen.agentchat.contrib.math_user_proxy_agent import MathUserProxyAgent\n",
"\n",
"# you need to provide a wolfram alpha appid to run this example\n",
"# you need to provide a wolfram alpha app id to run this example\n",
"if not os.environ.get(\"WOLFRAM_ALPHA_APPID\"):\n",
" os.environ[\"WOLFRAM_ALPHA_APPID\"] = open(\"wolfram.txt\").read().strip()\n",
"\n",
@ -383,7 +383,7 @@
" \"functions\": [\n",
" {\n",
" \"name\": \"query_wolfram\",\n",
" \"description\": \"Return the API query result from the Wolfram Alpha. the ruturn is a tuple of (result, is_success).\",\n",
" \"description\": \"Return the API query result from the Wolfram Alpha. the return is a tuple of (result, is_success).\",\n",
" \"parameters\": {\n",
" \"type\": \"object\",\n",
" \"properties\": {\n",
@ -400,7 +400,7 @@
"}\n",
"chatbot = autogen.AssistantAgent(\n",
" name=\"chatbot\",\n",
" system_message=\"Only use the functions you have been provided with. Do not ask user to perform other actions than executing the functions. Reply TERMINATE when the task is done.\",\n",
" system_message=\"Only use the functions you have been provided with. Do not ask the user to perform other actions than executing the functions. Reply TERMINATE when the task is done.\",\n",
" llm_config=llm_config,\n",
")\n",
"\n",