mirror of https://github.com/microsoft/autogen.git
136 lines
6.9 KiB
Plaintext
136 lines
6.9 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## RAG OpenAI Assistants in AutoGen\n",
|
|
"\n",
|
|
"This notebook shows an example of the [`GPTAssistantAgent`](https://github.com/microsoft/autogen/blob/main/autogen/agentchat/contrib/gpt_assistant_agent.py#L16C43-L16C43) with retrieval augmented generation. `GPTAssistantAgent` is an experimental AutoGen agent class that leverages the [OpenAI Assistant API](https://platform.openai.com/docs/assistants/overview) for conversational capabilities, working with\n",
|
|
"`UserProxyAgent` in AutoGen."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 1,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"name": "stderr",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"OpenAI client config of GPTAssistantAgent(assistant) - model: gpt-4-turbo-preview\n",
|
|
"GPT Assistant only supports one OpenAI client. Using the first client in the list.\n",
|
|
"Matching assistant found, using the first matching assistant: {'id': 'asst_sKUCUXkaXyTidtlyovbqppH3', 'created_at': 1710320924, 'description': None, 'file_ids': ['file-AcnBk5PCwAjJMCVO0zLSbzKP'], 'instructions': 'You are adapt at question answering', 'metadata': {}, 'model': 'gpt-4-turbo-preview', 'name': 'assistant', 'object': 'assistant', 'tools': [ToolRetrieval(type='retrieval')]}\n"
|
|
]
|
|
},
|
|
{
|
|
"name": "stdout",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"\u001b[33muser_proxy\u001b[0m (to assistant):\n",
|
|
"\n",
|
|
"Please explain the code in this file!\n",
|
|
"\n",
|
|
"--------------------------------------------------------------------------------\n",
|
|
"\u001b[33massistant\u001b[0m (to user_proxy):\n",
|
|
"\n",
|
|
"The code in the file appears to define tests for various functionalities related to a GPT-based assistant agent. Here is a summary of the main components and functionalities described in the visible portion of the code:\n",
|
|
"\n",
|
|
"1. **Imports and Setup**: The script imports necessary libraries and modules, such as `pytest` for testing, `uuid` for generating unique identifiers, and `openai` along with custom modules like `autogen` and `OpenAIWrapper`. It sets up the system path to include specific directories for importing test configurations and dependencies.\n",
|
|
"\n",
|
|
"2. **Conditional Test Skipping**: The script includes logic to conditionally skip tests based on the execution environment or missing dependencies. This is done through the `@pytest.mark.skipif` decorator, which conditionally skips test functions based on the `skip` flag, determined by whether certain imports are successful or other conditions are met.\n",
|
|
"\n",
|
|
"3. **Test Configurations**: The code loads configurations for interacting with the OpenAI API and a hypothetical Azure API (as indicated by the placeholder `azure`), filtering these configurations by certain criteria such as API type and version.\n",
|
|
"\n",
|
|
"4. **Test Cases**:\n",
|
|
" - **Configuration List Test (`test_config_list`)**: Ensures that the configurations for both OpenAI and the hypothetical Azure API are loaded correctly by asserting the presence of at least one configuration for each.\n",
|
|
" - **GPT Assistant Chat Test (`test_gpt_assistant_chat`)**: Tests the functionality of a GPT Assistant Agent by simulating a chat interaction. It uses a mock function to simulate an external API call and checks if the GPT Assistant properly processes the chat input, invokes the external function, and provides an expected response.\n",
|
|
" - **Assistant Instructions Test (`test_get_assistant_instructions` and related functions)**: These tests verify that instructions can be set and retrieved correctly for a GPTAssistantAgent. It covers creating an agent, setting instructions, and ensuring the set instructions can be retrieved as expected.\n",
|
|
" - **Instructions Overwrite Test (`test_gpt_assistant_instructions_overwrite`)**: Examines whether the instructions for a GPTAssistantAgent can be successfully overwritten by creating a new agent with the same ID but different instructions, with an explicit indication to overwrite the previous instructions.\n",
|
|
"\n",
|
|
"Each test case aims to cover different aspects of the GPTAssistantAgent's functionality, such as configuration loading, interactive chat behavior, function registration and invocation, and instruction management. The mocks and assertions within the tests are designed to ensure that each component of the GPTAssistantAgent behaves as expected under controlled conditions.\n",
|
|
"\n",
|
|
"\n",
|
|
"--------------------------------------------------------------------------------\n",
|
|
"\u001b[33muser_proxy\u001b[0m (to assistant):\n",
|
|
"\n",
|
|
"TERMINATE\n",
|
|
"\n",
|
|
"--------------------------------------------------------------------------------\n"
|
|
]
|
|
},
|
|
{
|
|
"name": "stderr",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"Permanently deleting assistant...\n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"import logging\n",
|
|
"import os\n",
|
|
"\n",
|
|
"from autogen import UserProxyAgent, config_list_from_json\n",
|
|
"from autogen.agentchat.contrib.gpt_assistant_agent import GPTAssistantAgent\n",
|
|
"\n",
|
|
"logger = logging.getLogger(__name__)\n",
|
|
"logger.setLevel(logging.WARNING)\n",
|
|
"\n",
|
|
"assistant_id = os.environ.get(\"ASSISTANT_ID\", None)\n",
|
|
"\n",
|
|
"config_list = config_list_from_json(\"OAI_CONFIG_LIST\")\n",
|
|
"llm_config = {\n",
|
|
" \"config_list\": config_list,\n",
|
|
"}\n",
|
|
"assistant_config = {\n",
|
|
" \"assistant_id\": assistant_id,\n",
|
|
" \"tools\": [{\"type\": \"retrieval\"}],\n",
|
|
" \"file_ids\": [\"file-AcnBk5PCwAjJMCVO0zLSbzKP\"],\n",
|
|
" # add id of an existing file in your openai account\n",
|
|
" # in this case I added the implementation of conversable_agent.py\n",
|
|
"}\n",
|
|
"\n",
|
|
"gpt_assistant = GPTAssistantAgent(\n",
|
|
" name=\"assistant\",\n",
|
|
" instructions=\"You are adapt at question answering\",\n",
|
|
" llm_config=llm_config,\n",
|
|
" assistant_config=assistant_config,\n",
|
|
")\n",
|
|
"\n",
|
|
"user_proxy = UserProxyAgent(\n",
|
|
" name=\"user_proxy\",\n",
|
|
" code_execution_config=False,\n",
|
|
" is_termination_msg=lambda msg: \"TERMINATE\" in msg[\"content\"],\n",
|
|
" human_input_mode=\"ALWAYS\",\n",
|
|
")\n",
|
|
"user_proxy.initiate_chat(gpt_assistant, message=\"Please explain the code in this file!\")\n",
|
|
"\n",
|
|
"gpt_assistant.delete_assistant()"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "Python 3",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.10.12"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 2
|
|
}
|