Bring Dotnet AutoGen (#924)

* update readme

* update

* update

* update

* update

* update

* update

* add sample project

* revert notebook change back

* update

* update interactive version

* add nuget package

* refactor Message

* update example

* add azure nightly build pipeline

* Set up CI with Azure Pipelines

[skip ci]

* Update nightly-build.yml for Azure Pipelines

* add dotnet interactive package

* add dotnet interactive package

* update pipeline

* add nuget feed back

* remove dotnet-tool feed

* remove dotnet-tool feed comment

* update pipeline

* update build name

* Update nightly-build.yml

* Delete .github/workflows/dotnet-ci.yml

* update

* add working_dir to use step

* add initateChat api

* update oai package

* Update dotnet-build.yml

* Update dotnet-run-openai-test-and-notebooks.yml

* update build workflow

* update build workflow

* update nuget feed

* update nuget feed

* update aoai and sk version

* Update InteractiveService.cs

* add support for GPT 4V

* add DalleAndGPT4V example

* update example

* add user proxy agent

* add readme

* bump version

* update example

* add dotnet interactive hook

* update

* udpate tests

* add website

* update index.md

* add docs

* update doc

* move sk dependency out of core package

* udpate doc

* Update Use-function-call.md

* add type safe function call document

* update doc

* update doc

* add dock

* Update Use-function-call.md

* add GenerateReplyOptions

* remove IChatLLM

* update version

* update doc

* update website

* add sample

* fix link

* add middleware agent

* clean up doc

* bump version

* update doc

* update

* add Other Language

* remove warnings

* add sign.props

* add sign step

* fix pipelien

* auth

* real sign

* disable PR trigger

* update

* disable PR trigger

* use microbuild machine

* update build pipeline to add publish to internal feed

* add internal feed

* fix build pipeline

* add dotnet prefix

* update ci

* add build number

* update run number

* update source

* update token

* update

* remove adding source

* add publish to github package

* try again

* try again

* ask for write pacakge

* disable package when branch is not main

* update

* implement streaming agent

* add test for streaming function call

* update

* fix #1588

* enable PR check for dotnet branch

* add website readme

* only publish to dotnet feed when pushing to dotnet branch

* remove openai-test-and-notebooks workflow

* update readme

* update readme

* update workflow

* update getting-start

* upgrade test and sample proejct to use .net 8

* fix global.json format && make loadFromConfig API internal only before implementing

* update

* add support for LM studio

* add doc

* Update README.md

* add push and workflow_dispatch trigger

* disable PR for main

* add dotnet env

* Update Installation.md

* add nuget

* refer to newtonsoft 13

* update branch to dotnet in docfx

* Update Installation.md

* pull out HumanInputMiddleware and FunctionCallMiddleware

* fix tests

* add link to sample folder

* refactor message

* refactor over IMessage

* add more tests

* add more test

* fix build error

* rename header

* add semantic kernel project

* update sk example

* update dotnet version

* add LMStudio function call example

* rename LLaMAFunctin

* remove dotnet run openai test and notebook workflow

* add FunctionContract and test

* update doc

* add documents

* add workflow

* update

* update sample

* fix warning in test

* reult length can be less then maximumOutputToKeep (#1804)

* merge with main

* add option to retrieve inner agent and middlewares from MiddlewareAgent

* update doc

* adjust namespace

* update readme

* fix test

* use IMessage

* more updates

* update

* fix test

* add comments

* use FunctionContract to replace FunctionDefinition

* move AutoGen contrac to AutoGen.Core

* update installation

* refactor streamingAgent by adding StreamingMessage type

* update sample

* update samples

* update

* update

* add test

* fix test

* bump version

* add openaichat test

* update

* Update Example03_Agent_FunctionCall.cs

* [.Net] improve docs (#1862)

* add doc

* add doc

* add doc

* add doc

* add doc

* add doc

* update

* fix test error

* fix some error

* fix test

* fix test

* add more tests

* edits

---------

Co-authored-by: ekzhu <ekzhu@users.noreply.github.com>

* [.Net] Add fill form example (#1911)

* add form filler example

* update

* fix ci error

* [.Net] Add using AutoGen.Core in source generator (#1983)

* fix using namespace bug in source generator

* remove using in sourcegenerator test

* disable PR test

* Add .idea to .gitignore (#1988)

* [.Net] publish to nuget.org feed (#1987)

* publish to nuget

* update ci

* update dotnet-release

* update release pipeline

* add source

* remove empty symbol package

* update pipeline

* remove tag

* update installation guide

* [.Net] Rename some classes && APIs based on doc review (#1980)

* rename sequential group chat to round robin group chat

* rename to sendInstruction

* rename workflow to graph

* rename some api

* bump version

* move Graph to GroupChat folder

* rename fill application example

* [.Net] Improve package description (#2161)

* add discord link and update package description

* Update getting-start.md

* [.Net] Fix document comment from the most recent AutoGen.Net engineer sync (#2231)

* update

* rename RegisterPrintMessageHook to RegisterPrintMessage

* update website

* update update.md

* fix link error

* [.Net] Enable JsonMode and deterministic output in AutoGen.OpenAI OpenAIChatAgent (#2347)

* update openai version && add sample for json output

* add example in web

* update update.md

* update image url

* [.Net] Add AutoGen.Mistral package (#2330)

* add mstral client

* enable streaming support

* add mistralClientAgent

* add test for function call

* add extension

* add support for toolcall and toolcall result message

* add support for aggregate message

* implement streaming function call

* track (#2471)

* [.Net] add mistral example (#2482)

* update existing examples to use messageCOnnector

* add overview

* add function call document

* add example 14

* add mistral token count usage example

* update version

* Update dotnet-release.yml (#2488)

* update

* revert gitattributes

---------

Co-authored-by: mhensen <mh@webvize.nl>
Co-authored-by: ekzhu <ekzhu@users.noreply.github.com>
Co-authored-by: Krzysztof Kasprowicz <60486987+Krzysztof318@users.noreply.github.com>
This commit is contained in:
Xiaoyun Zhang 2024-04-26 09:21:46 -07:00 committed by GitHub
parent fbcc56c90e
commit 600bd3f2fe
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
226 changed files with 16125 additions and 22 deletions

View File

@ -142,5 +142,4 @@ jobs:
ls -R ./output/nightly ls -R ./output/nightly
dotnet nuget push --api-key ${{ secrets.MYGET_TOKEN }} --source "https://www.myget.org/F/agentchat/api/v3/index.json" ./output/nightly/*.nupkg --skip-duplicate dotnet nuget push --api-key ${{ secrets.MYGET_TOKEN }} --source "https://www.myget.org/F/agentchat/api/v3/index.json" ./output/nightly/*.nupkg --skip-duplicate
env: env:
MYGET_TOKEN: ${{ secrets.MYGET_TOKEN }} MYGET_TOKEN: ${{ secrets.MYGET_TOKEN }}

View File

@ -7,6 +7,7 @@ on:
workflow_dispatch: workflow_dispatch:
push: push:
branches: branches:
- dotnet/release/**
- dotnet/release - dotnet/release
concurrency: concurrency:
@ -59,13 +60,6 @@ jobs:
echo "Publish package to Nuget" echo "Publish package to Nuget"
echo "ls output directory" echo "ls output directory"
ls -R ./output/release ls -R ./output/release
dotnet nuget push --api-key AzureArtifacts ./output/release/*.nupkg --skip-duplicate --api-key ${{ secrets.AUTOGEN_NUGET_API_KEY }} # remove AutoGen.SourceGenerator.snupkg because it's an empty package
- name: Tag commit rm ./output/release/AutoGen.SourceGenerator.*.snupkg
run: | dotnet nuget push --api-key ${{ secrets.AUTOGEN_NUGET_API_KEY }} --source https://api.nuget.org/v3/index.json ./output/release/*.nupkg --skip-duplicate
Write-Host "Tag commit"
# version = eng/MetaInfo.props.Project.PropertyGroup.VersionPrefix
$metaInfoContent = cat ./eng/MetaInfo.props
$version = $metaInfoContent | Select-String -Pattern "<VersionPrefix>(.*)</VersionPrefix>" | ForEach-Object { $_.Matches.Groups[1].Value }
git tag -a "$version" -m "AutoGen.Net release $version"
git push origin --tags
shell: pwsh

View File

@ -1,12 +1,18 @@
{ {
"version": 1, "version": 1,
"isRoot": true, "isRoot": true,
"tools": { "tools": {
"dotnet-repl": { "dotnet-repl": {
"version": "0.1.205", "version": "0.1.205",
"commands": [ "commands": [
"dotnet-repl" "dotnet-repl"
] ]
} },
"docfx": {
"version": "2.67.5",
"commands": [
"docfx"
]
} }
} }
}

178
dotnet/.editorconfig Normal file
View File

@ -0,0 +1,178 @@
# EditorConfig is awesome:http://EditorConfig.org
# top-most EditorConfig file
root = true
# Don't use tabs for indentation.
[*]
indent_style = space
# (Please don't specify an indent_size here; that has too many unintended consequences.)
# Code files
[*.{cs,csx,vb,vbx}]
indent_size = 4
insert_final_newline = true
charset = utf-8-bom
[*.xaml]
indent_size = 4
[*.ps1]
indent_size = 2
# Xml project files
[*.{csproj,vbproj,vcxproj,vcxproj.filters,proj,projitems,shproj}]
indent_size = 2
# Xml config files
[*.{props,targets,ruleset,config,nuspec,resx,vsixmanifest,vsct}]
indent_size = 2
# JSON files
[*.json]
indent_size = 2
[*.groovy]
indent_size = 2
# Dotnet code style settings:
[*.{cs,vb}]
# Sort using and Import directives with System.* appearing first
dotnet_sort_system_directives_first = true
dotnet_style_require_accessibility_modifiers = always:warning
# No blank line between System.* and Microsoft.*
dotnet_separate_import_directive_groups = false
# Suggest more modern language features when available
dotnet_style_object_initializer = true:suggestion
dotnet_style_collection_initializer = true:suggestion
dotnet_style_coalesce_expression = true:error
dotnet_style_null_propagation = true:error
dotnet_style_explicit_tuple_names = true:suggestion
dotnet_style_prefer_inferred_tuple_names = true:suggestion
dotnet_style_prefer_inferred_anonymous_type_member_names = true:suggestion
dotnet_style_prefer_is_null_check_over_reference_equality_method = true:suggestion
dotnet_style_prefer_conditional_expression_over_return = false
dotnet_style_prefer_conditional_expression_over_assignment = false
dotnet_style_prefer_auto_properties = false
# Use language keywords instead of framework type names for type references
dotnet_style_predefined_type_for_locals_parameters_members = true:error
dotnet_style_predefined_type_for_member_access = true:error
# Prefer read-only on fields
dotnet_style_readonly_field = false
# CSharp code style settings:
[*.cs]
# Prefer "var" only when the type is apparent
csharp_style_var_for_built_in_types = false:suggestion
csharp_style_var_when_type_is_apparent = true:suggestion
csharp_style_var_elsewhere = false:suggestion
# Prefer method-like constructs to have a block body
csharp_style_expression_bodied_methods = false:none
csharp_style_expression_bodied_constructors = false:none
csharp_style_expression_bodied_operators = false:none
# Prefer property-like constructs to have an expression-body
csharp_style_expression_bodied_properties = true:none
csharp_style_expression_bodied_indexers = true:none
csharp_style_expression_bodied_accessors = true:none
# Use block body for local functions
csharp_style_expression_bodied_local_functions = when_on_single_line:silent
# Suggest more modern language features when available
csharp_style_pattern_matching_over_is_with_cast_check = true:error
csharp_style_pattern_matching_over_as_with_null_check = true:error
csharp_style_inlined_variable_declaration = true:error
csharp_style_throw_expression = true:suggestion
csharp_style_conditional_delegate_call = true:suggestion
csharp_style_deconstructed_variable_declaration = true:suggestion
# Newline settings
csharp_new_line_before_open_brace = all
csharp_new_line_before_else = true
csharp_new_line_before_catch = true
csharp_new_line_before_finally = true
csharp_new_line_before_members_in_object_initializers = true
csharp_new_line_before_members_in_anonymous_types = true
csharp_new_line_between_query_expression_clauses = true
# Identation options
csharp_indent_case_contents = true
csharp_indent_case_contents_when_block = true
csharp_indent_switch_labels = true
csharp_indent_labels = no_change
csharp_indent_block_contents = true
csharp_indent_braces = false
# Spacing options
csharp_space_after_cast = false
csharp_space_after_keywords_in_control_flow_statements = true
csharp_space_between_method_call_empty_parameter_list_parentheses = false
csharp_space_between_method_call_parameter_list_parentheses = false
csharp_space_between_method_call_name_and_opening_parenthesis = false
csharp_space_between_method_declaration_parameter_list_parentheses = false
csharp_space_between_method_declaration_empty_parameter_list_parentheses = false
csharp_space_between_method_declaration_parameter_list_parentheses = false
csharp_space_between_method_declaration_name_and_open_parenthesis = false
csharp_space_between_parentheses = false
csharp_space_between_square_brackets = false
csharp_space_between_empty_square_brackets = false
csharp_space_before_open_square_brackets = false
csharp_space_around_declaration_statements = false
csharp_space_around_binary_operators = before_and_after
csharp_space_after_cast = false
csharp_space_before_semicolon_in_for_statement = false
csharp_space_before_dot = false
csharp_space_after_dot = false
csharp_space_before_comma = false
csharp_space_after_comma = true
csharp_space_before_colon_in_inheritance_clause = true
csharp_space_after_colon_in_inheritance_clause = true
csharp_space_after_semicolon_in_for_statement = true
# Wrapping
csharp_preserve_single_line_statements = true
csharp_preserve_single_line_blocks = true
# Code block
csharp_prefer_braces = false:none
# Using statements
csharp_using_directive_placement = outside_namespace:error
# Modifier settings
csharp_prefer_static_local_function = true:warning
csharp_preferred_modifier_order = public,private,protected,internal,static,extern,new,virtual,abstract,sealed,override,readonly,unsafe,volatile,async:warning
# Header template
file_header_template = Copyright (c) Microsoft Corporation. All rights reserved.\n{fileName}
dotnet_diagnostic.IDE0073.severity = error
# enable format error
dotnet_diagnostic.IDE0055.severity = error
# IDE0035: Remove unreachable code
dotnet_diagnostic.IDE0035.severity = error
# IDE0005: Remove unncecessary usings
dotnet_diagnostic.CS8019.severity = error
dotnet_diagnostic.IDE0005.severity = error
# IDE0069: Remove unused local variable
dotnet_diagnostic.IDE0069.severity = error
# disable CS1573: Parameter has no matching param tag in the XML comment for
dotnet_diagnostic.CS1573.severity = none
# disable CS1570: XML comment has badly formed XML
dotnet_diagnostic.CS1570.severity = none
# disable check for generated code
[*.generated.cs]
generated_code = true

30
dotnet/.gitignore vendored Normal file
View File

@ -0,0 +1,30 @@
# gitignore file for C#/VS
# Build results
[Dd]ebug/
[Dd]ebugPublic/
[Rr]elease/
[Rr]eleases/
x64/
x86/
build/
bld/
[Bb]in/
[Oo]bj/
# vs cache
.vs/
# vs code cache
.vscode/
# Properties
Properties/
artifacts/
output/
*.binlog
# JetBrains Rider
.idea/

111
dotnet/AutoGen.sln Normal file
View File

@ -0,0 +1,111 @@

Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Version 17
VisualStudioVersion = 17.8.34322.80
MinimumVisualStudioVersion = 10.0.40219.1
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen", "src\AutoGen\AutoGen.csproj", "{B2B27ACB-AA50-4FED-A06C-3AD6B4218188}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "src", "src", "{18BF8DD7-0585-48BF-8F97-AD333080CE06}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "test", "test", "{F823671B-3ECA-4AE6-86DA-25E920D3FE64}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.Tests", "test\AutoGen.Tests\AutoGen.Tests.csproj", "{FDD99AEC-4C57-4020-B23F-650612856102}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.SourceGenerator", "src\AutoGen.SourceGenerator\AutoGen.SourceGenerator.csproj", "{3FFD14E3-D6BC-4EA7-97A2-D21733060FD6}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.SourceGenerator.Tests", "test\AutoGen.SourceGenerator.Tests\AutoGen.SourceGenerator.Tests.csproj", "{05A2FAD8-03B0-4B2F-82AF-2F6BF0F050E5}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.BasicSample", "sample\AutoGen.BasicSamples\AutoGen.BasicSample.csproj", "{7EBF916A-A7B1-4B74-AF10-D705B7A18F58}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "sample", "sample", "{FBFEAD1F-29EB-4D99-A672-0CD8473E10B9}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.DotnetInteractive", "src\AutoGen.DotnetInteractive\AutoGen.DotnetInteractive.csproj", "{B61D8008-7FB7-4C0E-8044-3A74AA63A596}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.LMStudio", "src\AutoGen.LMStudio\AutoGen.LMStudio.csproj", "{F98BDA9B-8657-4BA8-9B03-BAEA454CAE60}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.SemanticKernel", "src\AutoGen.SemanticKernel\AutoGen.SemanticKernel.csproj", "{45D6FC80-36F3-4967-9663-E20B63824621}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.Core", "src\AutoGen.Core\AutoGen.Core.csproj", "{D58D43D1-0617-4A3D-9932-C773E6398535}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "AutoGen.OpenAI", "src\AutoGen.OpenAI\AutoGen.OpenAI.csproj", "{63445BB7-DBB9-4AEF-9D6F-98BBE75EE1EC}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "AutoGen.Mistral", "src\AutoGen.Mistral\AutoGen.Mistral.csproj", "{6585D1A4-3D97-4D76-A688-1933B61AEB19}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "AutoGen.Mistral.Tests", "test\AutoGen.Mistral.Tests\AutoGen.Mistral.Tests.csproj", "{15441693-3659-4868-B6C1-B106F52FF3BA}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
Release|Any CPU = Release|Any CPU
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{B2B27ACB-AA50-4FED-A06C-3AD6B4218188}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{B2B27ACB-AA50-4FED-A06C-3AD6B4218188}.Debug|Any CPU.Build.0 = Debug|Any CPU
{B2B27ACB-AA50-4FED-A06C-3AD6B4218188}.Release|Any CPU.ActiveCfg = Release|Any CPU
{B2B27ACB-AA50-4FED-A06C-3AD6B4218188}.Release|Any CPU.Build.0 = Release|Any CPU
{FDD99AEC-4C57-4020-B23F-650612856102}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{FDD99AEC-4C57-4020-B23F-650612856102}.Debug|Any CPU.Build.0 = Debug|Any CPU
{FDD99AEC-4C57-4020-B23F-650612856102}.Release|Any CPU.ActiveCfg = Release|Any CPU
{FDD99AEC-4C57-4020-B23F-650612856102}.Release|Any CPU.Build.0 = Release|Any CPU
{3FFD14E3-D6BC-4EA7-97A2-D21733060FD6}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{3FFD14E3-D6BC-4EA7-97A2-D21733060FD6}.Debug|Any CPU.Build.0 = Debug|Any CPU
{3FFD14E3-D6BC-4EA7-97A2-D21733060FD6}.Release|Any CPU.ActiveCfg = Release|Any CPU
{3FFD14E3-D6BC-4EA7-97A2-D21733060FD6}.Release|Any CPU.Build.0 = Release|Any CPU
{05A2FAD8-03B0-4B2F-82AF-2F6BF0F050E5}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{05A2FAD8-03B0-4B2F-82AF-2F6BF0F050E5}.Debug|Any CPU.Build.0 = Debug|Any CPU
{05A2FAD8-03B0-4B2F-82AF-2F6BF0F050E5}.Release|Any CPU.ActiveCfg = Release|Any CPU
{05A2FAD8-03B0-4B2F-82AF-2F6BF0F050E5}.Release|Any CPU.Build.0 = Release|Any CPU
{7EBF916A-A7B1-4B74-AF10-D705B7A18F58}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{7EBF916A-A7B1-4B74-AF10-D705B7A18F58}.Debug|Any CPU.Build.0 = Debug|Any CPU
{7EBF916A-A7B1-4B74-AF10-D705B7A18F58}.Release|Any CPU.ActiveCfg = Release|Any CPU
{7EBF916A-A7B1-4B74-AF10-D705B7A18F58}.Release|Any CPU.Build.0 = Release|Any CPU
{B61D8008-7FB7-4C0E-8044-3A74AA63A596}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{B61D8008-7FB7-4C0E-8044-3A74AA63A596}.Debug|Any CPU.Build.0 = Debug|Any CPU
{B61D8008-7FB7-4C0E-8044-3A74AA63A596}.Release|Any CPU.ActiveCfg = Release|Any CPU
{B61D8008-7FB7-4C0E-8044-3A74AA63A596}.Release|Any CPU.Build.0 = Release|Any CPU
{F98BDA9B-8657-4BA8-9B03-BAEA454CAE60}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{F98BDA9B-8657-4BA8-9B03-BAEA454CAE60}.Debug|Any CPU.Build.0 = Debug|Any CPU
{F98BDA9B-8657-4BA8-9B03-BAEA454CAE60}.Release|Any CPU.ActiveCfg = Release|Any CPU
{F98BDA9B-8657-4BA8-9B03-BAEA454CAE60}.Release|Any CPU.Build.0 = Release|Any CPU
{45D6FC80-36F3-4967-9663-E20B63824621}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{45D6FC80-36F3-4967-9663-E20B63824621}.Debug|Any CPU.Build.0 = Debug|Any CPU
{45D6FC80-36F3-4967-9663-E20B63824621}.Release|Any CPU.ActiveCfg = Release|Any CPU
{45D6FC80-36F3-4967-9663-E20B63824621}.Release|Any CPU.Build.0 = Release|Any CPU
{D58D43D1-0617-4A3D-9932-C773E6398535}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{D58D43D1-0617-4A3D-9932-C773E6398535}.Debug|Any CPU.Build.0 = Debug|Any CPU
{D58D43D1-0617-4A3D-9932-C773E6398535}.Release|Any CPU.ActiveCfg = Release|Any CPU
{D58D43D1-0617-4A3D-9932-C773E6398535}.Release|Any CPU.Build.0 = Release|Any CPU
{63445BB7-DBB9-4AEF-9D6F-98BBE75EE1EC}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{63445BB7-DBB9-4AEF-9D6F-98BBE75EE1EC}.Debug|Any CPU.Build.0 = Debug|Any CPU
{63445BB7-DBB9-4AEF-9D6F-98BBE75EE1EC}.Release|Any CPU.ActiveCfg = Release|Any CPU
{63445BB7-DBB9-4AEF-9D6F-98BBE75EE1EC}.Release|Any CPU.Build.0 = Release|Any CPU
{6585D1A4-3D97-4D76-A688-1933B61AEB19}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{6585D1A4-3D97-4D76-A688-1933B61AEB19}.Debug|Any CPU.Build.0 = Debug|Any CPU
{6585D1A4-3D97-4D76-A688-1933B61AEB19}.Release|Any CPU.ActiveCfg = Release|Any CPU
{6585D1A4-3D97-4D76-A688-1933B61AEB19}.Release|Any CPU.Build.0 = Release|Any CPU
{15441693-3659-4868-B6C1-B106F52FF3BA}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{15441693-3659-4868-B6C1-B106F52FF3BA}.Debug|Any CPU.Build.0 = Debug|Any CPU
{15441693-3659-4868-B6C1-B106F52FF3BA}.Release|Any CPU.ActiveCfg = Release|Any CPU
{15441693-3659-4868-B6C1-B106F52FF3BA}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
GlobalSection(NestedProjects) = preSolution
{B2B27ACB-AA50-4FED-A06C-3AD6B4218188} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{FDD99AEC-4C57-4020-B23F-650612856102} = {F823671B-3ECA-4AE6-86DA-25E920D3FE64}
{3FFD14E3-D6BC-4EA7-97A2-D21733060FD6} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{05A2FAD8-03B0-4B2F-82AF-2F6BF0F050E5} = {F823671B-3ECA-4AE6-86DA-25E920D3FE64}
{7EBF916A-A7B1-4B74-AF10-D705B7A18F58} = {FBFEAD1F-29EB-4D99-A672-0CD8473E10B9}
{B61D8008-7FB7-4C0E-8044-3A74AA63A596} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{F98BDA9B-8657-4BA8-9B03-BAEA454CAE60} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{45D6FC80-36F3-4967-9663-E20B63824621} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{D58D43D1-0617-4A3D-9932-C773E6398535} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{63445BB7-DBB9-4AEF-9D6F-98BBE75EE1EC} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{6585D1A4-3D97-4D76-A688-1933B61AEB19} = {18BF8DD7-0585-48BF-8F97-AD333080CE06}
{15441693-3659-4868-B6C1-B106F52FF3BA} = {F823671B-3ECA-4AE6-86DA-25E920D3FE64}
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {93384647-528D-46C8-922C-8DB36A382F0B}
EndGlobalSection
EndGlobal

View File

@ -0,0 +1,23 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Import Project="./eng/Version.props" />
<Import Project="./eng/MetaInfo.props" />
<Import Project="./eng/Sign.props" />
<PropertyGroup>
<TestTargetFramework>net8.0</TestTargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<SignAssembly>True</SignAssembly>
<AssemblyOriginatorKeyFile>$(MSBuildThisFileDirectory)eng/opensource.snk</AssemblyOriginatorKeyFile>
<CSNoWarn>CS1998;CS1591</CSNoWarn>
<NoWarn>$(NoWarn);$(CSNoWarn);NU5104</NoWarn>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<IsPackable>false</IsPackable>
<EnableNetAnalyzers>true</EnableNetAnalyzers>
<EnforceCodeStyleInBuild>true</EnforceCodeStyleInBuild>
</PropertyGroup>
<PropertyGroup>
<RepoRoot>$(MSBuildThisFileDirectory)../</RepoRoot>
</PropertyGroup>
</Project>

10
dotnet/NuGet.config Normal file
View File

@ -0,0 +1,10 @@
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
<clear />
<add key="dotnet-public" value="https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-public/nuget/v3/index.json" />
<add key="dotnet-tools" value="https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-tools/nuget/v3/index.json" />
<add key="nuget" value="https://api.nuget.org/v3/index.json" />
</packageSources>
<disabledPackageSources />
</configuration>

103
dotnet/README.md Normal file
View File

@ -0,0 +1,103 @@
### AutoGen for .NET
[![dotnet-ci](https://github.com/microsoft/autogen/actions/workflows/dotnet-build.yml/badge.svg)](https://github.com/microsoft/autogen/actions/workflows/dotnet-build.yml)
[![NuGet version](https://badge.fury.io/nu/AutoGen.Core.svg)](https://badge.fury.io/nu/AutoGen.Core)
> [!NOTE]
> Nightly build is available at:
> - ![Static Badge](https://img.shields.io/badge/public-blue?style=flat) ![Static Badge](https://img.shields.io/badge/nightly-yellow?style=flat) ![Static Badge](https://img.shields.io/badge/github-grey?style=flat): https://nuget.pkg.github.com/microsoft/index.json
> - ![Static Badge](https://img.shields.io/badge/public-blue?style=flat) ![Static Badge](https://img.shields.io/badge/nightly-yellow?style=flat) ![Static Badge](https://img.shields.io/badge/myget-grey?style=flat): https://www.myget.org/F/agentchat/api/v3/index.json
> - ![Static Badge](https://img.shields.io/badge/internal-blue?style=flat) ![Static Badge](https://img.shields.io/badge/nightly-yellow?style=flat) ![Static Badge](https://img.shields.io/badge/azure_devops-grey?style=flat) : https://devdiv.pkgs.visualstudio.com/DevDiv/_packaging/AutoGen/nuget/v3/index.json
Firstly, following the [installation guide](./website/articles/Installation.md) to install AutoGen packages.
Then you can start with the following code snippet to create a conversable agent and chat with it.
```csharp
using AutoGen;
using AutoGen.OpenAI;
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var gpt35Config = new OpenAIConfig(openAIKey, "gpt-3.5-turbo");
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = [gpt35Config],
})
.RegisterPrintMessage(); // register a hook to print message nicely to console
// set human input mode to ALWAYS so that user always provide input
var userProxyAgent = new UserProxyAgent(
name: "user",
humanInputMode: ConversableAgent.HumanInputMode.ALWAYS)
.RegisterPrintMessage();
// start the conversation
await userProxyAgent.InitiateChatAsync(
receiver: assistantAgent,
message: "Hey assistant, please do me a favor.",
maxRound: 10);
```
#### Samples
You can find more examples under the [sample project](https://github.com/microsoft/autogen/tree/dotnet/dotnet/sample/AutoGen.BasicSamples).
#### Functionality
- ConversableAgent
- [x] function call
- [x] code execution (dotnet only, powered by [`dotnet-interactive`](https://github.com/dotnet/interactive))
- Agent communication
- [x] Two-agent chat
- [x] Group chat
- [ ] Enhanced LLM Inferences
- Exclusive for dotnet
- [x] Source generator for type-safe function definition generation
#### Update log
##### Update on 0.0.11 (2024-03-26)
- Add link to Discord channel in nuget's readme.md
- Document improvements
##### Update on 0.0.10 (2024-03-12)
- Rename `Workflow` to `Graph`
- Rename `AddInitializeMessage` to `SendIntroduction`
- Rename `SequentialGroupChat` to `RoundRobinGroupChat`
##### Update on 0.0.9 (2024-03-02)
- Refactor over @AutoGen.Message and introducing `TextMessage`, `ImageMessage`, `MultiModalMessage` and so on. PR [#1676](https://github.com/microsoft/autogen/pull/1676)
- Add `AutoGen.SemanticKernel` to support seamless integration with Semantic Kernel
- Move the agent contract abstraction to `AutoGen.Core` package. The `AutoGen.Core` package provides the abstraction for message type, agent and group chat and doesn't contain dependencies over `Azure.AI.OpenAI` or `Semantic Kernel`. This is useful when you want to leverage AutoGen's abstraction only and want to avoid introducing any other dependencies.
- Move `GPTAgent`, `OpenAIChatAgent` and all openai-dependencies to `AutoGen.OpenAI`
##### Update on 0.0.8 (2024-02-28)
- Fix [#1804](https://github.com/microsoft/autogen/pull/1804)
- Streaming support for IAgent [#1656](https://github.com/microsoft/autogen/pull/1656)
- Streaming support for middleware via `MiddlewareStreamingAgent` [#1656](https://github.com/microsoft/autogen/pull/1656)
- Graph chat support with conditional transition workflow [#1761](https://github.com/microsoft/autogen/pull/1761)
- AutoGen.SourceGenerator: Generate `FunctionContract` from `FunctionAttribute` [#1736](https://github.com/microsoft/autogen/pull/1736)
##### Update on 0.0.7 (2024-02-11)
- Add `AutoGen.LMStudio` to support comsume openai-like API from LMStudio local server
##### Update on 0.0.6 (2024-01-23)
- Add `MiddlewareAgent`
- Use `MiddlewareAgent` to implement existing agent hooks (RegisterPreProcess, RegisterPostProcess, RegisterReply)
- Remove `AutoReplyAgent`, `PreProcessAgent`, `PostProcessAgent` because they are replaced by `MiddlewareAgent`
##### Update on 0.0.5
- Simplify `IAgent` interface by removing `ChatLLM` Property
- Add `GenerateReplyOptions` to `IAgent.GenerateReplyAsync` which allows user to specify or override the options when generating reply
##### Update on 0.0.4
- Move out dependency of Semantic Kernel
- Add type `IChatLLM` as connector to LLM
##### Update on 0.0.3
- In AutoGen.SourceGenerator, rename FunctionAttribution to FunctionAttribute
- In AutoGen, refactor over ConversationAgent, UserProxyAgent, and AssistantAgent
##### Update on 0.0.2
- update Azure.OpenAI.AI to 1.0.0-beta.12
- update Semantic kernel to 1.0.1

12
dotnet/eng/MetaInfo.props Normal file
View File

@ -0,0 +1,12 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<VersionPrefix>0.0.12</VersionPrefix>
<Authors>AutoGen</Authors>
<PackageProjectUrl>https://microsoft.github.io/autogen-for-net/</PackageProjectUrl>
<RepositoryUrl>https://github.com/microsoft/autogen</RepositoryUrl>
<RepositoryType>git</RepositoryType>
<PackageLicenseExpression>MIT</PackageLicenseExpression>
<PackageRequireLicenseAcceptance>false</PackageRequireLicenseAcceptance>
</PropertyGroup>
</Project>

22
dotnet/eng/Sign.props Normal file
View File

@ -0,0 +1,22 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<SignType></SignType>
</PropertyGroup>
<ItemGroup Condition="'$(SignType)' == 'Test' OR '$(SignType)' == 'REAL'">
<PackageReference Include="Microsoft.VisualStudioEng.MicroBuild.Core" Version="1.0.0">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers</IncludeAssets>
</PackageReference>
<FilesToSign Include="$(OutDir)\AutoGen*.dll">
<Authenticode>Microsoft400</Authenticode>
</FilesToSign>
<!-- nuget package -->
<FilesToSign Include="$(OutDir)\AutoGen*.nupkg">
<Authenticode>NuGet</Authenticode>
</FilesToSign>
</ItemGroup>
</Project>

17
dotnet/eng/Version.props Normal file
View File

@ -0,0 +1,17 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<AzureOpenAIVersion>1.0.0-beta.15</AzureOpenAIVersion>
<SemanticKernelVersion>1.7.1</SemanticKernelVersion>
<SemanticKernelExperimentalVersion>1.7.1-alpha</SemanticKernelExperimentalVersion>
<SystemCodeDomVersion>5.0.0</SystemCodeDomVersion>
<MicrosoftCodeAnalysisVersion>4.3.0</MicrosoftCodeAnalysisVersion>
<ApprovalTestVersion>6.0.0</ApprovalTestVersion>
<FluentAssertionVersion>6.8.0</FluentAssertionVersion>
<XUnitVersion>2.4.2</XUnitVersion>
<MicrosoftNETTestSdkVersion>17.7.0</MicrosoftNETTestSdkVersion>
<MicrosoftDotnetInteractive>1.0.0-beta.23523.2</MicrosoftDotnetInteractive>
<MicrosoftSourceLinkGitHubVersion>8.0.0</MicrosoftSourceLinkGitHubVersion>
<JsonSchemaVersion>4.0.0</JsonSchemaVersion>
</PropertyGroup>
</Project>

BIN
dotnet/eng/opensource.snk Normal file

Binary file not shown.

6
dotnet/global.json Normal file
View File

@ -0,0 +1,6 @@
{
"sdk": {
"version": "8.0.101",
"rollForward": "latestMinor"
}
}

8
dotnet/nuget/NUGET.md Normal file
View File

@ -0,0 +1,8 @@
### About AutoGen for .NET
`AutoGen for .NET` is the official .NET SDK for [AutoGen](https://github.com/microsoft/autogen). It enables you to create LLM agents and construct multi-agent workflows with ease. It also provides integration with popular platforms like OpenAI, Semantic Kernel, and LM Studio.
### Gettings started
- Find documents and examples on our [document site](https://microsoft.github.io/autogen-for-net/)
- Join our [Discord channel](https://discord.gg/pAbnFJrkgZ) to get help and discuss with the community
- Report a bug or request a feature by creating a new issue in our [github repo](https://github.com/microsoft/autogen)
- Consume the nightly build package from one of the [nightly build feeds](https://microsoft.github.io/autogen-for-net/articles/Installation.html#nighly-build)

3
dotnet/nuget/icon.png Normal file
View File

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:02dbf31fea0b92714c80fdc90888da7e96374a1f52c621a939835fd3c876ddcc
size 426084

View File

@ -0,0 +1,54 @@
<Project>
<PropertyGroup>
<IsPackable>true</IsPackable>
<!-- Default description and tags. Packages can override. -->
<Authors>AutoGen</Authors>
<Company>Microsoft</Company>
<Product>AutoGen</Product>
<Description>A programming framework for agentic AI</Description>
<PackageTags>AI, Artificial Intelligence, SDK</PackageTags>
<PackageId>$(AssemblyName)</PackageId>
<!-- Required license, copyright, and repo information. Packages can override. -->
<PackageLicenseExpression>MIT</PackageLicenseExpression>
<Copyright>© Microsoft Corporation. All rights reserved.</Copyright>
<PackageProjectUrl>https://microsoft.github.io/autogen-for-net</PackageProjectUrl>
<RepositoryUrl>https://github.com/microsoft/autogen</RepositoryUrl>
<PublishRepositoryUrl>true</PublishRepositoryUrl>
<!-- Use icon and NUGET readme from dotnet/nuget folder -->
<PackageIcon>icon.png</PackageIcon>
<PackageIconUrl>icon.png</PackageIconUrl>
<PackageReadmeFile>NUGET.md</PackageReadmeFile>
<!-- Build symbol package (.snupkg) to distribute the PDB containing Source Link -->
<IncludeSymbols>true</IncludeSymbols>
<SymbolPackageFormat>snupkg</SymbolPackageFormat>
<!-- Optional: Publish the repository URL in the built .nupkg (in the NuSpec <Repository> element) -->
<PublishRepositoryUrl>true</PublishRepositoryUrl>
<!-- Optional: Embed source files that are not tracked by the source control manager in the PDB -->
<EmbedUntrackedSources>true</EmbedUntrackedSources>
<!-- Include the XML documentation file in the NuGet package. -->
<DocumentationFile>bin\$(Configuration)\$(TargetFramework)\$(AssemblyName).xml</DocumentationFile>
</PropertyGroup>
<ItemGroup>
<!-- SourceLink allows step-through debugging for source hosted on GitHub. -->
<!-- https://github.com/dotnet/sourcelink -->
<PackageReference Include="Microsoft.SourceLink.GitHub" PrivateAssets="All" Version="$(MicrosoftSourceLinkGitHubVersion)" />
</ItemGroup>
<ItemGroup>
<!-- Include icon.png and NUGET.md in the project. -->
<None Include="$(RepoRoot)/dotnet/nuget/icon.png" Link="icon.png" Pack="true" PackagePath="." />
<None Include="$(RepoRoot)/dotnet/nuget/NUGET.md" Link="NUGET.md" Pack="true" PackagePath="." />
</ItemGroup>
<PropertyGroup Condition=" '$(Configuration)' == 'Release' ">
<GeneratePackageOnBuild>true</GeneratePackageOnBuild>
</PropertyGroup>
</Project>

View File

@ -0,0 +1,19 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>$(TestTargetFramework)</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<GenerateDocumentationFile>True</GenerateDocumentationFile>
<NoWarn>$(NoWarn);CS8981;CS8600;CS8602;CS8604;CS8618;CS0219;SKEXP0054;SKEXP0050</NoWarn>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\..\src\AutoGen.DotnetInteractive\AutoGen.DotnetInteractive.csproj" />
<ProjectReference Include="..\..\src\AutoGen.SourceGenerator\AutoGen.SourceGenerator.csproj" OutputItemType="Analyzer" ReferenceOutputAssembly="false" />
<ProjectReference Include="..\..\src\AutoGen\AutoGen.csproj" />
<PackageReference Include="FluentAssertions" Version="$(FluentAssertionVersion)" />
<PackageReference Include="Microsoft.SemanticKernel.Plugins.Web" Version="$(SemanticKernelExperimentalVersion)" />
</ItemGroup>
</Project>

View File

@ -0,0 +1,31 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// AgentCodeSnippet.cs
using AutoGen.Core;
namespace AutoGen.BasicSample.CodeSnippet;
internal class AgentCodeSnippet
{
public async Task ChatWithAnAgent(IStreamingAgent agent)
{
#region ChatWithAnAgent_GenerateReplyAsync
var message = new TextMessage(Role.User, "Hello");
IMessage reply = await agent.GenerateReplyAsync([message]);
#endregion ChatWithAnAgent_GenerateReplyAsync
#region ChatWithAnAgent_SendAsync
reply = await agent.SendAsync("Hello");
#endregion ChatWithAnAgent_SendAsync
#region ChatWithAnAgent_GenerateStreamingReplyAsync
var textMessage = new TextMessage(Role.User, "Hello");
await foreach (var streamingReply in await agent.GenerateStreamingReplyAsync([message]))
{
if (streamingReply is TextMessageUpdate update)
{
Console.Write(update.Content);
}
}
#endregion ChatWithAnAgent_GenerateStreamingReplyAsync
}
}

View File

@ -0,0 +1,42 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// BuildInMessageCodeSnippet.cs
using AutoGen.Core;
namespace AutoGen.BasicSample.CodeSnippet;
internal class BuildInMessageCodeSnippet
{
public async Task StreamingCallCodeSnippetAsync()
{
IStreamingAgent agent = default;
#region StreamingCallCodeSnippet
var helloTextMessage = new TextMessage(Role.User, "Hello");
var reply = await agent.GenerateStreamingReplyAsync([helloTextMessage]);
var finalTextMessage = new TextMessage(Role.Assistant, string.Empty, from: agent.Name);
await foreach (var message in reply)
{
if (message is TextMessageUpdate textMessage)
{
Console.Write(textMessage.Content);
finalTextMessage.Update(textMessage);
}
}
#endregion StreamingCallCodeSnippet
#region StreamingCallWithFinalMessage
reply = await agent.GenerateStreamingReplyAsync([helloTextMessage]);
TextMessage finalMessage = null;
await foreach (var message in reply)
{
if (message is TextMessageUpdate textMessage)
{
Console.Write(textMessage.Content);
}
else if (message is TextMessage txtMessage)
{
finalMessage = txtMessage;
}
}
#endregion StreamingCallWithFinalMessage
}
}

View File

@ -0,0 +1,142 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// CreateAnAgent.cs
using AutoGen;
using AutoGen.Core;
using AutoGen.OpenAI;
using FluentAssertions;
public partial class AssistantCodeSnippet
{
public void CodeSnippet1()
{
#region code_snippet_1
// get OpenAI Key and create config
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var llmConfig = new OpenAIConfig(openAIKey, "gpt-3.5-turbo");
// create assistant agent
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = new[] { llmConfig },
});
#endregion code_snippet_1
}
public void CodeSnippet2()
{
#region code_snippet_2
// get OpenAI Key and create config
var apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string endPoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"); // change to your endpoint
var llmConfig = new AzureOpenAIConfig(
endpoint: endPoint,
deploymentName: "gpt-3.5-turbo-16k", // change to your deployment name
apiKey: apiKey);
// create assistant agent
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = new[] { llmConfig },
});
#endregion code_snippet_2
}
#region code_snippet_3
/// <summary>
/// convert input to upper case
/// </summary>
/// <param name="input">input</param>
[Function]
public async Task<string> UpperCase(string input)
{
var result = input.ToUpper();
return result;
}
#endregion code_snippet_3
public async Task CodeSnippet4()
{
// get OpenAI Key and create config
var apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string endPoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"); // change to your endpoint
var llmConfig = new AzureOpenAIConfig(
endpoint: endPoint,
deploymentName: "gpt-3.5-turbo-16k", // change to your deployment name
apiKey: apiKey);
#region code_snippet_4
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that convert user input to upper case.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = new[]
{
llmConfig
},
FunctionContracts = new[]
{
this.UpperCaseFunctionContract, // The FunctionDefinition object for the UpperCase function
},
});
var response = await assistantAgent.SendAsync("hello");
response.Should().BeOfType<ToolCallMessage>();
var toolCallMessage = (ToolCallMessage)response;
toolCallMessage.ToolCalls.Count().Should().Be(1);
toolCallMessage.ToolCalls.First().FunctionName.Should().Be("UpperCase");
#endregion code_snippet_4
}
public async Task CodeSnippet5()
{
// get OpenAI Key and create config
var apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string endPoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"); // change to your endpoint
var llmConfig = new AzureOpenAIConfig(
endpoint: endPoint,
deploymentName: "gpt-3.5-turbo-16k", // change to your deployment name
apiKey: apiKey);
#region code_snippet_5
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that convert user input to upper case.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = new[]
{
llmConfig
},
FunctionContracts = new[]
{
this.UpperCaseFunctionContract, // The FunctionDefinition object for the UpperCase function
},
},
functionMap: new Dictionary<string, Func<string, Task<string>>>
{
{ this.UpperCaseFunction.Name, this.UpperCaseWrapper }, // The wrapper function for the UpperCase function
});
var response = await assistantAgent.SendAsync("hello");
response.Should().BeOfType<TextMessage>();
response.From.Should().Be("assistant");
var textMessage = (TextMessage)response;
textMessage.Content.Should().Be("HELLO");
#endregion code_snippet_5
}
}

View File

@ -0,0 +1,149 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// FunctionCallCodeSnippet.cs
using AutoGen;
using AutoGen.Core;
using AutoGen.OpenAI;
using FluentAssertions;
public partial class FunctionCallCodeSnippet
{
public async Task CodeSnippet4()
{
// get OpenAI Key and create config
var apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string endPoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"); // change to your endpoint
var llmConfig = new AzureOpenAIConfig(
endpoint: endPoint,
deploymentName: "gpt-3.5-turbo-16k", // change to your deployment name
apiKey: apiKey);
#region code_snippet_4
var function = new TypeSafeFunctionCall();
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that convert user input to upper case.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = new[]
{
llmConfig
},
FunctionContracts = new[]
{
function.WeatherReportFunctionContract,
},
});
var response = await assistantAgent.SendAsync("hello What's the weather in Seattle today? today is 2024-01-01");
response.Should().BeOfType<ToolCallMessage>();
var toolCallMessage = (ToolCallMessage)response;
toolCallMessage.ToolCalls.Count().Should().Be(1);
toolCallMessage.ToolCalls[0].FunctionName.Should().Be("WeatherReport");
toolCallMessage.ToolCalls[0].FunctionArguments.Should().Be(@"{""location"":""Seattle"",""date"":""2024-01-01""}");
#endregion code_snippet_4
}
public async Task CodeSnippet6()
{
// get OpenAI Key and create config
var apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string endPoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"); // change to your endpoint
var llmConfig = new AzureOpenAIConfig(
endpoint: endPoint,
deploymentName: "gpt-3.5-turbo-16k", // change to your deployment name
apiKey: apiKey);
#region code_snippet_6
var function = new TypeSafeFunctionCall();
var assistantAgent = new AssistantAgent(
name: "assistant",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = new[]
{
llmConfig
},
FunctionContracts = new[]
{
function.WeatherReportFunctionContract,
},
},
functionMap: new Dictionary<string, Func<string, Task<string>>>
{
{ function.WeatherReportFunctionContract.Name, function.WeatherReportWrapper }, // The function wrapper for the weather report function
});
#endregion code_snippet_6
#region code_snippet_6_1
var response = await assistantAgent.SendAsync("What's the weather in Seattle today? today is 2024-01-01");
response.Should().BeOfType<TextMessage>();
var textMessage = (TextMessage)response;
textMessage.Content.Should().Be("Weather report for Seattle on 2024-01-01 is sunny");
#endregion code_snippet_6_1
}
public async Task OverriderFunctionContractAsync()
{
IAgent agent = default;
IEnumerable<IMessage> messages = new List<IMessage>();
#region overrider_function_contract
var function = new TypeSafeFunctionCall();
var reply = agent.GenerateReplyAsync(messages, new GenerateReplyOptions
{
Functions = new[] { function.WeatherReportFunctionContract },
});
#endregion overrider_function_contract
}
public async Task RegisterFunctionCallMiddlewareAsync()
{
IAgent agent = default;
#region register_function_call_middleware
var function = new TypeSafeFunctionCall();
var functionCallMiddleware = new FunctionCallMiddleware(
functions: new[] { function.WeatherReportFunctionContract },
functionMap: new Dictionary<string, Func<string, Task<string>>>
{
{ function.WeatherReportFunctionContract.Name, function.WeatherReportWrapper },
});
agent = agent!.RegisterMiddleware(functionCallMiddleware);
var reply = await agent.SendAsync("What's the weather in Seattle today? today is 2024-01-01");
#endregion register_function_call_middleware
}
public async Task TwoAgentWeatherChatTestAsync()
{
var key = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY") ?? throw new ArgumentException("AZURE_OPENAI_API_KEY is not set");
var endpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT") ?? throw new ArgumentException("AZURE_OPENAI_ENDPOINT is not set");
var deploymentName = "gpt-35-turbo-16k";
var config = new AzureOpenAIConfig(endpoint, deploymentName, key);
#region two_agent_weather_chat
var function = new TypeSafeFunctionCall();
var assistant = new AssistantAgent(
"assistant",
llmConfig: new ConversableAgentConfig
{
ConfigList = new[] { config },
FunctionContracts = new[]
{
function.WeatherReportFunctionContract,
},
});
var user = new UserProxyAgent(
name: "user",
functionMap: new Dictionary<string, Func<string, Task<string>>>
{
{ function.WeatherReportFunctionContract.Name, function.WeatherReportWrapper },
});
await user.InitiateChatAsync(assistant, "what's weather in Seattle today, today is 2024-01-01", 10);
#endregion two_agent_weather_chat
}
}

View File

@ -0,0 +1,41 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// GetStartCodeSnippet.cs
#region snippet_GetStartCodeSnippet
using AutoGen;
using AutoGen.Core;
using AutoGen.OpenAI;
#endregion snippet_GetStartCodeSnippet
public class GetStartCodeSnippet
{
public async Task CodeSnippet1()
{
#region code_snippet_1
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var gpt35Config = new OpenAIConfig(openAIKey, "gpt-3.5-turbo");
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = [gpt35Config],
})
.RegisterPrintMessage(); // register a hook to print message nicely to console
// set human input mode to ALWAYS so that user always provide input
var userProxyAgent = new UserProxyAgent(
name: "user",
humanInputMode: HumanInputMode.ALWAYS)
.RegisterPrintMessage();
// start the conversation
await userProxyAgent.InitiateChatAsync(
receiver: assistantAgent,
message: "Hey assistant, please do me a favor.",
maxRound: 10);
#endregion code_snippet_1
}
}

View File

@ -0,0 +1,169 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MiddlewareAgentCodeSnippet.cs
using AutoGen.Core;
using System.Text.Json;
using AutoGen.OpenAI;
using FluentAssertions;
namespace AutoGen.BasicSample.CodeSnippet;
public class MiddlewareAgentCodeSnippet
{
public async Task CreateMiddlewareAgentAsync()
{
#region create_middleware_agent_with_original_agent
// Create an agent that always replies "Hello World"
IAgent agent = new DefaultReplyAgent(name: "assistant", defaultReply: "Hello World");
// Create a middleware agent on top of default reply agent
var middlewareAgent = new MiddlewareAgent(innerAgent: agent);
middlewareAgent.Use(async (messages, options, agent, ct) =>
{
var lastMessage = messages.Last() as TextMessage;
lastMessage.Content = $"[middleware 0] {lastMessage.Content}";
return await agent.GenerateReplyAsync(messages, options, ct);
});
var reply = await middlewareAgent.SendAsync("Hello World");
reply.GetContent().Should().Be("[middleware 0] Hello World");
#endregion create_middleware_agent_with_original_agent
#region register_middleware_agent
middlewareAgent = agent.RegisterMiddleware(async (messages, options, agent, ct) =>
{
var lastMessage = messages.Last() as TextMessage;
lastMessage.Content = $"[middleware 0] {lastMessage.Content}";
return await agent.GenerateReplyAsync(messages, options, ct);
});
#endregion register_middleware_agent
#region short_circuit_middleware_agent
// This middleware will short circuit the agent and return the last message directly.
middlewareAgent.Use(async (messages, options, agent, ct) =>
{
var lastMessage = messages.Last() as TextMessage;
lastMessage.Content = $"[middleware shortcut]";
return lastMessage;
});
#endregion short_circuit_middleware_agent
}
public async Task RegisterStreamingMiddlewareAsync()
{
IStreamingAgent streamingAgent = default;
#region register_streaming_middleware
var connector = new OpenAIChatRequestMessageConnector();
var agent = streamingAgent!
.RegisterStreamingMiddleware(connector);
#endregion register_streaming_middleware
}
public async Task CodeSnippet1()
{
#region code_snippet_1
// Create an agent that always replies "Hello World"
IAgent agent = new DefaultReplyAgent(name: "assistant", defaultReply: "Hello World");
// Create a middleware agent on top of default reply agent
var middlewareAgent = new MiddlewareAgent(innerAgent: agent);
// Since no middleware is added, middlewareAgent will simply proxy into the inner agent to generate reply.
var reply = await middlewareAgent.SendAsync("Hello World");
reply.From.Should().Be("assistant");
reply.GetContent().Should().Be("Hello World");
#endregion code_snippet_1
#region code_snippet_2
middlewareAgent.Use(async (messages, options, agent, ct) =>
{
var lastMessage = messages.Last() as TextMessage;
lastMessage.Content = $"[middleware 0] {lastMessage.Content}";
return await agent.GenerateReplyAsync(messages, options, ct);
});
reply = await middlewareAgent.SendAsync("Hello World");
reply.Should().BeOfType<TextMessage>();
var textReply = (TextMessage)reply;
textReply.Content.Should().Be("[middleware 0] Hello World");
#endregion code_snippet_2
#region code_snippet_2_1
middlewareAgent = agent.RegisterMiddleware(async (messages, options, agnet, ct) =>
{
var lastMessage = messages.Last() as TextMessage;
lastMessage.Content = $"[middleware 0] {lastMessage.Content}";
return await agent.GenerateReplyAsync(messages, options, ct);
});
reply = await middlewareAgent.SendAsync("Hello World");
reply.GetContent().Should().Be("[middleware 0] Hello World");
#endregion code_snippet_2_1
#region code_snippet_3
middlewareAgent.Use(async (messages, options, agent, ct) =>
{
var lastMessage = messages.Last() as TextMessage;
lastMessage.Content = $"[middleware 1] {lastMessage.Content}";
return await agent.GenerateReplyAsync(messages, options, ct);
});
reply = await middlewareAgent.SendAsync("Hello World");
reply.GetContent().Should().Be("[middleware 0] [middleware 1] Hello World");
#endregion code_snippet_3
#region code_snippet_4
middlewareAgent.Use(async (messages, options, next, ct) =>
{
var lastMessage = messages.Last() as TextMessage;
lastMessage.Content = $"[middleware shortcut]";
return lastMessage;
});
reply = await middlewareAgent.SendAsync("Hello World");
reply.GetContent().Should().Be("[middleware shortcut]");
#endregion code_snippet_4
#region retrieve_inner_agent
var innerAgent = middlewareAgent.Agent;
#endregion retrieve_inner_agent
#region code_snippet_logging_to_console
var agentWithLogging = middlewareAgent.RegisterMiddleware(async (messages, options, agent, ct) =>
{
var reply = await agent.GenerateReplyAsync(messages, options, ct);
var formattedMessage = reply.FormatMessage();
Console.WriteLine(formattedMessage);
return reply;
});
#endregion code_snippet_logging_to_console
#region code_snippet_response_format_forcement
var jsonAgent = middlewareAgent.RegisterMiddleware(async (messages, options, agent, ct) =>
{
var maxAttempt = 5;
var reply = await agent.GenerateReplyAsync(messages, options, ct);
while (maxAttempt-- > 0)
{
if (JsonSerializer.Deserialize<Dictionary<string, object>>(reply.GetContent()) is { } dict)
{
return reply;
}
else
{
await Task.Delay(1000);
var reviewPrompt = @"The format is not json, please modify your response to json format
-- ORIGINAL MESSAGE --
{reply.Content}
-- END OF ORIGINAL MESSAGE --
Reply again with json format.";
reply = await agent.SendAsync(reviewPrompt, messages, ct);
}
}
throw new Exception("agent fails to generate json response");
});
#endregion code_snippet_response_format_forcement
}
}

View File

@ -0,0 +1,86 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MistralAICodeSnippet.cs
#region using_statement
using AutoGen.Mistral;
using AutoGen.Core;
using AutoGen.Mistral.Extension;
using FluentAssertions;
#endregion using_statement
namespace AutoGen.BasicSample.CodeSnippet;
#region weather_function
public partial class MistralAgentFunction
{
[Function]
public async Task<string> GetWeather(string location)
{
return "The weather in " + location + " is sunny.";
}
}
#endregion weather_function
internal class MistralAICodeSnippet
{
public async Task CreateMistralAIClientAsync()
{
#region create_mistral_agent
var apiKey = Environment.GetEnvironmentVariable("MISTRAL_API_KEY") ?? throw new Exception("Missing MISTRAL_API_KEY environment variable");
var client = new MistralClient(apiKey: apiKey);
var agent = new MistralClientAgent(
client: client,
name: "MistralAI",
model: MistralAIModelID.OPEN_MISTRAL_7B)
.RegisterMessageConnector(); // support more AutoGen built-in message types.
await agent.SendAsync("Hello, how are you?");
#endregion create_mistral_agent
#region streaming_chat
var reply = await agent.GenerateStreamingReplyAsync(
messages: [new TextMessage(Role.User, "Hello, how are you?")]
);
await foreach (var message in reply)
{
if (message is TextMessageUpdate textMessageUpdate && textMessageUpdate.Content is string content)
{
Console.WriteLine(content);
}
}
#endregion streaming_chat
}
public async Task MistralAIChatAgentGetWeatherToolUsageAsync()
{
#region create_mistral_function_call_agent
var apiKey = Environment.GetEnvironmentVariable("MISTRAL_API_KEY") ?? throw new Exception("Missing MISTRAL_API_KEY environment variable");
var client = new MistralClient(apiKey: apiKey);
var agent = new MistralClientAgent(
client: client,
name: "MistralAI",
model: MistralAIModelID.MISTRAL_SMALL_LATEST)
.RegisterMessageConnector(); // support more AutoGen built-in message types like ToolCallMessage and ToolCallResultMessage
#endregion create_mistral_function_call_agent
#region create_get_weather_function_call_middleware
var mistralFunctions = new MistralAgentFunction();
var functionCallMiddleware = new FunctionCallMiddleware(
functions: [mistralFunctions.GetWeatherFunctionContract],
functionMap: new Dictionary<string, Func<string, Task<string>>> // with functionMap, the function will be automatically triggered if the tool name matches one of the keys.
{
{ mistralFunctions.GetWeatherFunctionContract.Name, mistralFunctions.GetWeather }
});
#endregion create_get_weather_function_call_middleware
#region register_function_call_middleware
agent = agent.RegisterMiddleware(functionCallMiddleware);
#endregion register_function_call_middleware
#region send_message_with_function_call
var reply = await agent.SendAsync("What is the weather in Seattle?");
reply.GetContent().Should().Be("The weather in Seattle is sunny.");
#endregion send_message_with_function_call
}
}

View File

@ -0,0 +1,136 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// OpenAICodeSnippet.cs
#region using_statement
using AutoGen.Core;
using AutoGen.OpenAI;
using AutoGen.OpenAI.Extension;
using Azure.AI.OpenAI;
#endregion using_statement
using FluentAssertions;
namespace AutoGen.BasicSample.CodeSnippet;
#region weather_function
public partial class Functions
{
[Function]
public async Task<string> GetWeather(string location)
{
return "The weather in " + location + " is sunny.";
}
}
#endregion weather_function
public partial class OpenAICodeSnippet
{
[Function]
public async Task<string> GetWeather(string location)
{
return "The weather in " + location + " is sunny.";
}
public async Task CreateOpenAIChatAgentAsync()
{
#region create_openai_chat_agent
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-3.5-turbo";
var openAIClient = new OpenAIClient(openAIKey);
// create an open ai chat agent
var openAIChatAgent = new OpenAIChatAgent(
openAIClient: openAIClient,
name: "assistant",
modelName: modelId,
systemMessage: "You are an assistant that help user to do some tasks.");
// OpenAIChatAgent supports the following message types:
// - IMessage<ChatRequestMessage> where ChatRequestMessage is from Azure.AI.OpenAI
var helloMessage = new ChatRequestUserMessage("Hello");
// Use MessageEnvelope.Create to create an IMessage<ChatRequestMessage>
var chatMessageContent = MessageEnvelope.Create(helloMessage);
var reply = await openAIChatAgent.SendAsync(chatMessageContent);
// The type of reply is MessageEnvelope<ChatResponseMessage> where ChatResponseMessage is from Azure.AI.OpenAI
reply.Should().BeOfType<MessageEnvelope<ChatResponseMessage>>();
// You can un-envelop the reply to get the ChatResponseMessage
ChatResponseMessage response = reply.As<MessageEnvelope<ChatResponseMessage>>().Content;
response.Role.Should().Be(ChatRole.Assistant);
#endregion create_openai_chat_agent
#region create_openai_chat_agent_streaming
var streamingReply = await openAIChatAgent.GenerateStreamingReplyAsync(new[] { chatMessageContent });
await foreach (var streamingMessage in streamingReply)
{
streamingMessage.Should().BeOfType<MessageEnvelope<StreamingChatCompletionsUpdate>>();
streamingMessage.As<MessageEnvelope<StreamingChatCompletionsUpdate>>().Content.Role.Should().Be(ChatRole.Assistant);
}
#endregion create_openai_chat_agent_streaming
#region register_openai_chat_message_connector
// register message connector to support more message types
var agentWithConnector = openAIChatAgent
.RegisterMessageConnector();
// now the agentWithConnector supports more message types
var messages = new IMessage[]
{
MessageEnvelope.Create(new ChatRequestUserMessage("Hello")),
new TextMessage(Role.Assistant, "Hello", from: "user"),
new MultiModalMessage(Role.Assistant,
[
new TextMessage(Role.Assistant, "Hello", from: "user"),
],
from: "user"),
new Message(Role.Assistant, "Hello", from: "user"), // Message type is going to be deprecated, please use TextMessage instead
};
foreach (var message in messages)
{
reply = await agentWithConnector.SendAsync(message);
reply.Should().BeOfType<TextMessage>();
reply.As<TextMessage>().From.Should().Be("assistant");
}
#endregion register_openai_chat_message_connector
}
public async Task OpenAIChatAgentGetWeatherFunctionCallAsync()
{
#region openai_chat_agent_get_weather_function_call
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-3.5-turbo";
var openAIClient = new OpenAIClient(openAIKey);
// create an open ai chat agent
var openAIChatAgent = new OpenAIChatAgent(
openAIClient: openAIClient,
name: "assistant",
modelName: modelId,
systemMessage: "You are an assistant that help user to do some tasks.")
.RegisterMessageConnector();
#endregion openai_chat_agent_get_weather_function_call
#region create_function_call_middleware
var functions = new Functions();
var functionCallMiddleware = new FunctionCallMiddleware(
functions: [functions.GetWeatherFunctionContract], // GetWeatherFunctionContract is auto-generated from the GetWeather function
functionMap: new Dictionary<string, Func<string, Task<string>>>
{
{ functions.GetWeatherFunctionContract.Name, functions.GetWeatherWrapper } // GetWeatherWrapper is a wrapper function for GetWeather, which is also auto-generated
});
openAIChatAgent = openAIChatAgent.RegisterMiddleware(functionCallMiddleware);
#endregion create_function_call_middleware
#region chat_agent_send_function_call
var reply = await openAIChatAgent.SendAsync("what is the weather in Seattle?");
reply.GetContent().Should().Be("The weather in Seattle is sunny.");
reply.GetToolCalls().Count.Should().Be(1);
reply.GetToolCalls().First().Should().Be(this.GetWeatherFunctionContract.Name);
#endregion chat_agent_send_function_call
}
}

View File

@ -0,0 +1,44 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// PrintMessageMiddlewareCodeSnippet.cs
using AutoGen.Core;
using AutoGen.OpenAI;
using AutoGen.OpenAI.Extension;
using Azure;
using Azure.AI.OpenAI;
namespace AutoGen.BasicSample.CodeSnippet;
internal class PrintMessageMiddlewareCodeSnippet
{
public async Task PrintMessageMiddlewareAsync()
{
var config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var endpoint = new Uri(config.Endpoint);
var openaiClient = new OpenAIClient(endpoint, new AzureKeyCredential(config.ApiKey));
var agent = new OpenAIChatAgent(openaiClient, "assistant", config.DeploymentName)
.RegisterMessageConnector();
#region PrintMessageMiddleware
var agentWithPrintMessageMiddleware = agent
.RegisterPrintMessage();
await agentWithPrintMessageMiddleware.SendAsync("write a long poem");
#endregion PrintMessageMiddleware
}
public async Task PrintMessageStreamingMiddlewareAsync()
{
var config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var endpoint = new Uri(config.Endpoint);
var openaiClient = new OpenAIClient(endpoint, new AzureKeyCredential(config.ApiKey));
#region print_message_streaming
var streamingAgent = new OpenAIChatAgent(openaiClient, "assistant", config.DeploymentName)
.RegisterMessageConnector()
.RegisterPrintMessage();
await streamingAgent.SendAsync("write a long poem");
#endregion print_message_streaming
}
}

View File

@ -0,0 +1,48 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// RunCodeSnippetCodeSnippet.cs
#region code_snippet_0_1
using AutoGen.Core;
using AutoGen.DotnetInteractive;
#endregion code_snippet_0_1
namespace AutoGen.BasicSample.CodeSnippet;
public class RunCodeSnippetCodeSnippet
{
public async Task CodeSnippet1()
{
IAgent agent = default;
#region code_snippet_1_1
var workingDirectory = Path.Combine(Path.GetTempPath(), Path.GetRandomFileName());
Directory.CreateDirectory(workingDirectory);
var interactiveService = new InteractiveService(installingDirectory: workingDirectory);
await interactiveService.StartAsync(workingDirectory: workingDirectory);
#endregion code_snippet_1_1
#region code_snippet_1_2
// register dotnet code block execution hook to an arbitrary agent
var dotnetCodeAgent = agent.RegisterDotnetCodeBlockExectionHook(interactiveService: interactiveService);
var codeSnippet = @"
```csharp
Console.WriteLine(""Hello World"");
```";
await dotnetCodeAgent.SendAsync(codeSnippet);
// output: Hello World
#endregion code_snippet_1_2
#region code_snippet_1_3
var content = @"
```csharp
// This is csharp code snippet
```
```python
// This is python code snippet
```
";
#endregion code_snippet_1_3
}
}

View File

@ -0,0 +1,102 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// SemanticKernelCodeSnippet.cs
using AutoGen.Core;
using AutoGen.SemanticKernel;
using AutoGen.SemanticKernel.Extension;
using FluentAssertions;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
namespace AutoGen.BasicSample.CodeSnippet;
public class SemanticKernelCodeSnippet
{
public async Task<string> GetWeather(string location)
{
return "The weather in " + location + " is sunny.";
}
public async Task CreateSemanticKernelAgentAsync()
{
#region create_semantic_kernel_agent
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-3.5-turbo";
var builder = Kernel.CreateBuilder()
.AddOpenAIChatCompletion(modelId: modelId, apiKey: openAIKey);
var kernel = builder.Build();
// create a semantic kernel agent
var semanticKernelAgent = new SemanticKernelAgent(
kernel: kernel,
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.");
// SemanticKernelAgent supports the following message types:
// - IMessage<ChatMessageContent> where ChatMessageContent is from Azure.AI.OpenAI
var helloMessage = new ChatMessageContent(AuthorRole.User, "Hello");
// Use MessageEnvelope.Create to create an IMessage<ChatRequestMessage>
var chatMessageContent = MessageEnvelope.Create(helloMessage);
var reply = await semanticKernelAgent.SendAsync(chatMessageContent);
// The type of reply is MessageEnvelope<ChatResponseMessage> where ChatResponseMessage is from Azure.AI.OpenAI
reply.Should().BeOfType<MessageEnvelope<ChatMessageContent>>();
// You can un-envelop the reply to get the ChatResponseMessage
ChatMessageContent response = reply.As<MessageEnvelope<ChatMessageContent>>().Content;
response.Role.Should().Be(AuthorRole.Assistant);
#endregion create_semantic_kernel_agent
#region create_semantic_kernel_agent_streaming
var streamingReply = await semanticKernelAgent.GenerateStreamingReplyAsync(new[] { chatMessageContent });
await foreach (var streamingMessage in streamingReply)
{
streamingMessage.Should().BeOfType<MessageEnvelope<StreamingChatMessageContent>>();
streamingMessage.As<MessageEnvelope<StreamingChatMessageContent>>().From.Should().Be("assistant");
}
#endregion create_semantic_kernel_agent_streaming
}
public async Task SemanticKernelChatMessageContentConnector()
{
#region register_semantic_kernel_chat_message_content_connector
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-3.5-turbo";
var builder = Kernel.CreateBuilder()
.AddOpenAIChatCompletion(modelId: modelId, apiKey: openAIKey);
var kernel = builder.Build();
// create a semantic kernel agent
var semanticKernelAgent = new SemanticKernelAgent(
kernel: kernel,
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.");
// Register the connector middleware to the kernel agent
var semanticKernelAgentWithConnector = semanticKernelAgent
.RegisterMessageConnector();
// now semanticKernelAgentWithConnector supports more message types
IMessage[] messages = [
MessageEnvelope.Create(new ChatMessageContent(AuthorRole.User, "Hello")),
new TextMessage(Role.Assistant, "Hello", from: "user"),
new MultiModalMessage(Role.Assistant,
[
new TextMessage(Role.Assistant, "Hello", from: "user"),
],
from: "user"),
];
foreach (var message in messages)
{
var reply = await semanticKernelAgentWithConnector.SendAsync(message);
// SemanticKernelChatMessageContentConnector will convert the reply message to TextMessage
reply.Should().BeOfType<TextMessage>();
}
#endregion register_semantic_kernel_chat_message_content_connector
}
}

View File

@ -0,0 +1,121 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// TypeSafeFunctionCallCodeSnippet.cs
using System.Text.Json;
using AutoGen.OpenAI.Extension;
using Azure.AI.OpenAI;
#region weather_report_using_statement
using AutoGen.Core;
#endregion weather_report_using_statement
#region weather_report
public partial class TypeSafeFunctionCall
{
/// <summary>
/// Get weather report
/// </summary>
/// <param name="city">city</param>
/// <param name="date">date</param>
[Function]
public async Task<string> WeatherReport(string city, string date)
{
return $"Weather report for {city} on {date} is sunny";
}
}
#endregion weather_report
public partial class TypeSafeFunctionCall
{
public async Task Consume()
{
#region weather_report_consume
var functionInstance = new TypeSafeFunctionCall();
// Get the generated function definition
FunctionDefinition functionDefiniton = functionInstance.WeatherReportFunctionContract.ToOpenAIFunctionDefinition();
// Get the generated function wrapper
Func<string, Task<string>> functionWrapper = functionInstance.WeatherReportWrapper;
// ...
#endregion weather_report_consume
}
}
#region code_snippet_3
// file: FunctionCall.cs
public partial class TypeSafeFunctionCall
{
/// <summary>
/// convert input to upper case
/// </summary>
/// <param name="input">input</param>
[Function]
public async Task<string> UpperCase(string input)
{
var result = input.ToUpper();
return result;
}
}
#endregion code_snippet_3
public class TypeSafeFunctionCallCodeSnippet
{
public async Task<string> UpperCase(string input)
{
var result = input.ToUpper();
return result;
}
#region code_snippet_1
// file: FunctionDefinition.generated.cs
public FunctionDefinition UpperCaseFunction
{
get => new FunctionDefinition
{
Name = @"UpperCase",
Description = "convert input to upper case",
Parameters = BinaryData.FromObjectAsJson(new
{
Type = "object",
Properties = new
{
input = new
{
Type = @"string",
Description = @"input",
},
},
Required = new[]
{
"input",
},
},
new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
})
};
}
#endregion code_snippet_1
#region code_snippet_2
// file: FunctionDefinition.generated.cs
private class UpperCaseSchema
{
public string input { get; set; }
}
public Task<string> UpperCaseWrapper(string arguments)
{
var schema = JsonSerializer.Deserialize<UpperCaseSchema>(
arguments,
new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
});
return UpperCase(schema.input);
}
#endregion code_snippet_2
}

View File

@ -0,0 +1,20 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// UserProxyAgentCodeSnippet.cs
using AutoGen.Core;
namespace AutoGen.BasicSample.CodeSnippet;
public class UserProxyAgentCodeSnippet
{
public async Task CodeSnippet1()
{
#region code_snippet_1
// create a user proxy agent which always ask user for input
var agent = new UserProxyAgent(
name: "user",
humanInputMode: HumanInputMode.ALWAYS);
await agent.SendAsync("hello");
#endregion code_snippet_1
}
}

View File

@ -0,0 +1,46 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example01_AssistantAgent.cs
using AutoGen.Core;
using AutoGen;
using AutoGen.BasicSample;
using FluentAssertions;
/// <summary>
/// This example shows the basic usage of <see cref="ConversableAgent"/> class.
/// </summary>
public static class Example01_AssistantAgent
{
public static async Task RunAsync()
{
var gpt35 = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var config = new ConversableAgentConfig
{
Temperature = 0,
ConfigList = [gpt35],
};
// create assistant agent
var assistantAgent = new AssistantAgent(
name: "assistant",
systemMessage: "You convert what user said to all uppercase.",
llmConfig: config)
.RegisterPrintMessage();
// talk to the assistant agent
var reply = await assistantAgent.SendAsync("hello world");
reply.Should().BeOfType<TextMessage>();
reply.GetContent().Should().Be("HELLO WORLD");
// to carry on the conversation, pass the previous conversation history to the next call
var conversationHistory = new List<IMessage>
{
new TextMessage(Role.User, "hello world"), // first message
reply, // reply from assistant agent
};
reply = await assistantAgent.SendAsync("hello world again", conversationHistory);
reply.Should().BeOfType<TextMessage>();
reply.GetContent().Should().Be("HELLO WORLD AGAIN");
}
}

View File

@ -0,0 +1,79 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example02_TwoAgent_MathChat.cs
using AutoGen.Core;
using AutoGen;
using AutoGen.BasicSample;
using FluentAssertions;
public static class Example02_TwoAgent_MathChat
{
public static async Task RunAsync()
{
#region code_snippet_1
// get gpt-3.5-turbo config
var gpt35 = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
// create teacher agent
// teacher agent will create math questions
var teacher = new AssistantAgent(
name: "teacher",
systemMessage: @"You are a teacher that create pre-school math question for student and check answer.
If the answer is correct, you terminate conversation by saying [TERMINATE].
If the answer is wrong, you ask student to fix it.",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = [gpt35],
})
.RegisterPostProcess(async (_, reply, _) =>
{
if (reply.GetContent()?.ToLower().Contains("terminate") is true)
{
return new TextMessage(Role.Assistant, GroupChatExtension.TERMINATE, from: reply.From);
}
return reply;
})
.RegisterPrintMessage();
// create student agent
// student agent will answer the math questions
var student = new AssistantAgent(
name: "student",
systemMessage: "You are a student that answer question from teacher",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = [gpt35],
})
.RegisterPrintMessage();
// start the conversation
var conversation = await student.InitiateChatAsync(
receiver: teacher,
message: "Hey teacher, please create math question for me.",
maxRound: 10);
// output
// Message from teacher
// --------------------
// content: Of course!Here's a math question for you:
//
// What is 2 + 3 ?
// --------------------
//
// Message from student
// --------------------
// content: The sum of 2 and 3 is 5.
// --------------------
//
// Message from teacher
// --------------------
// content: [GROUPCHAT_TERMINATE]
// --------------------
#endregion code_snippet_1
conversation.Count().Should().BeLessThan(10);
conversation.Last().IsGroupChatTerminateMessage().Should().BeTrue();
}
}

View File

@ -0,0 +1,96 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example03_Agent_FunctionCall.cs
using AutoGen;
using AutoGen.Core;
using AutoGen.BasicSample;
using FluentAssertions;
/// <summary>
/// This example shows how to add type-safe function call to an agent.
/// </summary>
public partial class Example03_Agent_FunctionCall
{
/// <summary>
/// upper case the message when asked.
/// </summary>
/// <param name="message"></param>
[Function]
public async Task<string> UpperCase(string message)
{
return message.ToUpper();
}
/// <summary>
/// Concatenate strings.
/// </summary>
/// <param name="strings">strings to concatenate</param>
[Function]
public async Task<string> ConcatString(string[] strings)
{
return string.Join(" ", strings);
}
/// <summary>
/// calculate tax
/// </summary>
/// <param name="price">price, should be an integer</param>
/// <param name="taxRate">tax rate, should be in range (0, 1)</param>
[FunctionAttribute]
public async Task<string> CalculateTax(int price, float taxRate)
{
return $"tax is {price * taxRate}";
}
public static async Task RunAsync()
{
var instance = new Example03_Agent_FunctionCall();
var gpt35 = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
// AutoGen makes use of AutoGen.SourceGenerator to automatically generate FunctionDefinition and FunctionCallWrapper for you.
// The FunctionDefinition will be created based on function signature and XML documentation.
// The return type of type-safe function needs to be Task<string>. And to get the best performance, please try only use primitive types and arrays of primitive types as parameters.
var config = new ConversableAgentConfig
{
Temperature = 0,
ConfigList = [gpt35],
FunctionContracts = new[]
{
instance.ConcatStringFunctionContract,
instance.UpperCaseFunctionContract,
instance.CalculateTaxFunctionContract,
},
};
var agent = new AssistantAgent(
name: "agent",
systemMessage: "You are a helpful AI assistant",
llmConfig: config,
functionMap: new Dictionary<string, Func<string, Task<string>>>
{
{ nameof(ConcatString), instance.ConcatStringWrapper },
{ nameof(UpperCase), instance.UpperCaseWrapper },
{ nameof(CalculateTax), instance.CalculateTaxWrapper },
})
.RegisterPrintMessage();
// talk to the assistant agent
var upperCase = await agent.SendAsync("convert to upper case: hello world");
upperCase.GetContent()?.Should().Be("HELLO WORLD");
upperCase.Should().BeOfType<AggregateMessage<ToolCallMessage, ToolCallResultMessage>>();
upperCase.GetToolCalls().Should().HaveCount(1);
upperCase.GetToolCalls().First().FunctionName.Should().Be(nameof(UpperCase));
var concatString = await agent.SendAsync("concatenate strings: a, b, c, d, e");
concatString.GetContent()?.Should().Be("a b c d e");
concatString.Should().BeOfType<AggregateMessage<ToolCallMessage, ToolCallResultMessage>>();
concatString.GetToolCalls().Should().HaveCount(1);
concatString.GetToolCalls().First().FunctionName.Should().Be(nameof(ConcatString));
var calculateTax = await agent.SendAsync("calculate tax: 100, 0.1");
calculateTax.GetContent().Should().Be("tax is 10");
calculateTax.Should().BeOfType<AggregateMessage<ToolCallMessage, ToolCallResultMessage>>();
calculateTax.GetToolCalls().Should().HaveCount(1);
calculateTax.GetToolCalls().First().FunctionName.Should().Be(nameof(CalculateTax));
}
}

View File

@ -0,0 +1,263 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example04_Dynamic_GroupChat_Coding_Task.cs
using AutoGen;
using AutoGen.Core;
using AutoGen.BasicSample;
using AutoGen.DotnetInteractive;
using AutoGen.OpenAI;
using FluentAssertions;
public partial class Example04_Dynamic_GroupChat_Coding_Task
{
public static async Task RunAsync()
{
var instance = new Example04_Dynamic_GroupChat_Coding_Task();
// setup dotnet interactive
var workDir = Path.Combine(Path.GetTempPath(), "InteractiveService");
if (!Directory.Exists(workDir))
Directory.CreateDirectory(workDir);
using var service = new InteractiveService(workDir);
var dotnetInteractiveFunctions = new DotnetInteractiveFunction(service);
var result = Path.Combine(workDir, "result.txt");
if (File.Exists(result))
File.Delete(result);
await service.StartAsync(workDir, default);
var gptConfig = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var helperAgent = new GPTAgent(
name: "helper",
systemMessage: "You are a helpful AI assistant",
temperature: 0f,
config: gptConfig);
var groupAdmin = new GPTAgent(
name: "groupAdmin",
systemMessage: "You are the admin of the group chat",
temperature: 0f,
config: gptConfig);
var userProxy = new UserProxyAgent(name: "user", defaultReply: GroupChatExtension.TERMINATE, humanInputMode: HumanInputMode.NEVER)
.RegisterPrintMessage();
// Create admin agent
var admin = new AssistantAgent(
name: "admin",
systemMessage: """
You are a manager who takes coding problem from user and resolve problem by splitting them into small tasks and assign each task to the most appropriate agent.
Here's available agents who you can assign task to:
- coder: write dotnet code to resolve task
- runner: run dotnet code from coder
The workflow is as follows:
- You take the coding problem from user
- You break the problem into small tasks. For each tasks you first ask coder to write code to resolve the task. Once the code is written, you ask runner to run the code.
- Once a small task is resolved, you summarize the completed steps and create the next step.
- You repeat the above steps until the coding problem is resolved.
You can use the following json format to assign task to agents:
```task
{
"to": "{agent_name}",
"task": "{a short description of the task}",
"context": "{previous context from scratchpad}"
}
```
If you need to ask user for extra information, you can use the following format:
```ask
{
"question": "{question}"
}
```
Once the coding problem is resolved, summarize each steps and results and send the summary to the user using the following format:
```summary
{
"problem": "{coding problem}",
"steps": [
{
"step": "{step}",
"result": "{result}"
}
]
}
```
Your reply must contain one of [task|ask|summary] to indicate the type of your message.
""",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = [gptConfig],
})
.RegisterPrintMessage();
// create coder agent
// The coder agent is a composite agent that contains dotnet coder, code reviewer and nuget agent.
// The dotnet coder write dotnet code to resolve the task.
// The code reviewer review the code block from coder's reply.
// The nuget agent install nuget packages if there's any.
var coderAgent = new GPTAgent(
name: "coder",
systemMessage: @"You act as dotnet coder, you write dotnet code to resolve task. Once you finish writing code, ask runner to run the code for you.
Here're some rules to follow on writing dotnet code:
- put code between ```csharp and ```
- When creating http client, use `var httpClient = new HttpClient()`. Don't use `using var httpClient = new HttpClient()` because it will cause error when running the code.
- Try to use `var` instead of explicit type.
- Try avoid using external library, use .NET Core library instead.
- Use top level statement to write code.
- Always print out the result to console. Don't write code that doesn't print out anything.
If you need to install nuget packages, put nuget packages in the following format:
```nuget
nuget_package_name
```
If your code is incorrect, Fix the error and send the code again.
Here's some externel information
- The link to mlnet repo is: https://github.com/dotnet/machinelearning. you don't need a token to use github pr api. Make sure to include a User-Agent header, otherwise github will reject it.
",
config: gptConfig,
temperature: 0.4f)
.RegisterPrintMessage();
// code reviewer agent will review if code block from coder's reply satisfy the following conditions:
// - There's only one code block
// - The code block is csharp code block
// - The code block is top level statement
// - The code block is not using declaration
var codeReviewAgent = new GPTAgent(
name: "reviewer",
systemMessage: """
You are a code reviewer who reviews code from coder. You need to check if the code satisfy the following conditions:
- The reply from coder contains at least one code block, e.g ```csharp and ```
- There's only one code block and it's csharp code block
- The code block is not inside a main function. a.k.a top level statement
- The code block is not using declaration when creating http client
You don't check the code style, only check if the code satisfy the above conditions.
Put your comment between ```review and ```, if the code satisfies all conditions, put APPROVED in review.result field. Otherwise, put REJECTED along with comments. make sure your comment is clear and easy to understand.
## Example 1 ##
```review
comment: The code satisfies all conditions.
result: APPROVED
```
## Example 2 ##
```review
comment: The code is inside main function. Please rewrite the code in top level statement.
result: REJECTED
```
""",
config: gptConfig,
temperature: 0f)
.RegisterPrintMessage();
// create runner agent
// The runner agent will run the code block from coder's reply.
// It runs dotnet code using dotnet interactive service hook.
// It also truncate the output if the output is too long.
var runner = new AssistantAgent(
name: "runner",
defaultReply: "No code available, coder, write code please")
.RegisterDotnetCodeBlockExectionHook(interactiveService: service)
.RegisterMiddleware(async (msgs, option, agent, ct) =>
{
var mostRecentCoderMessage = msgs.LastOrDefault(x => x.From == "coder") ?? throw new Exception("No coder message found");
return await agent.GenerateReplyAsync(new[] { mostRecentCoderMessage }, option, ct);
})
.RegisterPrintMessage();
var adminToCoderTransition = Transition.Create(admin, coderAgent, async (from, to, messages) =>
{
// the last message should be from admin
var lastMessage = messages.Last();
if (lastMessage.From != admin.Name)
{
return false;
}
return true;
});
var coderToReviewerTransition = Transition.Create(coderAgent, codeReviewAgent);
var adminToRunnerTransition = Transition.Create(admin, runner, async (from, to, messages) =>
{
// the last message should be from admin
var lastMessage = messages.Last();
if (lastMessage.From != admin.Name)
{
return false;
}
// the previous messages should contain a message from coder
var coderMessage = messages.FirstOrDefault(x => x.From == coderAgent.Name);
if (coderMessage is null)
{
return false;
}
return true;
});
var runnerToAdminTransition = Transition.Create(runner, admin);
var reviewerToAdminTransition = Transition.Create(codeReviewAgent, admin);
var adminToUserTransition = Transition.Create(admin, userProxy, async (from, to, messages) =>
{
// the last message should be from admin
var lastMessage = messages.Last();
if (lastMessage.From != admin.Name)
{
return false;
}
return true;
});
var userToAdminTransition = Transition.Create(userProxy, admin);
var workflow = new Graph(
[
adminToCoderTransition,
coderToReviewerTransition,
reviewerToAdminTransition,
adminToRunnerTransition,
runnerToAdminTransition,
adminToUserTransition,
userToAdminTransition,
]);
// create group chat
var groupChat = new GroupChat(
admin: groupAdmin,
members: [admin, coderAgent, runner, codeReviewAgent, userProxy],
workflow: workflow);
// task 1: retrieve the most recent pr from mlnet and save it in result.txt
var groupChatManager = new GroupChatManager(groupChat);
await userProxy.SendAsync(groupChatManager, "Retrieve the most recent pr from mlnet and save it in result.txt", maxRound: 30);
File.Exists(result).Should().BeTrue();
// task 2: calculate the 39th fibonacci number
var answer = 63245986;
// clear the result file
File.Delete(result);
var conversationHistory = await userProxy.InitiateChatAsync(groupChatManager, "What's the 39th of fibonacci number? Save the result in result.txt", maxRound: 10);
File.Exists(result).Should().BeTrue();
var resultContent = File.ReadAllText(result);
resultContent.Should().Contain(answer.ToString());
}
}

View File

@ -0,0 +1,152 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example05_Dalle_And_GPT4V.cs
using AutoGen;
using AutoGen.Core;
using Azure.AI.OpenAI;
using FluentAssertions;
using autogen = AutoGen.LLMConfigAPI;
public partial class Example05_Dalle_And_GPT4V
{
private readonly OpenAIClient openAIClient;
public Example05_Dalle_And_GPT4V(OpenAIClient openAIClient)
{
this.openAIClient = openAIClient;
}
/// <summary>
/// Generate image from prompt using DALL-E.
/// </summary>
/// <param name="prompt">prompt with feedback</param>
/// <returns></returns>
[Function]
public async Task<string> GenerateImage(string prompt)
{
// TODO
// generate image from prompt using DALL-E
// and return url.
var option = new ImageGenerationOptions
{
Size = ImageSize.Size1024x1024,
Style = ImageGenerationStyle.Vivid,
ImageCount = 1,
Prompt = prompt,
Quality = ImageGenerationQuality.Standard,
DeploymentName = "dall-e-3",
};
var imageResponse = await openAIClient.GetImageGenerationsAsync(option);
var imageUrl = imageResponse.Value.Data.First().Url.OriginalString;
return $@"// ignore this line [IMAGE_GENERATION]
The image is generated from prompt {prompt}
{imageUrl}";
}
public static async Task RunAsync()
{
// This example shows how to use DALL-E and GPT-4V to generate image from prompt and feedback.
// The DALL-E agent will generate image from prompt.
// The GPT-4V agent will provide feedback to DALL-E agent to help it generate better image.
// The conversation will be terminated when the image satisfies the condition.
// The image will be saved to image.jpg in current directory.
// get OpenAI Key and create config
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var gpt35Config = autogen.GetOpenAIConfigList(openAIKey, new[] { "gpt-3.5-turbo" });
var gpt4vConfig = autogen.GetOpenAIConfigList(openAIKey, new[] { "gpt-4-vision-preview" });
var openAIClient = new OpenAIClient(openAIKey);
var instance = new Example05_Dalle_And_GPT4V(openAIClient);
var imagePath = Path.Combine(Environment.CurrentDirectory, "image.jpg");
if (File.Exists(imagePath))
{
File.Delete(imagePath);
}
var dalleAgent = new AssistantAgent(
name: "dalle",
systemMessage: "You are a DALL-E agent that generate image from prompt, when conversation is terminated, return the most recent image url",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = gpt35Config,
FunctionContracts = new[]
{
instance.GenerateImageFunctionContract,
},
},
functionMap: new Dictionary<string, Func<string, Task<string>>>
{
{ nameof(GenerateImage), instance.GenerateImageWrapper },
})
.RegisterMiddleware(async (msgs, option, agent, ct) =>
{
// if last message contains [TERMINATE], then find the last image url and terminate the conversation
if (msgs.Last().GetContent()?.Contains("TERMINATE") is true)
{
var lastMessageWithImage = msgs.Last(msg => msg is ImageMessage) as ImageMessage;
var lastImageUrl = lastMessageWithImage.Url;
Console.WriteLine($"download image from {lastImageUrl} to {imagePath}");
var httpClient = new HttpClient();
var imageBytes = await httpClient.GetByteArrayAsync(lastImageUrl);
File.WriteAllBytes(imagePath, imageBytes);
var messageContent = $@"{GroupChatExtension.TERMINATE}
{lastImageUrl}";
return new TextMessage(Role.Assistant, messageContent)
{
From = "dalle",
};
}
var reply = await agent.GenerateReplyAsync(msgs, option, ct);
if (reply.GetContent() is string content && content.Contains("IMAGE_GENERATION"))
{
var imageUrl = content.Split("\n").Last();
var imageMessage = new ImageMessage(Role.Assistant, imageUrl, from: reply.From);
return imageMessage;
}
else
{
return reply;
}
})
.RegisterPrintMessage();
var gpt4VAgent = new AssistantAgent(
name: "gpt4v",
systemMessage: @"You are a critism that provide feedback to DALL-E agent.
Carefully check the image generated by DALL-E agent and provide feedback.
If the image satisfies the condition, then terminate the conversation by saying [TERMINATE].
Otherwise, provide detailed feedback to DALL-E agent so it can generate better image.
The image should satisfy the following conditions:
- There should be a cat and a mouse in the image
- The cat should be chasing after the mouse
",
llmConfig: new ConversableAgentConfig
{
Temperature = 0,
ConfigList = gpt4vConfig,
})
.RegisterPrintMessage();
IEnumerable<IMessage> conversation = new List<IMessage>()
{
new TextMessage(Role.User, "Hey dalle, please generate image from prompt: English short hair blue cat chase after a mouse")
};
var maxRound = 20;
await gpt4VAgent.InitiateChatAsync(
receiver: dalleAgent,
message: "Hey dalle, please generate image from prompt: English short hair blue cat chase after a mouse",
maxRound: maxRound);
File.Exists(imagePath).Should().BeTrue();
}
}

View File

@ -0,0 +1,32 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example06_UserProxyAgent.cs
using AutoGen.Core;
using AutoGen.OpenAI;
namespace AutoGen.BasicSample;
public static class Example06_UserProxyAgent
{
public static async Task RunAsync()
{
var gpt35 = LLMConfiguration.GetOpenAIGPT3_5_Turbo();
var assistantAgent = new GPTAgent(
name: "assistant",
systemMessage: "You are an assistant that help user to do some tasks.",
config: gpt35)
.RegisterPrintMessage();
// set human input mode to ALWAYS so that user always provide input
var userProxyAgent = new UserProxyAgent(
name: "user",
humanInputMode: HumanInputMode.ALWAYS)
.RegisterPrintMessage();
// start the conversation
await userProxyAgent.InitiateChatAsync(
receiver: assistantAgent,
message: "Hey assistant, please help me to do some tasks.",
maxRound: 10);
}
}

View File

@ -0,0 +1,377 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example07_Dynamic_GroupChat_Calculate_Fibonacci.cs
using System.Text;
using System.Text.Json;
using AutoGen;
using AutoGen.BasicSample;
using AutoGen.DotnetInteractive;
using AutoGen.Core;
using AutoGen.OpenAI;
using FluentAssertions;
public partial class Example07_Dynamic_GroupChat_Calculate_Fibonacci
{
#region reviewer_function
public struct CodeReviewResult
{
public bool HasMultipleCodeBlocks { get; set; }
public bool IsTopLevelStatement { get; set; }
public bool IsDotnetCodeBlock { get; set; }
public bool IsPrintResultToConsole { get; set; }
}
/// <summary>
/// review code block
/// </summary>
/// <param name="hasMultipleCodeBlocks">true if there're multipe csharp code blocks</param>
/// <param name="isTopLevelStatement">true if the code is in top level statement</param>
/// <param name="isDotnetCodeBlock">true if the code block is csharp code block</param>
/// <param name="isPrintResultToConsole">true if the code block print out result to console</param>
[Function]
public async Task<string> ReviewCodeBlock(
bool hasMultipleCodeBlocks,
bool isTopLevelStatement,
bool isDotnetCodeBlock,
bool isPrintResultToConsole)
{
var obj = new CodeReviewResult
{
HasMultipleCodeBlocks = hasMultipleCodeBlocks,
IsTopLevelStatement = isTopLevelStatement,
IsDotnetCodeBlock = isDotnetCodeBlock,
IsPrintResultToConsole = isPrintResultToConsole,
};
return JsonSerializer.Serialize(obj);
}
#endregion reviewer_function
#region create_coder
public static async Task<IAgent> CreateCoderAgentAsync()
{
var gpt3Config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var coder = new GPTAgent(
name: "coder",
systemMessage: @"You act as dotnet coder, you write dotnet code to resolve task. Once you finish writing code, ask runner to run the code for you.
Here're some rules to follow on writing dotnet code:
- put code between ```csharp and ```
- Avoid adding `using` keyword when creating disposable object. e.g `var httpClient = new HttpClient()`
- Try to use `var` instead of explicit type.
- Try avoid using external library, use .NET Core library instead.
- Use top level statement to write code.
- Always print out the result to console. Don't write code that doesn't print out anything.
If you need to install nuget packages, put nuget packages in the following format:
```nuget
nuget_package_name
```
If your code is incorrect, runner will tell you the error message. Fix the error and send the code again.",
config: gpt3Config,
temperature: 0.4f)
.RegisterPrintMessage();
return coder;
}
#endregion create_coder
#region create_runner
public static async Task<IAgent> CreateRunnerAgentAsync(InteractiveService service)
{
var runner = new AssistantAgent(
name: "runner",
systemMessage: "You run dotnet code",
defaultReply: "No code available.")
.RegisterDotnetCodeBlockExectionHook(interactiveService: service)
.RegisterReply(async (msgs, _) =>
{
if (msgs.Count() == 0)
{
return new TextMessage(Role.Assistant, "No code available. Coder please write code");
}
return null;
})
.RegisterPreProcess(async (msgs, _) =>
{
// retrieve the most recent message from coder
var coderMsg = msgs.LastOrDefault(msg => msg.From == "coder");
if (coderMsg is null)
{
return Enumerable.Empty<IMessage>();
}
else
{
return new[] { coderMsg };
}
})
.RegisterPrintMessage();
return runner;
}
#endregion create_runner
#region create_admin
public static async Task<IAgent> CreateAdminAsync()
{
var gpt3Config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var admin = new GPTAgent(
name: "admin",
systemMessage: "You are group admin, terminate the group chat once task is completed by saying [TERMINATE] plus the final answer",
temperature: 0,
config: gpt3Config)
.RegisterPostProcess(async (_, reply, _) =>
{
if (reply is TextMessage textMessage && textMessage.Content.Contains("TERMINATE") is true)
{
var content = $"{textMessage.Content}\n\n {GroupChatExtension.TERMINATE}";
return new TextMessage(Role.Assistant, content, from: reply.From);
}
return reply;
});
return admin;
}
#endregion create_admin
#region create_reviewer
public static async Task<IAgent> CreateReviewerAgentAsync()
{
var gpt3Config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var functions = new Example07_Dynamic_GroupChat_Calculate_Fibonacci();
var reviewer = new GPTAgent(
name: "code_reviewer",
systemMessage: @"You review code block from coder",
config: gpt3Config,
functions: [functions.ReviewCodeBlockFunction],
functionMap: new Dictionary<string, Func<string, Task<string>>>()
{
{ nameof(ReviewCodeBlock), functions.ReviewCodeBlockWrapper },
})
.RegisterMiddleware(async (msgs, option, innerAgent, ct) =>
{
var maxRetry = 3;
var reply = await innerAgent.GenerateReplyAsync(msgs, option, ct);
while (maxRetry-- > 0)
{
if (reply.GetToolCalls() is var toolCalls && toolCalls.Count() == 1 && toolCalls[0].FunctionName == nameof(ReviewCodeBlock))
{
var toolCallResult = reply.GetContent();
var reviewResultObj = JsonSerializer.Deserialize<CodeReviewResult>(toolCallResult);
var reviews = new List<string>();
if (reviewResultObj.HasMultipleCodeBlocks)
{
var fixCodeBlockPrompt = @"There're multiple code blocks, please combine them into one code block";
reviews.Add(fixCodeBlockPrompt);
}
if (reviewResultObj.IsDotnetCodeBlock is false)
{
var fixCodeBlockPrompt = @"The code block is not csharp code block, please write dotnet code only";
reviews.Add(fixCodeBlockPrompt);
}
if (reviewResultObj.IsTopLevelStatement is false)
{
var fixCodeBlockPrompt = @"The code is not top level statement, please rewrite your dotnet code using top level statement";
reviews.Add(fixCodeBlockPrompt);
}
if (reviewResultObj.IsPrintResultToConsole is false)
{
var fixCodeBlockPrompt = @"The code doesn't print out result to console, please print out result to console";
reviews.Add(fixCodeBlockPrompt);
}
if (reviews.Count > 0)
{
var sb = new StringBuilder();
sb.AppendLine("There're some comments from code reviewer, please fix these comments");
foreach (var review in reviews)
{
sb.AppendLine($"- {review}");
}
return new TextMessage(Role.Assistant, sb.ToString(), from: "code_reviewer");
}
else
{
var msg = new TextMessage(Role.Assistant, "The code looks good, please ask runner to run the code for you.")
{
From = "code_reviewer",
};
return msg;
}
}
else
{
var originalContent = reply.GetContent();
var prompt = $@"Please convert the content to ReviewCodeBlock function arguments.
## Original Content
{originalContent}";
reply = await innerAgent.SendAsync(prompt, msgs, ct);
}
}
throw new Exception("Failed to review code block");
})
.RegisterPrintMessage();
return reviewer;
}
#endregion create_reviewer
public static async Task RunWorkflowAsync()
{
long the39thFibonacciNumber = 63245986;
var workDir = Path.Combine(Path.GetTempPath(), "InteractiveService");
if (!Directory.Exists(workDir))
Directory.CreateDirectory(workDir);
using var service = new InteractiveService(workDir);
var dotnetInteractiveFunctions = new DotnetInteractiveFunction(service);
await service.StartAsync(workDir, default);
#region create_workflow
var reviewer = await CreateReviewerAgentAsync();
var coder = await CreateCoderAgentAsync();
var runner = await CreateRunnerAgentAsync(service);
var admin = await CreateAdminAsync();
var admin2CoderTransition = Transition.Create(admin, coder);
var coder2ReviewerTransition = Transition.Create(coder, reviewer);
var reviewer2RunnerTransition = Transition.Create(
from: reviewer,
to: runner,
canTransitionAsync: async (from, to, messages) =>
{
var lastMessage = messages.Last();
if (lastMessage is TextMessage textMessage && textMessage.Content.ToLower().Contains("the code looks good, please ask runner to run the code for you.") is true)
{
// ask runner to run the code
return true;
}
return false;
});
var reviewer2CoderTransition = Transition.Create(
from: reviewer,
to: coder,
canTransitionAsync: async (from, to, messages) =>
{
var lastMessage = messages.Last();
if (lastMessage is TextMessage textMessage && textMessage.Content.ToLower().Contains("there're some comments from code reviewer, please fix these comments") is true)
{
// ask coder to fix the code based on reviewer's comments
return true;
}
return false;
});
var runner2CoderTransition = Transition.Create(
from: runner,
to: coder,
canTransitionAsync: async (from, to, messages) =>
{
var lastMessage = messages.Last();
if (lastMessage is TextMessage textMessage && textMessage.Content.ToLower().Contains("error") is true)
{
// ask coder to fix the error
return true;
}
return false;
});
var runner2AdminTransition = Transition.Create(runner, admin);
var workflow = new Graph(
[
admin2CoderTransition,
coder2ReviewerTransition,
reviewer2RunnerTransition,
reviewer2CoderTransition,
runner2CoderTransition,
runner2AdminTransition,
]);
#endregion create_workflow
#region create_group_chat_with_workflow
var groupChat = new GroupChat(
admin: admin,
workflow: workflow,
members:
[
admin,
coder,
runner,
reviewer,
]);
admin.SendIntroduction("Welcome to my group, work together to resolve my task", groupChat);
coder.SendIntroduction("I will write dotnet code to resolve task", groupChat);
reviewer.SendIntroduction("I will review dotnet code", groupChat);
runner.SendIntroduction("I will run dotnet code once the review is done", groupChat);
var groupChatManager = new GroupChatManager(groupChat);
var conversationHistory = await admin.InitiateChatAsync(groupChatManager, "What's the 39th of fibonacci number?", maxRound: 10);
#endregion create_group_chat_with_workflow
// the last message is from admin, which is the termination message
var lastMessage = conversationHistory.Last();
lastMessage.From.Should().Be("admin");
lastMessage.IsGroupChatTerminateMessage().Should().BeTrue();
lastMessage.Should().BeOfType<TextMessage>();
lastMessage.GetContent().Should().Contain(the39thFibonacciNumber.ToString());
}
public static async Task RunAsync()
{
long the39thFibonacciNumber = 63245986;
var workDir = Path.Combine(Path.GetTempPath(), "InteractiveService");
if (!Directory.Exists(workDir))
Directory.CreateDirectory(workDir);
using var service = new InteractiveService(workDir);
var dotnetInteractiveFunctions = new DotnetInteractiveFunction(service);
await service.StartAsync(workDir, default);
#region create_group_chat
var reviewer = await CreateReviewerAgentAsync();
var coder = await CreateCoderAgentAsync();
var runner = await CreateRunnerAgentAsync(service);
var admin = await CreateAdminAsync();
var groupChat = new GroupChat(
admin: admin,
members:
[
admin,
coder,
runner,
reviewer,
]);
admin.SendIntroduction("Welcome to my group, work together to resolve my task", groupChat);
coder.SendIntroduction("I will write dotnet code to resolve task", groupChat);
reviewer.SendIntroduction("I will review dotnet code", groupChat);
runner.SendIntroduction("I will run dotnet code once the review is done", groupChat);
var groupChatManager = new GroupChatManager(groupChat);
var conversationHistory = await admin.InitiateChatAsync(groupChatManager, "What's the 39th of fibonacci number?", maxRound: 10);
// the last message is from admin, which is the termination message
var lastMessage = conversationHistory.Last();
lastMessage.From.Should().Be("admin");
lastMessage.IsGroupChatTerminateMessage().Should().BeTrue();
lastMessage.Should().BeOfType<TextMessage>();
lastMessage.GetContent().Should().Contain(the39thFibonacciNumber.ToString());
#endregion create_group_chat
}
}

View File

@ -0,0 +1,44 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example08_LMStudio.cs
#region lmstudio_using_statements
using AutoGen.Core;
using AutoGen.LMStudio;
#endregion lmstudio_using_statements
namespace AutoGen.BasicSample;
public class Example08_LMStudio
{
public static async Task RunAsync()
{
#region lmstudio_example_1
var config = new LMStudioConfig("localhost", 1234);
var lmAgent = new LMStudioAgent("asssistant", config: config)
.RegisterPrintMessage();
await lmAgent.SendAsync("Can you write a piece of C# code to calculate 100th of fibonacci?");
// output from assistant (the output below is generated using llama-2-chat-7b, the output may vary depending on the model used)
//
// Of course! To calculate the 100th number in the Fibonacci sequence using C#, you can use the following code:```
// using System;
// class FibonacciSequence {
// static int Fibonacci(int n) {
// if (n <= 1) {
// return 1;
// } else {
// return Fibonacci(n - 1) + Fibonacci(n - 2);
// }
// }
// static void Main() {
// Console.WriteLine("The 100th number in the Fibonacci sequence is: " + Fibonacci(100));
// }
// }
// ```
// In this code, we define a function `Fibonacci` that takes an integer `n` as input and returns the `n`-th number in the Fibonacci sequence. The function uses a recursive approach to calculate the value of the sequence.
// The `Main` method simply calls the `Fibonacci` function with the argument `100`, and prints the result to the console.
// Note that this code will only work for positive integers `n`. If you want to calculate the Fibonacci sequence for other types of numbers, such as real or complex numbers, you will need to modify the code accordingly.
#endregion lmstudio_example_1
}
}

View File

@ -0,0 +1,135 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example09_LMStudio_FunctionCall.cs
using System.Text.Json;
using System.Text.Json.Serialization;
using AutoGen.Core;
using AutoGen.LMStudio;
using Azure.AI.OpenAI;
namespace AutoGen.BasicSample;
public class LLaMAFunctionCall
{
[JsonPropertyName("name")]
public string Name { get; set; }
[JsonPropertyName("arguments")]
public JsonElement Arguments { get; set; }
}
public partial class Example09_LMStudio_FunctionCall
{
/// <summary>
/// Get weather from location.
/// </summary>
/// <param name="location">location</param>
/// <param name="date">date. type is string</param>
[Function]
public async Task<string> GetWeather(string location, string date)
{
return $"[Function] The weather on {date} in {location} is sunny.";
}
/// <summary>
/// Search query on Google and return the results.
/// </summary>
/// <param name="query">search query</param>
[Function]
public async Task<string> GoogleSearch(string query)
{
return $"[Function] Here are the search results for {query}.";
}
private static object SerializeFunctionDefinition(FunctionDefinition functionDefinition)
{
return new
{
type = "function",
function = new
{
name = functionDefinition.Name,
description = functionDefinition.Description,
parameters = functionDefinition.Parameters.ToObjectFromJson<object>(),
}
};
}
public static async Task RunAsync()
{
#region lmstudio_function_call_example
// This example has been verified to work with Trelis-Llama-2-7b-chat-hf-function-calling-v3
var instance = new Example09_LMStudio_FunctionCall();
var config = new LMStudioConfig("localhost", 1234);
var systemMessage = @$"You are a helpful AI assistant.";
// Because the LM studio server doesn't support openai function call yet
// To simulate the function call, we can put the function call details in the system message
// And ask agent to response in function call object format using few-shot example
object[] functionList =
[
SerializeFunctionDefinition(instance.GetWeatherFunction),
SerializeFunctionDefinition(instance.GoogleSearchFunction)
];
var functionListString = JsonSerializer.Serialize(functionList, new JsonSerializerOptions { WriteIndented = true });
var lmAgent = new LMStudioAgent(
name: "assistant",
systemMessage: @$"
You are a helpful AI assistant
You have access to the following functions. Use them if required:
{functionListString}",
config: config)
.RegisterMiddleware(async (msgs, option, innerAgent, ct) =>
{
// inject few-shot example to the message
var exampleGetWeather = new TextMessage(Role.User, "Get weather in London");
var exampleAnswer = new TextMessage(Role.Assistant, "{\n \"name\": \"GetWeather\",\n \"arguments\": {\n \"city\": \"London\"\n }\n}", from: innerAgent.Name);
msgs = new[] { exampleGetWeather, exampleAnswer }.Concat(msgs).ToArray();
var reply = await innerAgent.GenerateReplyAsync(msgs, option, ct);
// if reply is a function call, invoke function
var content = reply.GetContent();
try
{
if (JsonSerializer.Deserialize<LLaMAFunctionCall>(content) is { } functionCall)
{
var arguments = JsonSerializer.Serialize(functionCall.Arguments);
// invoke function wrapper
if (functionCall.Name == instance.GetWeatherFunction.Name)
{
var result = await instance.GetWeatherWrapper(arguments);
return new TextMessage(Role.Assistant, result);
}
else if (functionCall.Name == instance.GoogleSearchFunction.Name)
{
var result = await instance.GoogleSearchWrapper(arguments);
return new TextMessage(Role.Assistant, result);
}
else
{
throw new Exception($"Unknown function call: {functionCall.Name}");
}
}
}
catch (JsonException)
{
// ignore
}
return reply;
})
.RegisterPrintMessage();
var userProxyAgent = new UserProxyAgent(
name: "user",
humanInputMode: HumanInputMode.ALWAYS);
await userProxyAgent.SendAsync(
receiver: lmAgent,
"Search the names of the five largest stocks in the US by market cap ");
#endregion lmstudio_function_call_example
}
}

View File

@ -0,0 +1,80 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example10_SemanticKernel.cs
using System.ComponentModel;
using AutoGen.Core;
using AutoGen.SemanticKernel.Extension;
using FluentAssertions;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
namespace AutoGen.BasicSample;
public class LightPlugin
{
public bool IsOn { get; set; } = false;
[KernelFunction]
[Description("Gets the state of the light.")]
public string GetState() => this.IsOn ? "on" : "off";
[KernelFunction]
[Description("Changes the state of the light.'")]
public string ChangeState(bool newState)
{
this.IsOn = newState;
var state = this.GetState();
// Print the state to the console
Console.ForegroundColor = ConsoleColor.DarkBlue;
Console.WriteLine($"[Light is now {state}]");
Console.ResetColor();
return state;
}
}
public class Example10_SemanticKernel
{
public static async Task RunAsync()
{
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-3.5-turbo";
var builder = Kernel.CreateBuilder()
.AddOpenAIChatCompletion(modelId: modelId, apiKey: openAIKey);
var kernel = builder.Build();
var settings = new OpenAIPromptExecutionSettings
{
ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions,
};
kernel.Plugins.AddFromObject(new LightPlugin());
var skAgent = kernel
.ToSemanticKernelAgent(name: "assistant", systemMessage: "You control the light", settings);
// Send a message to the skAgent, the skAgent supports the following message types:
// - IMessage<ChatMessageContent>
// - (streaming) IMessage<StreamingChatMessageContent>
// You can create an IMessage<ChatMessageContent> using MessageEnvelope.Create
var chatMessageContent = MessageEnvelope.Create(new ChatMessageContent(AuthorRole.User, "Toggle the light"));
var reply = await skAgent.SendAsync(chatMessageContent);
reply.Should().BeOfType<MessageEnvelope<ChatMessageContent>>();
Console.WriteLine((reply as IMessage<ChatMessageContent>).Content.Items[0].As<TextContent>().Text);
var skAgentWithMiddleware = skAgent
.RegisterMessageConnector()
.RegisterPrintMessage();
// Now the skAgentWithMiddleware supports more IMessage types like TextMessage, ImageMessage or MultiModalMessage
// It also register a print format message hook to print the message in a human readable format to the console
await skAgent.SendAsync(chatMessageContent);
await skAgentWithMiddleware.SendAsync(new TextMessage(Role.User, "Toggle the light"));
// The more message type an agent support, the more flexible it is to be used in different scenarios
// For example, since the TextMessage is supported, the skAgentWithMiddleware can be used with user proxy.
var userProxy = new UserProxyAgent("user");
await skAgentWithMiddleware.InitiateChatAsync(userProxy, "how can I help you today");
}
}

View File

@ -0,0 +1,94 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example11_Sequential_GroupChat_Example.cs
#region using_statement
using AutoGen.Core;
using AutoGen.OpenAI;
using AutoGen.OpenAI.Extension;
using AutoGen.SemanticKernel;
using AutoGen.SemanticKernel.Extension;
using Azure.AI.OpenAI;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Plugins.Web;
using Microsoft.SemanticKernel.Plugins.Web.Bing;
#endregion using_statement
namespace AutoGen.BasicSample;
public partial class Sequential_GroupChat_Example
{
public static async Task<IAgent> CreateBingSearchAgentAsync()
{
#region CreateBingSearchAgent
var config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var apiKey = config.ApiKey;
var kernelBuilder = Kernel.CreateBuilder()
.AddAzureOpenAIChatCompletion(config.DeploymentName, config.Endpoint, apiKey);
var bingApiKey = Environment.GetEnvironmentVariable("BING_API_KEY") ?? throw new Exception("BING_API_KEY environment variable is not set");
var bingSearch = new BingConnector(bingApiKey);
var webSearchPlugin = new WebSearchEnginePlugin(bingSearch);
kernelBuilder.Plugins.AddFromObject(webSearchPlugin);
var kernel = kernelBuilder.Build();
var kernelAgent = new SemanticKernelAgent(
kernel: kernel,
name: "bing-search",
systemMessage: """
You search results from Bing and return it as-is.
You put the original search result between ```bing and ```
e.g.
```bing
xxx
```
""")
.RegisterMessageConnector()
.RegisterPrintMessage(); // pretty print the message
return kernelAgent;
#endregion CreateBingSearchAgent
}
public static async Task<IAgent> CreateSummarizerAgentAsync()
{
#region CreateSummarizerAgent
var config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var apiKey = config.ApiKey;
var endPoint = new Uri(config.Endpoint);
var openAIClient = new OpenAIClient(endPoint, new Azure.AzureKeyCredential(apiKey));
var openAIClientAgent = new OpenAIChatAgent(
openAIClient: openAIClient,
name: "summarizer",
modelName: config.DeploymentName,
systemMessage: "You summarize search result from bing in a short and concise manner");
return openAIClientAgent
.RegisterMessageConnector()
.RegisterPrintMessage(); // pretty print the message
#endregion CreateSummarizerAgent
}
public static async Task RunAsync()
{
#region Sequential_GroupChat_Example
var userProxyAgent = new UserProxyAgent(
name: "user",
humanInputMode: HumanInputMode.ALWAYS)
.RegisterPrintMessage();
var bingSearchAgent = await CreateBingSearchAgentAsync();
var summarizerAgent = await CreateSummarizerAgentAsync();
var groupChat = new RoundRobinGroupChat(
agents: [userProxyAgent, bingSearchAgent, summarizerAgent]);
var groupChatAgent = new GroupChatManager(groupChat);
var history = await userProxyAgent.InitiateChatAsync(
receiver: groupChatAgent,
message: "How to deploy an openai resource on azure",
maxRound: 10);
#endregion Sequential_GroupChat_Example
}
}

View File

@ -0,0 +1,199 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example11_TwoAgent_Fill_Application.cs
using System.Text;
using AutoGen.OpenAI;
using AutoGen.Core;
using Azure.AI.OpenAI;
using AutoGen.OpenAI.Extension;
namespace AutoGen.BasicSample;
public partial class TwoAgent_Fill_Application
{
private string? name = null;
private string? email = null;
private string? phone = null;
private string? address = null;
private bool? receiveUpdates = null;
[Function]
public async Task<string> SaveProgress(
string name,
string email,
string phone,
string address,
bool? receiveUpdates)
{
this.name = !string.IsNullOrEmpty(name) ? name : this.name;
this.email = !string.IsNullOrEmpty(email) ? email : this.email;
this.phone = !string.IsNullOrEmpty(phone) ? phone : this.phone;
this.address = !string.IsNullOrEmpty(address) ? address : this.address;
this.receiveUpdates = receiveUpdates ?? this.receiveUpdates;
var missingInformationStringBuilder = new StringBuilder();
if (string.IsNullOrEmpty(this.name))
{
missingInformationStringBuilder.AppendLine("Name is missing.");
}
if (string.IsNullOrEmpty(this.email))
{
missingInformationStringBuilder.AppendLine("Email is missing.");
}
if (string.IsNullOrEmpty(this.phone))
{
missingInformationStringBuilder.AppendLine("Phone is missing.");
}
if (string.IsNullOrEmpty(this.address))
{
missingInformationStringBuilder.AppendLine("Address is missing.");
}
if (this.receiveUpdates == null)
{
missingInformationStringBuilder.AppendLine("ReceiveUpdates is missing.");
}
if (missingInformationStringBuilder.Length > 0)
{
return missingInformationStringBuilder.ToString();
}
else
{
return "Application information is saved to database.";
}
}
public static async Task<IAgent> CreateSaveProgressAgent()
{
var gpt3Config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var endPoint = gpt3Config.Endpoint ?? throw new Exception("Please set AZURE_OPENAI_ENDPOINT environment variable.");
var apiKey = gpt3Config.ApiKey ?? throw new Exception("Please set AZURE_OPENAI_API_KEY environment variable.");
var openaiClient = new OpenAIClient(new Uri(endPoint), new Azure.AzureKeyCredential(apiKey));
var instance = new TwoAgent_Fill_Application();
var functionCallConnector = new FunctionCallMiddleware(
functions: [instance.SaveProgressFunctionContract],
functionMap: new Dictionary<string, Func<string, Task<string>>>
{
{ instance.SaveProgressFunctionContract.Name, instance.SaveProgressWrapper },
});
var chatAgent = new OpenAIChatAgent(
openAIClient: openaiClient,
name: "application",
modelName: gpt3Config.DeploymentName,
systemMessage: """You are a helpful application form assistant who saves progress while user fills application.""")
.RegisterMessageConnector()
.RegisterMiddleware(functionCallConnector)
.RegisterMiddleware(async (msgs, option, agent, ct) =>
{
var lastUserMessage = msgs.Last() ?? throw new Exception("No user message found.");
var prompt = $"""
Save progress according to the most recent information provided by user.
```user
{lastUserMessage.GetContent()}
```
""";
return await agent.GenerateReplyAsync([lastUserMessage], option, ct);
});
return chatAgent;
}
public static async Task<IAgent> CreateAssistantAgent()
{
var gpt3Config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var endPoint = gpt3Config.Endpoint ?? throw new Exception("Please set AZURE_OPENAI_ENDPOINT environment variable.");
var apiKey = gpt3Config.ApiKey ?? throw new Exception("Please set AZURE_OPENAI_API_KEY environment variable.");
var openaiClient = new OpenAIClient(new Uri(endPoint), new Azure.AzureKeyCredential(apiKey));
var chatAgent = new OpenAIChatAgent(
openAIClient: openaiClient,
name: "assistant",
modelName: gpt3Config.DeploymentName,
systemMessage: """You create polite prompt to ask user provide missing information""")
.RegisterMessageConnector()
.RegisterPrintMessage()
.RegisterMiddleware(async (msgs, option, agent, ct) =>
{
var lastReply = msgs.Last() ?? throw new Exception("No reply found.");
var reply = await agent.GenerateReplyAsync(msgs, option, ct);
// if application is complete, exit conversation by sending termination message
if (lastReply.GetContent().Contains("Application information is saved to database."))
{
return new TextMessage(Role.Assistant, GroupChatExtension.TERMINATE, from: agent.Name);
}
else
{
return reply;
}
});
return chatAgent;
}
public static async Task<IAgent> CreateUserAgent()
{
var gpt3Config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo();
var endPoint = gpt3Config.Endpoint ?? throw new Exception("Please set AZURE_OPENAI_ENDPOINT environment variable.");
var apiKey = gpt3Config.ApiKey ?? throw new Exception("Please set AZURE_OPENAI_API_KEY environment variable.");
var openaiClient = new OpenAIClient(new Uri(endPoint), new Azure.AzureKeyCredential(apiKey));
var chatAgent = new OpenAIChatAgent(
openAIClient: openaiClient,
name: "user",
modelName: gpt3Config.DeploymentName,
systemMessage: """
You are a user who is filling an application form. Simply provide the information as requested and answer the questions, don't do anything else.
here's some personal information about you:
- name: John Doe
- email: 1234567@gmail.com
- phone: 123-456-7890
- address: 1234 Main St, Redmond, WA 98052
- want to receive update? true
""")
.RegisterMessageConnector()
.RegisterPrintMessage();
return chatAgent;
}
public static async Task RunAsync()
{
var applicationAgent = await CreateSaveProgressAgent();
var assistantAgent = await CreateAssistantAgent();
var userAgent = await CreateUserAgent();
var userToApplicationTransition = Transition.Create(userAgent, applicationAgent);
var applicationToAssistantTransition = Transition.Create(applicationAgent, assistantAgent);
var assistantToUserTransition = Transition.Create(assistantAgent, userAgent);
var workflow = new Graph(
[
userToApplicationTransition,
applicationToAssistantTransition,
assistantToUserTransition,
]);
var groupChat = new GroupChat(
members: [userAgent, applicationAgent, assistantAgent],
workflow: workflow);
var groupChatManager = new GroupChatManager(groupChat);
var initialMessage = await assistantAgent.SendAsync("Generate a greeting meesage for user and start the conversation by asking what's their name.");
var chatHistory = await userAgent.SendAsync(groupChatManager, [initialMessage], maxRound: 30);
var lastMessage = chatHistory.Last();
Console.WriteLine(lastMessage.GetContent());
}
}

View File

@ -0,0 +1,67 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example13_OpenAIAgent_JsonMode.cs
using System.Text.Json;
using System.Text.Json.Serialization;
using AutoGen.Core;
using AutoGen.OpenAI;
using AutoGen.OpenAI.Extension;
using Azure.AI.OpenAI;
using FluentAssertions;
namespace AutoGen.BasicSample;
public class Example13_OpenAIAgent_JsonMode
{
public static async Task RunAsync()
{
#region create_agent
var config = LLMConfiguration.GetAzureOpenAIGPT3_5_Turbo(deployName: "gpt-35-turbo-0125"); // json mode only works with 0125 and later model.
var apiKey = config.ApiKey;
var endPoint = new Uri(config.Endpoint);
var openAIClient = new OpenAIClient(endPoint, new Azure.AzureKeyCredential(apiKey));
var openAIClientAgent = new OpenAIChatAgent(
openAIClient: openAIClient,
name: "assistant",
modelName: config.DeploymentName,
systemMessage: "You are a helpful assistant designed to output JSON.",
seed: 0, // explicitly set a seed to enable deterministic output
responseFormat: ChatCompletionsResponseFormat.JsonObject) // set response format to JSON object to enable JSON mode
.RegisterMessageConnector();
#endregion create_agent
#region chat_with_agent
var reply = await openAIClientAgent.SendAsync("My name is John, I am 25 years old, and I live in Seattle.");
var person = JsonSerializer.Deserialize<Person>(reply.GetContent());
Console.WriteLine($"Name: {person.Name}");
Console.WriteLine($"Age: {person.Age}");
if (!string.IsNullOrEmpty(person.Address))
{
Console.WriteLine($"Address: {person.Address}");
}
Console.WriteLine("Done.");
#endregion chat_with_agent
person.Name.Should().Be("John");
person.Age.Should().Be(25);
person.Address.Should().BeNullOrEmpty();
}
}
#region person_class
public class Person
{
[JsonPropertyName("name")]
public string Name { get; set; }
[JsonPropertyName("age")]
public int Age { get; set; }
[JsonPropertyName("address")]
public string Address { get; set; }
}
#endregion person_class

View File

@ -0,0 +1,65 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Example14_MistralClientAgent_TokenCount.cs
#region using_statements
using AutoGen.Core;
using AutoGen.Mistral;
#endregion using_statements
using FluentAssertions;
namespace AutoGen.BasicSample;
public class Example14_MistralClientAgent_TokenCount
{
#region token_counter_middleware
public class MistralAITokenCounterMiddleware : IMiddleware
{
private readonly List<ChatCompletionResponse> responses = new List<ChatCompletionResponse>();
public string? Name => nameof(MistralAITokenCounterMiddleware);
public async Task<IMessage> InvokeAsync(MiddlewareContext context, IAgent agent, CancellationToken cancellationToken = default)
{
var reply = await agent.GenerateReplyAsync(context.Messages, context.Options, cancellationToken);
if (reply is IMessage<ChatCompletionResponse> message)
{
responses.Add(message.Content);
}
return reply;
}
public int GetCompletionTokenCount()
{
return responses.Sum(r => r.Usage.CompletionTokens);
}
}
#endregion token_counter_middleware
public static async Task RunAsync()
{
#region create_mistral_client_agent
var apiKey = Environment.GetEnvironmentVariable("MISTRAL_API_KEY") ?? throw new Exception("Missing MISTRAL_API_KEY environment variable.");
var mistralClient = new MistralClient(apiKey);
var agent = new MistralClientAgent(
client: mistralClient,
name: "assistant",
model: MistralAIModelID.OPEN_MISTRAL_7B);
#endregion create_mistral_client_agent
#region register_middleware
var tokenCounterMiddleware = new MistralAITokenCounterMiddleware();
var mistralMessageConnector = new MistralChatMessageConnector();
var agentWithTokenCounter = agent
.RegisterMiddleware(tokenCounterMiddleware)
.RegisterMiddleware(mistralMessageConnector)
.RegisterPrintMessage();
#endregion register_middleware
#region chat_with_agent
await agentWithTokenCounter.SendAsync("write a long, tedious story");
Console.WriteLine($"Completion token count: {tokenCounterMiddleware.GetCompletionTokenCount()}");
tokenCounterMiddleware.GetCompletionTokenCount().Should().BeGreaterThan(0);
#endregion chat_with_agent
}
}

View File

@ -0,0 +1,3 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// GlobalUsing.cs

View File

@ -0,0 +1,40 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// LLMConfiguration.cs
using AutoGen.OpenAI;
namespace AutoGen.BasicSample;
internal static class LLMConfiguration
{
public static OpenAIConfig GetOpenAIGPT3_5_Turbo()
{
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-3.5-turbo";
return new OpenAIConfig(openAIKey, modelId);
}
public static OpenAIConfig GetOpenAIGPT4()
{
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-4";
return new OpenAIConfig(openAIKey, modelId);
}
public static AzureOpenAIConfig GetAzureOpenAIGPT3_5_Turbo(string deployName = "gpt-35-turbo-16k")
{
var azureOpenAIKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY") ?? throw new Exception("Please set AZURE_OPENAI_API_KEY environment variable.");
var endpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT") ?? throw new Exception("Please set AZURE_OPENAI_ENDPOINT environment variable.");
return new AzureOpenAIConfig(endpoint, deployName, azureOpenAIKey);
}
public static AzureOpenAIConfig GetAzureOpenAIGPT4(string deployName = "gpt-4")
{
var azureOpenAIKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY") ?? throw new Exception("Please set AZURE_OPENAI_API_KEY environment variable.");
var endpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT") ?? throw new Exception("Please set AZURE_OPENAI_ENDPOINT environment variable.");
return new AzureOpenAIConfig(endpoint, deployName, azureOpenAIKey);
}
}

View File

@ -0,0 +1,5 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Program.cs
using AutoGen.BasicSample;
await Example14_MistralClientAgent_TokenCount.RunAsync();

View File

@ -0,0 +1,31 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// DefaultReplyAgent.cs
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
public class DefaultReplyAgent : IAgent
{
public DefaultReplyAgent(
string name,
string? defaultReply)
{
Name = name;
DefaultReply = defaultReply ?? string.Empty;
}
public string Name { get; }
public string DefaultReply { get; } = string.Empty;
public async Task<IMessage> GenerateReplyAsync(
IEnumerable<IMessage> _,
GenerateReplyOptions? __ = null,
CancellationToken ___ = default)
{
return new TextMessage(Role.Assistant, DefaultReply, from: this.Name);
}
}

View File

@ -0,0 +1,34 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// GroupChatManager.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
public class GroupChatManager : IAgent
{
public GroupChatManager(IGroupChat groupChat)
{
GroupChat = groupChat;
}
public string Name => throw new ArgumentException("GroupChatManager does not have a name");
public IEnumerable<IMessage>? Messages { get; private set; }
public IGroupChat GroupChat { get; }
public async Task<IMessage> GenerateReplyAsync(
IEnumerable<IMessage> messages,
GenerateReplyOptions? options,
CancellationToken cancellationToken = default)
{
var response = await GroupChat.CallAsync(messages, ct: cancellationToken);
Messages = response;
return response.Last();
}
}

View File

@ -0,0 +1,50 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// IAgent.cs
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
public interface IAgent
{
public string Name { get; }
/// <summary>
/// Generate reply
/// </summary>
/// <param name="messages">conversation history</param>
/// <param name="options">completion option. If provided, it should override existing option if there's any</param>
public Task<IMessage> GenerateReplyAsync(
IEnumerable<IMessage> messages,
GenerateReplyOptions? options = null,
CancellationToken cancellationToken = default);
}
public class GenerateReplyOptions
{
public GenerateReplyOptions()
{
}
/// <summary>
/// Copy constructor
/// </summary>
/// <param name="other">other option to copy from</param>
public GenerateReplyOptions(GenerateReplyOptions other)
{
this.Temperature = other.Temperature;
this.MaxToken = other.MaxToken;
this.StopSequence = other.StopSequence?.Select(s => s)?.ToArray();
this.Functions = other.Functions?.Select(f => f)?.ToArray();
}
public float? Temperature { get; set; }
public int? MaxToken { get; set; }
public string[]? StopSequence { get; set; }
public FunctionContract[]? Functions { get; set; }
}

View File

@ -0,0 +1,50 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// IMiddlewareAgent.cs
using System.Collections.Generic;
namespace AutoGen.Core;
public interface IMiddlewareAgent : IAgent
{
/// <summary>
/// Get the inner agent.
/// </summary>
IAgent Agent { get; }
/// <summary>
/// Get the middlewares.
/// </summary>
IEnumerable<IMiddleware> Middlewares { get; }
/// <summary>
/// Use middleware.
/// </summary>
void Use(IMiddleware middleware);
}
public interface IMiddlewareStreamAgent : IMiddlewareAgent, IStreamingAgent
{
/// <summary>
/// Get the inner agent.
/// </summary>
IStreamingAgent StreamingAgent { get; }
IEnumerable<IStreamingMiddleware> StreamingMiddlewares { get; }
void UseStreaming(IStreamingMiddleware middleware);
}
public interface IMiddlewareAgent<out T> : IMiddlewareAgent
where T : IAgent
{
/// <summary>
/// Get the typed inner agent.
/// </summary>
T TAgent { get; }
}
public interface IMiddlewareStreamAgent<out T> : IMiddlewareStreamAgent, IMiddlewareAgent<T>
where T : IStreamingAgent
{
}

View File

@ -0,0 +1,19 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// IStreamingAgent.cs
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
/// <summary>
/// agent that supports streaming reply
/// </summary>
public interface IStreamingAgent : IAgent
{
public Task<IAsyncEnumerable<IStreamingMessage>> GenerateStreamingReplyAsync(
IEnumerable<IMessage> messages,
GenerateReplyOptions? options = null,
CancellationToken cancellationToken = default);
}

View File

@ -0,0 +1,136 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MiddlewareAgent.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
/// <summary>
/// An agent that allows you to add middleware and modify the behavior of an existing agent.
/// </summary>
public class MiddlewareAgent : IMiddlewareAgent
{
private readonly IAgent _agent;
private readonly List<IMiddleware> middlewares = new();
/// <summary>
/// Create a new instance of <see cref="MiddlewareAgent"/>
/// </summary>
/// <param name="innerAgent">the inner agent where middleware will be added.</param>
/// <param name="name">the name of the agent if provided. Otherwise, the name of <paramref name="innerAgent"/> will be used.</param>
public MiddlewareAgent(IAgent innerAgent, string? name = null)
{
this.Name = name ?? innerAgent.Name;
this._agent = innerAgent;
}
/// <summary>
/// Create a new instance of <see cref="MiddlewareAgent"/> by copying the middlewares from another <see cref="MiddlewareAgent"/>.
/// </summary>
public MiddlewareAgent(MiddlewareAgent other)
{
this.Name = other.Name;
this._agent = other._agent;
this.middlewares.AddRange(other.middlewares);
}
public string Name { get; }
/// <summary>
/// Get the inner agent.
/// </summary>
public IAgent Agent => this._agent;
/// <summary>
/// Get the middlewares.
/// </summary>
public IEnumerable<IMiddleware> Middlewares => this.middlewares;
public Task<IMessage> GenerateReplyAsync(
IEnumerable<IMessage> messages,
GenerateReplyOptions? options = null,
CancellationToken cancellationToken = default)
{
IAgent agent = this._agent;
foreach (var middleware in this.middlewares)
{
agent = new DelegateAgent(middleware, agent);
}
return agent.GenerateReplyAsync(messages, options, cancellationToken);
}
/// <summary>
/// Add a middleware to the agent. If multiple middlewares are added, they will be executed in the LIFO order.
/// Call into the next function to continue the execution of the next middleware.
/// Short cut middleware execution by not calling into the next function.
/// </summary>
public void Use(Func<IEnumerable<IMessage>, GenerateReplyOptions?, IAgent, CancellationToken, Task<IMessage>> func, string? middlewareName = null)
{
this.middlewares.Add(new DelegateMiddleware(middlewareName, async (context, agent, cancellationToken) =>
{
return await func(context.Messages, context.Options, agent, cancellationToken);
}));
}
public void Use(IMiddleware middleware)
{
this.middlewares.Add(middleware);
}
public override string ToString()
{
var names = this.Middlewares.Select(m => m.Name ?? "[Unknown middleware]");
var namesPlusAgentName = names.Append(this.Name);
return namesPlusAgentName.Aggregate((a, b) => $"{a} -> {b}");
}
private class DelegateAgent : IAgent
{
private readonly IAgent innerAgent;
private readonly IMiddleware middleware;
public DelegateAgent(IMiddleware middleware, IAgent innerAgent)
{
this.middleware = middleware;
this.innerAgent = innerAgent;
}
public string Name { get => this.innerAgent.Name; }
public Task<IMessage> GenerateReplyAsync(
IEnumerable<IMessage> messages,
GenerateReplyOptions? options = null,
CancellationToken cancellationToken = default)
{
var context = new MiddlewareContext(messages, options);
return this.middleware.InvokeAsync(context, this.innerAgent, cancellationToken);
}
}
}
public sealed class MiddlewareAgent<T> : MiddlewareAgent, IMiddlewareAgent<T>
where T : IAgent
{
public MiddlewareAgent(T innerAgent, string? name = null)
: base(innerAgent, name)
{
this.TAgent = innerAgent;
}
public MiddlewareAgent(MiddlewareAgent<T> other)
: base(other)
{
this.TAgent = other.TAgent;
}
/// <summary>
/// Get the inner agent of type <typeparamref name="T"/>.
/// </summary>
public T TAgent { get; }
}

View File

@ -0,0 +1,124 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MiddlewareStreamingAgent.cs
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
public class MiddlewareStreamingAgent : MiddlewareAgent, IMiddlewareStreamAgent
{
private readonly IStreamingAgent _agent;
private readonly List<IStreamingMiddleware> _streamingMiddlewares = new();
private readonly List<IMiddleware> _middlewares = new();
public MiddlewareStreamingAgent(
IStreamingAgent agent,
string? name = null,
IEnumerable<IStreamingMiddleware>? streamingMiddlewares = null,
IEnumerable<IMiddleware>? middlewares = null)
: base(agent, name)
{
_agent = agent;
if (streamingMiddlewares != null)
{
_streamingMiddlewares.AddRange(streamingMiddlewares);
}
if (middlewares != null)
{
_middlewares.AddRange(middlewares);
}
}
/// <summary>
/// Get the inner agent.
/// </summary>
public IStreamingAgent StreamingAgent => _agent;
/// <summary>
/// Get the streaming middlewares.
/// </summary>
public IEnumerable<IStreamingMiddleware> StreamingMiddlewares => _streamingMiddlewares;
public Task<IAsyncEnumerable<IStreamingMessage>> GenerateStreamingReplyAsync(IEnumerable<IMessage> messages, GenerateReplyOptions? options = null, CancellationToken cancellationToken = default)
{
var agent = _agent;
foreach (var middleware in _streamingMiddlewares)
{
agent = new DelegateStreamingAgent(middleware, agent);
}
return agent.GenerateStreamingReplyAsync(messages, options, cancellationToken);
}
public void UseStreaming(IStreamingMiddleware middleware)
{
_streamingMiddlewares.Add(middleware);
}
private class DelegateStreamingAgent : IStreamingAgent
{
private IStreamingMiddleware? streamingMiddleware;
private IMiddleware? middleware;
private IStreamingAgent innerAgent;
public string Name => innerAgent.Name;
public DelegateStreamingAgent(IStreamingMiddleware middleware, IStreamingAgent next)
{
this.streamingMiddleware = middleware;
this.innerAgent = next;
}
public DelegateStreamingAgent(IMiddleware middleware, IStreamingAgent next)
{
this.middleware = middleware;
this.innerAgent = next;
}
public async Task<IMessage> GenerateReplyAsync(IEnumerable<IMessage> messages, GenerateReplyOptions? options = null, CancellationToken cancellationToken = default)
{
if (middleware is null)
{
return await innerAgent.GenerateReplyAsync(messages, options, cancellationToken);
}
var context = new MiddlewareContext(messages, options);
return await middleware.InvokeAsync(context, innerAgent, cancellationToken);
}
public Task<IAsyncEnumerable<IStreamingMessage>> GenerateStreamingReplyAsync(IEnumerable<IMessage> messages, GenerateReplyOptions? options = null, CancellationToken cancellationToken = default)
{
if (streamingMiddleware is null)
{
return innerAgent.GenerateStreamingReplyAsync(messages, options, cancellationToken);
}
var context = new MiddlewareContext(messages, options);
return streamingMiddleware.InvokeAsync(context, innerAgent, cancellationToken);
}
}
}
public sealed class MiddlewareStreamingAgent<T> : MiddlewareStreamingAgent, IMiddlewareStreamAgent<T>
where T : IStreamingAgent
{
public MiddlewareStreamingAgent(T innerAgent, string? name = null)
: base(innerAgent, name)
{
TAgent = innerAgent;
}
public MiddlewareStreamingAgent(MiddlewareStreamingAgent<T> other)
: base(other)
{
TAgent = other.TAgent;
}
/// <summary>
/// Get the inner agent.
/// </summary>
public T TAgent { get; }
}

View File

@ -0,0 +1,21 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<RootNamespace>AutoGen.Core</RootNamespace>
</PropertyGroup>
<Import Project="$(RepoRoot)/dotnet/nuget/nuget-package.props" />
<PropertyGroup>
<!-- NuGet Package Settings -->
<Title>AutoGen.Core</Title>
<Description>
Core library for AutoGen. This package provides contracts and core functionalities for AutoGen.
</Description>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="JsonSchema.Net.Generation" Version="$(JsonSchemaVersion)" />
</ItemGroup>
</Project>

View File

@ -0,0 +1,174 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// AgentExtension.cs
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
public static class AgentExtension
{
/// <summary>
/// Send message to an agent.
/// </summary>
/// <param name="message">message to send. will be added to the end of <paramref name="chatHistory"/> if provided </param>
/// <param name="agent">sender agent.</param>
/// <param name="chatHistory">chat history.</param>
/// <returns>conversation history</returns>
public static async Task<IMessage> SendAsync(
this IAgent agent,
IMessage? message = null,
IEnumerable<IMessage>? chatHistory = null,
CancellationToken ct = default)
{
var messages = new List<IMessage>();
if (chatHistory != null)
{
messages.AddRange(chatHistory);
}
if (message != null)
{
messages.Add(message);
}
var result = await agent.GenerateReplyAsync(messages, cancellationToken: ct);
return result;
}
/// <summary>
/// Send message to an agent.
/// </summary>
/// <param name="agent">sender agent.</param>
/// <param name="message">message to send. will be added to the end of <paramref name="chatHistory"/> if provided </param>
/// <param name="chatHistory">chat history.</param>
/// <returns>conversation history</returns>
public static async Task<IMessage> SendAsync(
this IAgent agent,
string message,
IEnumerable<IMessage>? chatHistory = null,
CancellationToken ct = default)
{
var msg = new TextMessage(Role.User, message);
return await agent.SendAsync(msg, chatHistory, ct);
}
/// <summary>
/// Send message to another agent.
/// </summary>
/// <param name="agent">sender agent.</param>
/// <param name="receiver">receiver agent.</param>
/// <param name="chatHistory">chat history.</param>
/// <param name="maxRound">max conversation round.</param>
/// <returns>conversation history</returns>
public static async Task<IEnumerable<IMessage>> SendAsync(
this IAgent agent,
IAgent receiver,
IEnumerable<IMessage> chatHistory,
int maxRound = 10,
CancellationToken ct = default)
{
if (receiver is GroupChatManager manager)
{
var gc = manager.GroupChat;
return await agent.SendMessageToGroupAsync(gc, chatHistory, maxRound, ct);
}
var groupChat = new RoundRobinGroupChat(
agents: new[]
{
agent,
receiver,
});
return await groupChat.CallAsync(chatHistory, maxRound, ct: ct);
}
/// <summary>
/// Send message to another agent.
/// </summary>
/// <param name="agent">sender agent.</param>
/// <param name="message">message to send. will be added to the end of <paramref name="chatHistory"/> if provided </param>
/// <param name="receiver">receiver agent.</param>
/// <param name="chatHistory">chat history.</param>
/// <param name="maxRound">max conversation round.</param>
/// <returns>conversation history</returns>
public static async Task<IEnumerable<IMessage>> SendAsync(
this IAgent agent,
IAgent receiver,
string message,
IEnumerable<IMessage>? chatHistory = null,
int maxRound = 10,
CancellationToken ct = default)
{
var msg = new TextMessage(Role.User, message)
{
From = agent.Name,
};
chatHistory = chatHistory ?? new List<IMessage>();
chatHistory = chatHistory.Append(msg);
return await agent.SendAsync(receiver, chatHistory, maxRound, ct);
}
/// <summary>
/// Shortcut API to send message to another agent.
/// </summary>
/// <param name="agent">sender agent</param>
/// <param name="receiver">receiver agent</param>
/// <param name="message">message to send</param>
/// <param name="maxRound">max round</param>
public static async Task<IEnumerable<IMessage>> InitiateChatAsync(
this IAgent agent,
IAgent receiver,
string? message = null,
int maxRound = 10,
CancellationToken ct = default)
{
var chatHistory = new List<IMessage>();
if (message != null)
{
var msg = new TextMessage(Role.User, message)
{
From = agent.Name,
};
chatHistory.Add(msg);
}
return await agent.SendAsync(receiver, chatHistory, maxRound, ct);
}
public static async Task<IEnumerable<IMessage>> SendMessageToGroupAsync(
this IAgent agent,
IGroupChat groupChat,
string msg,
IEnumerable<IMessage>? chatHistory = null,
int maxRound = 10,
CancellationToken ct = default)
{
var chatMessage = new TextMessage(Role.Assistant, msg, from: agent.Name);
chatHistory = chatHistory ?? Enumerable.Empty<IMessage>();
chatHistory = chatHistory.Append(chatMessage);
return await agent.SendMessageToGroupAsync(groupChat, chatHistory, maxRound, ct);
}
public static async Task<IEnumerable<IMessage>> SendMessageToGroupAsync(
this IAgent _,
IGroupChat groupChat,
IEnumerable<IMessage>? chatHistory = null,
int maxRound = 10,
CancellationToken ct = default)
{
return await groupChat.CallAsync(chatHistory, maxRound, ct);
}
}

View File

@ -0,0 +1,109 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// GroupChatExtension.cs
using System;
using System.Collections.Generic;
using System.Linq;
namespace AutoGen.Core;
public static class GroupChatExtension
{
public const string TERMINATE = "[GROUPCHAT_TERMINATE]";
public const string CLEAR_MESSAGES = "[GROUPCHAT_CLEAR_MESSAGES]";
[Obsolete("please use SendIntroduction")]
public static void AddInitializeMessage(this IAgent agent, string message, IGroupChat groupChat)
{
var msg = new TextMessage(Role.User, message)
{
From = agent.Name
};
groupChat.SendIntroduction(msg);
}
/// <summary>
/// Send an instruction message to the group chat.
/// </summary>
public static void SendIntroduction(this IAgent agent, string message, IGroupChat groupChat)
{
var msg = new TextMessage(Role.User, message)
{
From = agent.Name
};
groupChat.SendIntroduction(msg);
}
public static IEnumerable<IMessage> MessageToKeep(
this IGroupChat _,
IEnumerable<IMessage> messages)
{
var lastCLRMessageIndex = messages.ToList()
.FindLastIndex(x => x.IsGroupChatClearMessage());
// if multiple clr messages, e.g [msg, clr, msg, clr, msg, clr, msg]
// only keep the the messages after the second last clr message.
if (messages.Count(m => m.IsGroupChatClearMessage()) > 1)
{
lastCLRMessageIndex = messages.ToList()
.FindLastIndex(lastCLRMessageIndex - 1, lastCLRMessageIndex - 1, x => x.IsGroupChatClearMessage());
messages = messages.Skip(lastCLRMessageIndex);
}
lastCLRMessageIndex = messages.ToList()
.FindLastIndex(x => x.IsGroupChatClearMessage());
if (lastCLRMessageIndex != -1 && messages.Count() - lastCLRMessageIndex >= 2)
{
messages = messages.Skip(lastCLRMessageIndex);
}
return messages;
}
/// <summary>
/// Return true if <see cref="IMessage"/> contains <see cref="TERMINATE"/>, otherwise false.
/// </summary>
/// <param name="message"></param>
/// <returns></returns>
public static bool IsGroupChatTerminateMessage(this IMessage message)
{
return message.GetContent()?.Contains(TERMINATE) ?? false;
}
public static bool IsGroupChatClearMessage(this IMessage message)
{
return message.GetContent()?.Contains(CLEAR_MESSAGES) ?? false;
}
public static IEnumerable<IMessage> ProcessConversationForAgent(
this IGroupChat groupChat,
IEnumerable<IMessage> initialMessages,
IEnumerable<IMessage> messages)
{
messages = groupChat.MessageToKeep(messages);
return initialMessages.Concat(messages);
}
internal static IEnumerable<IMessage> ProcessConversationsForRolePlay(
this IGroupChat groupChat,
IEnumerable<IMessage> initialMessages,
IEnumerable<IMessage> messages)
{
messages = groupChat.MessageToKeep(messages);
var messagesToKeep = initialMessages.Concat(messages);
return messagesToKeep.Select((x, i) =>
{
var msg = @$"From {x.From}:
{x.GetContent()}
<eof_msg>
round #
{i}";
return new TextMessage(Role.User, content: msg);
});
}
}

View File

@ -0,0 +1,213 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MessageExtension.cs
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace AutoGen.Core;
public static class MessageExtension
{
private static string separator = new string('-', 20);
public static string FormatMessage(this IMessage message)
{
return message switch
{
Message msg => msg.FormatMessage(),
TextMessage textMessage => textMessage.FormatMessage(),
ImageMessage imageMessage => imageMessage.FormatMessage(),
ToolCallMessage toolCallMessage => toolCallMessage.FormatMessage(),
ToolCallResultMessage toolCallResultMessage => toolCallResultMessage.FormatMessage(),
AggregateMessage<ToolCallMessage, ToolCallResultMessage> aggregateMessage => aggregateMessage.FormatMessage(),
_ => message.ToString(),
};
}
public static string FormatMessage(this TextMessage message)
{
var sb = new StringBuilder();
// write from
sb.AppendLine($"TextMessage from {message.From}");
// write a seperator
sb.AppendLine(separator);
sb.AppendLine(message.Content);
// write a seperator
sb.AppendLine(separator);
return sb.ToString();
}
public static string FormatMessage(this ImageMessage message)
{
var sb = new StringBuilder();
// write from
sb.AppendLine($"ImageMessage from {message.From}");
// write a seperator
sb.AppendLine(separator);
sb.AppendLine($"Image: {message.Url}");
// write a seperator
sb.AppendLine(separator);
return sb.ToString();
}
public static string FormatMessage(this ToolCallMessage message)
{
var sb = new StringBuilder();
// write from
sb.AppendLine($"ToolCallMessage from {message.From}");
// write a seperator
sb.AppendLine(separator);
foreach (var toolCall in message.ToolCalls)
{
sb.AppendLine($"- {toolCall.FunctionName}: {toolCall.FunctionArguments}");
}
sb.AppendLine(separator);
return sb.ToString();
}
public static string FormatMessage(this ToolCallResultMessage message)
{
var sb = new StringBuilder();
// write from
sb.AppendLine($"ToolCallResultMessage from {message.From}");
// write a seperator
sb.AppendLine(separator);
foreach (var toolCall in message.ToolCalls)
{
sb.AppendLine($"- {toolCall.FunctionName}: {toolCall.Result}");
}
sb.AppendLine(separator);
return sb.ToString();
}
public static string FormatMessage(this AggregateMessage<ToolCallMessage, ToolCallResultMessage> message)
{
var sb = new StringBuilder();
// write from
sb.AppendLine($"AggregateMessage from {message.From}");
// write a seperator
sb.AppendLine(separator);
sb.AppendLine("ToolCallMessage:");
sb.AppendLine(message.Message1.FormatMessage());
sb.AppendLine("ToolCallResultMessage:");
sb.AppendLine(message.Message2.FormatMessage());
sb.AppendLine(separator);
return sb.ToString();
}
public static string FormatMessage(this Message message)
{
var sb = new StringBuilder();
// write from
sb.AppendLine($"Message from {message.From}");
// write a seperator
sb.AppendLine(separator);
// write content
sb.AppendLine($"content: {message.Content}");
// write function name if exists
if (!string.IsNullOrEmpty(message.FunctionName))
{
sb.AppendLine($"function name: {message.FunctionName}");
sb.AppendLine($"function arguments: {message.FunctionArguments}");
}
// write metadata
if (message.Metadata is { Count: > 0 })
{
sb.AppendLine($"metadata:");
foreach (var item in message.Metadata)
{
sb.AppendLine($"{item.Key}: {item.Value}");
}
}
// write a seperator
sb.AppendLine(separator);
return sb.ToString();
}
public static bool IsSystemMessage(this IMessage message)
{
return message switch
{
TextMessage textMessage => textMessage.Role == Role.System,
Message msg => msg.Role == Role.System,
_ => false,
};
}
/// <summary>
/// Get the content from the message
/// <para>if the message is a <see cref="Message"/> or <see cref="TextMessage"/>, return the content</para>
/// <para>if the message is a <see cref="ToolCallResultMessage"/> and only contains one function call, return the result of that function call</para>
/// <para>if the message is a <see cref="AggregateMessage{ToolCallMessage, ToolCallResultMessage}"/> where TMessage1 is <see cref="ToolCallMessage"/> and TMessage2 is <see cref="ToolCallResultMessage"/> and the second message only contains one function call, return the result of that function call</para>
/// <para>for all other situation, return null.</para>
/// </summary>
/// <param name="message"></param>
public static string? GetContent(this IMessage message)
{
return message switch
{
TextMessage textMessage => textMessage.Content,
Message msg => msg.Content,
ToolCallResultMessage toolCallResultMessage => toolCallResultMessage.ToolCalls.Count == 1 ? toolCallResultMessage.ToolCalls.First().Result : null,
AggregateMessage<ToolCallMessage, ToolCallResultMessage> aggregateMessage => aggregateMessage.Message2.ToolCalls.Count == 1 ? aggregateMessage.Message2.ToolCalls.First().Result : null,
_ => null,
};
}
/// <summary>
/// Get the role from the message if it's available.
/// </summary>
public static Role? GetRole(this IMessage message)
{
return message switch
{
TextMessage textMessage => textMessage.Role,
Message msg => msg.Role,
ImageMessage img => img.Role,
MultiModalMessage multiModal => multiModal.Role,
_ => null,
};
}
/// <summary>
/// Return the tool calls from the message if it's available.
/// <para>if the message is a <see cref="ToolCallMessage"/>, return its tool calls</para>
/// <para>if the message is a <see cref="Message"/> and the function name and function arguments are available, return a list of tool call with one item</para>
/// <para>if the message is a <see cref="AggregateMessage{ToolCallMessage, ToolCallResultMessage}"/> where TMessage1 is <see cref="ToolCallMessage"/> and TMessage2 is <see cref="ToolCallResultMessage"/>, return the tool calls from the first message</para>
/// </summary>
/// <param name="message"></param>
/// <returns></returns>
public static IList<ToolCall>? GetToolCalls(this IMessage message)
{
return message switch
{
ToolCallMessage toolCallMessage => toolCallMessage.ToolCalls,
Message msg => msg.FunctionName is not null && msg.FunctionArguments is not null
? msg.Content is not null ? new List<ToolCall> { new ToolCall(msg.FunctionName, msg.FunctionArguments, result: msg.Content) }
: new List<ToolCall> { new ToolCall(msg.FunctionName, msg.FunctionArguments) }
: null,
AggregateMessage<ToolCallMessage, ToolCallResultMessage> aggregateMessage => aggregateMessage.Message1.ToolCalls,
_ => null,
};
}
}

View File

@ -0,0 +1,138 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MiddlewareExtension.cs
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
public static class MiddlewareExtension
{
/// <summary>
/// Register a auto reply hook to an agent. The hook will be called before the agent generate the reply.
/// If the hook return a non-null reply, then that non-null reply will be returned directly without calling the agent.
/// Otherwise, the agent will generate the reply.
/// This is useful when you want to override the agent reply in some cases.
/// </summary>
/// <param name="agent"></param>
/// <param name="replyFunc"></param>
/// <returns></returns>
/// <exception cref="Exception">throw when agent name is null.</exception>
public static MiddlewareAgent<TAgent> RegisterReply<TAgent>(
this TAgent agent,
Func<IEnumerable<IMessage>, CancellationToken, Task<IMessage?>> replyFunc)
where TAgent : IAgent
{
return agent.RegisterMiddleware(async (messages, options, agent, ct) =>
{
var reply = await replyFunc(messages, ct);
if (reply != null)
{
return reply;
}
return await agent.GenerateReplyAsync(messages, options, ct);
});
}
/// <summary>
/// Register a post process hook to an agent. The hook will be called before the agent return the reply and after the agent generate the reply.
/// This is useful when you want to customize arbitrary behavior before the agent return the reply.
///
/// One example is <see cref="PrintMessageMiddlewareExtension.RegisterPrintMessage{TAgent}(TAgent)" />, which print the formatted message to console before the agent return the reply.
/// </summary>
/// <exception cref="Exception">throw when agent name is null.</exception>
public static MiddlewareAgent<TAgent> RegisterPostProcess<TAgent>(
this TAgent agent,
Func<IEnumerable<IMessage>, IMessage, CancellationToken, Task<IMessage>> postprocessFunc)
where TAgent : IAgent
{
return agent.RegisterMiddleware(async (messages, options, agent, ct) =>
{
var reply = await agent.GenerateReplyAsync(messages, options, ct);
return await postprocessFunc(messages, reply, ct);
});
}
/// <summary>
/// Register a pre process hook to an agent. The hook will be called before the agent generate the reply. This is useful when you want to modify the conversation history before the agent generate the reply.
/// </summary>
/// <exception cref="Exception">throw when agent name is null.</exception>
public static MiddlewareAgent<TAgent> RegisterPreProcess<TAgent>(
this TAgent agent,
Func<IEnumerable<IMessage>, CancellationToken, Task<IEnumerable<IMessage>>> preprocessFunc)
where TAgent : IAgent
{
return agent.RegisterMiddleware(async (messages, options, agent, ct) =>
{
var newMessages = await preprocessFunc(messages, ct);
return await agent.GenerateReplyAsync(newMessages, options, ct);
});
}
/// <summary>
/// Register a middleware to an existing agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareAgent<TAgent> RegisterMiddleware<TAgent>(
this TAgent agent,
Func<IEnumerable<IMessage>, GenerateReplyOptions?, IAgent, CancellationToken, Task<IMessage>> func,
string? middlewareName = null)
where TAgent : IAgent
{
var middleware = new DelegateMiddleware(middlewareName, async (context, agent, cancellationToken) =>
{
return await func(context.Messages, context.Options, agent, cancellationToken);
});
return agent.RegisterMiddleware(middleware);
}
/// <summary>
/// Register a middleware to an existing agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareAgent<TAgent> RegisterMiddleware<TAgent>(
this TAgent agent,
IMiddleware middleware)
where TAgent : IAgent
{
var middlewareAgent = new MiddlewareAgent<TAgent>(agent);
return middlewareAgent.RegisterMiddleware(middleware);
}
/// <summary>
/// Register a middleware to an existing agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareAgent<TAgent> RegisterMiddleware<TAgent>(
this MiddlewareAgent<TAgent> agent,
Func<IEnumerable<IMessage>, GenerateReplyOptions?, IAgent, CancellationToken, Task<IMessage>> func,
string? middlewareName = null)
where TAgent : IAgent
{
var delegateMiddleware = new DelegateMiddleware(middlewareName, async (context, agent, cancellationToken) =>
{
return await func(context.Messages, context.Options, agent, cancellationToken);
});
return agent.RegisterMiddleware(delegateMiddleware);
}
/// <summary>
/// Register a middleware to an existing agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareAgent<TAgent> RegisterMiddleware<TAgent>(
this MiddlewareAgent<TAgent> agent,
IMiddleware middleware)
where TAgent : IAgent
{
var copyAgent = new MiddlewareAgent<TAgent>(agent);
copyAgent.Use(middleware);
return copyAgent;
}
}

View File

@ -0,0 +1,69 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// PrintMessageMiddlewareExtension.cs
using System;
namespace AutoGen.Core;
public static class PrintMessageMiddlewareExtension
{
[Obsolete("This API will be removed in v0.1.0, Use RegisterPrintMessage instead.")]
public static MiddlewareAgent<TAgent> RegisterPrintFormatMessageHook<TAgent>(this TAgent agent)
where TAgent : IAgent
{
return RegisterPrintMessage(agent);
}
[Obsolete("This API will be removed in v0.1.0, Use RegisterPrintMessage instead.")]
public static MiddlewareAgent<TAgent> RegisterPrintFormatMessageHook<TAgent>(this MiddlewareAgent<TAgent> agent)
where TAgent : IAgent
{
return RegisterPrintMessage(agent);
}
[Obsolete("This API will be removed in v0.1.0, Use RegisterPrintMessage instead.")]
public static MiddlewareStreamingAgent<TAgent> RegisterPrintFormatMessageHook<TAgent>(this MiddlewareStreamingAgent<TAgent> agent)
where TAgent : IStreamingAgent
{
return RegisterPrintMessage(agent);
}
/// <summary>
/// Register a <see cref="PrintMessageMiddleware"/> to <paramref name="agent"/> which print formatted message to console.
/// </summary>
public static MiddlewareAgent<TAgent> RegisterPrintMessage<TAgent>(this TAgent agent)
where TAgent : IAgent
{
var middleware = new PrintMessageMiddleware();
var middlewareAgent = new MiddlewareAgent<TAgent>(agent);
middlewareAgent.Use(middleware);
return middlewareAgent;
}
/// <summary>
/// Register a <see cref="PrintMessageMiddleware"/> to <paramref name="agent"/> which print formatted message to console.
/// </summary>
public static MiddlewareAgent<TAgent> RegisterPrintMessage<TAgent>(this MiddlewareAgent<TAgent> agent)
where TAgent : IAgent
{
var middleware = new PrintMessageMiddleware();
var middlewareAgent = new MiddlewareAgent<TAgent>(agent);
middlewareAgent.Use(middleware);
return middlewareAgent;
}
/// <summary>
/// Register a <see cref="PrintMessageMiddleware"/> to <paramref name="agent"/> which print formatted message to console.
/// </summary>
public static MiddlewareStreamingAgent<TAgent> RegisterPrintMessage<TAgent>(this MiddlewareStreamingAgent<TAgent> agent)
where TAgent : IStreamingAgent
{
var middleware = new PrintMessageMiddleware();
var middlewareAgent = new MiddlewareStreamingAgent<TAgent>(agent);
middlewareAgent.Use(middleware);
return middlewareAgent;
}
}

View File

@ -0,0 +1,114 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// StreamingMiddlewareExtension.cs
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
public static class StreamingMiddlewareExtension
{
/// <summary>
/// Register a middleware to an existing agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareStreamingAgent<TStreamingAgent> RegisterStreamingMiddleware<TStreamingAgent>(
this TStreamingAgent agent,
IStreamingMiddleware middleware)
where TStreamingAgent : IStreamingAgent
{
var middlewareAgent = new MiddlewareStreamingAgent<TStreamingAgent>(agent);
middlewareAgent.UseStreaming(middleware);
if (middleware is IMiddleware middlewareBase)
{
middlewareAgent.Use(middlewareBase);
}
return middlewareAgent;
}
/// <summary>
/// Register a middleware to an existing agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareStreamingAgent<TAgent> RegisterStreamingMiddleware<TAgent>(
this MiddlewareStreamingAgent<TAgent> agent,
IStreamingMiddleware middleware)
where TAgent : IStreamingAgent
{
var copyAgent = new MiddlewareStreamingAgent<TAgent>(agent);
copyAgent.UseStreaming(middleware);
if (middleware is IMiddleware middlewareBase)
{
copyAgent.Use(middlewareBase);
}
return copyAgent;
}
/// <summary>
/// Register a middleware to an existing agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareStreamingAgent<TAgent> RegisterStreamingMiddleware<TAgent>(
this TAgent agent,
Func<MiddlewareContext, IStreamingAgent, CancellationToken, Task<IAsyncEnumerable<IStreamingMessage>>> func,
string? middlewareName = null)
where TAgent : IStreamingAgent
{
var middleware = new DelegateStreamingMiddleware(middlewareName, new DelegateStreamingMiddleware.MiddlewareDelegate(func));
return agent.RegisterStreamingMiddleware(middleware);
}
/// <summary>
/// Register a streaming middleware to an existing agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareStreamingAgent<TAgent> RegisterStreamingMiddleware<TAgent>(
this MiddlewareStreamingAgent<TAgent> agent,
Func<MiddlewareContext, IStreamingAgent, CancellationToken, Task<IAsyncEnumerable<IStreamingMessage>>> func,
string? middlewareName = null)
where TAgent : IStreamingAgent
{
var middleware = new DelegateStreamingMiddleware(middlewareName, new DelegateStreamingMiddleware.MiddlewareDelegate(func));
return agent.RegisterStreamingMiddleware(middleware);
}
/// <summary>
/// Register a middleware to an existing streaming agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareStreamingAgent<TStreamingAgent> RegisterMiddleware<TStreamingAgent>(
this MiddlewareStreamingAgent<TStreamingAgent> streamingAgent,
Func<IEnumerable<IMessage>, GenerateReplyOptions?, IAgent, CancellationToken, Task<IMessage>> func,
string? middlewareName = null)
where TStreamingAgent : IStreamingAgent
{
var middleware = new DelegateMiddleware(middlewareName, async (context, agent, cancellationToken) =>
{
return await func(context.Messages, context.Options, agent, cancellationToken);
});
return streamingAgent.RegisterMiddleware(middleware);
}
/// <summary>
/// Register a middleware to an existing streaming agent and return a new agent with the middleware.
/// </summary>
public static MiddlewareStreamingAgent<TStreamingAgent> RegisterMiddleware<TStreamingAgent>(
this MiddlewareStreamingAgent<TStreamingAgent> streamingAgent,
IMiddleware middleware)
where TStreamingAgent : IStreamingAgent
{
var copyAgent = new MiddlewareStreamingAgent<TStreamingAgent>(streamingAgent);
copyAgent.Use(middleware);
if (middleware is IStreamingMiddleware streamingMiddleware)
{
copyAgent.UseStreaming(streamingMiddleware);
}
return copyAgent;
}
}

View File

@ -0,0 +1,93 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// FunctionAttribute.cs
using System;
using System.Collections.Generic;
namespace AutoGen.Core;
[AttributeUsage(AttributeTargets.Method, Inherited = false, AllowMultiple = false)]
public class FunctionAttribute : Attribute
{
public string? FunctionName { get; }
public string? Description { get; }
public FunctionAttribute(string? functionName = null, string? description = null)
{
FunctionName = functionName;
Description = description;
}
}
public class FunctionContract
{
/// <summary>
/// The namespace of the function.
/// </summary>
public string? Namespace { get; set; }
/// <summary>
/// The class name of the function.
/// </summary>
public string? ClassName { get; set; }
/// <summary>
/// The name of the function.
/// </summary>
public string? Name { get; set; }
/// <summary>
/// The description of the function.
/// If a structured comment is available, the description will be extracted from the summary section.
/// Otherwise, the description will be null.
/// </summary>
public string? Description { get; set; }
/// <summary>
/// The parameters of the function.
/// </summary>
public IEnumerable<FunctionParameterContract>? Parameters { get; set; }
/// <summary>
/// The return type of the function.
/// </summary>
public Type? ReturnType { get; set; }
/// <summary>
/// The description of the return section.
/// If a structured comment is available, the description will be extracted from the return section.
/// Otherwise, the description will be null.
/// </summary>
public string? ReturnDescription { get; set; }
}
public class FunctionParameterContract
{
/// <summary>
/// The name of the parameter.
/// </summary>
public string? Name { get; set; }
/// <summary>
/// The description of the parameter.
/// This will be extracted from the param section of the structured comment if available.
/// Otherwise, the description will be null.
/// </summary>
public string? Description { get; set; }
/// <summary>
/// The type of the parameter.
/// </summary>
public Type? ParameterType { get; set; }
/// <summary>
/// If the parameter is a required parameter.
/// </summary>
public bool IsRequired { get; set; }
/// <summary>
/// The default value of the parameter.
/// </summary>
public object? DefaultValue { get; set; }
}

View File

@ -0,0 +1,117 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Workflow.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
namespace AutoGen.Core;
/// <summary>
/// Obsolete: please use <see cref="Graph"/>
/// </summary>
[Obsolete("please use Graph")]
public class Workflow : Graph
{
[Obsolete("please use Graph")]
public Workflow(IEnumerable<Transition> transitions)
: base(transitions)
{
}
}
public class Graph
{
private readonly List<Transition> transitions = new List<Transition>();
public Graph(IEnumerable<Transition> transitions)
{
this.transitions.AddRange(transitions);
}
public void AddTransition(Transition transition)
{
transitions.Add(transition);
}
/// <summary>
/// Get the transitions of the workflow.
/// </summary>
public IEnumerable<Transition> Transitions => transitions;
/// <summary>
/// Get the next available agents that the messages can be transit to.
/// </summary>
/// <param name="fromAgent">the from agent</param>
/// <param name="messages">messages</param>
/// <returns>A list of agents that the messages can be transit to</returns>
public async Task<IEnumerable<IAgent>> TransitToNextAvailableAgentsAsync(IAgent fromAgent, IEnumerable<IMessage> messages)
{
var nextAgents = new List<IAgent>();
var availableTransitions = transitions.FindAll(t => t.From == fromAgent) ?? Enumerable.Empty<Transition>();
foreach (var transition in availableTransitions)
{
if (await transition.CanTransitionAsync(messages))
{
nextAgents.Add(transition.To);
}
}
return nextAgents;
}
}
/// <summary>
/// Represents a transition between two agents.
/// </summary>
public class Transition
{
private readonly IAgent _from;
private readonly IAgent _to;
private readonly Func<IAgent, IAgent, IEnumerable<IMessage>, Task<bool>>? _canTransition;
/// <summary>
/// Create a new instance of <see cref="Transition"/>.
/// This constructor is used for testing purpose only.
/// To create a new instance of <see cref="Transition"/>, use <see cref="Transition.Create{TFromAgent, TToAgent}(TFromAgent, TToAgent, Func{TFromAgent, TToAgent, IEnumerable{IMessage}, Task{bool}}?)"/>.
/// </summary>
/// <param name="from">from agent</param>
/// <param name="to">to agent</param>
/// <param name="canTransitionAsync">detect if the transition is allowed, default to be always true</param>
internal Transition(IAgent from, IAgent to, Func<IAgent, IAgent, IEnumerable<IMessage>, Task<bool>>? canTransitionAsync = null)
{
_from = from;
_to = to;
_canTransition = canTransitionAsync;
}
/// <summary>
/// Create a new instance of <see cref="Transition"/>.
/// </summary>
/// <returns><see cref="Transition"/></returns>"
public static Transition Create<TFromAgent, TToAgent>(TFromAgent from, TToAgent to, Func<TFromAgent, TToAgent, IEnumerable<IMessage>, Task<bool>>? canTransitionAsync = null)
where TFromAgent : IAgent
where TToAgent : IAgent
{
return new Transition(from, to, (fromAgent, toAgent, messages) => canTransitionAsync?.Invoke((TFromAgent)fromAgent, (TToAgent)toAgent, messages) ?? Task.FromResult(true));
}
public IAgent From => _from;
public IAgent To => _to;
/// <summary>
/// Check if the transition is allowed.
/// </summary>
/// <param name="messages">messages</param>
public Task<bool> CanTransitionAsync(IEnumerable<IMessage> messages)
{
if (_canTransition == null)
{
return Task.FromResult(true);
}
return _canTransition(this.From, this.To, messages);
}
}

View File

@ -0,0 +1,183 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// GroupChat.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
public class GroupChat : IGroupChat
{
private IAgent? admin;
private List<IAgent> agents = new List<IAgent>();
private IEnumerable<IMessage> initializeMessages = new List<IMessage>();
private Graph? workflow = null;
public IEnumerable<IMessage>? Messages { get; private set; }
/// <summary>
/// Create a group chat. The next speaker will be decided by a combination effort of the admin and the workflow.
/// </summary>
/// <param name="admin">admin agent. If provided, the admin will be invoked to decide the next speaker.</param>
/// <param name="workflow">workflow of the group chat. If provided, the next speaker will be decided by the workflow.</param>
/// <param name="members">group members.</param>
/// <param name="initializeMessages"></param>
public GroupChat(
IEnumerable<IAgent> members,
IAgent? admin = null,
IEnumerable<IMessage>? initializeMessages = null,
Graph? workflow = null)
{
this.admin = admin;
this.agents = members.ToList();
this.initializeMessages = initializeMessages ?? new List<IMessage>();
this.workflow = workflow;
this.Validation();
}
private void Validation()
{
// check if all agents has a name
if (this.agents.Any(x => string.IsNullOrEmpty(x.Name)))
{
throw new Exception("All agents must have a name.");
}
// check if any agents has the same name
var names = this.agents.Select(x => x.Name).ToList();
if (names.Distinct().Count() != names.Count)
{
throw new Exception("All agents must have a unique name.");
}
// if there's a workflow
// check if the agents in that workflow are in the group chat
if (this.workflow != null)
{
var agentNamesInWorkflow = this.workflow.Transitions.Select(x => x.From.Name!).Concat(this.workflow.Transitions.Select(x => x.To.Name!)).Distinct();
if (agentNamesInWorkflow.Any(x => !this.agents.Select(a => a.Name).Contains(x)))
{
throw new Exception("All agents in the workflow must be in the group chat.");
}
}
// must provide one of admin or workflow
if (this.admin == null && this.workflow == null)
{
throw new Exception("Must provide one of admin or workflow.");
}
}
/// <summary>
/// Select the next speaker based on the conversation history.
/// The next speaker will be decided by a combination effort of the admin and the workflow.
/// Firstly, a group of candidates will be selected by the workflow. If there's only one candidate, then that candidate will be the next speaker.
/// Otherwise, the admin will be invoked to decide the next speaker using role-play prompt.
/// </summary>
/// <param name="currentSpeaker">current speaker</param>
/// <param name="conversationHistory">conversation history</param>
/// <returns>next speaker.</returns>
public async Task<IAgent> SelectNextSpeakerAsync(IAgent currentSpeaker, IEnumerable<IMessage> conversationHistory)
{
var agentNames = this.agents.Select(x => x.Name).ToList();
if (this.workflow != null)
{
var nextAvailableAgents = await this.workflow.TransitToNextAvailableAgentsAsync(currentSpeaker, conversationHistory);
agentNames = nextAvailableAgents.Select(x => x.Name).ToList();
if (agentNames.Count() == 0)
{
throw new Exception("No next available agents found in the current workflow");
}
if (agentNames.Count() == 1)
{
return this.agents.FirstOrDefault(x => x.Name == agentNames.First());
}
}
if (this.admin == null)
{
throw new Exception("No admin is provided.");
}
var systemMessage = new TextMessage(Role.System,
content: $@"You are in a role play game. Carefully read the conversation history and carry on the conversation.
The available roles are:
{string.Join(",", agentNames)}
Each message will start with 'From name:', e.g:
From admin:
//your message//.");
var conv = this.ProcessConversationsForRolePlay(this.initializeMessages, conversationHistory);
var messages = new IMessage[] { systemMessage }.Concat(conv);
var response = await this.admin.GenerateReplyAsync(
messages: messages,
options: new GenerateReplyOptions
{
Temperature = 0,
MaxToken = 128,
StopSequence = [":"],
Functions = [],
});
var name = response?.GetContent() ?? throw new Exception("No name is returned.");
// remove From
name = name!.Substring(5);
return this.agents.First(x => x.Name!.ToLower() == name.ToLower());
}
/// <inheritdoc />
public void AddInitializeMessage(IMessage message)
{
this.SendIntroduction(message);
}
public async Task<IEnumerable<IMessage>> CallAsync(
IEnumerable<IMessage>? conversationWithName = null,
int maxRound = 10,
CancellationToken ct = default)
{
var conversationHistory = new List<IMessage>();
if (conversationWithName != null)
{
conversationHistory.AddRange(conversationWithName);
}
var lastSpeaker = conversationHistory.LastOrDefault()?.From switch
{
null => this.agents.First(),
_ => this.agents.FirstOrDefault(x => x.Name == conversationHistory.Last().From) ?? throw new Exception("The agent is not in the group chat"),
};
var round = 0;
while (round < maxRound)
{
var currentSpeaker = await this.SelectNextSpeakerAsync(lastSpeaker, conversationHistory);
var processedConversation = this.ProcessConversationForAgent(this.initializeMessages, conversationHistory);
var result = await currentSpeaker.GenerateReplyAsync(processedConversation) ?? throw new Exception("No result is returned.");
conversationHistory.Add(result);
// if message is terminate message, then terminate the conversation
if (result?.IsGroupChatTerminateMessage() ?? false)
{
break;
}
lastSpeaker = currentSpeaker;
round++;
}
return conversationHistory;
}
public void SendIntroduction(IMessage message)
{
this.initializeMessages = this.initializeMessages.Append(message);
}
}

View File

@ -0,0 +1,100 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// RoundRobinGroupChat.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
/// <summary>
/// Obsolete: please use <see cref="RoundRobinGroupChat"/>
/// </summary>
[Obsolete("please use RoundRobinGroupChat")]
public class SequentialGroupChat : RoundRobinGroupChat
{
[Obsolete("please use RoundRobinGroupChat")]
public SequentialGroupChat(IEnumerable<IAgent> agents, List<IMessage>? initializeMessages = null)
: base(agents, initializeMessages)
{
}
}
/// <summary>
/// A group chat that allows agents to talk in a round-robin manner.
/// </summary>
public class RoundRobinGroupChat : IGroupChat
{
private readonly List<IAgent> agents = new List<IAgent>();
private readonly List<IMessage> initializeMessages = new List<IMessage>();
public RoundRobinGroupChat(
IEnumerable<IAgent> agents,
List<IMessage>? initializeMessages = null)
{
this.agents.AddRange(agents);
this.initializeMessages = initializeMessages ?? new List<IMessage>();
}
/// <inheritdoc />
public void AddInitializeMessage(IMessage message)
{
this.SendIntroduction(message);
}
public async Task<IEnumerable<IMessage>> CallAsync(
IEnumerable<IMessage>? conversationWithName = null,
int maxRound = 10,
CancellationToken ct = default)
{
var conversationHistory = new List<IMessage>();
if (conversationWithName != null)
{
conversationHistory.AddRange(conversationWithName);
}
var lastSpeaker = conversationHistory.LastOrDefault()?.From switch
{
null => this.agents.First(),
_ => this.agents.FirstOrDefault(x => x.Name == conversationHistory.Last().From) ?? throw new Exception("The agent is not in the group chat"),
};
var round = 0;
while (round < maxRound)
{
var currentSpeaker = this.SelectNextSpeaker(lastSpeaker);
var processedConversation = this.ProcessConversationForAgent(this.initializeMessages, conversationHistory);
var result = await currentSpeaker.GenerateReplyAsync(processedConversation) ?? throw new Exception("No result is returned.");
conversationHistory.Add(result);
// if message is terminate message, then terminate the conversation
if (result?.IsGroupChatTerminateMessage() ?? false)
{
break;
}
lastSpeaker = currentSpeaker;
round++;
}
return conversationHistory;
}
public void SendIntroduction(IMessage message)
{
this.initializeMessages.Add(message);
}
private IAgent SelectNextSpeaker(IAgent currentSpeaker)
{
var index = this.agents.IndexOf(currentSpeaker);
if (index == -1)
{
throw new ArgumentException("The agent is not in the group chat", nameof(currentSpeaker));
}
var nextIndex = (index + 1) % this.agents.Count;
return this.agents[nextIndex];
}
}

View File

@ -0,0 +1,22 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// IGroupChat.cs
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
public interface IGroupChat
{
/// <summary>
/// Send an introduction message to the group chat.
/// </summary>
void SendIntroduction(IMessage message);
[Obsolete("please use SendIntroduction")]
void AddInitializeMessage(IMessage message);
Task<IEnumerable<IMessage>> CallAsync(IEnumerable<IMessage>? conversation = null, int maxRound = 10, CancellationToken ct = default);
}

View File

@ -0,0 +1,8 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// ILLMConfig.cs
namespace AutoGen.Core;
public interface ILLMConfig
{
}

View File

@ -0,0 +1,53 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// AggregateMessage.cs
using System;
using System.Collections.Generic;
namespace AutoGen.Core;
public class AggregateMessage<TMessage1, TMessage2> : IMessage
where TMessage1 : IMessage
where TMessage2 : IMessage
{
public AggregateMessage(TMessage1 message1, TMessage2 message2, string? from = null)
{
this.From = from;
this.Message1 = message1;
this.Message2 = message2;
this.Validate();
}
public TMessage1 Message1 { get; }
public TMessage2 Message2 { get; }
public string? From { get; set; }
private void Validate()
{
var messages = new List<IMessage> { this.Message1, this.Message2 };
// the from property of all messages should be the same with the from property of the aggregate message
foreach (var message in messages)
{
if (message.From != this.From)
{
throw new ArgumentException($"The from property of the message {message} is different from the from property of the aggregate message {this}");
}
}
}
public override string ToString()
{
var stringBuilder = new System.Text.StringBuilder();
var messages = new List<IMessage> { this.Message1, this.Message2 };
stringBuilder.Append($"AggregateMessage({this.From})");
foreach (var message in messages)
{
stringBuilder.Append($"\n\t{message}");
}
return stringBuilder.ToString();
}
}

View File

@ -0,0 +1,52 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// IMessage.cs
namespace AutoGen.Core;
/// <summary>
/// The universal message interface for all message types in AutoGen.
/// <para>Related PR: https://github.com/microsoft/autogen/pull/1676</para>
/// <para>Built-in message types</para>
/// <list type="bullet">
/// <item>
/// <see cref="TextMessage"/>: plain text message.
/// </item>
/// <item>
/// <see cref="ImageMessage"/>: image message.
/// </item>
/// <item>
/// <see cref="MultiModalMessage"/>: message type for multimodal message. The current support message items are <see cref="TextMessage"/> and <see cref="ImageMessage"/>.
/// </item>
/// <item>
/// <see cref="ToolCallMessage"/>: message type for tool call. This message supports both single and parallel tool call.
/// </item>
/// <item>
/// <see cref="ToolCallResultMessage"/>: message type for tool call result.
/// </item>
/// <item>
/// <see cref="Message"/>: This type is used by previous version of AutoGen. And it's reserved for backward compatibility.
/// </item>
/// <item>
/// <see cref="AggregateMessage{TMessage1, TMessage2}"/>: an aggregate message type that contains two message types.
/// This type is useful when you want to combine two message types into one unique message type. One example is when invoking a tool call and you want to return both <see cref="ToolCallMessage"/> and <see cref="ToolCallResultMessage"/>.
/// One example of how this type is used in AutoGen is <see cref="FunctionCallMiddleware"/>
/// </item>
/// </list>
/// </summary>
public interface IMessage : IStreamingMessage
{
}
public interface IMessage<out T> : IMessage, IStreamingMessage<T>
{
}
public interface IStreamingMessage
{
string? From { get; set; }
}
public interface IStreamingMessage<out T> : IStreamingMessage
{
T Content { get; }
}

View File

@ -0,0 +1,34 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// ImageMessage.cs
using System;
namespace AutoGen.Core;
public class ImageMessage : IMessage
{
public ImageMessage(Role role, string url, string? from = null)
{
this.Role = role;
this.From = from;
this.Url = url;
}
public ImageMessage(Role role, Uri uri, string? from = null)
{
this.Role = role;
this.From = from;
this.Url = uri.ToString();
}
public Role Role { get; set; }
public string Url { get; set; }
public string? From { get; set; }
public override string ToString()
{
return $"ImageMessage({this.Role}, {this.Url}, {this.From})";
}
}

View File

@ -0,0 +1,53 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Message.cs
using System.Collections.Generic;
namespace AutoGen.Core;
public class Message : IMessage
{
public Message(
Role role,
string? content,
string? from = null,
ToolCall? toolCall = null)
{
this.Role = role;
this.Content = content;
this.From = from;
this.FunctionName = toolCall?.FunctionName;
this.FunctionArguments = toolCall?.FunctionArguments;
}
public Message(Message other)
: this(other.Role, other.Content, other.From)
{
this.FunctionName = other.FunctionName;
this.FunctionArguments = other.FunctionArguments;
this.Value = other.Value;
this.Metadata = other.Metadata;
}
public Role Role { get; set; }
public string? Content { get; set; }
public string? From { get; set; }
public string? FunctionName { get; set; }
public string? FunctionArguments { get; set; }
/// <summary>
/// raw message
/// </summary>
public object? Value { get; set; }
public IList<KeyValuePair<string, object>> Metadata { get; set; } = new List<KeyValuePair<string, object>>();
public override string ToString()
{
return $"Message({this.Role}, {this.Content}, {this.From}, {this.FunctionName}, {this.FunctionArguments})";
}
}

View File

@ -0,0 +1,37 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MessageEnvelope.cs
using System.Collections.Generic;
namespace AutoGen.Core;
public abstract class MessageEnvelope : IMessage, IStreamingMessage
{
public MessageEnvelope(string? from = null, IDictionary<string, object>? metadata = null)
{
this.From = from;
this.Metadata = metadata ?? new Dictionary<string, object>();
}
public static MessageEnvelope<TContent> Create<TContent>(TContent content, string? from = null, IDictionary<string, object>? metadata = null)
{
return new MessageEnvelope<TContent>(content, from, metadata);
}
public string? From { get; set; }
public IDictionary<string, object> Metadata { get; set; }
}
public class MessageEnvelope<T> : MessageEnvelope, IMessage<T>, IStreamingMessage<T>
{
public MessageEnvelope(T content, string? from = null, IDictionary<string, object>? metadata = null)
: base(from, metadata)
{
this.Content = content;
this.From = from;
this.Metadata = metadata ?? new Dictionary<string, object>();
}
public T Content { get; }
}

View File

@ -0,0 +1,58 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MultiModalMessage.cs
using System;
using System.Collections.Generic;
namespace AutoGen.Core;
public class MultiModalMessage : IMessage
{
public MultiModalMessage(Role role, IEnumerable<IMessage> content, string? from = null)
{
this.Role = role;
this.Content = content;
this.From = from;
this.Validate();
}
public Role Role { get; set; }
public IEnumerable<IMessage> Content { get; set; }
public string? From { get; set; }
private void Validate()
{
foreach (var message in this.Content)
{
if (message.From != this.From)
{
var reason = $"The from property of the message {message} is different from the from property of the aggregate message {this}";
throw new ArgumentException($"Invalid aggregate message {reason}");
}
}
// all message must be either text or image
foreach (var message in this.Content)
{
if (message is not TextMessage && message is not ImageMessage)
{
var reason = $"The message {message} is not a text or image message";
throw new ArgumentException($"Invalid aggregate message {reason}");
}
}
}
public override string ToString()
{
var stringBuilder = new System.Text.StringBuilder();
stringBuilder.Append($"MultiModalMessage({this.Role}, {this.From})");
foreach (var message in this.Content)
{
stringBuilder.Append($"\n\t{message}");
}
return stringBuilder.ToString();
}
}

View File

@ -0,0 +1,54 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Role.cs
using System;
namespace AutoGen.Core;
public readonly struct Role : IEquatable<Role>
{
private readonly string label;
internal Role(string name)
{
label = name;
}
public static Role User { get; } = new Role("user");
public static Role Assistant { get; } = new Role("assistant");
public static Role System { get; } = new Role("system");
public static Role Function { get; } = new Role("function");
public bool Equals(Role other)
{
return label.Equals(other.label, StringComparison.OrdinalIgnoreCase);
}
public override string ToString()
{
return label;
}
public override bool Equals(object? obj)
{
return obj is Role other && Equals(other);
}
public override int GetHashCode()
{
return label.GetHashCode();
}
public static bool operator ==(Role left, Role right)
{
return left.Equals(right);
}
public static bool operator !=(Role left, Role right)
{
return !(left == right);
}
}

View File

@ -0,0 +1,63 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// TextMessage.cs
namespace AutoGen.Core;
public class TextMessage : IMessage, IStreamingMessage
{
public TextMessage(Role role, string content, string? from = null)
{
this.Content = content;
this.Role = role;
this.From = from;
}
public TextMessage(TextMessageUpdate update)
{
this.Content = update.Content ?? string.Empty;
this.Role = update.Role;
this.From = update.From;
}
public void Update(TextMessageUpdate update)
{
if (update.Role != this.Role)
{
throw new System.ArgumentException("Role mismatch", nameof(update));
}
if (update.From != this.From)
{
throw new System.ArgumentException("From mismatch", nameof(update));
}
this.Content = this.Content + update.Content ?? string.Empty;
}
public Role Role { get; set; }
public string Content { get; set; }
public string? From { get; set; }
public override string ToString()
{
return $"TextMessage({this.Role}, {this.Content}, {this.From})";
}
}
public class TextMessageUpdate : IStreamingMessage
{
public TextMessageUpdate(Role role, string? content, string? from = null)
{
this.Content = content;
this.From = from;
this.Role = role;
}
public string? Content { get; set; }
public string? From { get; set; }
public Role Role { get; set; }
}

View File

@ -0,0 +1,108 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// ToolCallMessage.cs
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace AutoGen.Core;
public class ToolCall
{
public ToolCall(string functionName, string functionArgs)
{
this.FunctionName = functionName;
this.FunctionArguments = functionArgs;
}
public ToolCall(string functionName, string functionArgs, string result)
{
this.FunctionName = functionName;
this.FunctionArguments = functionArgs;
this.Result = result;
}
public string FunctionName { get; set; }
public string FunctionArguments { get; set; }
public string? Result { get; set; }
public override string ToString()
{
return $"ToolCall({this.FunctionName}, {this.FunctionArguments}, {this.Result})";
}
}
public class ToolCallMessage : IMessage
{
public ToolCallMessage(IEnumerable<ToolCall> toolCalls, string? from = null)
{
this.From = from;
this.ToolCalls = toolCalls.ToList();
}
public ToolCallMessage(string functionName, string functionArgs, string? from = null)
{
this.From = from;
this.ToolCalls = new List<ToolCall> { new ToolCall(functionName, functionArgs) };
}
public ToolCallMessage(ToolCallMessageUpdate update)
{
this.From = update.From;
this.ToolCalls = new List<ToolCall> { new ToolCall(update.FunctionName, update.FunctionArgumentUpdate) };
}
public void Update(ToolCallMessageUpdate update)
{
// firstly, valid if the update is from the same agent
if (update.From != this.From)
{
throw new System.ArgumentException("From mismatch", nameof(update));
}
// if update.FunctionName exists in the tool calls, update the function arguments
var toolCall = this.ToolCalls.FirstOrDefault(tc => tc.FunctionName == update.FunctionName);
if (toolCall is not null)
{
toolCall.FunctionArguments += update.FunctionArgumentUpdate;
}
else
{
this.ToolCalls.Add(new ToolCall(update.FunctionName, update.FunctionArgumentUpdate));
}
}
public IList<ToolCall> ToolCalls { get; set; }
public string? From { get; set; }
public override string ToString()
{
var sb = new StringBuilder();
sb.Append($"ToolCallMessage({this.From})");
foreach (var toolCall in this.ToolCalls)
{
sb.Append($"\n\t{toolCall}");
}
return sb.ToString();
}
}
public class ToolCallMessageUpdate : IStreamingMessage
{
public ToolCallMessageUpdate(string functionName, string functionArgumentUpdate, string? from = null)
{
this.From = from;
this.FunctionName = functionName;
this.FunctionArgumentUpdate = functionArgumentUpdate;
}
public string? From { get; set; }
public string FunctionName { get; set; }
public string FunctionArgumentUpdate { get; set; }
}

View File

@ -0,0 +1,56 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// ToolCallResultMessage.cs
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace AutoGen.Core;
public class ToolCallResultMessage : IMessage
{
public ToolCallResultMessage(IEnumerable<ToolCall> toolCalls, string? from = null)
{
this.From = from;
this.ToolCalls = toolCalls.ToList();
}
public ToolCallResultMessage(string result, string functionName, string functionArgs, string? from = null)
{
this.From = from;
var toolCall = new ToolCall(functionName, functionArgs);
toolCall.Result = result;
this.ToolCalls = [toolCall];
}
/// <summary>
/// The original tool call message
/// </summary>
public IList<ToolCall> ToolCalls { get; set; }
public string? From { get; set; }
public override string ToString()
{
var sb = new StringBuilder();
sb.Append($"ToolCallResultMessage({this.From})");
foreach (var toolCall in this.ToolCalls)
{
sb.Append($"\n\t{toolCall}");
}
return sb.ToString();
}
private void Validate()
{
// each tool call must have a result
foreach (var toolCall in this.ToolCalls)
{
if (string.IsNullOrEmpty(toolCall.Result))
{
throw new System.ArgumentException($"The tool call {toolCall} does not have a result");
}
}
}
}

View File

@ -0,0 +1,45 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// DelegateMiddleware.cs
using System;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
internal class DelegateMiddleware : IMiddleware
{
/// <summary>
/// middleware delegate. Call into the next function to continue the execution of the next middleware. Otherwise, short cut the middleware execution.
/// </summary>
/// <param name="cancellationToken">cancellation token</param>
public delegate Task<IMessage> MiddlewareDelegate(
MiddlewareContext context,
IAgent agent,
CancellationToken cancellationToken);
private readonly MiddlewareDelegate middlewareDelegate;
public DelegateMiddleware(string? name, Func<MiddlewareContext, IAgent, CancellationToken, Task<IMessage>> middlewareDelegate)
{
this.Name = name;
this.middlewareDelegate = async (context, agent, cancellationToken) =>
{
return await middlewareDelegate(context, agent, cancellationToken);
};
}
public string? Name { get; }
public Task<IMessage> InvokeAsync(
MiddlewareContext context,
IAgent agent,
CancellationToken cancellationToken = default)
{
var messages = context.Messages;
var options = context.Options;
return this.middlewareDelegate(context, agent, cancellationToken);
}
}

View File

@ -0,0 +1,38 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// DelegateStreamingMiddleware.cs
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
internal class DelegateStreamingMiddleware : IStreamingMiddleware
{
public delegate Task<IAsyncEnumerable<IStreamingMessage>> MiddlewareDelegate(
MiddlewareContext context,
IStreamingAgent agent,
CancellationToken cancellationToken);
private readonly MiddlewareDelegate middlewareDelegate;
public DelegateStreamingMiddleware(string? name, MiddlewareDelegate middlewareDelegate)
{
this.Name = name;
this.middlewareDelegate = middlewareDelegate;
}
public string? Name { get; }
public Task<IAsyncEnumerable<IStreamingMessage>> InvokeAsync(
MiddlewareContext context,
IStreamingAgent agent,
CancellationToken cancellationToken = default)
{
var messages = context.Messages;
var options = context.Options;
return this.middlewareDelegate(context, agent, cancellationToken);
}
}

View File

@ -0,0 +1,178 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// FunctionCallMiddleware.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
/// <summary>
/// The middleware that process function call message that both send to an agent or reply from an agent.
/// <para>If the last message is <see cref="ToolCallMessage"/> and the tool calls is available in this middleware's function map,
/// the tools from the last message will be invoked and a <see cref="ToolCallResultMessage"/> will be returned. In this situation,
/// the inner agent will be short-cut and won't be invoked.</para>
/// <para>Otherwise, the message will be sent to the inner agent. In this situation</para>
/// <para>if the reply from the inner agent is <see cref="ToolCallMessage"/>,
/// and the tool calls is available in this middleware's function map, the tools from the reply will be invoked,
/// and a <see cref="AggregateMessage{TMessage1, TMessage2}"/> where TMessage1 is <see cref="ToolCallMessage"/> and TMessage2 is <see cref="ToolCallResultMessage"/>"/>
/// will be returned.
/// </para>
/// <para>If the reply from the inner agent is <see cref="ToolCallMessage"/> but the tool calls is not available in this middleware's function map,
/// or the reply from the inner agent is not <see cref="ToolCallMessage"/>, the original reply from the inner agent will be returned.</para>
/// <para>
/// When used as a streaming middleware, if the streaming reply from the inner agent is <see cref="ToolCallMessageUpdate"/> or <see cref="TextMessageUpdate"/>,
/// This middleware will update the message accordingly and invoke the function if the tool call is available in this middleware's function map.
/// If the streaming reply from the inner agent is other types of message, the most recent message will be used to invoke the function.
/// </para>
/// </summary>
public class FunctionCallMiddleware : IMiddleware, IStreamingMiddleware
{
private readonly IEnumerable<FunctionContract>? functions;
private readonly IDictionary<string, Func<string, Task<string>>>? functionMap;
public FunctionCallMiddleware(
IEnumerable<FunctionContract>? functions = null,
IDictionary<string, Func<string, Task<string>>>? functionMap = null,
string? name = null)
{
this.Name = name ?? nameof(FunctionCallMiddleware);
this.functions = functions;
this.functionMap = functionMap;
}
public string? Name { get; }
public async Task<IMessage> InvokeAsync(MiddlewareContext context, IAgent agent, CancellationToken cancellationToken = default)
{
var lastMessage = context.Messages.Last();
if (lastMessage is ToolCallMessage toolCallMessage)
{
return await this.InvokeToolCallMessagesBeforeInvokingAgentAsync(toolCallMessage, agent);
}
// combine functions
var options = new GenerateReplyOptions(context.Options ?? new GenerateReplyOptions());
var combinedFunctions = this.functions?.Concat(options.Functions ?? []) ?? options.Functions;
options.Functions = combinedFunctions?.ToArray();
var reply = await agent.GenerateReplyAsync(context.Messages, options, cancellationToken);
// if the reply is a function call message plus the function's name is available in function map, invoke the function and return the result instead of sending to the agent.
if (reply is ToolCallMessage toolCallMsg)
{
return await this.InvokeToolCallMessagesAfterInvokingAgentAsync(toolCallMsg, agent);
}
// for all other messages, just return the reply from the agent.
return reply;
}
public Task<IAsyncEnumerable<IStreamingMessage>> InvokeAsync(MiddlewareContext context, IStreamingAgent agent, CancellationToken cancellationToken = default)
{
return Task.FromResult(this.StreamingInvokeAsync(context, agent, cancellationToken));
}
private async IAsyncEnumerable<IStreamingMessage> StreamingInvokeAsync(
MiddlewareContext context,
IStreamingAgent agent,
[EnumeratorCancellation] CancellationToken cancellationToken)
{
var lastMessage = context.Messages.Last();
if (lastMessage is ToolCallMessage toolCallMessage)
{
yield return await this.InvokeToolCallMessagesBeforeInvokingAgentAsync(toolCallMessage, agent);
}
// combine functions
var options = new GenerateReplyOptions(context.Options ?? new GenerateReplyOptions());
var combinedFunctions = this.functions?.Concat(options.Functions ?? []) ?? options.Functions;
options.Functions = combinedFunctions?.ToArray();
IStreamingMessage? initMessage = default;
await foreach (var message in await agent.GenerateStreamingReplyAsync(context.Messages, options, cancellationToken))
{
if (message is ToolCallMessageUpdate toolCallMessageUpdate && this.functionMap != null)
{
if (initMessage is null)
{
initMessage = new ToolCallMessage(toolCallMessageUpdate);
}
else if (initMessage is ToolCallMessage toolCall)
{
toolCall.Update(toolCallMessageUpdate);
}
else
{
throw new InvalidOperationException("The first message is ToolCallMessage, but the update message is not ToolCallMessageUpdate");
}
}
else
{
yield return message;
}
}
if (initMessage is ToolCallMessage toolCallMsg)
{
yield return await this.InvokeToolCallMessagesAfterInvokingAgentAsync(toolCallMsg, agent);
}
}
private async Task<ToolCallResultMessage> InvokeToolCallMessagesBeforeInvokingAgentAsync(ToolCallMessage toolCallMessage, IAgent agent)
{
var toolCallResult = new List<ToolCall>();
var toolCalls = toolCallMessage.ToolCalls;
foreach (var toolCall in toolCalls)
{
var functionName = toolCall.FunctionName;
var functionArguments = toolCall.FunctionArguments;
if (this.functionMap?.TryGetValue(functionName, out var func) is true)
{
var result = await func(functionArguments);
toolCallResult.Add(new ToolCall(functionName, functionArguments, result));
}
else if (this.functionMap is not null)
{
var errorMessage = $"Function {functionName} is not available. Available functions are: {string.Join(", ", this.functionMap.Select(f => f.Key))}";
toolCallResult.Add(new ToolCall(functionName, functionArguments, errorMessage));
}
else
{
throw new InvalidOperationException("FunctionMap is not available");
}
}
return new ToolCallResultMessage(toolCallResult, from: agent.Name);
}
private async Task<IMessage> InvokeToolCallMessagesAfterInvokingAgentAsync(ToolCallMessage toolCallMsg, IAgent agent)
{
var toolCallsReply = toolCallMsg.ToolCalls;
var toolCallResult = new List<ToolCall>();
foreach (var toolCall in toolCallsReply)
{
var fName = toolCall.FunctionName;
var fArgs = toolCall.FunctionArguments;
if (this.functionMap?.TryGetValue(fName, out var func) is true)
{
var result = await func(fArgs);
toolCallResult.Add(new ToolCall(fName, fArgs, result));
}
}
if (toolCallResult.Count() > 0)
{
var toolCallResultMessage = new ToolCallResultMessage(toolCallResult, from: agent.Name);
return new AggregateMessage<ToolCallMessage, ToolCallResultMessage>(toolCallMsg, toolCallResultMessage, from: agent.Name);
}
else
{
return toolCallMsg;
}
}
}

View File

@ -0,0 +1,26 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// IMiddleware.cs
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
/// <summary>
/// The middleware interface
/// </summary>
public interface IMiddleware
{
/// <summary>
/// the name of the middleware
/// </summary>
public string? Name { get; }
/// <summary>
/// The method to invoke the middleware
/// </summary>
public Task<IMessage> InvokeAsync(
MiddlewareContext context,
IAgent agent,
CancellationToken cancellationToken = default);
}

View File

@ -0,0 +1,21 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// IStreamingMiddleware.cs
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
/// <summary>
/// The streaming middleware interface
/// </summary>
public interface IStreamingMiddleware
{
public string? Name { get; }
public Task<IAsyncEnumerable<IStreamingMessage>> InvokeAsync(
MiddlewareContext context,
IStreamingAgent agent,
CancellationToken cancellationToken = default);
}

View File

@ -0,0 +1,27 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MiddlewareContext.cs
using System.Collections.Generic;
namespace AutoGen.Core;
public class MiddlewareContext
{
public MiddlewareContext(
IEnumerable<IMessage> messages,
GenerateReplyOptions? options)
{
this.Messages = messages;
this.Options = options;
}
/// <summary>
/// Messages to send to the agent
/// </summary>
public IEnumerable<IMessage> Messages { get; }
/// <summary>
/// Options to generate the reply
/// </summary>
public GenerateReplyOptions? Options { get; }
}

View File

@ -0,0 +1,87 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// PrintMessageMiddleware.cs
using System;
using System.Threading;
using System.Threading.Tasks;
namespace AutoGen.Core;
/// <summary>
/// The middleware that prints the reply from agent to the console.
/// </summary>
public class PrintMessageMiddleware : IMiddleware
{
public string? Name => nameof(PrintMessageMiddleware);
public async Task<IMessage> InvokeAsync(MiddlewareContext context, IAgent agent, CancellationToken cancellationToken = default)
{
if (agent is IStreamingAgent streamingAgent)
{
IMessage? recentUpdate = null;
await foreach (var message in await streamingAgent.GenerateStreamingReplyAsync(context.Messages, context.Options, cancellationToken))
{
if (message is TextMessageUpdate textMessageUpdate)
{
if (recentUpdate is null)
{
// Print from: xxx
Console.WriteLine($"from: {textMessageUpdate.From}");
recentUpdate = new TextMessage(textMessageUpdate);
Console.Write(textMessageUpdate.Content);
}
else if (recentUpdate is TextMessage recentTextMessage)
{
// Print the content of the message
Console.Write(textMessageUpdate.Content);
recentTextMessage.Update(textMessageUpdate);
}
else
{
throw new InvalidOperationException("The recent update is not a TextMessage");
}
}
else if (message is ToolCallMessageUpdate toolCallUpdate)
{
if (recentUpdate is null)
{
recentUpdate = new ToolCallMessage(toolCallUpdate);
}
else if (recentUpdate is ToolCallMessage recentToolCallMessage)
{
recentToolCallMessage.Update(toolCallUpdate);
}
else
{
throw new InvalidOperationException("The recent update is not a ToolCallMessage");
}
}
else if (message is IMessage imessage)
{
recentUpdate = imessage;
}
else
{
throw new InvalidOperationException("The message is not a valid message");
}
}
Console.WriteLine();
if (recentUpdate is not null && recentUpdate is not TextMessage)
{
Console.WriteLine(recentUpdate.FormatMessage());
}
return recentUpdate ?? throw new InvalidOperationException("The message is not a valid message");
}
else
{
var reply = await agent.GenerateReplyAsync(context.Messages, context.Options, cancellationToken);
var formattedMessages = reply.FormatMessage();
Console.WriteLine(formattedMessages);
return reply;
}
}
}

View File

@ -0,0 +1,40 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<RootNamespace>AutoGen.DotnetInteractive</RootNamespace>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<Import Project="$(RepoRoot)/dotnet/nuget/nuget-package.props" />
<PropertyGroup>
<!-- NuGet Package Settings -->
<Title>AutoGen.DotnetInteractive</Title>
<Description>
Dotnet interactive integration for AutoGen agents
</Description>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.DotNet.Interactive.VisualStudio" Version="$(MicrosoftDotnetInteractive)" />
</ItemGroup>
<ItemGroup>
<EmbeddedResource Include="dotnet-tools.json" />
<EmbeddedResource Include="RestoreInteractive.config" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="Azure.AI.OpenAI" Version="$(AzureOpenAIVersion)" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\AutoGen\AutoGen.csproj" />
</ItemGroup>
</Project>

View File

@ -0,0 +1,278 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// DotnetInteractiveFunction.cs
using System.Text;
using System.Text.Json;
using Azure.AI.OpenAI;
using Microsoft.DotNet.Interactive.Documents;
using Microsoft.DotNet.Interactive.Documents.Jupyter;
namespace AutoGen.DotnetInteractive;
public class DotnetInteractiveFunction : IDisposable
{
private readonly InteractiveService? _interactiveService = null;
private string? _notebookPath;
private readonly KernelInfoCollection _kernelInfoCollection = new KernelInfoCollection();
public DotnetInteractiveFunction(InteractiveService interactiveService, string? notebookPath = null, bool continueFromExistingNotebook = false)
{
this._interactiveService = interactiveService;
this._notebookPath = notebookPath;
this._kernelInfoCollection.Add(new KernelInfo("csharp"));
this._kernelInfoCollection.Add(new KernelInfo("markdown"));
if (this._notebookPath != null)
{
if (continueFromExistingNotebook == false)
{
// remove existing notebook
if (File.Exists(this._notebookPath))
{
File.Delete(this._notebookPath);
}
var document = new InteractiveDocument();
using var stream = File.OpenWrite(_notebookPath);
Notebook.Write(document, stream, this._kernelInfoCollection);
stream.Flush();
stream.Dispose();
}
else if (continueFromExistingNotebook == true && File.Exists(this._notebookPath))
{
// load existing notebook
using var readStream = File.OpenRead(this._notebookPath);
var document = Notebook.Read(readStream, this._kernelInfoCollection);
foreach (var cell in document.Elements)
{
if (cell.KernelName == "csharp")
{
var code = cell.Contents;
this._interactiveService.SubmitCSharpCodeAsync(code, default).Wait();
}
}
}
else
{
// create an empty notebook
var document = new InteractiveDocument();
using var stream = File.OpenWrite(_notebookPath);
Notebook.Write(document, stream, this._kernelInfoCollection);
stream.Flush();
stream.Dispose();
}
}
}
/// <summary>
/// Run existing dotnet code from message. Don't modify the code, run it as is.
/// </summary>
/// <param name="code">code.</param>
public async Task<string> RunCode(string code)
{
if (this._interactiveService == null)
{
throw new Exception("InteractiveService is not initialized.");
}
var result = await this._interactiveService.SubmitCSharpCodeAsync(code, default);
if (result != null)
{
// if result contains Error, return entire message
if (result.StartsWith("Error:"))
{
return result;
}
// add cell if _notebookPath is not null
if (this._notebookPath != null)
{
await AddCellAsync(code, "csharp");
}
// if result is over 100 characters, only return the first 100 characters.
if (result.Length > 100)
{
result = result.Substring(0, 100) + " (...too long to present)";
return result;
}
return result;
}
// add cell if _notebookPath is not null
if (this._notebookPath != null)
{
await AddCellAsync(code, "csharp");
}
return "Code run successfully. no output is available.";
}
/// <summary>
/// Install nuget packages.
/// </summary>
/// <param name="nugetPackages">nuget package to install.</param>
public async Task<string> InstallNugetPackages(string[] nugetPackages)
{
if (this._interactiveService == null)
{
throw new Exception("InteractiveService is not initialized.");
}
var codeSB = new StringBuilder();
foreach (var nuget in nugetPackages ?? Array.Empty<string>())
{
var nugetInstallCommand = $"#r \"nuget:{nuget}\"";
codeSB.AppendLine(nugetInstallCommand);
await this._interactiveService.SubmitCSharpCodeAsync(nugetInstallCommand, default);
}
var code = codeSB.ToString();
if (this._notebookPath != null)
{
await AddCellAsync(code, "csharp");
}
var sb = new StringBuilder();
sb.AppendLine("Installed nuget packages:");
foreach (var nuget in nugetPackages ?? Array.Empty<string>())
{
sb.AppendLine($"- {nuget}");
}
return sb.ToString();
}
private async Task AddCellAsync(string cellContent, string kernelName)
{
if (!File.Exists(this._notebookPath))
{
using var stream = File.OpenWrite(this._notebookPath);
Notebook.Write(new InteractiveDocument(), stream, this._kernelInfoCollection);
stream.Dispose();
}
using var readStream = File.OpenRead(this._notebookPath);
var document = Notebook.Read(readStream, this._kernelInfoCollection);
readStream.Dispose();
var cell = new InteractiveDocumentElement(cellContent, kernelName);
document.Add(cell);
using var writeStream = File.OpenWrite(this._notebookPath);
Notebook.Write(document, writeStream, this._kernelInfoCollection);
// sleep 3 seconds
await Task.Delay(3000);
writeStream.Flush();
writeStream.Dispose();
}
private class RunCodeSchema
{
public string code { get; set; } = string.Empty;
}
public Task<string> RunCodeWrapper(string arguments)
{
var schema = JsonSerializer.Deserialize<RunCodeSchema>(
arguments,
new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
});
return RunCode(schema!.code);
}
public FunctionDefinition RunCodeFunction
{
get => new FunctionDefinition
{
Name = @"RunCode",
Description = """
Run existing dotnet code from message. Don't modify the code, run it as is.
""",
Parameters = BinaryData.FromObjectAsJson(new
{
Type = "object",
Properties = new
{
code = new
{
Type = @"string",
Description = @"code.",
},
},
Required = new[]
{
"code",
},
},
new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
})
};
}
private class InstallNugetPackagesSchema
{
public string[] nugetPackages { get; set; } = Array.Empty<string>();
}
public Task<string> InstallNugetPackagesWrapper(string arguments)
{
var schema = JsonSerializer.Deserialize<InstallNugetPackagesSchema>(
arguments,
new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
});
return InstallNugetPackages(schema!.nugetPackages);
}
public FunctionDefinition InstallNugetPackagesFunction
{
get => new FunctionDefinition
{
Name = @"InstallNugetPackages",
Description = """
Install nuget packages.
""",
Parameters = BinaryData.FromObjectAsJson(new
{
Type = "object",
Properties = new
{
nugetPackages = new
{
Type = @"array",
Items = new
{
Type = @"string",
},
Description = @"nuget package to install.",
},
},
Required = new[]
{
"nugetPackages",
},
},
new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
})
};
}
public void Dispose()
{
this._interactiveService?.Dispose();
}
}

View File

@ -0,0 +1,83 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// AgentExtension.cs
using System.Text;
namespace AutoGen.DotnetInteractive;
public static class AgentExtension
{
/// <summary>
/// Register an AutoReply hook to run dotnet code block from message.
/// This hook will first detect if there's any dotnet code block (e.g. ```csharp and ```) in the most recent message.
/// if there's any, it will run the code block and send the result back as reply.
/// </summary>
/// <param name="agent">agent</param>
/// <param name="interactiveService">interactive service</param>
/// <param name="codeBlockPrefix">code block prefix</param>
/// <param name="codeBlockSuffix">code block suffix</param>
/// <param name="maximumOutputToKeep">maximum output to keep</param>
/// <example>
/// <![CDATA[
/// [!code-csharp[Example04_Dynamic_GroupChat_Coding_Task](~/../sample/AutoGen.BasicSamples/Example04_Dynamic_GroupChat_Coding_Task.cs)]
/// ]]>
/// </example>
public static IAgent RegisterDotnetCodeBlockExectionHook(
this IAgent agent,
InteractiveService interactiveService,
string codeBlockPrefix = "```csharp",
string codeBlockSuffix = "```",
int maximumOutputToKeep = 500)
{
return agent.RegisterReply(async (msgs, ct) =>
{
var lastMessage = msgs.LastOrDefault();
if (lastMessage == null || lastMessage.GetContent() is null)
{
return null;
}
// retrieve all code blocks from last message
var codeBlocks = lastMessage.GetContent()!.Split(new[] { codeBlockPrefix }, StringSplitOptions.RemoveEmptyEntries);
if (codeBlocks.Length <= 0)
{
return null;
}
// run code blocks
var result = new StringBuilder();
var i = 0;
result.AppendLine(@$"// [DOTNET_CODE_BLOCK_EXECUTION]");
foreach (var codeBlock in codeBlocks)
{
var codeBlockIndex = codeBlock.IndexOf(codeBlockSuffix);
if (codeBlockIndex == -1)
{
continue;
}
// remove code block suffix
var code = codeBlock.Substring(0, codeBlockIndex).Trim();
if (code.Length == 0)
{
continue;
}
var codeResult = await interactiveService.SubmitCSharpCodeAsync(code, ct);
if (codeResult != null)
{
result.AppendLine(@$"### Executing result for code block {i++}");
result.AppendLine(codeResult);
result.AppendLine("### End of executing result ###");
}
}
if (result.Length <= maximumOutputToKeep)
{
maximumOutputToKeep = result.Length;
}
return new TextMessage(Role.Assistant, result.ToString().Substring(0, maximumOutputToKeep), from: agent.Name);
});
}
}

View File

@ -0,0 +1,4 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// GlobalUsing.cs
global using AutoGen.Core;

View File

@ -0,0 +1,261 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// InteractiveService.cs
using System.Diagnostics;
using System.Reactive.Linq;
using System.Reflection;
using Microsoft.DotNet.Interactive;
using Microsoft.DotNet.Interactive.App.Connection;
using Microsoft.DotNet.Interactive.Commands;
using Microsoft.DotNet.Interactive.Connection;
using Microsoft.DotNet.Interactive.Events;
using Microsoft.DotNet.Interactive.Utility;
namespace AutoGen.DotnetInteractive;
public class InteractiveService : IDisposable
{
private Kernel? kernel = null;
private Process? process = null;
private bool disposedValue;
private const string DotnetInteractiveToolNotInstallMessage = "Cannot find a tool in the manifest file that has a command named 'dotnet-interactive'.";
//private readonly ProcessJobTracker jobTracker = new ProcessJobTracker();
private string installingDirectory;
public event EventHandler<DisplayEvent>? DisplayEvent;
public event EventHandler<string>? Output;
public event EventHandler<CommandFailed>? CommandFailed;
public event EventHandler<HoverTextProduced>? HoverTextProduced;
/// <summary>
/// Create an instance of InteractiveService
/// </summary>
/// <param name="installingDirectory">dotnet interactive installing directory</param>
public InteractiveService(string installingDirectory)
{
this.installingDirectory = installingDirectory;
}
public async Task<bool> StartAsync(string workingDirectory, CancellationToken ct = default)
{
this.kernel = await this.CreateKernelAsync(workingDirectory, ct);
return true;
}
public async Task<string?> SubmitCommandAsync(KernelCommand cmd, CancellationToken ct)
{
if (this.kernel == null)
{
throw new Exception("Kernel is not running");
}
try
{
var res = await this.kernel.SendAndThrowOnCommandFailedAsync(cmd, ct);
var events = res.Events;
var displayValues = events.Where(x => x is StandardErrorValueProduced || x is StandardOutputValueProduced || x is ReturnValueProduced)
.SelectMany(x => (x as DisplayEvent)!.FormattedValues);
if (displayValues is null || displayValues.Count() == 0)
{
return null;
}
return string.Join("\n", displayValues.Select(x => x.Value));
}
catch (Exception ex)
{
return $"Error: {ex.Message}";
}
}
public async Task<string?> SubmitPowershellCodeAsync(string code, CancellationToken ct)
{
var command = new SubmitCode(code, targetKernelName: "pwsh");
return await this.SubmitCommandAsync(command, ct);
}
public async Task<string?> SubmitCSharpCodeAsync(string code, CancellationToken ct)
{
var command = new SubmitCode(code, targetKernelName: "csharp");
return await this.SubmitCommandAsync(command, ct);
}
private async Task<Kernel> CreateKernelAsync(string workingDirectory, CancellationToken ct = default)
{
try
{
var url = KernelHost.CreateHostUriForCurrentProcessId();
var compositeKernel = new CompositeKernel("cbcomposite");
var cmd = new string[]
{
"dotnet",
"tool",
"run",
"dotnet-interactive",
$"[cb-{Process.GetCurrentProcess().Id}]",
"stdio",
//"--default-kernel",
//"csharp",
"--working-dir",
$@"""{workingDirectory}""",
};
var connector = new StdIoKernelConnector(
cmd,
"root-proxy",
url,
new DirectoryInfo(workingDirectory));
// Start the dotnet-interactive tool and get a proxy for the root composite kernel therein.
using var rootProxyKernel = await connector.CreateRootProxyKernelAsync().ConfigureAwait(false);
// Get proxies for each subkernel present inside the dotnet-interactive tool.
var requestKernelInfoCommand = new RequestKernelInfo(rootProxyKernel.KernelInfo.RemoteUri);
var result =
await rootProxyKernel.SendAsync(
requestKernelInfoCommand,
ct).ConfigureAwait(false);
var subKernels = result.Events.OfType<KernelInfoProduced>();
foreach (var kernelInfoProduced in result.Events.OfType<KernelInfoProduced>())
{
var kernelInfo = kernelInfoProduced.KernelInfo;
if (kernelInfo is not null && !kernelInfo.IsProxy && !kernelInfo.IsComposite)
{
var proxyKernel = await connector.CreateProxyKernelAsync(kernelInfo).ConfigureAwait(false);
proxyKernel.SetUpValueSharingIfSupported();
compositeKernel.Add(proxyKernel);
}
}
//compositeKernel.DefaultKernelName = "csharp";
compositeKernel.Add(rootProxyKernel);
compositeKernel.KernelEvents.Subscribe(this.OnKernelDiagnosticEventReceived);
return compositeKernel;
}
catch (CommandLineInvocationException ex) when (ex.Message.Contains("Cannot find a tool in the manifest file that has a command named 'dotnet-interactive'"))
{
var success = this.RestoreDotnetInteractive();
if (success)
{
return await this.CreateKernelAsync(workingDirectory, ct);
}
throw;
}
}
private void OnKernelDiagnosticEventReceived(KernelEvent ke)
{
this.WriteLine("Receive data from kernel");
this.WriteLine(KernelEventEnvelope.Serialize(ke));
switch (ke)
{
case DisplayEvent de:
this.DisplayEvent?.Invoke(this, de);
break;
case CommandFailed cf:
this.CommandFailed?.Invoke(this, cf);
break;
case HoverTextProduced cf:
this.HoverTextProduced?.Invoke(this, cf);
break;
}
}
private void WriteLine(string data)
{
this.Output?.Invoke(this, data);
}
private bool RestoreDotnetInteractive()
{
this.WriteLine("Restore dotnet interactive tool");
// write RestoreInteractive.config from embedded resource to this.workingDirectory
var assembly = Assembly.GetAssembly(typeof(InteractiveService))!;
var resourceName = "AutoGen.DotnetInteractive.RestoreInteractive.config";
using (var stream = assembly.GetManifestResourceStream(resourceName)!)
using (var fileStream = File.Create(Path.Combine(this.installingDirectory, "RestoreInteractive.config")))
{
stream.CopyTo(fileStream);
}
// write dotnet-tool.json from embedded resource to this.workingDirectory
resourceName = "AutoGen.DotnetInteractive.dotnet-tools.json";
using (var stream2 = assembly.GetManifestResourceStream(resourceName)!)
using (var fileStream2 = File.Create(Path.Combine(this.installingDirectory, "dotnet-tools.json")))
{
stream2.CopyTo(fileStream2);
}
var psi = new ProcessStartInfo
{
FileName = "dotnet",
Arguments = $"tool restore --configfile RestoreInteractive.config",
WorkingDirectory = this.installingDirectory,
RedirectStandardInput = true,
RedirectStandardOutput = true,
RedirectStandardError = true,
UseShellExecute = false,
CreateNoWindow = true,
};
using var process = new Process { StartInfo = psi };
process.OutputDataReceived += this.PrintProcessOutput;
process.ErrorDataReceived += this.PrintProcessOutput;
process.Start();
process.BeginErrorReadLine();
process.BeginOutputReadLine();
process.WaitForExit();
return process.ExitCode == 0;
}
private void PrintProcessOutput(object sender, DataReceivedEventArgs e)
{
if (!string.IsNullOrEmpty(e.Data))
{
this.WriteLine(e.Data);
}
}
public bool IsRunning()
{
return this.kernel != null;
}
protected virtual void Dispose(bool disposing)
{
if (!disposedValue)
{
if (disposing)
{
this.kernel?.Dispose();
if (this.process != null)
{
this.process.Kill();
this.process.Dispose();
}
}
disposedValue = true;
}
}
public void Dispose()
{
// Do not change this code. Put cleanup code in 'Dispose(bool disposing)' method
Dispose(disposing: true);
GC.SuppressFinalize(this);
}
}

View File

@ -0,0 +1,9 @@
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
<clear />
<add key="nuget.org"
value="https://api.nuget.org/v3/index.json" />
</packageSources>
<disabledPackageSources />
</configuration>

View File

@ -0,0 +1,86 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Utils.cs
using System.Collections;
using System.Collections.Immutable;
using Microsoft.DotNet.Interactive;
using Microsoft.DotNet.Interactive.Commands;
using Microsoft.DotNet.Interactive.Connection;
using Microsoft.DotNet.Interactive.Events;
public static class ObservableExtensions
{
public static SubscribedList<T> ToSubscribedList<T>(this IObservable<T> source)
{
return new SubscribedList<T>(source);
}
}
public static class KernelExtensions
{
internal static void SetUpValueSharingIfSupported(this ProxyKernel proxyKernel)
{
var supportedCommands = proxyKernel.KernelInfo.SupportedKernelCommands;
if (supportedCommands.Any(d => d.Name == nameof(RequestValue)) &&
supportedCommands.Any(d => d.Name == nameof(SendValue)))
{
proxyKernel.UseValueSharing();
}
}
internal static async Task<KernelCommandResult> SendAndThrowOnCommandFailedAsync(
this Kernel kernel,
KernelCommand command,
CancellationToken cancellationToken)
{
var result = await kernel.SendAsync(command, cancellationToken);
result.ThrowOnCommandFailed();
return result;
}
private static void ThrowOnCommandFailed(this KernelCommandResult result)
{
var failedEvents = result.Events.OfType<CommandFailed>();
if (!failedEvents.Any())
{
return;
}
if (failedEvents.Skip(1).Any())
{
var innerExceptions = failedEvents.Select(f => f.GetException());
throw new AggregateException(innerExceptions);
}
else
{
throw failedEvents.Single().GetException();
}
}
private static Exception GetException(this CommandFailed commandFailedEvent)
=> new Exception(commandFailedEvent.Message);
}
public class SubscribedList<T> : IReadOnlyList<T>, IDisposable
{
private ImmutableArray<T> _list = ImmutableArray<T>.Empty;
private readonly IDisposable _subscription;
public SubscribedList(IObservable<T> source)
{
_subscription = source.Subscribe(x => _list = _list.Add(x));
}
public IEnumerator<T> GetEnumerator()
{
return ((IEnumerable<T>)_list).GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
public int Count => _list.Length;
public T this[int index] => _list[index];
public void Dispose() => _subscription.Dispose();
}

View File

@ -0,0 +1,12 @@
{
"version": 1,
"isRoot": true,
"tools": {
"Microsoft.dotnet-interactive": {
"version": "1.0.431302",
"commands": [
"dotnet-interactive"
]
}
}
}

View File

@ -0,0 +1,23 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<RootNamespace>AutoGen.LMStudio</RootNamespace>
</PropertyGroup>
<Import Project="$(RepoRoot)/dotnet/nuget/nuget-package.props" />
<PropertyGroup>
<!-- NuGet Package Settings -->
<Title>AutoGen.LMStudio</Title>
<Description>
Provide support for consuming LMStudio openai-like API service in AutoGen
</Description>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\AutoGen.Core\AutoGen.Core.csproj" />
<ProjectReference Include="..\AutoGen.OpenAI\AutoGen.OpenAI.csproj" />
</ItemGroup>
</Project>

View File

@ -0,0 +1,4 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// GlobalUsing.cs
global using AutoGen.Core;

View File

@ -0,0 +1,80 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// LMStudioAgent.cs
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using AutoGen.OpenAI;
using Azure.AI.OpenAI;
using Azure.Core.Pipeline;
using Azure.Core;
namespace AutoGen.LMStudio;
/// <summary>
/// agent that consumes local server from LM Studio
/// </summary>
/// <example>
/// [!code-csharp[LMStudioAgent](../../sample/AutoGen.BasicSamples/Example08_LMStudio.cs?name=lmstudio_example_1)]
/// </example>
public class LMStudioAgent : IAgent
{
private readonly GPTAgent innerAgent;
public LMStudioAgent(
string name,
LMStudioConfig config,
string systemMessage = "You are a helpful AI assistant",
float temperature = 0.7f,
int maxTokens = 1024,
IEnumerable<FunctionDefinition>? functions = null,
IDictionary<string, Func<string, Task<string>>>? functionMap = null)
{
var client = ConfigOpenAIClientForLMStudio(config);
innerAgent = new GPTAgent(
name: name,
systemMessage: systemMessage,
openAIClient: client,
modelName: "llm", // model name doesn't matter for LM Studio
temperature: temperature,
maxTokens: maxTokens,
functions: functions,
functionMap: functionMap);
}
public string Name => innerAgent.Name;
public Task<IMessage> GenerateReplyAsync(
IEnumerable<IMessage> messages,
GenerateReplyOptions? options = null,
System.Threading.CancellationToken cancellationToken = default)
{
return innerAgent.GenerateReplyAsync(messages, options, cancellationToken);
}
private OpenAIClient ConfigOpenAIClientForLMStudio(LMStudioConfig config)
{
// create uri from host and port
var uri = config.Uri;
var accessToken = new AccessToken(string.Empty, DateTimeOffset.Now.AddDays(180));
var tokenCredential = DelegatedTokenCredential.Create((_, _) => accessToken);
var openAIClient = new OpenAIClient(uri, tokenCredential);
// remove authenication header from pipeline
var pipeline = HttpPipelineBuilder.Build(
new OpenAIClientOptions(OpenAIClientOptions.ServiceVersion.V2022_12_01),
Array.Empty<HttpPipelinePolicy>(),
[],
new ResponseClassifier());
// use reflection to override _pipeline field
var field = typeof(OpenAIClient).GetField("_pipeline", System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance);
field.SetValue(openAIClient, pipeline);
// use reflection to set _isConfiguredForAzureOpenAI to false
var isConfiguredForAzureOpenAIField = typeof(OpenAIClient).GetField("_isConfiguredForAzureOpenAI", System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance);
isConfiguredForAzureOpenAIField.SetValue(openAIClient, false);
return openAIClient;
}
}

View File

@ -0,0 +1,25 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// LMStudioConfig.cs
using System;
/// <summary>
/// Add support for consuming openai-like API from LM Studio
/// </summary>
public class LMStudioConfig : ILLMConfig
{
public LMStudioConfig(string host, int port, int version = 1)
{
this.Host = host;
this.Port = port;
this.Version = version;
}
public string Host { get; }
public int Port { get; }
public int Version { get; }
public Uri Uri => new Uri($"http://{Host}:{Port}/v{Version}");
}

View File

@ -0,0 +1,31 @@
## AutoGen.LMStudio
This package provides support for consuming openai-like API from LMStudio local server.
## Installation
To use `AutoGen.LMStudio`, add the following package to your `.csproj` file:
```xml
<ItemGroup>
<PackageReference Include="AutoGen.LMStudio" Version="AUTOGEN_VERSION" />
</ItemGroup>
```
## Usage
```csharp
using AutoGen.LMStudio;
var localServerEndpoint = "localhost";
var port = 5000;
var lmStudioConfig = new LMStudioConfig(localServerEndpoint, port);
var agent = new LMStudioAgent(
name: "agent",
systemMessage: "You are an agent that help user to do some tasks.",
lmStudioConfig: lmStudioConfig)
.RegisterPrintMessage(); // register a hook to print message nicely to console
await agent.SendAsync("Can you write a piece of C# code to calculate 100th of fibonacci?");
```
## Update history
### Update on 0.0.7 (2024-02-11)
- Add `LMStudioAgent` to support consuming openai-like API from LMStudio local server.

View File

@ -0,0 +1,133 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// MistralClientAgent.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using AutoGen.Core;
using AutoGen.Mistral.Extension;
namespace AutoGen.Mistral;
/// <summary>
/// Mistral client agent.
///
/// <para>This agent supports the following input message types:</para>
/// <list type="bullet">
/// <para><see cref="MessageEnvelope{T}"/> where T is <see cref="ChatMessage"/></para>
/// </list>
///
/// <para>This agent returns the following message types:</para>
/// <list type="bullet">
/// <para><see cref="MessageEnvelope{T}"/> where T is <see cref="ChatCompletionResponse"/></para>
/// </list>
///
/// You can register this agent with <see cref="MistralAgentExtension.RegisterMessageConnector(AutoGen.Mistral.MistralClientAgent, AutoGen.Mistral.MistralChatMessageConnector?)"/>
/// to support more AutoGen message types.
/// </summary>
public class MistralClientAgent : IStreamingAgent
{
private readonly MistralClient _client;
private readonly string _systemMessage;
private readonly string _model;
private readonly int? _randomSeed;
private readonly bool _jsonOutput = false;
private ToolChoiceEnum? _toolChoice;
/// <summary>
/// Create a new instance of <see cref="MistralClientAgent"/>.
/// </summary>
/// <param name="client"><see cref="MistralClient"/></param>
/// <param name="name">the name of this agent</param>
/// <param name="model">the mistral model id.</param>
/// <param name="systemMessage">system message.</param>
/// <param name="randomSeed">the seed to generate output.</param>
/// <param name="toolChoice">tool choice strategy.</param>
/// <param name="jsonOutput">use json output.</param>
public MistralClientAgent(
MistralClient client,
string name,
string model,
string systemMessage = "You are a helpful AI assistant",
int? randomSeed = null,
ToolChoiceEnum? toolChoice = null,
bool jsonOutput = false)
{
_client = client;
Name = name;
_systemMessage = systemMessage;
_model = model;
_randomSeed = randomSeed;
_jsonOutput = jsonOutput;
_toolChoice = toolChoice;
}
public string Name { get; }
public async Task<IMessage> GenerateReplyAsync(
IEnumerable<IMessage> messages,
GenerateReplyOptions? options = null,
CancellationToken cancellationToken = default)
{
var request = BuildChatRequest(messages, options);
var response = await _client.CreateChatCompletionsAsync(request);
return new MessageEnvelope<ChatCompletionResponse>(response, from: this.Name);
}
public async Task<IAsyncEnumerable<IStreamingMessage>> GenerateStreamingReplyAsync(
IEnumerable<IMessage> messages,
GenerateReplyOptions? options = null,
CancellationToken cancellationToken = default)
{
var request = BuildChatRequest(messages, options);
var response = _client.StreamingChatCompletionsAsync(request);
return ProcessMessage(response);
}
private async IAsyncEnumerable<IMessage> ProcessMessage(IAsyncEnumerable<ChatCompletionResponse> response)
{
await foreach (var content in response)
{
yield return new MessageEnvelope<ChatCompletionResponse>(content, from: this.Name);
}
}
private ChatCompletionRequest BuildChatRequest(IEnumerable<IMessage> messages, GenerateReplyOptions? options)
{
var chatHistory = BuildChatHistory(messages);
var chatRequest = new ChatCompletionRequest(model: _model, messages: chatHistory.ToList(), temperature: options?.Temperature, randomSeed: _randomSeed)
{
MaxTokens = options?.MaxToken,
ResponseFormat = _jsonOutput ? new ResponseFormat() { ResponseFormatType = "json_object" } : null,
};
if (options?.Functions != null)
{
chatRequest.Tools = options.Functions.Select(f => new FunctionTool(f.ToMistralFunctionDefinition())).ToList();
chatRequest.ToolChoice = _toolChoice ?? ToolChoiceEnum.Auto;
}
return chatRequest;
}
private IEnumerable<ChatMessage> BuildChatHistory(IEnumerable<IMessage> messages)
{
var history = messages.Select(m => m switch
{
IMessage<ChatMessage> chatMessage => chatMessage.Content,
_ => throw new ArgumentException("Invalid message type")
});
// if there's no system message in the history, add one to the beginning
if (!history.Any(m => m.Role == ChatMessage.RoleEnum.System))
{
history = new[] { new ChatMessage(ChatMessage.RoleEnum.System, _systemMessage) }.Concat(history);
}
return history;
}
}

View File

@ -0,0 +1,27 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<RootNamespace>AutoGen.Mistral</RootNamespace>
</PropertyGroup>
<Import Project="$(RepoRoot)/dotnet/nuget/nuget-package.props" />
<PropertyGroup>
<!-- NuGet Package Settings -->
<Title>AutoGen.Mistral</Title>
<Description>
Provide support for consuming Mistral model in AutoGen
</Description>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Memory.Data" Version="8.0.0" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\AutoGen.Core\AutoGen.Core.csproj" />
</ItemGroup>
</Project>

Some files were not shown because too many files have changed in this diff Show More