autogen/dotnet
Jack Gerrits 9b79094891
Add blog post announcing the new architecture preview (#3599)
2024-10-02 18:04:33 +00:00
..
.config Bring Dotnet AutoGen (#924) 2024-04-26 16:21:46 +00:00
.tools [.Net] Add AOT compatible check for AutoGen.Core (#2858) 2024-06-04 15:01:11 +00:00
eng [.NET] Release v0.2.1 (#3529) 2024-09-13 21:21:19 +00:00
nuget Add blog post announcing the new architecture preview (#3599) 2024-10-02 18:04:33 +00:00
resource/images [.Net] Add Goolge gemini (#2868) 2024-06-10 17:31:45 +00:00
sample [.Net][AutoGen.OpenAI] Allow nullable system message, temperature and max token to support o1-preview model (#3524) 2024-09-13 16:59:58 +00:00
src [.Net][AutoGen.OpenAI] Allow nullable system message, temperature and max token to support o1-preview model (#3524) 2024-09-13 16:59:58 +00:00
test stop setting name field when assistant message contains tool call (#3481) 2024-09-05 20:54:30 +00:00
website [.NET] Release v0.2.1 (#3529) 2024-09-13 21:21:19 +00:00
.editorconfig [.Net] rename Autogen.Ollama to AutoGen.Ollama and add more test cases to AutoGen.Ollama (#2772) 2024-05-23 19:15:25 +00:00
.gitignore Bring Dotnet AutoGen (#924) 2024-04-26 16:21:46 +00:00
AutoGen.sln [.Net] Add AutoGen.OpenAI package that uses OpenAI v2 SDK (#3402) 2024-08-27 21:37:47 +00:00
Directory.Build.props [.Net] fix #3203 (#3204) 2024-07-25 16:15:55 +00:00
NuGet.config [.Net] Fix #2660 and add tests for AutoGen.DotnetInteractive (#2676) 2024-05-14 03:40:26 +00:00
README.md Bring Dotnet AutoGen (#924) 2024-04-26 16:21:46 +00:00
global.json [.Net] fix code ql for dotnet build && update trigger for dotnet workflow (#2529) 2024-04-29 20:27:57 +00:00

README.md

AutoGen for .NET

dotnet-ci NuGet version

[!NOTE] Nightly build is available at:

Firstly, following the installation guide to install AutoGen packages.

Then you can start with the following code snippet to create a conversable agent and chat with it.

using AutoGen;
using AutoGen.OpenAI;

var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var gpt35Config = new OpenAIConfig(openAIKey, "gpt-3.5-turbo");

var assistantAgent = new AssistantAgent(
    name: "assistant",
    systemMessage: "You are an assistant that help user to do some tasks.",
    llmConfig: new ConversableAgentConfig
    {
        Temperature = 0,
        ConfigList = [gpt35Config],
    })
    .RegisterPrintMessage(); // register a hook to print message nicely to console

// set human input mode to ALWAYS so that user always provide input
var userProxyAgent = new UserProxyAgent(
    name: "user",
    humanInputMode: ConversableAgent.HumanInputMode.ALWAYS)
    .RegisterPrintMessage();

// start the conversation
await userProxyAgent.InitiateChatAsync(
    receiver: assistantAgent,
    message: "Hey assistant, please do me a favor.",
    maxRound: 10);

Samples

You can find more examples under the sample project.

Functionality

  • ConversableAgent

  • Agent communication

    • Two-agent chat
    • Group chat
  • Enhanced LLM Inferences

  • Exclusive for dotnet

    • Source generator for type-safe function definition generation

Update log

Update on 0.0.11 (2024-03-26)
  • Add link to Discord channel in nuget's readme.md
  • Document improvements
Update on 0.0.10 (2024-03-12)
  • Rename Workflow to Graph
  • Rename AddInitializeMessage to SendIntroduction
  • Rename SequentialGroupChat to RoundRobinGroupChat
Update on 0.0.9 (2024-03-02)
  • Refactor over @AutoGen.Message and introducing TextMessage, ImageMessage, MultiModalMessage and so on. PR #1676
  • Add AutoGen.SemanticKernel to support seamless integration with Semantic Kernel
  • Move the agent contract abstraction to AutoGen.Core package. The AutoGen.Core package provides the abstraction for message type, agent and group chat and doesn't contain dependencies over Azure.AI.OpenAI or Semantic Kernel. This is useful when you want to leverage AutoGen's abstraction only and want to avoid introducing any other dependencies.
  • Move GPTAgent, OpenAIChatAgent and all openai-dependencies to AutoGen.OpenAI
Update on 0.0.8 (2024-02-28)
  • Fix #1804
  • Streaming support for IAgent #1656
  • Streaming support for middleware via MiddlewareStreamingAgent #1656
  • Graph chat support with conditional transition workflow #1761
  • AutoGen.SourceGenerator: Generate FunctionContract from FunctionAttribute #1736
Update on 0.0.7 (2024-02-11)
  • Add AutoGen.LMStudio to support comsume openai-like API from LMStudio local server
Update on 0.0.6 (2024-01-23)
  • Add MiddlewareAgent
  • Use MiddlewareAgent to implement existing agent hooks (RegisterPreProcess, RegisterPostProcess, RegisterReply)
  • Remove AutoReplyAgent, PreProcessAgent, PostProcessAgent because they are replaced by MiddlewareAgent
Update on 0.0.5
  • Simplify IAgent interface by removing ChatLLM Property
  • Add GenerateReplyOptions to IAgent.GenerateReplyAsync which allows user to specify or override the options when generating reply
Update on 0.0.4
  • Move out dependency of Semantic Kernel
  • Add type IChatLLM as connector to LLM
Update on 0.0.3
  • In AutoGen.SourceGenerator, rename FunctionAttribution to FunctionAttribute
  • In AutoGen, refactor over ConversationAgent, UserProxyAgent, and AssistantAgent
Update on 0.0.2
  • update Azure.OpenAI.AI to 1.0.0-beta.12
  • update Semantic kernel to 1.0.1