autogen/dotnet
Xiaoyun Zhang 7f635b4309
[.Net] Update website for AutoGen.SemanticKernel and AutoGen.Ollama (#2814)
* update sk documents

* add ollama doc
2024-05-28 21:55:40 +00:00
..
.config Bring Dotnet AutoGen (#924) 2024-04-26 16:21:46 +00:00
.tools [Dotnet] Add dotnet build workflow (#946) 2023-12-30 02:23:12 +00:00
eng [.Net] Fix #2660 and add tests for AutoGen.DotnetInteractive (#2676) 2024-05-14 03:40:26 +00:00
nuget [.Net] fix code ql for dotnet build && update trigger for dotnet workflow (#2529) 2024-04-29 20:27:57 +00:00
sample [.Net] Update website for AutoGen.SemanticKernel and AutoGen.Ollama (#2814) 2024-05-28 21:55:40 +00:00
src [.Net] Update website for AutoGen.SemanticKernel and AutoGen.Ollama (#2814) 2024-05-28 21:55:40 +00:00
test Introduce AnthropicClient and AnthropicClientAgent (#2769) 2024-05-24 16:37:16 +00:00
website [.Net] Update website for AutoGen.SemanticKernel and AutoGen.Ollama (#2814) 2024-05-28 21:55:40 +00:00
.editorconfig [.Net] rename Autogen.Ollama to AutoGen.Ollama and add more test cases to AutoGen.Ollama (#2772) 2024-05-23 19:15:25 +00:00
.gitignore Bring Dotnet AutoGen (#924) 2024-04-26 16:21:46 +00:00
AutoGen.sln Remove duplicate project declared in AutoGen.sln (#2789) 2024-05-24 20:17:41 +00:00
Directory.Build.props [.Net] fix #2609 (#2618) 2024-05-07 21:37:46 +00:00
NuGet.config [.Net] Fix #2660 and add tests for AutoGen.DotnetInteractive (#2676) 2024-05-14 03:40:26 +00:00
README.md Bring Dotnet AutoGen (#924) 2024-04-26 16:21:46 +00:00
global.json [.Net] fix code ql for dotnet build && update trigger for dotnet workflow (#2529) 2024-04-29 20:27:57 +00:00

README.md

AutoGen for .NET

dotnet-ci NuGet version

[!NOTE] Nightly build is available at:

Firstly, following the installation guide to install AutoGen packages.

Then you can start with the following code snippet to create a conversable agent and chat with it.

using AutoGen;
using AutoGen.OpenAI;

var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var gpt35Config = new OpenAIConfig(openAIKey, "gpt-3.5-turbo");

var assistantAgent = new AssistantAgent(
    name: "assistant",
    systemMessage: "You are an assistant that help user to do some tasks.",
    llmConfig: new ConversableAgentConfig
    {
        Temperature = 0,
        ConfigList = [gpt35Config],
    })
    .RegisterPrintMessage(); // register a hook to print message nicely to console

// set human input mode to ALWAYS so that user always provide input
var userProxyAgent = new UserProxyAgent(
    name: "user",
    humanInputMode: ConversableAgent.HumanInputMode.ALWAYS)
    .RegisterPrintMessage();

// start the conversation
await userProxyAgent.InitiateChatAsync(
    receiver: assistantAgent,
    message: "Hey assistant, please do me a favor.",
    maxRound: 10);

Samples

You can find more examples under the sample project.

Functionality

  • ConversableAgent

  • Agent communication

    • Two-agent chat
    • Group chat
  • Enhanced LLM Inferences

  • Exclusive for dotnet

    • Source generator for type-safe function definition generation

Update log

Update on 0.0.11 (2024-03-26)
  • Add link to Discord channel in nuget's readme.md
  • Document improvements
Update on 0.0.10 (2024-03-12)
  • Rename Workflow to Graph
  • Rename AddInitializeMessage to SendIntroduction
  • Rename SequentialGroupChat to RoundRobinGroupChat
Update on 0.0.9 (2024-03-02)
  • Refactor over @AutoGen.Message and introducing TextMessage, ImageMessage, MultiModalMessage and so on. PR #1676
  • Add AutoGen.SemanticKernel to support seamless integration with Semantic Kernel
  • Move the agent contract abstraction to AutoGen.Core package. The AutoGen.Core package provides the abstraction for message type, agent and group chat and doesn't contain dependencies over Azure.AI.OpenAI or Semantic Kernel. This is useful when you want to leverage AutoGen's abstraction only and want to avoid introducing any other dependencies.
  • Move GPTAgent, OpenAIChatAgent and all openai-dependencies to AutoGen.OpenAI
Update on 0.0.8 (2024-02-28)
  • Fix #1804
  • Streaming support for IAgent #1656
  • Streaming support for middleware via MiddlewareStreamingAgent #1656
  • Graph chat support with conditional transition workflow #1761
  • AutoGen.SourceGenerator: Generate FunctionContract from FunctionAttribute #1736
Update on 0.0.7 (2024-02-11)
  • Add AutoGen.LMStudio to support comsume openai-like API from LMStudio local server
Update on 0.0.6 (2024-01-23)
  • Add MiddlewareAgent
  • Use MiddlewareAgent to implement existing agent hooks (RegisterPreProcess, RegisterPostProcess, RegisterReply)
  • Remove AutoReplyAgent, PreProcessAgent, PostProcessAgent because they are replaced by MiddlewareAgent
Update on 0.0.5
  • Simplify IAgent interface by removing ChatLLM Property
  • Add GenerateReplyOptions to IAgent.GenerateReplyAsync which allows user to specify or override the options when generating reply
Update on 0.0.4
  • Move out dependency of Semantic Kernel
  • Add type IChatLLM as connector to LLM
Update on 0.0.3
  • In AutoGen.SourceGenerator, rename FunctionAttribution to FunctionAttribute
  • In AutoGen, refactor over ConversationAgent, UserProxyAgent, and AssistantAgent
Update on 0.0.2
  • update Azure.OpenAI.AI to 1.0.0-beta.12
  • update Semantic kernel to 1.0.1