tryAGI.OpenAI 3.9.2

Prefix Reserved
There is a newer prerelease version of this package available.
See the version list below for details.
dotnet add package tryAGI.OpenAI --version 3.9.2                
NuGet\Install-Package tryAGI.OpenAI -Version 3.9.2                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="tryAGI.OpenAI" Version="3.9.2" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add tryAGI.OpenAI --version 3.9.2                
#r "nuget: tryAGI.OpenAI, 3.9.2"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install tryAGI.OpenAI as a Cake Addin
#addin nuget:?package=tryAGI.OpenAI&version=3.9.2

// Install tryAGI.OpenAI as a Cake Tool
#tool nuget:?package=tryAGI.OpenAI&version=3.9.2                

OpenAI

Nuget package dotnet License: MIT Discord

Features 🔥

  • Fully generated C# SDK based on official OpenAI OpenAPI specification using AutoSDK
  • Same day update to support new features
  • Updated and supported automatically if there are no breaking changes
  • Contains a supported list of constants such as current prices, models, and other
  • Source generator to define functions natively through C# interfaces
  • All modern .NET features - nullability, trimming, NativeAOT, etc.
  • Support .Net Framework/.Net Standard 2.0
  • Support all OpenAI API endpoints including completions, chat, embeddings, images, assistants and more.
  • Regularly tested for compatibility with popular custom providers like OpenRouter/DeepSeek/Ollama/LM Studio and many others

Documentation

Examples and documentation can be found here: https://tryagi.github.io/OpenAI/

Usage

using var api = new OpenAiApi("API_KEY");
string response = await api.Chat.CreateChatCompletionAsync(
    messages: ["Generate five random words."],
    model: CreateChatCompletionRequestModel.Gpt4oMini);
Console.WriteLine(response); // "apple, banana, cherry, date, elderberry"

var enumerable = api.Chat.CreateChatCompletionAsStreamAsync(
    messages: ["Generate five random words."],
    model: CreateChatCompletionRequestModel.Gpt4oMini);

await foreach (string response in enumerable)
{
    Console.WriteLine(response);
}

It uses three implicit conversions:

  • from string to ChatCompletionRequestUserMessage. It will always be converted to the user message.
  • from ChatCompletionResponseMessage to string . It will always contain the first choice message content.
  • from CreateChatCompletionStreamResponse to string . It will always contain the first delta content.

You still can use the full response objects if you need more information, just replace string response to var response.

Tools

using OpenAI;
using CSharpToJsonSchema;

public enum Unit
{
    Celsius,
    Fahrenheit,
}

public class Weather
{
    public string Location { get; set; } = string.Empty;
    public double Temperature { get; set; }
    public Unit Unit { get; set; }
    public string Description { get; set; } = string.Empty;
}

[GenerateJsonSchema(Strict = true)] // false by default. You can't use parameters with default values in Strict mode.
public interface IWeatherFunctions
{
    [Description("Get the current weather in a given location")]
    public Task<Weather> GetCurrentWeatherAsync(
        [Description("The city and state, e.g. San Francisco, CA")] string location,
        Unit unit,
        CancellationToken cancellationToken = default);
}

public class WeatherService : IWeatherFunctions
{
    public Task<Weather> GetCurrentWeatherAsync(string location, Unit unit = Unit.Celsius, CancellationToken cancellationToken = default)
    {
        return Task.FromResult(new Weather
        {
            Location = location,
            Temperature = 22.0,
            Unit = unit,
            Description = "Sunny",
        });
    }
}

using var api = new OpenAiApi("API_KEY");

var service = new WeatherService();
var tools = service.AsTools().AsOpenAiTools();

var messages = new List<ChatCompletionRequestMessage>
{
    "You are a helpful weather assistant.".AsSystemMessage(),
    "What is the current temperature in Dubai, UAE in Celsius?".AsUserMessage(),
};
var model = CreateChatCompletionRequestModel.Gpt4oMini;
var result = await api.Chat.CreateChatCompletionAsync(
    messages,
    model: model,
    tools: tools);
var resultMessage = result.Choices.First().Message;
messages.Add(resultMessage.AsRequestMessage());

foreach (var call in resultMessage.ToolCalls)
{
    var json = await service.CallAsync(
        functionName: call.Function.Name,
        argumentsAsJson: call.Function.Arguments);
    messages.Add(json.AsToolMessage(call.Id));
}

var result = await api.Chat.CreateChatCompletionAsync(
    messages,
    model: model,
    tools: tools);
var resultMessage = result.Choices.First().Message;
messages.Add(resultMessage.AsRequestMessage());
> System: 
You are a helpful weather assistant.
> User: 
What is the current temperature in Dubai, UAE in Celsius?
> Assistant: 
call_3sptsiHzKnaxF8bs8BWxPo0B:
GetCurrentWeather({"location":"Dubai, UAE","unit":"celsius"})
> Tool(call_3sptsiHzKnaxF8bs8BWxPo0B):
{"location":"Dubai, UAE","temperature":22,"unit":"celsius","description":"Sunny"}
> Assistant: 
The current temperature in Dubai, UAE is 22°C with sunny weather.

Structured Outputs

using OpenAI;

using var api = new OpenAiApi("API_KEY");

var response = await api.Chat.CreateChatCompletionAsAsync<Weather>(
    messages: ["Generate random weather."],
    model: CreateChatCompletionRequestModel.Gpt4oMini,
    jsonSerializerOptions: new JsonSerializerOptions
    {
        Converters = {new JsonStringEnumConverter()},
    });
// or (if you need trimmable/NativeAOT version)
var response = await api.Chat.CreateChatCompletionAsAsync(
    jsonTypeInfo: SourceGeneratedContext.Default.Weather,
    messages: ["Generate random weather."],
    model: CreateChatCompletionRequestModel.Gpt4oMini);

// response.Value1 contains the structured output
// response.Value2 contains the CreateChatCompletionResponse object
Weather:
Location: San Francisco, CA
Temperature: 65
Unit: Fahrenheit
Description: Partly cloudy with a light breeze and occasional sunshine.
Raw Response:
{"Location":"San Francisco, CA","Temperature":65,"Unit":"Fahrenheit","Description":"Partly cloudy with a light breeze and occasional sunshine."}

Additional code for trimmable/NativeAOT version:

[JsonSourceGenerationOptions(Converters = [typeof(JsonStringEnumConverter<Unit>)])]
[JsonSerializable(typeof(Weather))]
public partial class SourceGeneratedContext : JsonSerializerContext;

Custom providers

using OpenAI;

using var api = CustomProviders.GitHubModels("GITHUB_TOKEN");
using var api = CustomProviders.Azure("API_KEY", "ENDPOINT");
using var api = CustomProviders.DeepInfra("API_KEY");
using var api = CustomProviders.Groq("API_KEY");
using var api = CustomProviders.DeepSeek("API_KEY");
using var api = CustomProviders.Fireworks("API_KEY");
using var api = CustomProviders.OpenRouter("API_KEY");
using var api = CustomProviders.Together("API_KEY");
using var api = CustomProviders.Perplexity("API_KEY");
using var api = CustomProviders.SambaNova("API_KEY");
using var api = CustomProviders.Mistral("API_KEY");
using var api = CustomProviders.Codestral("API_KEY");
using var api = CustomProviders.Ollama();
using var api = CustomProviders.LmStudio();

Constants

All tryGetXXX methods return null if the value is not found.
There also non-try methods that throw an exception if the value is not found.

using OpenAI;

// You can try to get the enum from string using:
var model = CreateChatCompletionRequestModelExtensions.ToEnum("gpt-4o") ?? throw new Exception("Invalid model");

// Chat
var model = CreateChatCompletionRequestModel.Gpt4oMini;
double? priceInUsd = model.TryGetPriceInUsd(
    inputTokens: 500,
    outputTokens: 500)
double? priceInUsd = model.TryGetFineTunePriceInUsd(
    trainingTokens: 500,
    inputTokens: 500,
    outputTokens: 500)
int contextLength = model.TryGetContextLength() // 128_000
int outputLength = model.TryGetOutputLength() // 16_000

// Embeddings
var model = CreateEmbeddingRequestModel.TextEmbedding3Small;
int? maxInputTokens = model.TryGetMaxInputTokens() // 8191
double? priceInUsd = model.TryGetPriceInUsd(tokens: 500)

// Images
double? priceInUsd = CreateImageRequestModel.DallE3.TryGetPriceInUsd(
    size: CreateImageRequestSize.x1024x1024,
    quality: CreateImageRequestQuality.Hd)

// Speech to Text
double? priceInUsd = CreateTranscriptionRequestModel.Whisper1.TryGetPriceInUsd(
    seconds: 60)

// Text to Speech
double? priceInUsd = CreateSpeechRequestModel.Tts1Hd.TryGetPriceInUsd(
    characters: 1000)

Support

Priority place for bugs: https://github.com/tryAGI/OpenAI/issues
Priority place for ideas and general questions: https://github.com/tryAGI/OpenAI/discussions
Discord: https://discord.gg/Ca2xhfBf3v

Acknowledgments

JetBrains logo

This project is supported by JetBrains through the Open Source Support Program.

CodeRabbit logo

This project is supported by CodeRabbit through the Open Source Support Program.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 is compatible.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on tryAGI.OpenAI:

Package Downloads
LangChain.Providers.OpenAI

OpenAI API LLM and Chat model provider.

Anyscale

SDK for Anyscale Endpoint that makes it easy and cheap to use LLama 2

LangChain.Serve.OpenAI

LangChain Serve as OpenAI sdk compatible API

GitHub repositories (1)

Showing the top 1 popular GitHub repositories that depend on tryAGI.OpenAI:

Repository Stars
tryAGI/LangChain
C# implementation of LangChain. We try to be as close to the original as possible in terms of abstractions, but are open to new entities.
Version Downloads Last updated
3.9.3-dev.14 45 11/15/2024
3.9.3-dev.13 37 11/15/2024
3.9.3-dev.1 60 10/26/2024
3.9.2 680 10/26/2024
3.9.2-dev.7 126 10/25/2024
3.9.2-dev.6 34 10/25/2024
3.9.2-dev.4 41 10/25/2024
3.9.2-dev.1 151 10/24/2024
3.9.1 60 10/24/2024
3.8.2-dev.13 85 10/14/2024
3.8.2-dev.10 48 10/13/2024
3.8.2-dev.9 50 10/13/2024
3.8.1 913 10/9/2024
3.8.1-dev.14 66 10/2/2024
3.8.1-dev.13 53 10/2/2024
3.8.1-dev.6 45 9/28/2024
3.8.1-dev.3 50 9/28/2024
3.8.1-dev.2 46 9/26/2024
3.8.0 973 9/26/2024
3.7.1-dev.14 38 9/24/2024
3.7.1-dev.12 52 9/23/2024
3.7.1-dev.5 72 9/16/2024
3.7.1-dev.3 7,998 9/15/2024
3.7.0 312 9/14/2024
3.6.4-dev.10 57 9/14/2024
3.6.4-dev.9 53 9/14/2024
3.6.4-dev.7 45 9/13/2024
3.6.4-dev.5 57 9/9/2024
3.6.3 496 9/2/2024
3.6.2 122 9/1/2024
3.6.1 129 9/1/2024
3.5.2-dev.24 67 8/31/2024
3.5.2-dev.22 59 8/31/2024
3.5.2-dev.21 61 8/31/2024
3.5.2-dev.14 58 8/29/2024
3.5.2-dev.10 81 8/24/2024
3.5.2-dev.9 88 8/21/2024
3.5.2-dev.7 82 8/19/2024
3.5.2-dev.6 73 8/19/2024
3.5.2-dev.5 79 8/19/2024
3.5.2-dev.2 91 8/18/2024
3.5.1 737 8/18/2024
3.5.1-dev.4 75 8/18/2024
3.5.1-dev.3 83 8/18/2024
3.5.0 139 8/17/2024
3.4.2-dev.5 77 8/17/2024
3.4.2-dev.1 78 8/17/2024
3.4.1 230 8/16/2024
3.4.0 137 8/16/2024
3.3.1-dev.17 87 8/15/2024
3.3.1-dev.16 67 8/15/2024
3.3.1-dev.14 71 8/15/2024
3.3.1-dev.12 79 8/13/2024
3.3.1-dev.11 75 8/13/2024
3.3.1-dev.8 69 8/12/2024
3.3.1-dev.7 64 8/12/2024
3.3.1-dev.1 41 8/6/2024
3.3.0 297 8/6/2024
3.2.2-dev.3 37 8/6/2024
3.2.1 970 8/6/2024
3.2.1-dev.2 53 8/6/2024
3.2.1-dev.1 41 8/5/2024
3.2.0 89 8/5/2024
3.1.1-dev.1 41 8/4/2024
3.1.0 161 8/1/2024
3.0.4-dev.11 50 8/1/2024
3.0.4-dev.9 659 7/24/2024
3.0.4-dev.8 114 7/23/2024
3.0.4-dev.6 165 7/20/2024
3.0.4-dev.5 63 7/20/2024
3.0.4-dev.4 61 7/20/2024
3.0.4-dev.3 63 7/20/2024
3.0.4-dev.2 66 7/19/2024
3.0.4-dev.1 59 7/18/2024
3.0.3 158 7/18/2024
3.0.3-dev.1 53 7/18/2024
3.0.1 116 7/13/2024
3.0.0 92 7/11/2024
3.0.0-alpha.3 55 7/7/2024
3.0.0-alpha.2 53 7/6/2024
2.0.9 32,172 5/14/2024
2.0.8 146 4/29/2024
2.0.7 127 4/29/2024
2.0.6 9,324 4/22/2024
2.0.5 623 4/10/2024
2.0.4 362 4/4/2024
2.0.3 129 4/3/2024
2.0.2 205 4/3/2024
2.0.1 132 4/2/2024
2.0.0 305 3/22/2024
2.0.0-alpha.10 2,849 2/23/2024
2.0.0-alpha.9 18,030 1/27/2024
2.0.0-alpha.8 61 1/27/2024
2.0.0-alpha.7 104 1/20/2024
2.0.0-alpha.5 513 12/5/2023
2.0.0-alpha.4 190 11/16/2023
2.0.0-alpha.3 102 11/16/2023
2.0.0-alpha.2 69 11/16/2023
2.0.0-alpha.1 66 11/15/2023
2.0.0-alpha.0 66 11/15/2023
1.8.2 80,003 10/13/2023
1.8.1 449 10/11/2023
1.8.0 287 9/29/2023
1.7.2 1,653 8/15/2023
1.7.1 164 8/15/2023
1.7.0 333 8/15/2023
1.6.3 1,903 7/29/2023
1.6.1 164 7/29/2023
1.6.0 295 7/29/2023
1.5.0 550 7/13/2023
1.4.1 175 7/12/2023
1.4.0 169 7/12/2023
1.3.0 173 7/12/2023
1.2.0 591 7/5/2023
1.1.2 331 6/30/2023
1.1.1 170 6/30/2023
1.1.0 166 6/29/2023
1.0.0 180 6/28/2023
0.9.0 224 6/24/2023
0.0.0-dev.150 59 7/18/2024
0.0.0-dev 107 9/14/2024