Netizine.OpenAI
1.0.8
dotnet add package Netizine.OpenAI --version 1.0.8
NuGet\Install-Package Netizine.OpenAI -Version 1.0.8
<PackageReference Include="Netizine.OpenAI" Version="1.0.8" />
<PackageVersion Include="Netizine.OpenAI" Version="1.0.8" />
<PackageReference Include="Netizine.OpenAI" />
paket add Netizine.OpenAI --version 1.0.8
#r "nuget: Netizine.OpenAI, 1.0.8"
#addin nuget:?package=Netizine.OpenAI&version=1.0.8
#tool nuget:?package=Netizine.OpenAI&version=1.0.8
OpenAI
The unofficial OpenAI .NET library, supporting .NET Standard 2.0+, .NET Core 2.0+, and .NET Framework 4.6.2+.
Installation
Using the .NET Core command-line interface (CLI) tools:
dotnet add package OpenAI
Using the NuGet Command Line Interface (CLI):
nuget install OpenAI
Using the Package Manager Console:
Install-Package OpenAI
From within Visual Studio:
- Open the Solution Explorer.
- Right-click on a project within your solution.
- Click on Manage NuGet Packages...
- Click on the Browse tab and search for "OpenAI".
- Click on the OpenAI package, select the appropriate version in the right-tab and click Install.
Documentation
For a comprehensive list of examples, check out the API documentation. See video demonstrations covering how to use the library.
Usage
Authentication
OpenAI authenticates API requests using your account’s secret key, which you can find in the OpenAI Dashboard. By default, secret keys can be used to perform any API request without restriction.
Use OpenAI.OpenAIConfiguration.ApiKey
property to set the secret key.
OpenAI.OpenAIConfiguration.ApiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY");
Retrieve a resource
The Retrieve
method of the service class can be used to retrieve a resource:
ModelService modelService = new ModelService();
Model model = modelService.Get("davinci");
Console.WriteLine(model.OwnedBy);
Creating a resource
The Create
method of the service class can be used to create a new resource:
ChatCompletionMessage chatMessage = new ChatCompletionMessage {
Role = ChatRoles.User,
Content = "Can you explain the meaning of life"
};
List<ChatCompletionMessage> chatMessageList = new List<ChatCompletionMessage>
{
chatMessage
};
ChatGPT3CompletionService chatCompletionService = new ChatGPT3CompletionService();
ChatGPT3CompletionCreateOptions chatCompletionOptions = new ChatGPT3CompletionCreateOptions {
Model = "gpt-3.5-turbo",
Messages = chatMessageList,
Temperature = 0,
};
ChatCompletion chatCompletion = chatCompletionService.Create(chatCompletionOptions);
Console.WriteLine(chatCompletion.Choices[0].Message);
CompletionService completionService = new CompletionService();
Completion completion = completionService.Create(new CompletionCreateOptions
{
Prompt = "Say this is a test",
Model = "text-davinci-003",
MaxTokens = 7,
Temperature = 0,
});
Console.WriteLine(completion.Id);
Deleting a resource
The Delete
method of the service class can be used to delete a resource:
FileService fileService = new FileService();
File deleteFile = fileService.Delete(createdFile.Id, new FileDeleteOptions());
Console.WriteLine(deleteFile.Id);
Listing a resource
The List
method on the service class can be used to list resources page-by-page.
ModelService modelService = new ModelService();
OpenAIList<Model> models = modelService.List();
// Enumerate the list
foreach (Model model in models)
{
Console.WriteLine(model.Id);
}
FormEncoder provides methods to serialize various objects with application/x-www-form-urlencoded</c> encoding.
var imageService = new ImageService();
Image editedImage = imageService.Edit(new EditImageCreateOptions
{
Image = "otters.png",
ImageSource = System.IO.File.ReadAllBytes("otters.png"),
Mask = "otters-mask.png",
MaskSource = System.IO.File.ReadAllBytes("otters-mask.png"),
Prompt = "A cute baby sea otter wearing a beret",
N = 2,
Size = "1024x1024",
});
Console.WriteLine(editedImage.Data[0].Url);
Per-request configuration
All of the service methods accept an optional RequestOptions
object. This is
used if you want to pass the secret API key on each method etc.
var requestOptions = new RequestOptions
{
ApiKey = "SECRET API KEY",
OrganizationId = "ORGANIZATION ID"
};
Using a custom HttpClient
You can configure the library with your own custom HttpClient
:
OpenAIConfiguration.OpenAIClient = new OpenAIClient(
apiKey,
httpClient: new SystemNetHttpClient(httpClient));
Please refer to the Advanced client usage page to see more examples of using custom clients, e.g. for using a proxy server, a custom message handler, etc.
Automatic retries
The library automatically retries requests on intermittent failures like on a
connection error, timeout, or on certain API responses like a status 409 Conflict
.
By default, it will perform up to two retries. That number can be configured
with OpenAIConfiguration.MaxNetworkRetries
:
OpenAIConfiguration.MaxNetworkRetries = 0; // Zero retries
How to use parameters and properties
OpenAI is a typed library and it supports all public properties or parameters. In cases where OpenAI adds some new features which introduce new properties or parameters that are not immediately available in the SDK, the library may not support these properties or parameters yet but there is still an approach that allows you to use them.
Parameters To pass undocumented parameters to OpenAI using the OpenAI SDK, you need to use the AddExtraParam() method, as shown below:
CompletionService completionService = new CompletionService();
var completionOptions = new CompletionCreateOptions
{
Prompt = "Say this is a test",
Model = "text-davinci-003",
MaxTokens = 7,
Temperature = 0,
};
completionOptions.AddExtraParam("new_feature_enabled", "true");
Completion completion = completionService.Create(completionOptions);
Console.WriteLine(completion.Id);
Properties
To retrieve undocumented properties from OpenAI using C# you can use an option in the library to return the raw JSON object and return the property. An example of this is shown below:
ModelService modelService = new ModelService();
Model model = modelService.Get("davinci");
var featureEnabled = model.RawJObject["feature_enabled"];
This information is passed along when the library makes calls to the OpenAI API.
dotnet add package OpenAI --version
Support
New features and bug fixes are released on the latest major version of the OpenAI .NET client library. If you are on an older major version, we recommend that you upgrade to the latest in order to use the new features and bug fixes including those for security vulnerabilities. Older major versions of the package will continue to be available for use, but will not be receiving any updates.
Development
The test suite depends on openai-mock, so make sure to fetch and run it from a background terminal (openai-mock's README also contains instructions for installing via Nuget):
dotnet tool install --global OpenAI.Mock
openai-mock
Alternatively, if you have already installed it, run
dotnet tool update --global OpenAI.Mock
openai-mock
Run all tests from the tests/OpenAI.Tests
directory:
dotnet test
Run some tests, filtering by name:
dotnet test --filter FullyQualifiedName~ModelServiceTest
Run tests for a single target framework:
dotnet test --framework netcoreapp2.1
CI/CD Tests
If you need to run tests in a CI/CD pipeline, you can specify the OPENAI_MOCK_PORT
environment variable to a specific port.
Then in your pipeline, add a step to install the openai-mock server before running unit tests.
- name: Install OpenAI.Mock
run: dotnet tool install --global OpenAI.Mock
The library uses dotnet-format
for code formatting. Code
must be formatted before PRs are submitted, otherwise CI will fail. Run the
formatter with:
dotnet format src/OpenAI.sln
For any requests, bug or comments, please open an issue or submit a pull request.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 is compatible. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 is compatible. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 is compatible. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETCoreApp 3.1
- Newtonsoft.Json (>= 13.0.2)
-
.NETFramework 4.6.2
- Microsoft.Bcl.AsyncInterfaces (>= 1.1.0)
- Newtonsoft.Json (>= 13.0.2)
- System.Runtime.InteropServices.RuntimeInformation (>= 4.3.0)
-
.NETStandard 2.0
- Microsoft.Bcl.AsyncInterfaces (>= 1.1.0)
- Newtonsoft.Json (>= 13.0.2)
-
net6.0
- Newtonsoft.Json (>= 13.0.2)
-
net7.0
- Newtonsoft.Json (>= 13.0.2)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.