OllamaClientLibrary 1.0.1
There is a newer version of this package available.
See the version list below for details.
See the version list below for details.
dotnet add package OllamaClientLibrary --version 1.0.1
NuGet\Install-Package OllamaClientLibrary -Version 1.0.1
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="OllamaClientLibrary" Version="1.0.1" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add OllamaClientLibrary --version 1.0.1
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: OllamaClientLibrary, 1.0.1"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install OllamaClientLibrary as a Cake Addin #addin nuget:?package=OllamaClientLibrary&version=1.0.1 // Install OllamaClientLibrary as a Cake Tool #tool nuget:?package=OllamaClientLibrary&version=1.0.1
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
OllamaClientLibrary
OllamaClientLibrary is a .NET Standard 2.1 library for interacting with the Ollama API. It provides functionality to generate text completions and chat completions using various models.
Features
- Predefined configuration for the local Ollama setup, such as host, model, and temperature.
- Generate text completions. Provide a prompt and get a simple text response.
- Generate JSON completions with an automatically recognized JSON schema of the response DTO, so you no longer need to specify it in the prompt.
- Generate chat completions with streaming. The library provides access to the conversation history, allowing you to store it in the database if needed.
- List available local and remote models with filtering options. Now you have access to see all models installed on your local machine, as well as all models available on Ollama's library
Prerequisites: Install Ollama locally
- Download and install Ollama from https://ollama.com/download
- Ensure Ollama is running by executing
ollama run deepseek-r1
in your terminal - (Optional) Install specific models using
ollama pull <model-name>
, e.g.,ollama pull llama3.2:latest
. List of available models you can find on Ollama's library - Verify installation with
ollama list
to check available models
Installation
You can install the package via NuGet:
dotnet add package OllamaClientLibrary
Usage
Generate JSON Completion sample
// Setup OllamaClient
using var client = new OllamaClient(new LocalOllamaOptions() // If no options are provided, LocalOllamaOptions will be used by default.
{
Host = "http://localhost:11434", // Default host is http://localhost:11434
Model = "llama3.2:latest", // Default model is "deepseek-r1". Ensure this model is available in your Ollama installation.
Temperature = Temperature.DataCleaningOrAnalysis, // Default temperature is Temperature.GeneralConversationOrTranslation
ApiKey = "your-api-key" // Optional. It is not required by default, so it is set to null.
});
// Call Ollama API
var response = await client.GenerateCompletionJsonAsync<Response>(
"You are a professional .NET developer. List all available .NET Core versions from the past five years."
);
// Display results
if (response?.Data != null)
{
foreach (var item in response.Data.OrderBy(s => s.ReleaseDate))
{
Console.WriteLine($"Version: {item.Version}, Release Date: {item.ReleaseDate}, End of Life: {item.EndOfLife}");
}
}
// Configure the response DTO
class Response
{
public List<DotNetCoreVersion>? Data { get; set; } // A wrapper class is required for the response if the data is an array.
}
class DotNetCoreVersion
{
[JsonSchemaFormat("string", @"^([0-9]+).([0-9]+).([0-9]+)$")]
public string? Version { get; set; }
public DateTime? ReleaseDate { get; set; }
public DateTime? EndOfLife { get; set; }
}
More samples
- Chat Completions
- Generate JSON Completions
- Generate Text Completions
- List Local Models
- List Remote Models
License
This project is licensed under the MIT License.
Repository
For more information, visit the GitHub repository.
Author
Oleksandr Kushnir
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
.NET Core | netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.1 is compatible. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
.NETStandard 2.1
- HtmlAgilityPack (>= 1.11.72)
- Newtonsoft.Json.Schema (>= 4.0.1)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.