OllamaClientLibrary 1.3.0
See the version list below for details.
dotnet add package OllamaClientLibrary --version 1.3.0
NuGet\Install-Package OllamaClientLibrary -Version 1.3.0
<PackageReference Include="OllamaClientLibrary" Version="1.3.0" />
paket add OllamaClientLibrary --version 1.3.0
#r "nuget: OllamaClientLibrary, 1.3.0"
// Install OllamaClientLibrary as a Cake Addin #addin nuget:?package=OllamaClientLibrary&version=1.3.0 // Install OllamaClientLibrary as a Cake Tool #tool nuget:?package=OllamaClientLibrary&version=1.3.0
OllamaClientLibrary
OllamaClientLibrary is a .NET Standard 2.1 library designed to interact seamlessly with the Ollama API. It offers robust functionality for generating text and chat completions using a variety of models. This library simplifies the process of configuring and utilizing the Ollama API, making it easier for developers to integrate advanced text generation capabilities into their applications.
Features
- Predefined configuration for the local Ollama setup, such as host, model, temperature, timeout, prompt size, etc.
- Auto-install a model if it is not available on your local machine.
- Get JSON completion with an automatically recognized JSON schema of the response DTO, so you no longer need to specify it in the prompt.
- Get chat completion with streaming. The library provides access to the conversation history, allowing you to store it in the database if needed.
- Get text completion. Provide a prompt and get a simple text response.
- Get completion with tools. Based on the Ollama response, the library can dynamically call local methods and provide the necessary parameters.
- Get embeddings completions for a given text. The library provides functionality to convert text into numerical vectors (embeddings) that can be used for various machine learning tasks such as similarity search, clustering, and classification. This is useful for applications like semantic search, recommendation systems, and natural language understanding.
- List available local and remote models with filtering options. Now you have access to see all models installed on your local machine, as well as all models available on Ollama's library
- Pull a model from Ollama's library to your local machine.
- Delete a model from the local machine
Prerequisites:
Setting Up Ollama Server
Download and install Ollama from https://ollama.com/download
Installing a Model
- Execute
ollama run qwen2.5:1.5b
in your terminal to start the Ollama server and install the necessary models. You can find a list of available models on Ollama's library. - Confirm the installation by running
ollama list
to see the models installed on your local machine.
Installing a Model (Alternative)
using var client = new OllamaClient(new OllamaOptions()
{
AutoInstallModel = true, // Default is false
});
Installation
You can install the package via NuGet:
Install-Package OllamaClientLibrary
Usage
Generate JSON Completion sample
using OllamaClientLibrary;
using OllamaClientLibrary.Abstractions;
using OllamaClientLibrary.Constants;
using System.ComponentModel;
// Setup OllamaClient
using IOllamaClient client = new OllamaClient(new OllamaOptions() // If no options are provided, OllamaOptions will be used with the default settings
{
Host = "http://localhost:11434", // Default host is http://localhost:11434
Model = "qwen2.5:1.5b", // Default model is "qwen2.5:1.5b"
Temperature = Temperature.DataCleaningOrAnalysis, // Default temperature is Temperature.GeneralConversationOrTranslation
AutoInstallModel = true, // Default is false. The library will automatically install the model if it is not available on your local machine
Timeout = TimeSpan.FromSeconds(30), // Default is 60 seconds.
MaxPromptTokenSize = 4096, // Default is 4096 tokens. Increase this value if you want to send larger prompts
AssistantBehavior = "You are a professional .NET developer.", // Optional. Default is "You are a world class AI Assistant"
ApiKey = "your-api-key", // Optional. It is not required by default for the local setup
Tools = null // Optional. You can use the ToolFactory to create tools, e.g. ToolFactory.Create<WeatherService>() where WeatherService is your class
});
// Call Ollama API
var response = await client.GetJsonCompletionAsync<Response>(
"Return a list of all available .NET Core versions from the past five years."
);
// Display results
if (response?.Data != null)
{
foreach (var item in response.Data.OrderBy(s => s.ReleaseDate))
{
Console.WriteLine($"Version: {item.Version}, Release Date: {item.ReleaseDate}, End of Life: {item.EndOfLife}");
}
}
// Configure the response DTO
class Response
{
public List<DotNetCore>? Data { get; set; } // A wrapper class is required for the response if the data is an array.
}
class DotNetCore
{
[Description("Version number in the format of Major.Minor.Patch")]
public string? Version { get; set; }
[Description("Release date of the version")]
public DateTime? ReleaseDate { get; set; }
[Description("End of life date of the version")]
public DateTime? EndOfLife { get; set; }
}
More samples
- Get Chat Completion
- Get JSON Completion
- Get JSON Completion with Tools
- Get Text Completion
- Get Text Completion with Tools
- Get Embedding Completion
- List Local Models
- List Remote Models
- Pull Model
- Delete Model
Changelog
- v1.3.0: Introduced
MaxPromptTokenSize
andAssistantBehavior
properties toOllamaOptions
, and added theDeleteModelAsync
method. RemovedKeepChatHistory
. Fixed handling of multipleToolCalls
. Updated to call AI after receiving a response from the tools. ModifiedToolFactory
to retrieve all public methods automatically instead of specifying them manually. Added support for async methods inToolCalls
. - v1.2.0: Renamed methods to be more descriptive, added
AutoInstallModel
andTimeout
properties toOllamaOptions
, started using the chat completion API instead of the generate API added the pull model API and added IOllamaClient interface. - v1.1.0: Changed the default model to
qwen2.5:1.5b
, fixed parsing ofModifiedAt
for the Models list endpoint, added support for Tools, added Chat History, added integration tests, and configured CI. - v1.0.1: Allowed setting the
ApiKey
inOllamaOptions
. - v1.0.0: Initial release with basic functionality for text and chat completions.
License
This project is licensed under the MIT License.
Repository
For more information, visit the GitHub repository.
Author
Oleksandr Kushnir
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
.NET Core | netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.1 is compatible. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.1
- HtmlAgilityPack (>= 1.11.72)
- Newtonsoft.Json.Schema (>= 4.0.1)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.