OllamaClientLibrary 1.4.0
dotnet add package OllamaClientLibrary --version 1.4.0
NuGet\Install-Package OllamaClientLibrary -Version 1.4.0
<PackageReference Include="OllamaClientLibrary" Version="1.4.0" />
paket add OllamaClientLibrary --version 1.4.0
#r "nuget: OllamaClientLibrary, 1.4.0"
// Install OllamaClientLibrary as a Cake Addin #addin nuget:?package=OllamaClientLibrary&version=1.4.0 // Install OllamaClientLibrary as a Cake Tool #tool nuget:?package=OllamaClientLibrary&version=1.4.0
OllamaClientLibrary
OllamaClientLibrary is a .NET Standard 2.1 library designed to interact seamlessly with the Ollama API. It offers robust functionality for generating text and chat completions using a variety of models. This library simplifies the process of configuring and utilizing the Ollama API, making it easier for developers to integrate advanced text generation capabilities into their applications.
Features
- Customizable configuration for Ollama (host, model, temperature, timeout, etc.)
- Automatic model installation
- JSON completions with automatic schema recognition
- Streaming chat completions with conversation history management
- Simple text completion API
- Tool-calling support for dynamic method invocation
- Text-to-embedding conversion for ML applications
- Get text completion from a file (doc, docx, xls, xlsx, pdf, txt, json, xml, csv, jpg, jpeg, png) with OCR capabilities
- Local and remote model management (list, pull, delete)
Prerequisites:
Setting Up Ollama Server
Download and install Ollama from https://ollama.com/download
Installing a Model
- Execute
ollama run qwen2.5:1.5b
in your terminal to start the Ollama server and install the necessary models. You can find a list of available models on Ollama's library. - Confirm the installation by running
ollama list
to see the models installed on your local machine.
Installing a Model (Alternative)
using var client = new OllamaClient(new OllamaOptions()
{
AutoInstallModel = true, // Default is false
});
Installation
You can install the package via NuGet:
Install-Package OllamaClientLibrary
Usage
Generate JSON Completion sample
using OllamaClientLibrary;
using OllamaClientLibrary.Abstractions;
using OllamaClientLibrary.Constants;
using System.ComponentModel;
// Setup OllamaClient
using IOllamaClient client = new OllamaClient(new OllamaOptions() // If no options are provided, OllamaOptions will be used with the default settings
{
Host = "http://localhost:11434", // Default host is http://localhost:11434
Model = "qwen2.5:1.5b", // Default model is "qwen2.5:1.5b"
Temperature = Temperature.DataCleaningOrAnalysis, // Default temperature is Temperature.GeneralConversationOrTranslation
AutoInstallModel = true, // Default is false. The library will automatically install the model if it is not available on your local machine
Timeout = TimeSpan.FromSeconds(30), // Default is 60 seconds.
MaxPromptTokenSize = 4096, // Default is 4096 tokens. Increase this value if you want to send larger prompts
AssistantBehavior = "You are a professional .NET developer.", // Optional. Default is "You are a world class AI Assistant"
ApiKey = "your-api-key", // Optional. It is not required by default for the local setup
Tools = null // Optional. You can use the ToolFactory to create tools, e.g. ToolFactory.Create<WeatherService>() where WeatherService is your class
});
// Call Ollama API
var response = await client.GetJsonCompletionAsync<Response>(
"Return a list of all available .NET Core versions from the past five years."
);
// Display results
if (response?.Data != null)
{
foreach (var item in response.Data.OrderBy(s => s.ReleaseDate))
{
Console.WriteLine($"Version: {item.Version}, Release Date: {item.ReleaseDate}, End of Life: {item.EndOfLife}");
}
}
// Configure the response DTO
class Response
{
public List<DotNetCore>? Data { get; set; } // A wrapper class is required for the response if the data is an array.
}
class DotNetCore
{
[Description("Version number in the format of Major.Minor.Patch")]
public string? Version { get; set; }
[Description("Release date of the version")]
public DateTime? ReleaseDate { get; set; }
[Description("End of life date of the version")]
public DateTime? EndOfLife { get; set; }
}
More samples
- Get Text Completion From a File
- Get Chat Completion
- Get JSON Completion
- Get JSON Completion with Tools
- Get Text Completion
- Get Text Completion with Tools
- Get Embedding Completion
- List Local Models
- List Remote Models
- Pull Model
- Delete Model
Changelog
- v1.4.0: Implemented
dependency injection
support, addedfile-based embedding generation
, integrated document processing capabilities withPdfPig
(PDF text extraction),Tesseract
(OCR
for images and image-based PDFs), andNPOI
(extraction from DOC, DOCX, XML, XLSX formats). - v1.3.0: Enhanced configuration with
MaxPromptTokenSize
andAssistantBehavior
properties, added model deletion functionality, improved tool calling with multi-call support and async method compatibility, automated public method discovery inToolFactory
, and removed theKeepChatHistory
option. - v1.2.0: Improved API naming conventions, introduced
AutoInstallModel
andTimeout
options, migrated from generate API to chat completion API, added model pull functionality, and implemented theIOllamaClient
interface. - v1.1.0: Set
qwen2.5:1.5b
as the default model, correctedModifiedAt
parsing for model listings, implemented tools support and conversation history, added comprehensive integration tests, and established CI pipeline. - v1.0.1: Added
ApiKey
configuration support inOllamaOptions
. - v1.0.0: Initial release with core text and chat completion functionality.
License
This project is licensed under the MIT License.
Repository
For more information, visit the GitHub repository.
Dependencies
- Tesseract (License: Apache-2.0) - OCR capabilities
- PdfPig (License: Apache-2.0) - Extract text from PDF files
- NPOI (License: Apache-2.0) - Extract text from Word and Excel files
Author
Oleksandr Kushnir
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
.NET Core | netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.1 is compatible. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.1
- DocumentFormat.OpenXml (>= 3.2.0)
- HtmlAgilityPack (>= 1.11.72)
- Microsoft.Extensions.DependencyInjection (>= 9.0.2)
- Newtonsoft.Json.Schema (>= 4.0.1)
- NPOI.HWPFCore (>= 2.3.0.1)
- PdfPig (>= 0.1.9)
- System.Drawing.Common (>= 9.0.2)
- System.Text.Encoding.CodePages (>= 10.0.0-preview.1.25080.5)
- Tesseract (>= 5.2.0)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.