OllamaSharp.ModelContextProtocol 5.1.4

dotnet add package OllamaSharp.ModelContextProtocol --version 5.1.4                
NuGet\Install-Package OllamaSharp.ModelContextProtocol -Version 5.1.4                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="OllamaSharp.ModelContextProtocol" Version="5.1.4" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add OllamaSharp.ModelContextProtocol --version 5.1.4                
#r "nuget: OllamaSharp.ModelContextProtocol, 5.1.4"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install OllamaSharp.ModelContextProtocol as a Cake Addin
#addin nuget:?package=OllamaSharp.ModelContextProtocol&version=5.1.4

// Install OllamaSharp.ModelContextProtocol as a Cake Tool
#tool nuget:?package=OllamaSharp.ModelContextProtocol&version=5.1.4                

<a href="https://www.nuget.org/packages/OllamaSharp"><img src="https://img.shields.io/nuget/v/OllamaSharp" alt="nuget version"></a> <a href="https://www.nuget.org/packages/OllamaSharp"><img src="https://img.shields.io/nuget/dt/OllamaSharp.svg" alt="nuget downloads"></a> <a href="https://awaescher.github.io/OllamaSharp"><img src="https://img.shields.io/badge/api_docs-8A2BE2" alt="Api docs"></a>

<p align="center">   <img alt="ollama" height="200px" src="https://github.com/awaescher/OllamaSharp/blob/main/Ollama.png"> </p>

OllamaSharp 🦙

OllamaSharp provides .NET bindings for the Ollama API, simplifying interactions with Ollama both locally and remotely.

Features

Usage

OllamaSharp wraps each Ollama API endpoint in awaitable methods that fully support response streaming.

The following list shows a few simple code examples.

Try our full featured demo application that's included in this repository

Initializing

// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);

// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";

Listing all models that are available locally

var models = await ollama.ListLocalModelsAsync();

Pulling a model and reporting progress

await foreach (var status in ollama.PullModelAsync("llama3.1:405b"))
    Console.WriteLine($"{status.Percent}% {status.Status}");

Generating a completion directly into the console

await foreach (var stream in ollama.GenerateAsync("How are you today?"))
    Console.Write(stream.Response);

Building interactive chats

// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property

var chat = new Chat(ollama);

while (true)
{
    var message = Console.ReadLine();
    await foreach (var answerToken in chat.SendAsync(message))
        Console.Write(answerToken);
}

Usage with Microsoft.Extensions.AI

Microsoft built an abstraction library to streamline the usage of different AI providers. This is a really interesting concept if you plan to build apps that might use different providers, like ChatGPT, Claude and local models with Ollama.

I encourage you to read their accouncement Introducing Microsoft.Extensions.AI Preview – Unified AI Building Blocks for .NET.

OllamaSharp is the first full implementation of their IChatClient and IEmbeddingGenerator that makes it possible to use Ollama just like every other chat provider.

To do this, simply use the OllamaApiClient as IChatClient instead of IOllamaApiClient.

// install package Microsoft.Extensions.AI.Abstractions

private static IChatClient CreateChatClient(Arguments arguments)
{
  if (arguments.Provider.Equals("ollama", StringComparison.OrdinalIgnoreCase))
    return new OllamaApiClient(arguments.Uri, arguments.Model);
  else
    return new OpenAIChatClient(new OpenAI.OpenAIClient(arguments.ApiKey), arguments.Model); // ChatGPT or compatible
}

[!NOTE] IOllamaApiClient provides many Ollama specific methods that IChatClient and IEmbeddingGenerator miss. Because these are abstractions, IChatClient and IEmbeddingGenerator will never implement the full Ollama API specification. However, OllamaApiClient implements three interfaces: the native IOllamaApiClient and Microsoft IChatClient and IEmbeddingGenerator<string, Embedding<float>> which allows you to cast it to any of these two interfaces as you need them at any time.

Credits

The icon and name were reused from the amazing Ollama project.

I would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost mili-tan, who always keeps OllamaSharp in sync with the Ollama API. ❤

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
5.1.4 162 3/5/2025
5.1.3 163 3/4/2025