OllamaSharp 1.0.3

There is a newer version of this package available.
See the version list below for details.
dotnet add package OllamaSharp --version 1.0.3
                    
NuGet\Install-Package OllamaSharp -Version 1.0.3
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="OllamaSharp" Version="1.0.3" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="OllamaSharp" Version="1.0.3" />
                    
Directory.Packages.props
<PackageReference Include="OllamaSharp" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add OllamaSharp --version 1.0.3
                    
#r "nuget: OllamaSharp, 1.0.3"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#addin nuget:?package=OllamaSharp&version=1.0.3
                    
Install as a Cake Addin
#tool nuget:?package=OllamaSharp&version=1.0.3
                    
Install as a Cake Tool

OllamaSharp 🦙

OllamaSharp is a .NET binding for the Ollama API, making it easy to interact with Ollama using your favorite .NET languages.

Features

  • Intuitive API client: Set up and interact with Ollama in just a few lines of code.
  • Support for various Ollama operations: Including streaming completions (chatting), listing local models, pulling new models, show model information, creating new models, copying models, deleting models, pushing models, and generating embeddings.
  • Real-time streaming: Stream responses directly to your application.
  • Progress reporting: Get real-time progress feedback on tasks like model pulling.

Usage

Here's a simple example to get you started:

// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);

// list all local models
var models = await ollama.ListLocalModels();

// stream a completion and write to the console
// keep reusing the context to keep the chat topic going
ConversationContext context = null;
context = await ollama.StreamCompletion("How are you today?", "llama2", context, stream => Console.Write(stream.Response));

// pull a model and report progress
await ollama.PullModel("mistral", status => Console.WriteLine($"({status.Percent}%) {status.Status}"));

Credits

Icon and name were reused from the amazing Ollama project.

Product Compatible and additional computed target framework versions.
.NET net5.0 is compatible.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
  • net5.0

    • No dependencies.

NuGet packages (19)

Showing the top 5 NuGet packages that depend on OllamaSharp:

Package Downloads
Microsoft.KernelMemory.AI.Ollama

Provide access to Ollama LLM models in Kernel Memory to generate embeddings and text

Microsoft.SemanticKernel.Connectors.Ollama

Semantic Kernel connector for Ollama. Contains services for text generation, chat completion and text embeddings.

CommunityToolkit.Aspire.Hosting.Ollama

An Aspire integration leveraging the Ollama container with support for downloading a model on startup.

CommunityToolkit.Aspire.OllamaSharp

A .NET Aspire client integration for the OllamaSharp library.

EnergyAssembly

Package Description

GitHub repositories (7)

Showing the top 7 popular GitHub repositories that depend on OllamaSharp:

Repository Stars
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
microsoft/kernel-memory
RAG architecture: index and query any data using LLM and natural language, track sources, show citations, asynchronous memory patterns.
dotnet/ai-samples
CommunityToolkit/Aspire
A community project with additional components and extensions for .NET Aspire
PowerShell/AIShell
An interactive shell to work with AI-powered assistance providers
davidfowl/aspire-ai-chat-demo
Aspire AI Chat is a full-stack chat sample that combines modern technologies to deliver a ChatGPT-like experience.
lindexi/lindexi_gd
博客用到的代码
Version Downloads Last Updated
5.2.3 1,126 6/23/2025
5.2.2 8,753 5/30/2025
5.2.1 292 5/30/2025
5.1.20 313 5/30/2025
5.1.19 2,481 5/23/2025
5.1.18 6,055 5/19/2025
5.1.17 827 5/16/2025
5.1.16 1,130 5/13/2025
5.1.15 366 5/13/2025
5.1.14 10,191 5/6/2025
5.1.13 8,692 4/10/2025
5.1.12 88,256 4/10/2025
5.1.11 2,120 4/9/2025
5.1.10 1,560 4/7/2025
5.1.9 3,853 3/27/2025
5.1.8 208 3/27/2025
5.1.7 57,236 3/14/2025
5.1.6 193 3/14/2025
5.1.5 991 3/13/2025
5.1.4 2,753 3/5/2025
5.1.3 484 3/4/2025
5.1.2 21,667 2/24/2025
5.1.1 1,149 2/21/2025
5.1.0 197 2/21/2025
5.0.7 33,297 2/17/2025
5.0.6 27,248 2/3/2025
5.0.5 1,264 1/31/2025
5.0.4 3,943 1/27/2025
5.0.3 3,092 1/22/2025
5.0.2 8,506 1/15/2025
5.0.1 379 1/15/2025
4.0.22 9,411 1/10/2025
4.0.21 157 1/9/2025
4.0.20 396 1/8/2025
4.0.19 88 1/8/2025
4.0.18 453 1/8/2025
4.0.17 35,415 1/3/2025
4.0.16 128 1/3/2025
4.0.15 130 1/3/2025
4.0.14 136 1/3/2025
4.0.13 122 1/3/2025
4.0.12 123 1/3/2025
4.0.11 30,057 12/9/2024
4.0.10 117 12/9/2024
4.0.9 1,852 12/2/2024
4.0.8 34,514 11/22/2024
4.0.7 4,111 11/13/2024
4.0.6 13,244 11/7/2024
4.0.5 1,347 11/5/2024
4.0.4 598 11/4/2024
4.0.3 26,481 10/30/2024
4.0.2 300 10/29/2024
4.0.1 2,185 10/26/2024
4.0.0-preview.10 119 10/23/2024
4.0.0-preview.9 74 10/21/2024
4.0.0-preview.8 105 10/17/2024
3.0.15 4,981 10/21/2024
3.0.14 9,931 10/16/2024
3.0.13 134 10/16/2024
3.0.12 20,940 10/14/2024
3.0.11 1,490 10/9/2024
3.0.10 20,957 10/4/2024
3.0.9 147 10/4/2024
3.0.8 16,731 9/26/2024
3.0.7 33,551 9/12/2024
3.0.6 1,238 9/11/2024
3.0.5 828 9/11/2024
3.0.4 26,596 9/6/2024
3.0.3 151 9/6/2024
3.0.2 538 9/5/2024
3.0.1 22,337 9/2/2024
3.0.0 2,168 8/26/2024
2.1.3 2,727 8/23/2024
2.1.2 2,610 8/19/2024
2.1.1 4,370 8/5/2024
2.0.15 242 8/5/2024
2.0.14 141 8/3/2024
2.0.13 1,948 7/29/2024
2.0.12 153 7/28/2024
2.0.11 200 7/28/2024
2.0.10 4,937 7/12/2024
2.0.9 131 7/12/2024
2.0.8 171 7/12/2024
2.0.7 1,777 7/10/2024
2.0.6 3,162 6/25/2024
2.0.5 142 6/25/2024
2.0.4 607 6/24/2024
2.0.3 140 6/24/2024
2.0.2 198 6/24/2024
2.0.1 3,974 6/5/2024
1.1.13 522 6/5/2024
1.1.12 578 6/4/2024
1.1.11 335 6/2/2024
1.1.10 1,335 5/31/2024
1.1.9 7,287 5/15/2024
1.1.8 1,409 5/10/2024
1.1.7 156 5/10/2024
1.1.5 155 5/10/2024
1.1.4 286 5/10/2024
1.1.3 158 5/10/2024
1.1.2 155 5/10/2024
1.1.1 3,522 3/27/2024
1.1.0 2,200 1/8/2024
1.0.4 339 12/27/2023
1.0.3 415 11/30/2023
1.0.2 423 11/5/2023
1.0.1 250 10/16/2023
1.0.0 1,996 10/16/2023