OllamaSharp 4.0.7
See the version list below for details.
dotnet add package OllamaSharp --version 4.0.7
NuGet\Install-Package OllamaSharp -Version 4.0.7
<PackageReference Include="OllamaSharp" Version="4.0.7" />
paket add OllamaSharp --version 4.0.7
#r "nuget: OllamaSharp, 4.0.7"
// Install OllamaSharp as a Cake Addin #addin nuget:?package=OllamaSharp&version=4.0.7 // Install OllamaSharp as a Cake Tool #tool nuget:?package=OllamaSharp&version=4.0.7
<p align="center"> <img alt="ollama" height="200px" src="https://github.com/awaescher/OllamaSharp/blob/main/Ollama.png"> </p>
OllamaSharp 🦙
OllamaSharp provides .NET bindings for the Ollama API, simplifying interactions with Ollama both locally and remotely.
✅ Supporting Microsoft.Extensions.AI and Microsoft Semantic Kernel
Features
- Ease of use: Interact with Ollama in just a few lines of code.
- API endpoint coverage: Support for all the Ollama API endpoints, including chats, embeddings, listing models, pulling and creating new models, and more.
- Real-time streaming: Stream responses directly to your application.
- Progress reporting: Get real-time progress feedback on tasks like model pulling.
- Support for vision models and tools (function calling).
Usage
OllamaSharp wraps each Ollama API endpoint in awaitable methods that fully support response streaming.
The following list shows a few simple code examples.
ℹ Try our full featured demo application that's included in this repository
Initializing
// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";
Listing all models that are available locally
var models = await ollama.ListLocalModelsAsync();
Pulling a model and reporting progress
await foreach (var status in ollama.PullModelAsync("llama3.1:405b"))
Console.WriteLine($"{status.Percent}% {status.Status}");
Generating a completion directly into the console
await foreach (var stream in ollama.GenerateAsync("How are you today?"))
Console.Write(stream.Response);
Building interactive chats
var chat = new Chat(ollama);
while (true)
{
var message = Console.ReadLine();
await foreach (var answerToken in chat.SendAsync(message))
Console.Write(answerToken);
}
// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property
Usage with Microsoft.Extensions.AI
Microsoft built an abstraction library to streamline the usage of different AI providers. This is a really interesting concept if you plan to build apps that might use different providers, like ChatGPT, Claude and local models with Ollama.
I encourage you to read their accouncement Introducing Microsoft.Extensions.AI Preview – Unified AI Building Blocks for .NET.
OllamaSharp is the first full implementation of their IChatClient
and IEmbeddingGenerator
that makes it possible to use Ollama just like every other chat provider.
To do this, simply use the OllamaApiClient
as IChatClient
instead of IOllamaApiClient
.
// install package Microsoft.Extensions.AI.Abstractions
private static IChatClient CreateChatClient(Arguments arguments)
{
if (arguments.Provider.Equals("ollama", StringComparison.OrdinalIgnoreCase))
return new OllamaApiClient(arguments.Uri, arguments.Model);
else
return new OpenAIChatClient(new OpenAI.OpenAIClient(arguments.ApiKey), arguments.Model); // ChatGPT or compatible
}
[!NOTE]
IOllamaApiClient
provides many Ollama specific methods thatIChatClient
andIEmbeddingGenerator
miss. Because these are abstractions,IChatClient
andIEmbeddingGenerator
will never implement the full Ollama API specification. However,OllamaApiClient
implements three interfaces: the nativeIOllamaApiClient
and MicrosoftIChatClient
andIEmbeddingGenerator<string, Embedding<float>>
which allows you to cast it to any of these two interfaces as you need them at any time.
Credits
The icon and name were reused from the amazing Ollama project.
I would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost mili-tan, who always keeps OllamaSharp in sync with the Ollama API. ❤
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Microsoft.Bcl.AsyncInterfaces (>= 8.0.0)
- Microsoft.Extensions.AI.Abstractions (>= 9.0.0-preview.9.24556.5)
NuGet packages (9)
Showing the top 5 NuGet packages that depend on OllamaSharp:
Package | Downloads |
---|---|
Microsoft.KernelMemory.AI.Ollama
Provide access to Ollama LLM models in Kernel Memory to generate embeddings and text |
|
Microsoft.SemanticKernel.Connectors.Ollama
Semantic Kernel connector for Ollama. Contains services for text generation, chat completion and text embeddings. |
|
CommunityToolkit.Aspire.Hosting.Ollama
An Aspire integration leveraging the Ollama container with support for downloading a model on startup. |
|
CommunityToolkit.Aspire.OllamaSharp
A .NET Aspire client integration for the OllamaSharp library. |
|
Atc.SemanticKernel.Connectors.Ollama
Atc.SemanticKernel.Connectors.Ollama contains a connector for integrating with local llms through Ollama. |
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
4.0.8 | 623 | 11/22/2024 |
4.0.7 | 2,743 | 11/13/2024 |
4.0.6 | 6,667 | 11/7/2024 |
4.0.5 | 1,277 | 11/5/2024 |
4.0.4 | 285 | 11/4/2024 |
4.0.3 | 2,762 | 10/30/2024 |
4.0.2 | 247 | 10/29/2024 |
4.0.1 | 2,040 | 10/26/2024 |
4.0.0-preview.10 | 70 | 10/23/2024 |
4.0.0-preview.9 | 46 | 10/21/2024 |
4.0.0-preview.8 | 69 | 10/17/2024 |
3.0.15 | 2,746 | 10/21/2024 |
3.0.14 | 5,483 | 10/16/2024 |
3.0.13 | 76 | 10/16/2024 |
3.0.12 | 11,764 | 10/14/2024 |
3.0.11 | 1,292 | 10/9/2024 |
3.0.10 | 8,350 | 10/4/2024 |
3.0.9 | 91 | 10/4/2024 |
3.0.8 | 11,294 | 9/26/2024 |
3.0.7 | 15,039 | 9/12/2024 |
3.0.6 | 1,145 | 9/11/2024 |
3.0.5 | 782 | 9/11/2024 |
3.0.4 | 24,416 | 9/6/2024 |
3.0.3 | 108 | 9/6/2024 |
3.0.2 | 361 | 9/5/2024 |
3.0.1 | 15,766 | 9/2/2024 |
3.0.0 | 1,916 | 8/26/2024 |
2.1.3 | 2,502 | 8/23/2024 |
2.1.2 | 2,089 | 8/19/2024 |
2.1.1 | 4,097 | 8/5/2024 |
2.0.15 | 116 | 8/5/2024 |
2.0.14 | 95 | 8/3/2024 |
2.0.13 | 1,665 | 7/29/2024 |
2.0.12 | 98 | 7/28/2024 |
2.0.11 | 108 | 7/28/2024 |
2.0.10 | 3,640 | 7/12/2024 |
2.0.9 | 93 | 7/12/2024 |
2.0.8 | 125 | 7/12/2024 |
2.0.7 | 1,488 | 7/10/2024 |
2.0.6 | 3,073 | 6/25/2024 |
2.0.5 | 99 | 6/25/2024 |
2.0.4 | 566 | 6/24/2024 |
2.0.3 | 100 | 6/24/2024 |
2.0.2 | 139 | 6/24/2024 |
2.0.1 | 3,653 | 6/5/2024 |
1.1.13 | 175 | 6/5/2024 |
1.1.12 | 395 | 6/4/2024 |
1.1.11 | 288 | 6/2/2024 |
1.1.10 | 899 | 5/31/2024 |
1.1.9 | 6,594 | 5/15/2024 |
1.1.8 | 1,355 | 5/10/2024 |
1.1.7 | 113 | 5/10/2024 |
1.1.5 | 107 | 5/10/2024 |
1.1.4 | 159 | 5/10/2024 |
1.1.3 | 107 | 5/10/2024 |
1.1.2 | 110 | 5/10/2024 |
1.1.1 | 3,030 | 3/27/2024 |
1.1.0 | 1,786 | 1/8/2024 |
1.0.4 | 289 | 12/27/2023 |
1.0.3 | 332 | 11/30/2023 |
1.0.2 | 303 | 11/5/2023 |
1.0.1 | 185 | 10/16/2023 |
1.0.0 | 1,419 | 10/16/2023 |