Microsoft.DeepDev.TokenizerLib 1.3.1

Prefix Reserved
There is a newer version of this package available.
See the version list below for details.
dotnet add package Microsoft.DeepDev.TokenizerLib --version 1.3.1                
NuGet\Install-Package Microsoft.DeepDev.TokenizerLib -Version 1.3.1                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Microsoft.DeepDev.TokenizerLib" Version="1.3.1" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Microsoft.DeepDev.TokenizerLib --version 1.3.1                
#r "nuget: Microsoft.DeepDev.TokenizerLib, 1.3.1"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Microsoft.DeepDev.TokenizerLib as a Cake Addin
#addin nuget:?package=Microsoft.DeepDev.TokenizerLib&version=1.3.1

// Install Microsoft.DeepDev.TokenizerLib as a Cake Tool
#tool nuget:?package=Microsoft.DeepDev.TokenizerLib&version=1.3.1                

Tokenizer

This repo contains C# and Typescript implementation of byte pair encoding(BPE) tokenizer for OpenAI LLMs, it's based on open sourced rust implementation in the OpenAI tiktoken. Both implementation are valuable to run prompt tokenization in .NET and Nodejs environment before feeding prompt into a LLM.

C# implementation

The TokenizerLib is built in .NET Standard 2.0, which can be consumed in projects on any version of .NET later than .NET Core 2.0 or .NET Framework 4.6.1.

You can download and install the nuget package of TokenizerLib here.

Example C# code to use TokenizerLib in your code. In production setting, you should pre-download the BPE rank file and call TokenizerBuilder.CreateTokenizer API to avoid downloading the BPE rank file on the fly.

using System.Collections.Generic;
using Microsoft.DeepDev;

var IM_START = "<|im_start|>";
var IM_END = "<|im_end|>";

var specialTokens = new Dictionary<string, int>{
                                            { IM_START, 100264},
                                            { IM_END, 100265},
                                        };
tokenizer = TokenizerBuilder.CreateByModelName("gpt-4", specialTokens);

var text = "<|im_start|>Hello World<|im_end|>";
var encoded = tokenizer.Encode(text, new HashSet<string>(specialTokens.Keys));
Console.WriteLine(encoded.Count);

var decoded = tokenizer.Decode(encoded.ToArray());
Console.WriteLine(decoded);

C# performance benchmark

PerfBenchmark result based on PerfBenchmark.csproj:

BenchmarkDotNet=v0.13.3, OS=Windows 11 (10.0.22621.1702)
Intel Core i7-1065G7 CPU 1.30GHz, 1 CPU, 8 logical and 4 physical cores
.NET SDK=7.0.300-preview.23179.2
  [Host]     : .NET 6.0.16 (6.0.1623.17311), X64 RyuJIT AVX2
  DefaultJob : .NET 6.0.16 (6.0.1623.17311), X64 RyuJIT AVX2

| Method |    Mean |    Error |   StdDev |
|------- |--------:|---------:|---------:|
| Encode | 2.414 s | 0.0303 s | 0.0253 s |

# Typescript implementation

Install the npm package in your project:

```bash
npm install @microsoft/tiktokenizer

Example Typescript code to use @microsoft/tiktokenizer in your code:

import {
    createByModelName
  } from '@microsoft/tiktokenizer';

const IM_START = "<|im_start|>";
const IM_END = "<|im_end|>";
const specialTokens: ReadonlyMap<string, number> = new Map([
  [IM_START, 100264],
  [IM_END, 100265],
]);

const str = "<|im_start|>Hello World<|im_end|>";
let tokenizer = null;
const createTokenizer = async () => {
    tokenizer = await createByModelName("gpt-3.5-turbo", specialTokens);
    var out = tokenizer.encodeTrimSuffix(str, 3, Array.from(specialTokens.keys()));
    console.log(out);
}
createTokenizer();

Contributing

We welcome contributions. Please follow this guideline.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
  • .NETStandard 2.0

    • No dependencies.

NuGet packages (5)

Showing the top 5 NuGet packages that depend on Microsoft.DeepDev.TokenizerLib:

Package Downloads
Microsoft.DotNet.Interactive.AIUtilities

Utilities for AI workload in .NET Interactive and Polyglot Notebooks

LangChain.NET

LangChain.NET provides the ability to build applications with LLMs through composability

Cnblogs.KernelMemory.AI.DashScope

Provide access to DashScope LLM models in Kernel Memory to generate embeddings and text

FoundationaLLM.Common

FoundationaLLM.Common is a .NET library that the FoundationaLLM.Client.Core and FoundationaLLM.Client.Management client libraries share as a common dependency. Do not directly import and use this library.

ContextFlow

Package Description

GitHub repositories (5)

Showing the top 5 popular GitHub repositories that depend on Microsoft.DeepDev.TokenizerLib:

Repository Stars
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
dotnet/ResXResourceManager
Manage localization of all ResX-Based resources in one central place.
axzxs2001/Asp.NetCoreExperiment
原来所有项目都移动到**OleVersion**目录下进行保留。新的案例装以.net 5.0为主,一部分对以前案例进行升级,一部分将以前的工作经验总结出来,以供大家参考!
dmitry-brazhenko/SharpToken
SharpToken is a C# library for tokenizing natural language text. It's based on the tiktoken Python library and designed to be fast and accurate.
Azure/Vector-Search-AI-Assistant
Microsoft Official Build Modern AI Apps reference solutions and content. Demonstrate how to build Copilot applications that incorporate Hero Azure Services including Azure OpenAI Service, Azure Container Apps (or AKS) and Azure Cosmos DB for NoSQL with Vector Search.
Version Downloads Last updated
1.3.3 268,456 1/11/2024
1.3.2 199,792 6/20/2023
1.3.1 11,866 5/12/2023
1.3.0 2,749 4/6/2023