StableDiffusion.ML.OnnxRuntime
1.1.1
See the version list below for details.
dotnet add package StableDiffusion.ML.OnnxRuntime --version 1.1.1
NuGet\Install-Package StableDiffusion.ML.OnnxRuntime -Version 1.1.1
<PackageReference Include="StableDiffusion.ML.OnnxRuntime" Version="1.1.1" />
paket add StableDiffusion.ML.OnnxRuntime --version 1.1.1
#r "nuget: StableDiffusion.ML.OnnxRuntime, 1.1.1"
// Install StableDiffusion.ML.OnnxRuntime as a Cake Addin #addin nuget:?package=StableDiffusion.ML.OnnxRuntime&version=1.1.1 // Install StableDiffusion.ML.OnnxRuntime as a Cake Tool #tool nuget:?package=StableDiffusion.ML.OnnxRuntime&version=1.1.1
Inference Stable Diffusion with C# and ONNX Runtime
This package contains the logic to do inferencing for the popular Stable Diffusion deep learning model in C#. Stable Diffusion models take a text prompt and create an image that represents the text.
How to use this NuGet package
Download the ONNX Stable Diffusion models from Hugging Face
Once you have selected a model version repo, click
Files and Versions
, then select theONNX
branch. If there isn't an ONNX model branch available, use themain
branch and convert it to ONNX. See the ONNX conversion tutorial for PyTorch for more information.Clone the model repo:
git lfs install
git clone https://huggingface.co/CompVis/stable-diffusion-v1-4 -b onnx
Copy the folders with the ONNX files to the C# project folder
models
. The folders to copy are:unet
,vae_decoder
,text_encoder
,safety_checker
.Install the following NuGets for DirectML
<PackageReference Include="Microsoft.ML" Version="2.0.1" />
<PackageReference Include="Microsoft.ML.OnnxRuntime.DirectML" Version="1.14.1" />
Cuda support coming soon.
Sample logic for implementing in your project
//Default args
var prompt = "a fireplace in an old cabin in the woods";
Console.WriteLine(prompt);
var config = new StableDiffusionConfig
{
// Number of denoising steps
NumInferenceSteps = 15,
// Scale for classifier-free guidance
GuidanceScale = 7.5,
// Set your preferred Execution Provider. Currently DirectML and CPU are supported.
ExecutionProviderTarget = StableDiffusionConfig.ExecutionProvider.DirectML,
// Set GPU Device ID.
DeviceId = 1,
// Update paths to your models
TextEncoderOnnxPath = @".\models\text_encoder\model.onnx",
UnetOnnxPath = @".\models\unet\model.onnx",
VaeDecoderOnnxPath = @".\models\vae_decoder\model.onnx",
SafetyModelPath = @".\models\safety_checker\model.onnx",
};
// Inference Stable Diff
var image = UNet.Inference(prompt, config);
// If image failed or was unsafe it will return null.
if (image == null)
{
Console.WriteLine("Unable to create image, please try again.");
}
Set Build for x64
Hit
F5
to run the project in Visual Studio ordotnet run
in the terminal to run the project in VS Code.
Resources
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net6.0
- MathNet.Numerics (>= 5.0.0)
- Microsoft.ML (>= 2.0.1)
- Microsoft.ML.OnnxRuntime.DirectML (>= 1.14.1)
- Microsoft.ML.OnnxRuntime.Managed (>= 1.14.1)
- NumSharp (>= 0.30.0)
- SixLabors.ImageSharp (>= 2.1.4)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on StableDiffusion.ML.OnnxRuntime:
Package | Downloads |
---|---|
Frank.SemanticKernel.Connectors.OnnxRuntime.StableDiffusion
Package Description |
GitHub repositories
This package is not used by any popular GitHub repositories.