Anthropic.SDK
4.4.0
See the version list below for details.
dotnet add package Anthropic.SDK --version 4.4.0
NuGet\Install-Package Anthropic.SDK -Version 4.4.0
<PackageReference Include="Anthropic.SDK" Version="4.4.0" />
paket add Anthropic.SDK --version 4.4.0
#r "nuget: Anthropic.SDK, 4.4.0"
// Install Anthropic.SDK as a Cake Addin #addin nuget:?package=Anthropic.SDK&version=4.4.0 // Install Anthropic.SDK as a Cake Tool #tool nuget:?package=Anthropic.SDK&version=4.4.0
Anthropic.SDK
Anthropic.SDK is an unofficial C# client designed for interacting with the Claude AI API. This powerful interface simplifies the integration of the Claude AI into your C# applications. It targets NetStandard 2.0, .NET 6.0, and .NET 8.0.
Table of Contents
Installation
Install Anthropic.SDK via the NuGet package manager:
PM> Install-Package Anthropic.SDK
API Keys
You can load the API Key from an environment variable named ANTHROPIC_API_KEY
by default. Alternatively, you can supply it as a string to the AnthropicClient
constructor.
HttpClient
The AnthropicClient
can optionally take a custom HttpClient
in the AnthropicClient
constructor, which allows you to control elements such as retries and timeouts. Note: If you provide your own HttpClient
, you are responsible for disposal of that client.
Usage
There are two ways to start using the AnthropicClient
. The first is to simply new up an instance of the AnthropicClient
and start using it, the second is to use the messaging client with Microsoft.SemanticKernel
.
Brief examples of each are below.
Option 1:
var client = new AnthropicClient();
Option 2:
using Microsoft.SemanticKernel;
var skChatService =
new ChatClientBuilder()
.UseFunctionInvocation()
.Use(new AnthropicClient().Messages)
.AsChatCompletionService();
Examples
Non-Streaming Call
Here's an example of a non-streaming call to the Claude AI API to the new Claude 3.5 Sonnet model:
var client = new AnthropicClient();
var messages = new List<Message>()
{
new Message(RoleType.User, "Who won the world series in 2020?"),
new Message(RoleType.Assistant, "The Los Angeles Dodgers won the World Series in 2020."),
new Message(RoleType.User, "Where was it played?"),
};
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 1024,
Model = AnthropicModels.Claude35Sonnet,
Stream = false,
Temperature = 1.0m,
};
var firstResult = await client.Messages.GetClaudeMessageAsync(parameters);
//print result
Console.WriteLine(firstResult.Message.ToString());
//print remaining Request Limit
Console.WriteLine(firstResult.RateLimits.RequestsLimit.ToString());
//add assistant message to chain for second call
messages.Add(firstResult.Message);
//ask followup question in chain
messages.Add(new Message(RoleType.User,"Who were the starting pitchers for the Dodgers?"));
var finalResult = await client.Messages.GetClaudeMessageAsync(parameters);
//print result
Console.WriteLine(finalResult.Message.ToString());
Streaming Call
The following is an example of a streaming call to the Claude AI API Model 3 Opus that provides an image for analysis:
string resourceName = "Anthropic.SDK.Tests.Red_Apple.jpg";
// Get the current assembly
Assembly assembly = Assembly.GetExecutingAssembly();
// Get a stream to the embedded resource
await using Stream stream = assembly.GetManifestResourceStream(resourceName);
// Read the stream into a byte array
byte[] imageBytes;
using (var memoryStream = new MemoryStream())
{
await stream.CopyToAsync(memoryStream);
imageBytes = memoryStream.ToArray();
}
// Convert the byte array to a base64 string
string base64String = Convert.ToBase64String(imageBytes);
var client = new AnthropicClient();
var messages = new List<Message>();
messages.Add(new Message()
{
Role = RoleType.User,
Content = new List<ContentBase>()
{
new ImageContent()
{
Source = new ImageSource()
{
MediaType = "image/jpeg",
Data = base64String
}
},
new TextContent()
{
Text = "What is this a picture of?"
}
}
});
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 512,
Model = AnthropicModels.Claude3Opus,
Stream = true,
Temperature = 1.0m,
};
var outputs = new List<MessageResponse>();
await foreach (var res in client.Messages.StreamClaudeMessageAsync(parameters))
{
if (res.Delta != null)
{
Console.Write(res.Delta.Text);
}
outputs.Add(res);
}
Console.WriteLine(string.Empty);
Console.WriteLine($@"Used Tokens - Input:{outputs.First().StreamStartMessage.Usage.InputTokens}.
Output: {outputs.Last().Usage.OutputTokens}");
IChatClient
The AnthropicClient
has support for the new IChatClient
from Microsoft and offers a slightly different mechanism for using the AnthropicClient
. Below are a few examples.
//function calling
IChatClient client = new ChatClientBuilder()
.UseFunctionInvocation()
.Use(new AnthropicClient().Messages);
ChatOptions options = new()
{
ModelId = AnthropicModels.Claude3Haiku,
MaxOutputTokens = 512,
Tools = [AIFunctionFactory.Create((string personName) => personName switch {
"Alice" => "25",
_ => "40"
}, "GetPersonAge", "Gets the age of the person whose name is specified.")]
};
var res = await client.CompleteAsync("How old is Alice?", options);
Assert.IsTrue(
res.Message.Text?.Contains("25") is true,
res.Message.Text);
//non-streaming
IChatClient client = new AnthropicClient().Messages;
ChatOptions options = new()
{
ModelId = AnthropicModels.Claude_v2_1,
MaxOutputTokens = 512,
Temperature = 1.0f,
};
var res = await client.CompleteAsync("Write a sonnet about the Statue of Liberty. The response must include the word green.", options);
Assert.IsTrue(res.Message.Text?.Contains("green") is true, res.Message.Text);
//streaming call
IChatClient client = new AnthropicClient().Messages;
ChatOptions options = new()
{
ModelId = AnthropicModels.Claude_v2_1,
MaxOutputTokens = 512,
Temperature = 1.0f,
};
StringBuilder sb = new();
await foreach (var res in client.CompleteStreamingAsync("Write a sonnet about the Statue of Liberty. The response must include the word green.", options))
{
sb.Append(res);
}
Assert.IsTrue(sb.ToString().Contains("green") is true, sb.ToString());
//Image call
string resourceName = "Anthropic.SDK.Tests.Red_Apple.jpg";
Assembly assembly = Assembly.GetExecutingAssembly();
await using Stream stream = assembly.GetManifestResourceStream(resourceName)!;
byte[] imageBytes;
using (var memoryStream = new MemoryStream())
{
await stream.CopyToAsync(memoryStream);
imageBytes = memoryStream.ToArray();
}
IChatClient client = new AnthropicClient().Messages;
var res = await client.CompleteAsync(
[
new ChatMessage(ChatRole.User,
[
new ImageContent(imageBytes, "image/jpeg"),
new TextContent("What is this a picture of?"),
])
], new()
{
ModelId = AnthropicModels.Claude3Opus,
MaxOutputTokens = 512,
Temperature = 0f,
});
Assert.IsTrue(res.Message.Text?.Contains("apple", StringComparison.OrdinalIgnoreCase) is true, res.Message.Text);
Please see the unit tests for even more examples.
Prompt Caching
The AnthropicClient
supports prompt caching of system messages, user messages (including images), assistant messages, tool_results, and tools in accordance with model limitations. Because the AnthropicClient
does not have it's own tokenizer, you must ensure yourself that when enabling prompt caching, you are providing enough context to the qualifying model for it to cache or nothing will be cached. Check out the documentation on Anthropic's website for specific model limitations and requirements.
//load up a long form text you want to cache and ask questions of
string resourceName = "Anthropic.SDK.Tests.BillyBudd.txt";
Assembly assembly = Assembly.GetExecutingAssembly();
await using Stream stream = assembly.GetManifestResourceStream(resourceName);
using StreamReader reader = new StreamReader(stream);
string content = await reader.ReadToEndAsync();
var client = new AnthropicClient();
var systemMessages = new List<SystemMessage>()
{
//typical system message
new SystemMessage("You are an expert at analyzing literary texts."),
//entire contents of the long form text
new SystemMessage(content)
};
var messages = new List<Message>()
{
//first question to ask
new Message(RoleType.User, "What are the key literary themes of this novel?"),
};
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 1024,
Model = AnthropicModels.Claude35Sonnet,
Stream = false,
Temperature = 1.0m,
System = systemMessages,
//Key ingredient: we tell Claude we want it to cache messages
PromptCaching = PromptCacheType.Messages
};
var res = await client.Messages.GetClaudeMessageAsync(parameters);
Console.WriteLine(res.Message);
//proof that our messages were cached
Console.WriteLine(res.Usage.CacheCreationInputTokens);
//add assistant message
messages.Add(res.Message);
//ask question 2
messages.Add(new Message(RoleType.User, "Who is the main character and how old are they?"));
var res2 = await client.Messages.GetClaudeMessageAsync(parameters);
//proof that we hit the cache, this will be greater than 0
Console.WriteLine(res2.Usage.CacheReadInputTokens);
To cache tools (if you have a LOT of tools registered) and cache messages at the same time, you can simply declare the prompt caching type as a bitwise operation like so:
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 1024,
Model = AnthropicModels.Claude35Sonnet,
Stream = false,
Temperature = 1.0m,
//Set caching as enabled for both messages and tools
PromptCaching = PromptCacheType.Messages | PromptCacheType.Tools,
Tools = tools
};
var res = await client.Messages.GetClaudeMessageAsync(parameters);
Additionally, there is a mode for fine-grained control of caching, where you manage the cache points yourself. Here, you declare the cache control setting at the message and tool level, giving you complete control.
string resourceName = "Anthropic.SDK.Tests.BillyBudd.txt";
Assembly assembly = Assembly.GetExecutingAssembly();
await using Stream stream = assembly.GetManifestResourceStream(resourceName);
using StreamReader reader = new StreamReader(stream);
string content = await reader.ReadToEndAsync();
var client = new AnthropicClient();
var messages = new List<Message>()
{
new Message(RoleType.User, "What are the key literary themes of this novel?"),
};
var systemMessages = new List<SystemMessage>()
{
new SystemMessage("You are an expert at analyzing literary texts."),
//set cache control manually
new SystemMessage(content, new CacheControl() { Type = CacheControlType.ephemeral })
};
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 1024,
Model = AnthropicModels.Claude35Sonnet,
Stream = false,
Temperature = 0m,
System = systemMessages,
//Set to fine-grained, manual checkpoint caching
PromptCaching = PromptCacheType.FineGrained
};
var res = await client.Messages.GetClaudeMessageAsync(parameters);
Console.WriteLine(res.Message);
//will be greater than 0
Console.WriteLine(res.Usage.CacheCreationInputTokens);
//cache an assistant message
res.Message.Content.First().CacheControl = new CacheControl() { Type = CacheControlType.ephemeral };
messages.Add(res.Message);
messages.Add(new Message(RoleType.User, "Who is the main character and how old are they?"));
var res2 = await client.Messages.GetClaudeMessageAsync(parameters);
//will be greater than 0
Console.WriteLine(res2.Usage.CacheReadInputTokens);
//more turns
See unit tests for additional examples.
PDF Support
The AnthropicClient
supports the new PDF Upload mechanism enabled by Claude.
string resourceName = "Anthropic.SDK.Tests.Claude3ModelCard.pdf";
Assembly assembly = Assembly.GetExecutingAssembly();
await using Stream stream = assembly.GetManifestResourceStream(resourceName);
//read stream into byte array
using var ms = new MemoryStream();
await stream.CopyToAsync(ms);
byte[] pdfBytes = ms.ToArray();
string base64String = Convert.ToBase64String(pdfBytes);
var client = new AnthropicClient();
var messages = new List<Message>()
{
new Message(RoleType.User, new DocumentContent()
{
Source = new ImageSource()
{
Data = base64String,
MediaType = "application/pdf"
},
CacheControl = new CacheControl()
{
Type = CacheControlType.ephemeral
}
}),
new Message(RoleType.User, "Which model has the highest human preference win rates across each use-case?"),
};
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 1024,
Model = AnthropicModels.Claude35Sonnet,
Stream = false,
Temperature = 0m,
PromptCaching = PromptCacheType.FineGrained
};
var res = await client.Messages.GetClaudeMessageAsync(parameters);
Console.WriteLine(res.Message);
Batching
The AnthropicClient
supports the new batching API. Abbreviated call examples are listed below, please check the Anthropic.SDK.BatchTester
project for a more comprehensive example.
//list batches
var list = await client.Batches.ListBatchesAsync();
foreach (var batch in list.Batches)
{
Console.WriteLine("Batch: " + batch.Id);
}
//create batch
var messages = new List<Message>();
messages.Add(new Message(RoleType.User, "Write me a sonnet about the Statue of Liberty"));
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 512,
Model = AnthropicModels.Claude35Sonnet,
Stream = false,
Temperature = 1.0m,
};
var batchRequest = new BatchRequest()
{
CustomId = "BatchTester",
MessageParameters = parameters
};
var response = await client.Batches.CreateBatchAsync(new List<BatchRequest> { batchRequest });
Console.WriteLine("Batch created: " + response.Id);
//cancel batch
var cancelResponse = await client.Batches.CancelBatchAsync(response.Id);
//check batch status
var status = await client.Batches.RetrieveBatchStatusAsync(response.Id);
//stream strongly typed batch results when complete
await foreach (var result in client.Batches.RetrieveBatchResultsAsync(response.Id))
{
//do something with results (which are wrapped messages)
}
//stream jsonl batch results when complete
await foreach (var result in client.Batches.RetrieveBatchResultsJsonlAsync(response.Id))
{
Console.WriteLine("Result: " + result);
}
Tools
The AnthropicClient
supports function-calling through a variety of methods, see some examples below or check out the unit tests in this repo:
//From a globally declared static function:
public enum TempType
{
Fahrenheit,
Celsius
}
[Function("This function returns the weather for a given location")]
public static async Task<string> GetWeather([FunctionParameter("Location of the weather", true)]string location,
[FunctionParameter("Unit of temperature, celsius or fahrenheit", true)] TempType tempType)
{
return "72 degrees and sunny";
}
var client = new AnthropicClient();
var messages = new List<Message>
{
new Message(RoleType.User, "What is the weather in San Francisco, CA in fahrenheit?")
};
var tools = Common.Tool.GetAllAvailableTools(includeDefaults: false,
forceUpdate: true, clearCache: true);
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 2048,
Model = AnthropicModels.Claude3Sonnet,
Stream = false,
Temperature = 1.0m,
Tools = tools.ToList()
};
var res = await client.Messages.GetClaudeMessageAsync(parameters);
messages.Add(res.Message);
foreach (var toolCall in res.ToolCalls)
{
var response = await toolCall.InvokeAsync<string>();
messages.Add(new Message(toolCall, response));
}
var finalResult = await client.Messages.GetClaudeMessageAsync(parameters);
//The weather in San Francisco, CA is currently 72 degrees Fahrenheit and sunny.
//Streaming example
var client = new AnthropicClient();
var messages = new List<Message>();
messages.Add(new Message(RoleType.User, "What's the temperature in San diego right now in Fahrenheit?"));
var tools = Common.Tool.GetAllAvailableTools(includeDefaults: false, forceUpdate: true, clearCache: true);
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 512,
Model = AnthropicModels.Claude35Sonnet,
Stream = true,
Temperature = 1.0m,
Tools = tools.ToList()
};
var outputs = new List<MessageResponse>();
await foreach (var res in client.Messages.StreamClaudeMessageAsync(parameters))
{
if (res.Delta != null)
{
Console.Write(res.Delta.Text);
}
outputs.Add(res);
}
messages.Add(new Message(outputs));
foreach (var output in outputs)
{
if (output.ToolCalls != null)
{
foreach (var toolCall in output.ToolCalls)
{
var response = await toolCall.InvokeAsync<string>();
messages.Add(new Message(toolCall, response));
}
}
}
await foreach (var res in client.Messages.StreamClaudeMessageAsync(parameters))
{
if (res.Delta != null)
{
Console.Write(res.Delta.Text);
}
outputs.Add(res);
}
//The weather in San Diego, CA is currently 72 degrees Fahrenheit and sunny.
//From a Func:
var client = new AnthropicClient();
var messages = new List<Message>
{
new Message(RoleType.User, "What is the weather in San Francisco, CA?")
};
var tools = new List<Common.Tool>
{
Common.Tool.FromFunc("Get_Weather",
([FunctionParameter("Location of the weather", true)]string location)=> "72 degrees and sunny")
};
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 2048,
Model = AnthropicModels.Claude3Sonnet,
Stream = false,
Temperature = 1.0m,
Tools = tools
};
var res = await client.Messages.GetClaudeMessageAsync(parameters);
messages.Add(res.Message);
foreach (var toolCall in res.ToolCalls)
{
var response = toolCall.Invoke<string>();
messages.Add(new Message(toolCall, response));
}
var finalResult = await client.Messages.GetClaudeMessageAsync(parameters);
//From a static Object
public static class StaticObjectTool
{
public static string GetWeather(string location)
{
return "72 degrees and sunny";
}
}
var client = new AnthropicClient();
var messages = new List<Message>
{
new Message(RoleType.User, "What is the weather in San Francisco, CA?")
};
var tools = new List<Common.Tool>
{
Common.Tool.GetOrCreateTool(typeof(StaticObjectTool), nameof(GetWeather), "This function returns the weather for a given location")
};
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 2048,
Model = AnthropicModels.Claude3Sonnet,
Stream = false,
Temperature = 1.0m,
Tools = tools
};
var res = await client.Messages.GetClaudeMessageAsync(parameters);
messages.Add(res.Message);
foreach (var toolCall in res.ToolCalls)
{
var response = toolCall.Invoke<string>();
messages.Add(new Message(toolCall, response));
}
var finalResult = await client.Messages.GetClaudeMessageAsync(parameters);
//From an object instance
public class InstanceObjectTool
{
public string GetWeather(string location)
{
return "72 degrees and sunny";
}
}
var client = new AnthropicClient();
var messages = new List<Message>
{
new Message(RoleType.User, "What is the weather in San Francisco, CA?")
};
var objectInstance = new InstanceObjectTool();
var tools = new List<Common.Tool>
{
Common.Tool.GetOrCreateTool(objectInstance, nameof(GetWeather), "This function returns the weather for a given location")
};
....
//Manual
var client = new AnthropicClient();
var messages = new List<Message>
{
new Message(RoleType.User, "What is the weather in San Francisco, CA in fahrenheit?")
};
var inputschema = new InputSchema()
{
Type = "object",
Properties = new Dictionary<string, Property>()
{
{ "location", new Property() { Type = "string", Description = "The location of the weather" } },
{
"tempType", new Property()
{
Type = "string", Enum = Enum.GetNames(typeof(TempType)),
Description = "The unit of temperature, celsius or fahrenheit"
}
}
},
Required = new List<string>() { "location", "tempType" }
};
JsonSerializerOptions jsonSerializationOptions = new()
{
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
Converters = { new JsonStringEnumConverter() },
ReferenceHandler = ReferenceHandler.IgnoreCycles,
};
string jsonString = JsonSerializer.Serialize(inputschema, jsonSerializationOptions);
var tools = new List<Common.Tool>()
{
new Function("GetWeather", "This function returns the weather for a given location",
JsonNode.Parse(jsonString))
};
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 2048,
Model = AnthropicModels.Claude3Sonnet,
Stream = false,
Temperature = 1.0m,
Tools = tools
};
var res = await client.Messages.GetClaudeMessageAsync(parameters);
messages.Add(res.Message);
var toolUse = res.Content.OfType<ToolUseContent>().First();
var id = toolUse.Id;
var param1 = toolUse.Input["location"].ToString();
var param2 = Enum.Parse<TempType>(toolUse.Input["tempType"].ToString());
var weather = await GetWeather(param1, param2);
messages.Add(new Message()
{
Role = RoleType.User,
Content = new List<ContentBase>() { new ToolResultContent()
{
ToolUseId = id,
Content = new List<ContentBase>() { new TextContent() { Text = weather } }
}
}});
var finalResult = await client.Messages.GetClaudeMessageAsync(parameters);
//Json Mode - Advanced Usage
string resourceName = "Anthropic.SDK.Tests.Red_Apple.jpg";
Assembly assembly = Assembly.GetExecutingAssembly();
await using Stream stream = assembly.GetManifestResourceStream(resourceName);
byte[] imageBytes;
using (var memoryStream = new MemoryStream())
{
await stream.CopyToAsync(memoryStream);
imageBytes = memoryStream.ToArray();
}
string base64String = Convert.ToBase64String(imageBytes);
var client = new AnthropicClient();
var messages = new List<Message>();
messages.Add(new Message()
{
Role = RoleType.User,
Content = new List<ContentBase>()
{
new ImageContent()
{
Source = new ImageSource()
{
MediaType = "image/jpeg",
Data = base64String
}
},
new TextContent()
{
Text = "Use `record_summary` to describe this image."
}
}
});
var imageSchema = new ImageSchema
{
Type = "object",
Required = new string[] { "key_colors", "description"},
Properties = new Properties()
{
KeyColors = new KeyColorsProperty
{
Items = new ItemProperty
{
Properties = new Dictionary<string, ColorProperty>
{
{ "r", new ColorProperty { Type = "number", Description = "red value [0.0, 1.0]" } },
{ "g", new ColorProperty { Type = "number", Description = "green value [0.0, 1.0]" } },
{ "b", new ColorProperty { Type = "number", Description = "blue value [0.0, 1.0]" } },
{ "name", new ColorProperty { Type = "string", Description = "Human-readable color name in snake_case, e.g. 'olive_green' or 'turquoise'" } }
}
}
},
Description = new DescriptionDetail { Type = "string", Description = "Image description. One to two sentences max." },
EstimatedYear = new EstimatedYear { Type = "number", Description = "Estimated year that the images was taken, if is it a photo. Only set this if the image appears to be non-fictional. Rough estimates are okay!" }
}
};
JsonSerializerOptions jsonSerializationOptions = new()
{
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
Converters = { new JsonStringEnumConverter() },
ReferenceHandler = ReferenceHandler.IgnoreCycles,
};
string jsonString = JsonSerializer.Serialize(imageSchema, jsonSerializationOptions);
var tools = new List<Common.Tool>()
{
new Function("record_summary", "Record summary of an image into well-structured JSON.",
JsonNode.Parse(jsonString))
};
//with ToolChoice selection
var parameters = new MessageParameters()
{
Messages = messages,
MaxTokens = 1024,
Model = AnthropicModels.Claude3Sonnet,
Stream = false,
Temperature = 1.0m,
Tools = tools,
ToolChoice = new ToolChoice()
{
Type = ToolChoiceType.Tool,
Name = "record_summary"
}
};
var res = await client.Messages.GetClaudeMessageAsync(parameters);
var toolResult = res.Content.OfType<ToolUseContent>().First();
var json = toolResult.Input.ToJsonString();
Output From Json Mode
{
"description": "This image shows a close-up view of a ripe, red apple with shades of yellow and orange. The apple has a shiny, waxy surface with water droplets visible, giving it a fresh appearance.",
"estimated_year": 2020,
"key_colors": [
{
"r": 1,
"g": 0.2,
"b": 0.2,
"name": "red"
},
{
"r": 1,
"g": 0.6,
"b": 0.2,
"name": "orange"
},
{
"r": 0.8,
"g": 0.8,
"b": 0.2,
"name": "yellow"
}
]
}
Computer Use
The AnthropicClient
supports calling computer use functionality, and in this repository is a demonstration application that should work reasonably well on Windows and mirrors in many ways the example application provided by Anthropic.
Please see the Anthropic.SDK.ComputerUse application for a complete example.
var client = new AnthropicClient();
var messages = new List<Message>();
messages.Add(new Message()
{
Role = RoleType.User,
Content = new List<ContentBase>()
{
new TextContent()
{
Text = """
Find Flights between ATL and NYC using a Google Search.
Once you've searched for the flights and have viewed the initial results,
switch the toggle to first class and take a screenshot of the results and tell me the price of the flights.
"""
}
}
});
var tools = new List<Common.Tool>()
{
new Function("computer", "computer_20241022",new Dictionary<string, object>()
{
{"display_width_px", scaledX },
{"display_height_px", scaledY },
{"display_number", displayNumber }
})
};
Contributing
Pull requests are welcome with associated unit tests. If you're planning to make a major change, please open an issue first to discuss your proposed changes.
License
This project is licensed under the MIT License.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Microsoft.Bcl.AsyncInterfaces (>= 8.0.0)
- Microsoft.CSharp (>= 4.7.0)
- Microsoft.Extensions.AI.Abstractions (>= 9.0.0-preview.9.24556.5)
- System.Text.Json (>= 8.0.5)
-
net6.0
- Microsoft.Bcl.AsyncInterfaces (>= 8.0.0)
- Microsoft.Extensions.AI.Abstractions (>= 9.0.0-preview.9.24556.5)
-
net8.0
- Microsoft.Bcl.AsyncInterfaces (>= 8.0.0)
- Microsoft.Extensions.AI.Abstractions (>= 9.0.0-preview.9.24556.5)
NuGet packages (2)
Showing the top 2 NuGet packages that depend on Anthropic.SDK:
Package | Downloads |
---|---|
OBotService
OBase Framework |
|
BotSharp.Plugin.AnthropicAI
Package Description |
GitHub repositories (1)
Showing the top 1 popular GitHub repositories that depend on Anthropic.SDK:
Repository | Stars |
---|---|
SciSharp/BotSharp
AI Multi-Agent Framework in .NET
|
Version | Downloads | Last updated |
---|---|---|
4.4.2 | 547 | 12/2/2024 |
4.4.1 | 1,741 | 11/22/2024 |
4.4.0 | 914 | 11/20/2024 |
4.3.1 | 1,561 | 11/13/2024 |
4.3.0 | 2,500 | 10/30/2024 |
4.2.0 | 466 | 10/26/2024 |
4.1.1 | 8,227 | 8/30/2024 |
4.1.0 | 6,238 | 8/18/2024 |
4.0.0 | 2,876 | 8/2/2024 |
3.3.0 | 9,513 | 7/23/2024 |
3.2.3 | 43,224 | 6/21/2024 |
3.2.2 | 2,423 | 6/16/2024 |
3.2.1 | 6,749 | 4/25/2024 |
3.2.0 | 8,771 | 4/24/2024 |
3.1.0 | 1,016 | 4/17/2024 |
3.0.1 | 1,493 | 4/12/2024 |
3.0.0 | 223 | 4/11/2024 |
2.0.1 | 11,112 | 3/17/2024 |
2.0.0 | 1,713 | 3/5/2024 |
1.3.0 | 2,310 | 11/21/2023 |
1.2.0 | 8,470 | 8/19/2023 |
1.1.2 | 1,837 | 8/9/2023 |
1.1.1 | 1,090 | 7/24/2023 |
1.1.0 | 1,017 | 7/16/2023 |
1.0.0 | 1,262 | 7/1/2023 |
Support for Computer Use Tooling, PDF Support