SemanticKernel.Agents.DatabaseAgent.MCPServer 1.3.0

There is a newer version of this package available.
See the version list below for details.
dotnet tool install --global SemanticKernel.Agents.DatabaseAgent.MCPServer --version 1.3.0
                    
This package contains a .NET tool you can call from the shell/command line.
dotnet new tool-manifest
                    
if you are setting up this repo
dotnet tool install --local SemanticKernel.Agents.DatabaseAgent.MCPServer --version 1.3.0
                    
This package contains a .NET tool you can call from the shell/command line.
#tool dotnet:?package=SemanticKernel.Agents.DatabaseAgent.MCPServer&version=1.3.0
                    
nuke :add-package SemanticKernel.Agents.DatabaseAgent.MCPServer --version 1.3.0
                    

Database Agent MCP Server

The Database Agent MCP Server is a server that listens for incoming connections from the Database Agent and processes the messages sent by the Database Agent. The Database Agent MCP Server is responsible for processing the messages sent by the Database Agent and executing the appropriate actions based on the message type.

Installation

To install the MCP server, you first should install the .NET Core SDK. You can download the .NET Core SDK from the following link: https://dotnet.microsoft.com/download

After installing the .NET Core SDK, you can install the MCP server tool by running the following command:

dotnet tool install --global SemanticKernel.Agents.DatabaseAgent.MCPServer

Usage

To start the MCP server, you can run the following command:

modelcontextprotocol-database-agent --*options*

Example

Here is an example of how to start the MCP server with a SQLite database and Azure OpenAI services:

modelcontextprotocol-database-agent \
  --database:Provider=sqlite \
  --database:ConnectionString="Data Source=northwind.db;Mode=ReadWrite" \
  --memory:Kind=Volatile \
  --kernel:Completion=gpt-4o-mini \
  --kernel:Embedding=text-embedding-ada-002 \
  --services:gpt-4o-mini:Type=AzureOpenAI \
  --services:gpt-4o-mini:Endpoint=https://xxx.openai.azure.com/ \
  --services:gpt-4o-mini:Auth=APIKey \
  --services:gpt-4o-mini:APIKey=xxx \
  --services:gpt-4o-mini:Deployment=gpt-4o-mini \
  --services:text-embedding-ada-002:Type=AzureOpenAI \
  --services:text-embedding-ada-002:Endpoint=https://xxx.openai.azure.com/ \
  --services:text-embedding-ada-002:Auth=APIKey \
  --services:text-embedding-ada-002:APIKey=xxx \
  --services:text-embedding-ada-002:Deployment=text-embedding-ada-002

Options

The following options are available

Global options

--database:Provider
- Description: Specifies the database provider (e.g., SQLite, SQL Server, etc.).
- Type: string
- Example: --database:Provider=sqlite

--database:ConnectionString
- Description: The connection string for connecting to the database.
- Type: string
- Example: --database:ConnectionString="Data Source=northwind.db;Mode=ReadWrite"

Supported database providers

The following database providers are supported:

  • sqlite: SQLite database provider
  • sqlserver: SQL Server database provider
  • mysql: MySQL database provider
  • postgresql: PostgreSQL database provider
  • oracle: Oracle database provider
  • oledb: OLE DB database provider
Memory options

Memory options are used to configure the memory settings for the kernel. As a default, the memory is set to Volatile, which means that the memory is not persisted and will be lost when the kernel is stopped.

--memory:Kind
- Description: Defines the kind of memory to be used for the kernel (e.g., Volatile).
- Type: string
- Example: --memory:Kind=Volatile

--memory:Dimensions - Description: The number of dimensions for the memory vectors. This is only used when the memory kind is set to a persistent memory provider. - Type: int
- Example: --memory:Dimensions=1536

You can also set the memory to persist the data in a database. At the moment, the only supported database provider is sqlite and Qdrant but more providers will be added in the future.

SQLite options

--memory:ConnectionString
- Description: The connection string for connecting to the SQLite database.
- Type: string
- Example: --memory:SQLite:ConnectionString="Data Source=northwind.db;Mode=ReadWrite"

Qdrant options

--memory:Host - Description: The host name or IP address of the Qdrant server.
- Type: string
- Example: --memory:Host="localhost"

--memory:Port
- Description: The port number of the Qdrant server.
- Type: int
- Example: --memory:Port=6333

--memory:Https - Description: Specifies whether to use HTTPS for the connection.
- Type: bool
- Default: false
- Example: --memory:Https=true

--memory:ApiKey
- Description: The API key for authenticating with the Qdrant server.
- Type: string
- Example: --memory:ApiKey="xxx"

Kernel options

. --kernel:Completion
- Description: Defines the completion model used by the kernel and configured in the services section.
- Type: string
- Example: --kernel:Completion=gpt-4o-mini

. --kernel:Embedding
- Description: Specifies the embedding model for the kernel's embedding operations and configured in the services section.
- Type: string
- Example: --kernel:Embedding=text-embedding-ada-002

Services options

The services options are used to configure the services that are used by the kernel. At this time Azure Open AI and Ollama are supported as backend but more providers will be added in the future.

--services:<model>:Type
- Description: Specifies the type of service to be used (e.g., AzureOpenAI, Ollama).
- Type: string
- Example: --services:gpt-4o-mini:Type=AzureOpenAI

Azure Open AI

--services:<model>:Endpoint
- Description: The endpoint URL for the service.
- Type: string
- Example: --services:gpt-4o-mini:Endpoint="https://xxx.openai.azure.com/"

--services:<model>:Auth
- Description: The authentication method for the service (e.g., APIKey).
- Type: string
- Example: --services:gpt-4o-mini:Auth=APIKey

--services:<model>:APIKey
- Description: The API key for authenticating with the service.
- Type: string
- Example: --services:gpt-4o-mini:APIKey="xxx"

--services:<model>:Deployment
- Description: The deployment name for the service.
- Type: string
- Example: --services:gpt-4o-mini:Deployment="gpt-4o-mini"

Ollama

--services:<model>:ModelId
- Description: The model name for the Ollama service.
- Type: string
- Example: --services:qwen2.5-coder:ModelId="qwen2.5-coder:latest"

--services:<model>:Host
- Description: The host name or IP address of the Ollama server.
- Type: string
- Example: --services:qwen2.5-coder:Endpoint="http://localhost:11434"

Quality assurance

You can set the quality assurance settings by adding these specific configuration options:

--agent:QualityAssurance:EnableQueryRelevancyFilter
- Description: Enables or disables the query relevancy filter in the quality assurance process.
- Type: bool
- Default: true
- Example: --agent:QualityAssurance:EnableQueryRelevancyFilter=false

--agent:QualityAssurance:QueryRelevancyThreshold
- Description: Enables or disables the query relevancy filter in the quality assurance process.
- Type: bool
- Default: true
- Example: --agent:QualityAssurance:QueryRelevancyThreshold=0.9

Contributing

Contributions are welcome! For more information, please see the CONTRIBUTING file.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

This package has no dependencies.

Version Downloads Last updated
1.8.0 205 4/21/2025
1.7.1 250 4/14/2025
1.7.1-beta01 268 4/14/2025
1.7.0 241 4/13/2025
1.7.0-beta16 195 4/13/2025
1.7.0-beta15 216 4/13/2025
1.7.0-beta14 203 4/13/2025
1.7.0-beta13 202 4/13/2025
1.7.0-beta12 193 4/13/2025
1.7.0-beta11 144 4/12/2025
1.7.0-beta10 156 4/12/2025
1.7.0-beta09 186 4/11/2025
1.7.0-beta08 178 4/11/2025
1.7.0-beta07 196 4/11/2025
1.7.0-beta06 178 4/11/2025
1.7.0-beta05 194 4/11/2025
1.7.0-beta04 160 4/11/2025
1.7.0-beta03 172 4/11/2025
1.7.0-beta02 196 4/11/2025
1.7.0-beta01 175 4/11/2025
1.6.0 227 4/11/2025
1.5.1 179 4/11/2025
1.5.0 231 4/10/2025
1.5.0-beta03 208 4/10/2025
1.5.0-beta02 214 4/9/2025
1.5.0-beta01 200 4/9/2025
1.4.1-beta01 201 4/9/2025
1.4.0 219 4/8/2025
1.3.1 238 4/7/2025
1.3.1-beta02 207 4/7/2025
1.3.1-beta01 225 4/7/2025
1.3.0 203 4/6/2025
1.3.0-beta03 210 4/6/2025
1.3.0-beta02 163 4/5/2025
1.3.0-beta01 151 4/5/2025
1.2.2 177 4/4/2025
1.2.1 168 4/4/2025
1.2.1-beta02 190 4/4/2025
1.2.1-beta01 190 4/4/2025
1.2.0 194 4/4/2025
1.2.0-beta01 195 4/4/2025
1.1.0 185 3/31/2025
1.1.0-beta01 160 3/31/2025
1.0.2 160 3/31/2025
1.0.1 165 3/31/2025
1.0.0 160 3/31/2025
0.0.3 93 3/29/2025
0.0.3-alpha03 135 3/28/2025
0.0.3-alpha01 130 3/28/2025