MASES.KNet 2.1.1

There is a newer version of this package available.
See the version list below for details.
dotnet add package MASES.KNet --version 2.1.1                
NuGet\Install-Package MASES.KNet -Version 2.1.1                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="MASES.KNet" Version="2.1.1" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add MASES.KNet --version 2.1.1                
#r "nuget: MASES.KNet, 2.1.1"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install MASES.KNet as a Cake Addin
#addin nuget:?package=MASES.KNet&version=2.1.1

// Install MASES.KNet as a Cake Tool
#tool nuget:?package=MASES.KNet&version=2.1.1                

KNet: library usage

To use KNet classes the developer can write code in .NET using the same classes available in the official Apache Kafka package. If classes or methods are not available yet it is possible to use the approach synthetized in What to do if an API was not yet implemented

Environment setup

KNet accepts many command-line switches to customize its behavior. The full list is available at Command line switch page.

JVM identification

One of the most important command-line switch is JVMPath and it is available in JCOBridge switches: it can be used to set-up the location of the JVM library if JCOBridge is not able to identify a suitable JRE installation. If a developer is using KNet within its own product it is possible to override the JVMPath property with a snippet like the following one:

    class MyKNetCore : KNetCore
    {
        public override string JVMPath
        {
            get
            {
                string pathToJVM = "Set here the path to JVM library or use your own search method";
                return pathToJVM;
            }
        }
    }

Producer example

Below the reader can found two different version of producer examples.

Simple producer

A basic producer can be like the following one:

using MASES.KNet;
using Org.Apache.Kafka.Clients.Producer;
using Java.Util;
using System;
using System.Threading;

namespace MASES.KNetTemplate.KNetProducer
{
    class Program
    {
        const string theServer = "localhost:9092";
        const string theTopic = "myTopic";

        static string serverToUse = theServer;
        static string topicToUse = theTopic;

        static readonly ManualResetEvent resetEvent = new ManualResetEvent(false);

        static void Main(string[] args)
        {
            KNetCore.CreateGlobalInstance();
            var appArgs = KNetCore.FilteredArgs;

            if (appArgs.Length != 0)
            {
                serverToUse = args[0];
            }

            /**** Direct mode ******
            Properties props = new Properties();
            props.Put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, serverToUse);
            props.Put(ProducerConfig.ACKS_CONFIG, "all");
            props.Put(ProducerConfig.RETRIES_CONFIG, 0);
            props.Put(ProducerConfig.LINGER_MS_CONFIG, 1);
            props.Put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
            props.Put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
            ******/

            Properties props = ProducerConfigBuilder.Create()
                                                    .WithBootstrapServers(serverToUse)
                                                    .WithAcks(ProducerConfig.Acks.All)
                                                    .WithRetries(0)
                                                    .WithLingerMs(1)
                                                    .WithKeySerializerClass("org.apache.kafka.common.serialization.StringSerializer")
                                                    .WithValueSerializerClass("org.apache.kafka.common.serialization.StringSerializer")
                                                    .ToProperties();

            Console.CancelKeyPress += Console_CancelKeyPress;
            Console.WriteLine("Press Ctrl-C to exit");

			using (KafkaProducer producer = new KafkaProducer(props))
			{
				int i = 0;
				while (!resetEvent.WaitOne(0))
				{
					var record = new ProducerRecord<string, string>(topicToUse, i.ToString(), i.ToString());
					var result = producer.Send(record);
					Console.WriteLine($"Producing: {record} with result: {result.Get()}");
					producer.Flush();
					i++;
				}
			}
        }

        private static void Console_CancelKeyPress(object sender, ConsoleCancelEventArgs e)
        {
            if (e.Cancel) resetEvent.Set();
        }
    }
}

The example above can be found in the templates package. Its behavior is:

  • during initialization prepares the properties,
  • create a producer using the properties
  • create ProducerRecord and send it
  • print out the produced data and the resulting RecordMetadata

Producer with Callback

A producer with Callback can be like the following one. In this example the reader can highlight a slightly difference from the corresponding Java code. Surf JVM callbacks to go into detail in the callback management from JVM.

using MASES.KNet;
using Org.Apache.Kafka.Clients.Producer;
using Java.Util;
using System;
using System.Threading;

namespace MASES.KNetTemplate.KNetProducer
{
    class Program
    {
        const string theServer = "localhost:9092";
        const string theTopic = "myTopic";

        static string serverToUse = theServer;
        static string topicToUse = theTopic;

        static readonly ManualResetEvent resetEvent = new ManualResetEvent(false);

        static void Main(string[] args)
        {
            KNetCore.CreateGlobalInstance();
            var appArgs = KNetCore.FilteredArgs;

            if (appArgs.Length != 0)
            {
                serverToUse = args[0];
            }

            /**** Direct mode ******
            Properties props = new Properties();
            props.Put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, serverToUse);
            props.Put(ProducerConfig.ACKS_CONFIG, "all");
            props.Put(ProducerConfig.RETRIES_CONFIG, 0);
            props.Put(ProducerConfig.LINGER_MS_CONFIG, 1);
            props.Put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
            props.Put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
            ******/

            Properties props = ProducerConfigBuilder.Create()
                                                    .WithBootstrapServers(serverToUse)
                                                    .WithAcks(ProducerConfig.Acks.All)
                                                    .WithRetries(0)
                                                    .WithLingerMs(1)
                                                    .WithKeySerializerClass("org.apache.kafka.common.serialization.StringSerializer")
                                                    .WithValueSerializerClass("org.apache.kafka.common.serialization.StringSerializer")
                                                    .ToProperties();

            Console.CancelKeyPress += Console_CancelKeyPress;
            Console.WriteLine("Press Ctrl-C to exit");

			using (KafkaProducer producer = new KafkaProducer(props))
			{
				int i = 0;
				using (var callback = new Callback((o1, o2) =>
				{
					if (o2 != null) Console.WriteLine(o2.ToString());
					else Console.WriteLine($"Produced on topic {o1.Topic} at offset {o1.Offset}");
				}))
				{
					while (!resetEvent.WaitOne(0))
					{
						var record = new ProducerRecord<string, string>(topicToUse, i.ToString(), i.ToString());
						var result = producer.Send(record, callback);
						Console.WriteLine($"Producing: {record} with result: {result.Get()}");
						producer.Flush();
						i++;
					}
				}
			}
        }

        private static void Console_CancelKeyPress(object sender, ConsoleCancelEventArgs e)
        {
            if (e.Cancel) resetEvent.Set();
        }
    }
}

The example above can be found in the templates package. Its behavior is:

  • during initialization prepares the properties
  • create a producer using the properties
  • create ProducerRecord and send it using the API Send with the attached Callback
  • when the operation completed the Callback is called:
    • if an Exception was raised it will be printed out
    • otherwise the RecordMetadata is printed out
  • print out the produced data and the resulting RecordMetadata

Consumer example

A basic consumer can be like the following one:

using MASES.KNet;
using Org.Apache.Kafka.Clients.Consumer;
using Java.Util;
using System;

namespace MASES.KNetTemplate.KNetConsumer
{
    class Program
    {
        const string theServer = "localhost:9092";
        const string theTopic = "myTopic";

        static string serverToUse = theServer;
        static string topicToUse = theTopic;

        static readonly ManualResetEvent resetEvent = new ManualResetEvent(false);

        static void Main(string[] args)
        {
            KNetCore.CreateGlobalInstance();
            var appArgs = KNetCore.FilteredArgs;

            if (appArgs.Length != 0)
            {
                serverToUse = args[0];
            }

            /**** Direct mode ******
            Properties props = new Properties();
            props.Put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, serverToUse);
            props.Put(ConsumerConfig.GROUP_ID_CONFIG, "test");
            props.Put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true");
            props.Put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "1000");
            props.Put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");
            props.Put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");
            *******/

            Properties props = ConsumerConfigBuilder.Create()
                                                    .WithBootstrapServers(serverToUse)
                                                    .WithGroupId("test")
                                                    .WithEnableAutoCommit(true)
                                                    .WithAutoCommitIntervalMs(1000)
                                                    .WithKeyDeserializerClass("org.apache.kafka.common.serialization.StringDeserializer")
                                                    .WithValueDeserializerClass("org.apache.kafka.common.serialization.StringDeserializer")
                                                    .ToProperties();

            Console.CancelKeyPress += Console_CancelKeyPress;
            Console.WriteLine("Press Ctrl-C to exit");

            using (var consumer = new KafkaConsumer<string, string>(props))
            {
                consumer.Subscribe(Collections.singleton(topicToUse));
                while (!resetEvent.WaitOne(0))
                {
                    var records = consumer.Poll((long)TimeSpan.FromMilliseconds(200).TotalMilliseconds);
                    foreach (var item in records)
                    {
                        Console.WriteLine($"Offset = {item.Offset}, Key = {item.Key}, Value = {item.Value}");
                    }
                }
            }
        }

        private static void Console_CancelKeyPress(object sender, ConsoleCancelEventArgs e)
        {
            if (e.Cancel) resetEvent.Set();
        }
    }
}

The example above can be found in the templates package. Its behavior is:

  • during initialization prepares the properties,
  • create a consumer using the properties
  • subscribe and starts consume
  • when data are received it logs to the console the information.
Product Compatible and additional computed target framework versions.
.NET net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
.NET Framework net462 is compatible.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (5)

Showing the top 5 NuGet packages that depend on MASES.KNet:

Package Downloads
MASES.EntityFrameworkCore.KNet.Serialization

EntityFrameworkCore KNet - Serialization support for EntityFrameworkCore provider for Apache Kafka

MASES.KNet.Serialization.Avro

Avro Serializer/Deserializer of .NET suite for Apache Kafka. KNet is a comprehensive .NET suite for Apache Kafka providing all features: Producer, Consumer, Admin, Streams, Connect, backends (ZooKeeper and Kafka).

MASES.KNet.Serialization.Json

Json Serializer/Deserializer of .NET suite for Apache Kafka. KNet is a comprehensive .NET suite for Apache Kafka providing all features: Producer, Consumer, Admin, Streams, Connect, backends (ZooKeeper and Kafka).

MASES.KNet.Serialization.MessagePack

MessagePack Serializer/Deserializer of .NET suite for Apache Kafka. KNet is a comprehensive .NET suite for Apache Kafka providing all features: Producer, Consumer, Admin, Streams, Connect, backends (ZooKeeper and Kafka).

MASES.KNet.Serialization.Protobuf

Protobuf Serializer/Deserializer of .NET suite for Apache Kafka. KNet is a comprehensive .NET suite for Apache Kafka providing all features: Producer, Consumer, Admin, Streams, Connect, backends (ZooKeeper and Kafka).

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
2.8.2 13,460 11/5/2024
2.8.1 143,884 9/20/2024
2.8.0 41,401 8/6/2024
2.7.10 277 11/5/2024
2.7.9 529 9/20/2024
2.7.8 13,872 7/31/2024
2.7.7 5,800 7/30/2024
2.7.6 271 7/29/2024
2.7.5 32,261 7/2/2024
2.7.4 55,433 6/27/2024
2.7.3 28,167 6/24/2024
2.7.2 41,906 5/25/2024
2.7.1 13,949 5/18/2024
2.7.0 869 5/16/2024
2.6.7 312 11/5/2024
2.6.6 419 9/20/2024
2.6.5 504 9/16/2024
2.6.4 421 6/14/2024
2.6.3 337 6/11/2024
2.6.2 622 5/17/2024
2.6.1 19,389 5/3/2024
2.6.0 139,682 3/1/2024
2.5.0 681 2/28/2024
2.4.3 58,309 2/11/2024
2.4.2 47,380 1/27/2024
2.4.1 11,689 1/21/2024
2.4.0 369 1/20/2024
2.3.0 93,002 11/25/2023
2.2.0 48,499 10/19/2023
2.1.3 26,530 10/11/2023
2.1.2 26,723 10/6/2023
2.1.1 1,411 10/5/2023
2.1.0 978 9/27/2023
2.0.2 135,928 8/2/2023
2.0.1 10,066 7/11/2023
2.0.0 37,422 7/8/2023
1.5.5 32,631 7/1/2023
1.5.4 25,585 5/28/2023
1.5.3 46,242 4/16/2023
1.5.2 33,285 4/11/2023
1.5.1 65,159 3/15/2023
1.5.0 55,791 2/9/2023
1.4.8 143,670 11/28/2022
1.4.7 792 11/23/2022
1.4.6 884 11/22/2022
1.4.5 740 11/21/2022
1.4.4 19,803 11/1/2022
1.4.3 19,969 10/21/2022
1.4.2 881 10/17/2022
1.4.1 1,462 10/10/2022
1.4.0 52,713 10/6/2022
1.3.6 36,935 9/19/2022
1.3.5 54,278 9/8/2022
1.3.4 66,304 8/18/2022
1.3.3 932 8/5/2022
1.3.2 935 6/19/2022
1.3.1 35,614 5/23/2022
1.2.4 1,087 5/11/2022
1.2.3 982 5/7/2022
1.2.2 972 5/2/2022
1.2.1 6,053 3/28/2022
1.2.0 1,557 3/20/2022