KEFCore: KafkaDbContext
KafkaDbContext
is a special class which helps to define the DbContext
and use Entity Framework Core provider for Apache Kafka™:
KafkaDbContext
inherits fromDbContext
: to define the model, and/or creating the database, see getting started in the docs and KEFCore usageKafkaDbContext
defines the following properties:- KeySerDesSelectorType: the .NET type to be used to allocate an external serializer for Apache Kafka™ record key
- ValueSerDesSelectorType: the .NET type to be used to allocate an external serializer for Apache Kafka™ record value
- ValueContainerType: the .NET type to be used to allocate an external container class for Apache Kafka™ record value
- UseNameMatching: set to false to avoid Entity matching based on Name
- BootstrapServers: the server hosting the broker of Apache Kafka™
- ApplicationId: the application identifier used to identify the context
- DatabaseName: the user defined name which declares the database name, it is used to prepend every Topic which belongs to this database
- DefaultNumPartitions: the default number of partitions used when topics are created for each entity
- DefaultReplicationFactor: the replication factor to use when data are stored in Apache Kafka™
- DefaultConsumerInstances: the consumer instances to be allocated when UseCompactedReplicator is true
- UsePersistentStorage: set to true to use a persistent storage between multiple application startup
- UseEnumeratorWithPrefetch: set to true to prefer enumerator instances able to do a prefetch on data speeding up execution, used if UseKNetStreams is true and UseCompactedReplicator is false
- UseByteBufferDataTransfer: set to true to prefer
data exchange in serializer instances - UseDeletePolicyForTopic: set to true to enable delete cleanup policy
- UseCompactedReplicator: Use
KNetCompactedReplicator
instead of Apache Kafka™ Streams to manage data to or from topics - UseKNetStreams: Use KNet version of Apache Kafka™ Streams instead of standard Apache Kafka™ Streams, used if UseCompactedReplicator is false
- ConsumerConfig: parameters to use for Producer
- ProducerConfig: parameters to use for Producer
- StreamsConfig: parameters to use for Apche Kafka™ Streams application
- TopicConfig: parameters to use on topic creation for each entity
- OnChangeEvent: handler to receive change events from back-end
How to use KafkaDbContext
class
The most simple example of usage can be found in KEFCore usage. By default, KafkaDbContext
automatically manages OnConfiguring
method of DbContext
:
KafkaDbContext
checks the mandatory options like BootstrapServers and DbNameKafkaDbContext
setup the options needed to use an Apache Kafka™ cluster:- default
ConsumerConfig
can be overridden using ConsumerConfig property ofKafkaDbContext
- default
ProducerConfig
can be overridden using ProducerConfig property ofKafkaDbContext
- default
StreamsConfig
can be overridden using StreamsConfig property ofKafkaDbContext
- default
TopicConfig
can be overridden using TopicConfig property ofKafkaDbContext
- default
Default ConsumerConfig
Over the Apache Kafka™ defaults it applies:
- EnableAutoCommit is true
- AutoOffsetReset set to EARLIEST
- AllowAutoCreateTopics set to false
Default ProducerConfig
Does not change anything over the Apache Kafka™ defaults
Default StreamsConfig
Does not change anything over the Apache Kafka™ defaults
Default TopicConfig
Over the Apache Kafka™ defaults it applies:
- DeleteRetentionMs set to 100 ms
- MinCleanableDirtyRatio set to 0.01
- SegmentMs set to 100 ms
- RetentionBytes set to 1073741824 bytes (1 Gb)