All Classes and Interfaces

Class
Description
Manages the ConsumerSeekAware.ConsumerSeekCallback s for the listener.
Top level class for all listener adapters.
An abstract message listener adapter that implements record filter logic via a RecordFilterStrategy.
Abstract type mapper.
Base class for KafkaBackOffManagerFactory implementations.
Base for Kafka header mappers.
A matcher for headers.
A matcher that never matches a set of headers.
A pattern-based header matcher that matches if the specified header matches the specified simple pattern.
Base KafkaListenerContainerFactory for Spring's base container implementation.
Base model for a Kafka listener endpoint.
The base implementation for the MessageListenerContainer.
Base class for retrying message listener adapters.
A Supplier for bootstrap servers that can toggle between 2 lists of servers.
Listener for handling incoming Kafka messages, propagating an acknowledgment handle that recipients can invoke when the message has been processed.
Listener for handling incoming Kafka messages, propagating an acknowledgment handle that recipients can invoke when the message has been processed.
Handle for acknowledging the processing of a ConsumerRecord.
Utilities for listener adapters.
Invoked by a listener container with remaining, unprocessed, records (including the failed record).
A replying template that aggregates multiple replies with the same correlation id.
Class for managing Allow / Deny collections and its predicates.
Handler for the provided back off time, listener container and exception.
Generates the backoff values from the provided maxAttempts value and BackOffPolicy.
Listener for handling a batch of incoming Kafka messages, propagating an acknowledgment handle that recipients can invoke when the message has been processed.
Listener for handling a batch of incoming Kafka messages, propagating an acknowledgment handle that recipients can invoke when the message has been processed.
Listener for handling a batch of incoming Kafka messages; the list is created from the consumer records object returned by a poll.
An interceptor for batches of records.
An exception thrown by user code to inform the framework which record in a batch has failed.
A Kafka-specific Message converter strategy.
Listener for handling a batch of incoming Kafka messages; the list is created from the consumer records object returned by a poll.
A Messaging MessageConverter implementation used with a batch message listener; the consumer record values are extracted into a collection in the message payload.
A MessageListener adapter that invokes a configurable HandlerAdapter; used when the factory is configured for the listener to receive batches of messages.
An adapter that adapts a batch listener to a record listener method.
A callback for each message.
Encapsulates the address of a Kafka broker.
JSON Message converter - byte[] on output, String, Bytes, or byte[] on input.
JSON Message converter - Bytes on output, String, Bytes, or byte[] on input.
Deprecated.
Refer to the ChainedTransactionManager javadocs.
Strategy for setting metadata on messages such that one can create the class that needs to be instantiated when receiving a message.
Specifies time of KafkaStreams.cleanUp() execution.
A CommonErrorHandler that stops the container when an error occurs.
An error handler that delegates to different error handlers, depending on the exception type.
Listener container error handling contract.
The CommonErrorHandler implementation for logging exceptions.
A CommonErrorHandler that delegates to different CommonErrorHandlers for record and batch listeners.
A BatchInterceptor that delegates to one or more BatchInterceptors in order.
Composite KafkaStreamsCustomizer customizes KafkaStreams by delegating to a list of provided KafkaStreamsCustomizer.
Composite KafkaStreamsInfrastructureCustomizer customizes KafkaStreams by delegating to a list of provided KafkaStreamsInfrastructureCustomizer.
A ProducerInterceptor that delegates to a collection of interceptors.
A ProducerListener that delegates to a collection of listeners.
A RecordInterceptor that delegates to one or more RecordInterceptors in order.
Creates 1 or more KafkaMessageListenerContainers based on concurrency.
An error handler that has access to the consumer.
Listener for handling individual incoming Kafka messages.
A rebalance listener that provides access to the consumer object.
A ConsumerRecordRecoverer that supports getting a reference to the Consumer.
The strategy to produce a Consumer instance.
Called whenever a consumer is added or removed.
An event published when a consumer fails to start.
An event published when a consumer partition is paused.
An event published when a consumer partition is resumed.
An event published when a consumer is paused.
Objects that can publish consumer pause/resume events.
Called by consumer factories to perform post processing on newly created consumers.
Common consumer properties.
Used to provide a listener method argument when the user supplies such a parameter.
A BiConsumer extension for recovering consumer records.
An event published when a consumer is resumed.
An event published when authentication or authorization of a consumer fails and is being retried.
Reasons for retrying auth a consumer.
An event published when authentication or authorization has been retried successfully.
Listeners that implement this interface are provided with a ConsumerSeekAware.ConsumerSeekCallback which can be used to perform a seek operation.
A callback that a listener can invoke to seek to a specific offset.
An event published when a consumer has started.
An event published when a consumer is initializing.
An event published when a consumer is stopped.
Reasons for stopping a consumer.
An event published when a consumer is stopped.
Called by the container factory after the container is created and configured.
A group of listener containers.
Sequence the starting of container groups when all containers in the previous group are idle.
A manager that backs off consumption for a given topic if the timestamp provided is not due.
A BackOffHandler that pauses the container for the backoff.
Called by the container factory after the container is created and configured.
Contains runtime properties for a listener container.
The offset commit behavior enumeration.
Offset commit behavior during assignment.
Mode for exactly once semantics.
An event published when a container is stopped.
Utilities for testing listener containers.
Exception for conversions.
A AcknowledgingConsumerAwareMessageListener adapter that implements converting received ConsumerRecord using specified MessageConverter and then passes result to specified MessageListener.
Wrapper for byte[] that can be used as a hash key.
A ConsumerRecordRecoverer that publishes a failed record to a dead-letter topic.
Use this to provide a custom implementation to take complete control over exception header creation for the output record.
Container class for the name of the headers that will be added to the produced record.
Provides a convenient API for creating DeadLetterPublishingRecoverer.HeaderNames.
Header names for exception headers.
Bits representing which headers to add.
Header names for original record property headers.
A Header that indicates that this header should replace any existing headers with this name, rather than being appended to the headers, which is the normal behavior.
Creates and configures the DeadLetterPublishingRecoverer that will be used to forward the messages using the DestinationTopicResolver.
Implement this interface to create each DeadLetterPublishingRecoverer.
Default implementation of AfterRollbackProcessor.
Default BackOffHandler; suspends the thread for the back off.
The default BatchToRecordAdapter implementation; if the supplied recoverer throws an exception, the batch will be aborted; otherwise the next record will be processed.
Default implementation of the DestinationTopicProcessor interface.
Default implementation of the DestinationTopicResolver interface.
 
An error handler that, for record listeners, seeks to the current offset for each topic in the remaining records.
Jackson 2 type mapper.
The ConsumerFactory implementation to produce new Consumer instances for provided Map configs and optional Deserializers on each ConsumerFactory.createConsumer() invocation.
Default header mapper for Apache Kafka.
Represents a header that could not be decoded due to an untrusted type.
The ProducerFactory implementation for a singleton shared Producer instance.
A wrapper class for the delegate.
A Deserializer that delegates to other deserializers based on the topic name.
Base class with common code for delegating by topic serialization.
A Serializer that delegates to other serializers based on a topic pattern.
Delegates to a serializer based on type.
A Deserializer that delegates to other deserializers based on a serialization selector header.
Delegates to an InvocableHandlerMethod based on the message payload type.
Classes implementing this interface allow containers to determine the type of the ultimate listener.
A Serializer that delegates to other serializers based on a serialization selector header.
A component implementing this interface can provide the next delivery attempt.
Exception returned in the consumer record value or key when a deserialization failure occurs.
Representation of a Destination Topic to which messages can be forwarded, such as retry topics and dlt.
 
Provides methods to store and retrieve DestinationTopic instances.
The DestinationTopicProcessor creates and registers the DestinationTopic instances in the provided DestinationTopicProcessor.Context, also providing callback interfaces to be called upon the context properties.
 
Creates a list of DestinationTopic.Properties based on the provided configurations.
 
Provides methods for resolving the destination to which a message that failed to be processed should be forwarded to.
Annotation to determine the method that should process the DLT topic message.
Strategies for handling DLT processing.
Annotation that can be specified on a test class that runs Spring for Apache Kafka based tests.
 
JUnit5 condition for an embedded broker.
An embedded Kafka Broker(s) using KRaft.
A TestRule wrapper around an EmbeddedKafkaBroker.
An embedded Kafka Broker(s) and Zookeeper manager.
Ported from scala to allow setting the port.
Enable Kafka listener annotated endpoints that are created under the covers by a AbstractListenerContainerFactory.
Enables the non-blocking topic-based delayed retries feature.
Enable default Kafka Streams components.
Customizes main, retry and DLT endpoints in the Retry Topic functionality and returns the resulting topic names.
 
Creates the EndpointCustomizer that will be used by the RetryTopicConfigurer to customize the main, retry and DLT endpoints.
Handler method for retrying endpoints.
Delegating key/value deserializer that catches exceptions, returning them in the headers as serialized java objects.
Utilities for error handling.
Supports exception classification.
Subclass of ExponentialBackOff that allows the specification of the maximum number of retries rather than the maximum elapsed time.
Subclass of FailedRecordProcessor that can process (and recover) a batch.
Class containing all the contextual information around a deserialization error.
Common super class for classes that deal with failing to consume a consumer record.
A BatchMessageListener adapter that implements filter logic via a RecordFilterStrategy.
A MessageListener adapter that implements filter logic via a RecordFilterStrategy.
Top level interface for listeners.
Generic message listener container; adds parameters.
The TestExecutionListener to start an EmbeddedKafkaBroker in the beginning of the test plan and stop in the end.
Manipulate the headers.
Container object for SpEL evaluation.
The result of a method invocation.
Strategy for setting metadata on messages such that one can create the class that needs to be instantiated when receiving a message.
The precedence for type conversion - inferred from the method parameter or message headers.
A SimpleModule extension for MimeType serialization.
The utility to check if Jackson JSON processor is present in the classpath.
The utilities for Jackson ObjectMapper instances.
Chained utility methods to simplify some Java repetitive code.
Generic Deserializer for receiving JSON from Kafka and return Java objects.
Base class for JSON message converters; on the consumer side, it can handle byte[], Bytes and String record values.
A Serde that provides serialization and deserialization in JSON format.
Generic Serializer for sending Java objects to Kafka as JSON.
Determine the JavaType from the topic/data/headers.
Utility methods for JUnit rules and conditions.
 
An admin that delegates to an AdminClient to create topics defined in the application context.
Wrapper for a collection of NewTopic to facilitate declaring multiple topics as a single bean.
Provides a number of convenience methods wrapping AdminClient.
Detect and register Avro types for Apache Kafka listeners.
A transaction manager that can provide a ProducerFactory.
A AcknowledgingConsumerAwareMessageListener implementation that looks for a backoff dueTimestamp header and invokes a KafkaConsumerBackoffManager instance that will back off if necessary.
Exception thrown when the consumer should not yet consume the message due to backOff.
Creates a KafkaBackOffManagerFactory instance.
An ImportBeanDefinitionRegistrar class that registers a KafkaListenerAnnotationBeanPostProcessor bean capable of processing Spring's @KafkaListener annotation.
AssertJ custom Conditions.
 
 
 
 
 
Interface for backing off a MessageListenerContainer until a given dueTimestamp, if such timestamp is in the future.
Provides the state that will be used for backing off.
Base class for events.
The Spring for Apache Kafka specific NestedRuntimeException implementation.
The log level for KafkaException.
A top level abstract class for classes that can be configured with a KafkaException.Level.
Annotation that marks a method to be the target of a Kafka message listener within a class that is annotated with KafkaListener.
Header mapper for Apache Kafka.
The Kafka specific message headers constants.
Contains properties for setting up an AppConfigurationEntry that can be used for the Kafka client.
Control flag values for login configuration.
Annotation that marks a method to be the target of a Kafka message listener on the specified topics.
Bean post-processor that registers methods annotated with KafkaListener to be invoked by a Kafka message listener container created under the covers by a KafkaListenerContainerFactory according to the parameters of the annotation.
Post processes each set of annotation attributes.
A DeferredImportSelector implementation with the lowest order to import a KafkaBootstrapConfiguration as late as possible.
Optional interface to be implemented by Spring managed bean willing to customize how Kafka listener endpoints are configured.
Configuration constants for internal sharing across subpackages.
Model for a Kafka listener endpoint.
Helper bean for registering KafkaListenerEndpoint with a KafkaListenerEndpointRegistry.
Creates the necessary MessageListenerContainer instances for the registered endpoints.
An error handler which is called when a @KafkaListener method throws an exception.
Spring for Apache Kafka Observation for listeners.
Default KafkaListenerObservationConvention for Kafka listener key values.
Low cardinality tags.
ObservationConvention for Kafka listener key values.
Container annotation that aggregates several KafkaListener annotations.
Hamcrest Matchers utilities.
 
 
 
 
A header accessor to provide convenient access to certain headers in a type specific manner.
Overload of message headers configurable for adding id and timestamp headers.
Single-threaded Message listener container using the Java Consumer supporting auto-partition assignment or user-configured assignment.
This class represents NULL Kafka payload.
PayloadMethodArgumentResolver that can properly decode KafkaNull payloads, returning null.
The basic Kafka operations contract returning CompletableFutures.
A callback for executing arbitrary operations on the KafkaOperations.
A callback for executing arbitrary operations on the Producer.
Exceptions when producing.
ReceiverContext for ConsumerRecords.
SenderContext for ProducerRecords.
Exception when a reply is not received within a timeout.
Base class for consumer/producer/admin creators.
Kafka resource holder, wrapping a Kafka producer.
RuntimeHintsRegistrar for Spring for Apache Kafka.
Provides a method-chaining way to build branches in Kafka Streams processor topology.
Wrapper for StreamsBuilder properties.
Callback interface that can be used to configure KafkaStreams directly.
@Configuration class that registers a StreamsBuilderFactoryBean if StreamsConfig with the name KafkaStreamsDefaultConfiguration.DEFAULT_STREAMS_CONFIG_BEAN_NAME is present in the application context.
A customizer for infrastructure components such as the StreamsBuilder and Topology.
Creates a KafkaStreamsMetrics for the KafkaStreams.
A template for executing high-level operations.
Spring for Apache Kafka Observation for KafkaTemplate.
Default KafkaTemplateObservationConvention for Kafka template key values.
Low cardinality tags.
ObservationConvention for Kafka template key values.
Kafka testing utilities.
PlatformTransactionManager implementation for a single Kafka ProducerFactory.
Utility methods.
An event that is emitted when a container is idle if the container is configured to do so.
An event that is emitted when a container is no longer idle if configured to publish idle events.
An event that is emitted when a container partition is idle if the container is configured to do so.
An event that is emitted when a partition is no longer idle if configured to publish idle events.
Service for pausing and resuming of MessageListenerContainer.
A registry for listener containers.
The listener specific KafkaException extension.
Metadata associated to a KafkaListener.
Defines the listener type.
Listener utilities.
A JUnit method @Rule that changes the logger level for a set of classes while a test method is running.
Logs commit results at DEBUG level for success and ERROR for failures.
The ProducerListener that logs exceptions thrown when sending messages.
Wrapper for a commons-logging Log supporting configurable logging levels.
Logging levels.
Test classes annotated with this will change logging levels between tests.
JUnit condition that adjusts and reverts log levels before/after each test.
A KafkaListenerErrorHandler that supports manual acks.
Subclass of MappingJackson2MessageConverter that can handle parameterized (generic) types.
A top level interface for message converters.
Listener for handling individual incoming Kafka messages.
Internal abstraction used by the framework representing a message listener container.
A function that receives a spring-messaging Message and returns a Message.
A Messaging MessageConverter implementation for a message listener that receives individual messages.
An abstract MessageListener adapter providing the necessary infrastructure to extract the payload of a Message.
Root object for reply expression evaluation.
A Transformer implementation that invokes a MessagingFunction converting to/from spring-messaging Message.
A KafkaListenerEndpoint providing the method to invoke to process an incoming message for this endpoint.
A consumer factory listener that manages KafkaClientMetrics.
A wrapper for micrometer timers when available on the class path.
A producer factory listener that manages KafkaClientMetrics.
Support the use of MockConsumer in tests.
Support the use of MockProducer in tests.
The MethodKafkaListenerEndpoint extension for several POJO methods based on the KafkaHandler.
An event that is emitted when a consumer is not responding to the poll; with early versions of the kafka-clients, this was a possible indication that the broker is down.
Provider for OffsetAndMetadata.
Generic Deserializer for deserialization of entity from its String representation received from Kafka (a.k.a parsing).
Used to add partition/initial offset information to a KafkaListener.
The strategy to produce a Producer instance(s).
Called whenever a producer is added or removed.
Helper class for managing a Spring based Kafka DefaultKafkaProducerFactory in particular for obtaining transactional Kafka resources for a given ProducerFactory.
Listener for handling outbound Kafka messages.
Called by producer factories to perform post processing on newly created producers.
A MessageConverter implementation that uses a Spring Data ProjectionFactory to bind incoming messages to projection interfaces.
Reactive kafka consumer operations implementation.
Reactive kafka producer operations implementation.
Implementations of this interface can signal that a record about to be delivered to a message listener should be discarded instead of being delivered.
An interceptor for ConsumerRecord invoked by the listener container before and after invoking the listener.
A Kafka-specific Message converter strategy.
A MessageListener adapter that invokes a configurable HandlerAdapter; used when the factory is configured for the listener to receive individual messages.
A DeserializationExceptionHandler that calls a ConsumerRecordRecoverer.
Called to determine whether a record should be skipped.
A strategy for configuring which headers, if any, should be set in a reply message.
Request/reply operations.
A KafkaTemplate that implements request/reply semantics.
A CompletableFuture for requests/replies.
A listenable future for Message replies.
A listenable future for Message replies with a specific payload type.
Annotation to create the retry and dlt topics for a KafkaListener annotated listener.
Processes the provided RetryableTopic annotation returning an RetryTopicConfiguration.
A deserializer configured with a delegate and a RetryOperations to retry deserialization in case of transient errors.
A listener for retry activity.
The bean names for the non-blocking topic-based delayed retries feature.
Provide the component instances that will be used with RetryTopicConfigurationSupport.
Contains the provided configuration for the retryable topics.
Builder class to create RetryTopicConfiguration instances.
Attempts to provide an instance of RetryTopicConfiguration by either creating one from a RetryableTopic annotation, or from the bean container if no annotation is available.
This is the main class providing the configuration behind the non-blocking, topic-based delayed retries feature.
Configure blocking retries to be used along non-blocking.
Configure customizers for components instantiated by the retry topics feature.
Configures main, retry and DLT topics based on a main endpoint and provided configurations to accomplish a distributed retry / DLT pattern in a non-blocking fashion, at the expense of ordering guarantees.
 
Constants for the RetryTopic functionality.
Contains the headers that will be used in the forwarded messages.
Handles the naming related to the retry and dead letter topics.
 
A wrapper class for a TaskScheduler to use for scheduling container resumption when a partition has been paused for a retry topic.
A KafkaTemplate that routes messages based on the topic name.
Strategy for topic reuse when multiple, sequential retries have the same backoff interval.
Seek utilities.
Result for a CompletableFuture after a send.
Utilities for serialization.
A simple header mapper that maps headers directly; for outbound, only byte[] headers are mapped; for inbound, headers are mapped unchanged, as byte[].
An AbstractFactoryBean for the StreamsBuilder instance and lifecycle control for the internal KafkaStreams instance.
Called whenever a KafkaStreams is added or removed.
A configurer for StreamsBuilderFactoryBean.
JSON Message converter - String on output, String, Bytes, or byte[] on input.
A serializer that can handle byte[], Bytes and String.
Utility class that suffixes strings.
Retry and dead letter naming handling that add a suffix to each name.
 
A general interface for managing thread-bound resources when a Consumer is available.
A KafkaException that records the timestamp of when it was thrown.
A Serde that delegates to a ToStringSerializer and ParseStringDeserializer.
Builder for a NewTopic.
Marker to indicate this NewTopic is for retryable topics; admin will ignore these if a regular NewTopic exist.
Used to add topic/partition information to a KafkaListener.
A configuration container to represent a topic name, partition number and, optionally, an offset for it.
Enumeration for "special" seeks.
Constants for the RetryTopic functionality.
Generic Serializer that relies on Object.toString() to get serialized representation of the entity.