Interface StreamingModelClient<TReq extends ModelRequest<?>,TResChunk extends ModelResponse<?>>

Type Parameters:
TReq - the generic type of the request to the AI model
TResChunk - the generic type of a single item in the streaming response from the AI model
All Known Subinterfaces:
StreamingChatClient
All Known Implementing Classes:
AzureOpenAiChatClient, BedrockAnthropicChatClient, BedrockCohereChatClient, BedrockLlama2ChatClient, BedrockTitanChatClient, MistralAiChatClient, OllamaChatClient, OpenAiChatClient, VertexAiGeminiChatClient

public interface StreamingModelClient<TReq extends ModelRequest<?>,TResChunk extends ModelResponse<?>>
The StreamingModelClient interface provides a generic API for invoking a AI models with streaming response. It abstracts the process of sending requests and receiving a streaming responses. The interface uses Java generics to accommodate different types of requests and responses, enhancing flexibility and adaptability across different AI model implementations.
Since:
0.8.0
Author:
Christian Tzolov
  • Method Summary

    Modifier and Type
    Method
    Description
    reactor.core.publisher.Flux<TResChunk>
    stream(TReq request)
    Executes a method call to the AI model.
  • Method Details

    • stream

      reactor.core.publisher.Flux<TResChunk> stream(TReq request)
      Executes a method call to the AI model.
      Parameters:
      request - the request object to be sent to the AI model
      Returns:
      the streaming response from the AI model