Interface StreamingModel<TReq extends ModelRequest<?>,TResChunk extends ModelResponse<?>>

Type Parameters:
TReq - the generic type of the request to the AI model
TResChunk - the generic type of a single item in the streaming response from the AI model
All Known Subinterfaces:
ChatModel, StreamingChatModel, StreamingSpeechModel
All Known Implementing Classes:
AnthropicChatModel, AzureOpenAiChatModel, BedrockAi21Jurassic2ChatModel, BedrockAnthropic3ChatModel, BedrockAnthropicChatModel, BedrockCohereChatModel, BedrockLlamaChatModel, BedrockProxyChatModel, BedrockTitanChatModel, HuggingfaceChatModel, MiniMaxChatModel, MistralAiChatModel, MoonshotChatModel, OCICohereChatModel, OllamaChatModel, OpenAiAudioSpeechModel, OpenAiChatModel, QianFanChatModel, VertexAiGeminiChatModel, WatsonxAiChatModel, ZhiPuAiChatModel

public interface StreamingModel<TReq extends ModelRequest<?>,TResChunk extends ModelResponse<?>>
The StreamingModel interface provides a generic API for invoking an AI models with streaming response. It abstracts the process of sending requests and receiving a streaming responses. The interface uses Java generics to accommodate different types of requests and responses, enhancing flexibility and adaptability across different AI model implementations.
Since:
0.8.0
Author:
Christian Tzolov
  • Method Summary

    Modifier and Type
    Method
    Description
    reactor.core.publisher.Flux<TResChunk>
    stream(TReq request)
    Executes a method call to the AI model.
  • Method Details

    • stream

      reactor.core.publisher.Flux<TResChunk> stream(TReq request)
      Executes a method call to the AI model.
      Parameters:
      request - the request object to be sent to the AI model
      Returns:
      the streaming response from the AI model