Class AnthropicChatModel

java.lang.Object
org.springframework.ai.anthropic.AnthropicChatModel
All Implemented Interfaces:
ChatModel, StreamingChatModel, Model<Prompt,ChatResponse>, StreamingModel<Prompt,ChatResponse>

public final class AnthropicChatModel extends Object implements ChatModel, StreamingChatModel
ChatModel and StreamingChatModel implementation using the official Anthropic Java SDK.

Supports synchronous and streaming completions, tool calling, and Micrometer-based observability. API credentials are auto-detected from ANTHROPIC_API_KEY if not configured.

Since:
1.0.0
Author:
Christian Tzolov, luocongqiu, Mariusz Bernacki, Thomas Vitale, Claudio Silva Junior, Alexandros Pappas, Jonghoon Park, Soby Chacko, Austin Dase
See Also:
  • Method Details

    • builder

      public static AnthropicChatModel.Builder builder()
      Creates a new builder for AnthropicChatModel.
      Returns:
      a new builder instance
    • getOptions

      public AnthropicChatOptions getOptions()
      Gets the chat options for this model.
      Returns:
      the chat options
    • getAnthropicClient

      public com.anthropic.client.AnthropicClient getAnthropicClient()
      Returns the underlying synchronous Anthropic SDK client. Useful for accessing SDK features directly, such as the Files API (client.beta().files()).
      Returns:
      the sync client
    • getAnthropicClientAsync

      public com.anthropic.client.AnthropicClientAsync getAnthropicClientAsync()
      Returns the underlying asynchronous Anthropic SDK client. Useful for non-blocking access to SDK features directly, such as the Files API.
      Returns:
      the async client
    • call

      public ChatResponse call(Prompt prompt)
      Description copied from interface: Model
      Executes a method call to the AI model.
      Specified by:
      call in interface ChatModel
      Specified by:
      call in interface Model<Prompt,ChatResponse>
      Parameters:
      prompt - the request object to be sent to the AI model
      Returns:
      the response from the AI model
    • stream

      public reactor.core.publisher.Flux<ChatResponse> stream(Prompt prompt)
      Description copied from interface: StreamingModel
      Executes a method call to the AI model.
      Specified by:
      stream in interface ChatModel
      Specified by:
      stream in interface StreamingChatModel
      Specified by:
      stream in interface StreamingModel<Prompt,ChatResponse>
      Parameters:
      prompt - the request object to be sent to the AI model
      Returns:
      the streaming response from the AI model
    • internalStream

      public reactor.core.publisher.Flux<ChatResponse> internalStream(Prompt prompt, @Nullable ChatResponse previousChatResponse)
      Internal method to handle streaming chat completion calls with tool execution support. This method is called recursively to support multi-turn tool calling.
      Parameters:
      prompt - The prompt for the chat completion. In a recursive tool-call scenario, this prompt will contain the full conversation history including the tool results.
      previousChatResponse - The chat response from the preceding API call. This is used to accumulate token usage correctly across multiple API calls in a single user turn.
      Returns:
      A Flux of ChatResponse events, which can include text chunks and the final response with tool call information or the model's final answer.
    • internalCall

      public ChatResponse internalCall(Prompt prompt, @Nullable ChatResponse previousChatResponse)
      Internal method to handle synchronous chat completion calls with tool execution support. This method is called recursively to support multi-turn tool calling.
      Parameters:
      prompt - The prompt for the chat completion. In a recursive tool-call scenario, this prompt will contain the full conversation history including the tool results.
      previousChatResponse - The chat response from the preceding API call. This is used to accumulate token usage correctly across multiple API calls in a single user turn.
      Returns:
      The final ChatResponse after all tool calls (if any) are resolved.
    • getDefaultOptions

      public ChatOptions getDefaultOptions()
      Specified by:
      getDefaultOptions in interface ChatModel
    • setObservationConvention

      public void setObservationConvention(ChatModelObservationConvention observationConvention)
      Use the provided convention for reporting observation data.
      Parameters:
      observationConvention - the provided convention