Class GoogleGenAiChatModel

java.lang.Object
org.springframework.ai.google.genai.GoogleGenAiChatModel
All Implemented Interfaces:
ChatModel, StreamingChatModel, Model<Prompt,ChatResponse>, StreamingModel<Prompt,ChatResponse>, org.springframework.beans.factory.DisposableBean

public class GoogleGenAiChatModel extends Object implements ChatModel, org.springframework.beans.factory.DisposableBean
Google GenAI Chat Model implementation that provides access to Google's Gemini language models.

Key features include:

  • Support for multiple Gemini model versions including Gemini Pro, Gemini 1.5 Pro, Gemini 1.5/2.0 Flash variants
  • Tool/Function calling capabilities through ToolCallingManager
  • Streaming support via stream(Prompt) method
  • Configurable safety settings through GoogleGenAiSafetySetting
  • Support for system messages and multi-modal content (text and images)
  • Built-in retry mechanism and observability through Micrometer
  • Google Search Retrieval integration

The model can be configured with various options including temperature, top-k, top-p sampling, maximum output tokens, and candidate count through GoogleGenAiChatOptions.

Use the GoogleGenAiChatModel.Builder to create instances with custom configurations:


 GoogleGenAiChatModel model = GoogleGenAiChatModel.builder()
 		.genAiClient(genAiClient)
 		.defaultOptions(options)
 		.toolCallingManager(toolManager)
 		.build();
 
Since:
0.8.1
Author:
Christian Tzolov, Grogdunn, luocongqiu, Chris Turchin, Mark Pollack, Soby Chacko, Jihoon Kim, Alexandros Pappas, Ilayaperumal Gopinathan, Dan Dobrin
See Also:
  • Constructor Details

    • GoogleGenAiChatModel

      public GoogleGenAiChatModel(com.google.genai.Client genAiClient, GoogleGenAiChatOptions defaultOptions, ToolCallingManager toolCallingManager, org.springframework.retry.support.RetryTemplate retryTemplate, io.micrometer.observation.ObservationRegistry observationRegistry)
      Creates a new instance of GoogleGenAiChatModel.
      Parameters:
      genAiClient - the GenAI Client instance to use
      defaultOptions - the default options to use
      toolCallingManager - the tool calling manager to use. It is wrapped in a GoogleGenAiToolCallingManager to ensure compatibility with Vertex AI's OpenAPI schema format.
      retryTemplate - the retry template to use
      observationRegistry - the observation registry to use
    • GoogleGenAiChatModel

      public GoogleGenAiChatModel(com.google.genai.Client genAiClient, GoogleGenAiChatOptions defaultOptions, ToolCallingManager toolCallingManager, org.springframework.retry.support.RetryTemplate retryTemplate, io.micrometer.observation.ObservationRegistry observationRegistry, ToolExecutionEligibilityPredicate toolExecutionEligibilityPredicate)
      Creates a new instance of GoogleGenAiChatModel.
      Parameters:
      genAiClient - the GenAI Client instance to use
      defaultOptions - the default options to use
      toolCallingManager - the tool calling manager to use. It is wrapped in a GoogleGenAiToolCallingManager to ensure compatibility with Vertex AI's OpenAPI schema format.
      retryTemplate - the retry template to use
      observationRegistry - the observation registry to use
      toolExecutionEligibilityPredicate - the tool execution eligibility predicate
  • Method Details

    • call

      public ChatResponse call(Prompt prompt)
      Description copied from interface: Model
      Executes a method call to the AI model.
      Specified by:
      call in interface ChatModel
      Specified by:
      call in interface Model<Prompt,ChatResponse>
      Parameters:
      prompt - the request object to be sent to the AI model
      Returns:
      the response from the AI model
    • stream

      public reactor.core.publisher.Flux<ChatResponse> stream(Prompt prompt)
      Description copied from interface: StreamingModel
      Executes a method call to the AI model.
      Specified by:
      stream in interface ChatModel
      Specified by:
      stream in interface StreamingChatModel
      Specified by:
      stream in interface StreamingModel<Prompt,ChatResponse>
      Parameters:
      prompt - the request object to be sent to the AI model
      Returns:
      the streaming response from the AI model
    • internalStream

      public reactor.core.publisher.Flux<ChatResponse> internalStream(Prompt prompt, ChatResponse previousChatResponse)
    • responseCandidateToGeneration

      protected List<Generation> responseCandidateToGeneration(com.google.genai.types.Candidate candidate)
    • getDefaultOptions

      public ChatOptions getDefaultOptions()
      Specified by:
      getDefaultOptions in interface ChatModel
    • getCachedContentService

      public GoogleGenAiCachedContentService getCachedContentService()
      Gets the cached content service for managing cached content.
      Returns:
      the cached content service
    • destroy

      public void destroy() throws Exception
      Specified by:
      destroy in interface org.springframework.beans.factory.DisposableBean
      Throws:
      Exception
    • setObservationConvention

      public void setObservationConvention(ChatModelObservationConvention observationConvention)
      Use the provided convention for reporting observation data
      Parameters:
      observationConvention - The provided convention
    • builder

      public static GoogleGenAiChatModel.Builder builder()