Package org.springframework.ai.ollama
Class OllamaChatModel
java.lang.Object
org.springframework.ai.ollama.OllamaChatModel
- All Implemented Interfaces:
ChatModel,StreamingChatModel,Model<Prompt,,ChatResponse> StreamingModel<Prompt,ChatResponse>
ChatModel implementation for Ollama. Ollama allows developers to run
large language models and generate embeddings locally. It supports open-source models
available on [Ollama AI Library](...) and on
Hugging Face. Please refer to the official Ollama
website for the most up-to-date information on available models.- Since:
- 1.0.0
- Author:
- Christian Tzolov, luocongqiu, Thomas Vitale, Jihoon Kim, Alexandros Pappas, Ilayaperumal Gopinathan, Sun Yuhan
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionOllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions) OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions, ToolExecutionEligibilityPredicate toolExecutionEligibilityPredicate, org.springframework.retry.support.RetryTemplate retryTemplate) -
Method Summary
Modifier and TypeMethodDescriptionstatic OllamaChatModel.Builderbuilder()Executes a method call to the AI model.voidsetObservationConvention(ChatModelObservationConvention observationConvention) Use the provided convention for reporting observation datareactor.core.publisher.Flux<ChatResponse>Executes a method call to the AI model.Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.springframework.ai.chat.model.StreamingChatModel
stream, stream
-
Constructor Details
-
OllamaChatModel
public OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions) -
OllamaChatModel
public OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, ToolCallingManager toolCallingManager, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions, ToolExecutionEligibilityPredicate toolExecutionEligibilityPredicate, org.springframework.retry.support.RetryTemplate retryTemplate)
-
-
Method Details
-
builder
-
call
Description copied from interface:ModelExecutes a method call to the AI model. -
stream
Description copied from interface:StreamingModelExecutes a method call to the AI model.- Specified by:
streamin interfaceChatModel- Specified by:
streamin interfaceStreamingChatModel- Specified by:
streamin interfaceStreamingModel<Prompt,ChatResponse> - Parameters:
prompt- the request object to be sent to the AI model- Returns:
- the streaming response from the AI model
-
getDefaultOptions
- Specified by:
getDefaultOptionsin interfaceChatModel
-
setObservationConvention
Use the provided convention for reporting observation data- Parameters:
observationConvention- The provided convention
-