Package org.springframework.ai.ollama
Class OllamaChatModel
java.lang.Object
org.springframework.ai.chat.model.AbstractToolCallSupport
org.springframework.ai.ollama.OllamaChatModel
- All Implemented Interfaces:
ChatModel
,StreamingChatModel
,Model<Prompt,
,ChatResponse> StreamingModel<Prompt,
ChatResponse>
ChatModel
implementation for Ollama. Ollama allows developers to run
large language models and generate embeddings locally. It supports open-source models
available on [Ollama AI Library](...) and on
Hugging Face. Please refer to the official Ollama
website for the most up-to-date information on available models.- Since:
- 1.0.0
- Author:
- Christian Tzolov, luocongqiu, Thomas Vitale, Jihoon Kim
-
Nested Class Summary
-
Field Summary
Fields inherited from class org.springframework.ai.chat.model.AbstractToolCallSupport
functionCallbackContext, functionCallbackRegister, IS_RUNTIME_CALL
-
Constructor Summary
ConstructorDescriptionOllamaChatModel
(OllamaApi ollamaApi, OllamaOptions defaultOptions, FunctionCallbackContext functionCallbackContext, List<FunctionCallback> toolFunctionCallbacks, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions) -
Method Summary
Modifier and TypeMethodDescriptionstatic OllamaChatModel.Builder
builder()
Executes a method call to the AI model.static ChatResponseMetadata
from
(OllamaApi.ChatResponse response) void
setObservationConvention
(ChatModelObservationConvention observationConvention) Use the provided convention for reporting observation datareactor.core.publisher.Flux<ChatResponse>
Executes a method call to the AI model.Methods inherited from class org.springframework.ai.chat.model.AbstractToolCallSupport
buildToolCallConversation, executeFunctions, getFunctionCallbackRegister, handleToolCalls, isProxyToolCalls, isToolCall, isToolCall, resolveFunctionCallbacks, runtimeFunctionCallbackConfigurations
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface org.springframework.ai.chat.model.StreamingChatModel
stream, stream
-
Constructor Details
-
OllamaChatModel
public OllamaChatModel(OllamaApi ollamaApi, OllamaOptions defaultOptions, FunctionCallbackContext functionCallbackContext, List<FunctionCallback> toolFunctionCallbacks, io.micrometer.observation.ObservationRegistry observationRegistry, ModelManagementOptions modelManagementOptions)
-
-
Method Details
-
builder
-
from
-
call
Description copied from interface:Model
Executes a method call to the AI model. -
stream
Description copied from interface:StreamingModel
Executes a method call to the AI model.- Specified by:
stream
in interfaceChatModel
- Specified by:
stream
in interfaceStreamingChatModel
- Specified by:
stream
in interfaceStreamingModel<Prompt,
ChatResponse> - Parameters:
prompt
- the request object to be sent to the AI model- Returns:
- the streaming response from the AI model
-
getDefaultOptions
- Specified by:
getDefaultOptions
in interfaceChatModel
-
setObservationConvention
Use the provided convention for reporting observation data- Parameters:
observationConvention
- The provided convention
-