Class OllamaApi
java.lang.Object
org.springframework.ai.ollama.api.OllamaApi
Java Client for the Ollama API. https://ollama.ai
- Since:
- 0.8.0
- Author:
- Christian Tzolov, Thomas Vitale, Jonghoon Park
-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic class
static final record
Chat request object.static final record
Ollama chat response object.static final record
static final record
static final record
Generate embeddings from a model.static final record
The response object returned from the /embedding endpoint.static final record
static final record
Chat message object.static final record
static final record
static final record
static final record
static final record
-
Field Summary
Fields -
Method Summary
Modifier and TypeMethodDescriptionstatic OllamaApi.Builder
builder()
chat
(OllamaApi.ChatRequest chatRequest) Generate the next message in a chat with a provided model.org.springframework.http.ResponseEntity<Void>
copyModel
(OllamaApi.CopyModelRequest copyModelRequest) Copy a model.org.springframework.http.ResponseEntity<Void>
deleteModel
(OllamaApi.DeleteModelRequest deleteModelRequest) Delete a model and its data.embed
(OllamaApi.EmbeddingsRequest embeddingsRequest) Generate embeddings from a model.List models that are available locally on the machine where Ollama is running.reactor.core.publisher.Flux<OllamaApi.ProgressResponse>
pullModel
(OllamaApi.PullModelRequest pullModelRequest) Download a model from the Ollama library.showModel
(OllamaApi.ShowModelRequest showModelRequest) Show information about a model available locally on the machine where Ollama is running.reactor.core.publisher.Flux<OllamaApi.ChatResponse>
streamingChat
(OllamaApi.ChatRequest chatRequest) Streaming response for the chat completion request.
-
Field Details
-
REQUEST_BODY_NULL_ERROR
- See Also:
-
-
Method Details
-
builder
-
chat
Generate the next message in a chat with a provided model. This is a streaming endpoint (controlled by the 'stream' request property), so there will be a series of responses. The final response object will include statistics and additional data from the request.- Parameters:
chatRequest
- Chat request.- Returns:
- Chat response.
-
streamingChat
public reactor.core.publisher.Flux<OllamaApi.ChatResponse> streamingChat(OllamaApi.ChatRequest chatRequest) Streaming response for the chat completion request.- Parameters:
chatRequest
- Chat request. The request must set the stream property to true.- Returns:
- Chat response as a
Flux
stream.
-
embed
Generate embeddings from a model.- Parameters:
embeddingsRequest
- Embedding request.- Returns:
- Embeddings response.
-
listModels
List models that are available locally on the machine where Ollama is running. -
showModel
Show information about a model available locally on the machine where Ollama is running. -
copyModel
public org.springframework.http.ResponseEntity<Void> copyModel(OllamaApi.CopyModelRequest copyModelRequest) Copy a model. Creates a model with another name from an existing model. -
deleteModel
public org.springframework.http.ResponseEntity<Void> deleteModel(OllamaApi.DeleteModelRequest deleteModelRequest) Delete a model and its data. -
pullModel
public reactor.core.publisher.Flux<OllamaApi.ProgressResponse> pullModel(OllamaApi.PullModelRequest pullModelRequest) Download a model from the Ollama library. Cancelled pulls are resumed from where they left off, and multiple calls will share the same download progress.
-