Class OllamaApi

java.lang.Object
org.springframework.ai.ollama.api.OllamaApi

public final class OllamaApi extends Object
Java Client for the Ollama API. https://ollama.ai
Since:
0.8.0
Author:
Christian Tzolov, Thomas Vitale, Jonghoon Park
  • Field Details

  • Method Details

    • builder

      public static OllamaApi.Builder builder()
    • chat

      public OllamaApi.ChatResponse chat(OllamaApi.ChatRequest chatRequest)
      Generate the next message in a chat with a provided model. This is a streaming endpoint (controlled by the 'stream' request property), so there will be a series of responses. The final response object will include statistics and additional data from the request.
      Parameters:
      chatRequest - Chat request.
      Returns:
      Chat response.
    • streamingChat

      public reactor.core.publisher.Flux<OllamaApi.ChatResponse> streamingChat(OllamaApi.ChatRequest chatRequest)
      Streaming response for the chat completion request.
      Parameters:
      chatRequest - Chat request. The request must set the stream property to true.
      Returns:
      Chat response as a Flux stream.
    • embed

      Generate embeddings from a model.
      Parameters:
      embeddingsRequest - Embedding request.
      Returns:
      Embeddings response.
    • listModels

      public OllamaApi.ListModelResponse listModels()
      List models that are available locally on the machine where Ollama is running.
    • showModel

      public OllamaApi.ShowModelResponse showModel(OllamaApi.ShowModelRequest showModelRequest)
      Show information about a model available locally on the machine where Ollama is running.
    • copyModel

      public org.springframework.http.ResponseEntity<Void> copyModel(OllamaApi.CopyModelRequest copyModelRequest)
      Copy a model. Creates a model with another name from an existing model.
    • deleteModel

      public org.springframework.http.ResponseEntity<Void> deleteModel(OllamaApi.DeleteModelRequest deleteModelRequest)
      Delete a model and its data.
    • pullModel

      public reactor.core.publisher.Flux<OllamaApi.ProgressResponse> pullModel(OllamaApi.PullModelRequest pullModelRequest)
      Download a model from the Ollama library. Cancelled pulls are resumed from where they left off, and multiple calls will share the same download progress.