This version is still in development and is not considered stable yet. For the latest snapshot version, please use Spring AI 1.0.0-SNAPSHOT! |
Mistral AI Embeddings
Spring AI supports the Mistral AI’s text embeddings models. Embeddings are vectorial representations of text that capture the semantic meaning of paragraphs through their position in a high dimensional vector space. Mistral AI Embeddings API offers cutting-edge, state-of-the-art embeddings for text, which can be used for many NLP tasks.
Prerequisites
You will need to create an API with MistralAI to access MistralAI embeddings models.
Create an account at MistralAI registration page and generate the token on the API Keys page.
The Spring AI project defines a configuration property named spring.ai.mistralai.api-key
that you should set to the value of the API Key
obtained from console.mistral.ai.
Exporting an environment variable is one way to set that configuration property:
export SPRING_AI_MISTRALAI_API_KEY=<INSERT KEY HERE>
Add Repositories and BOM
Spring AI artifacts are published in Spring Milestone and Snapshot repositories. Refer to the Repositories section to add these repositories to your build system.
To help with dependency management, Spring AI provides a BOM (bill of materials) to ensure that a consistent version of Spring AI is used throughout the entire project. Refer to the Dependency Management section to add the Spring AI BOM to your build system.
Auto-configuration
Spring AI provides Spring Boot auto-configuration for the MistralAI Embedding Model.
To enable it add the following dependency to your project’s Maven pom.xml
file:
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-mistral-ai-spring-boot-starter</artifactId>
</dependency>
or to your Gradle build.gradle
build file.
dependencies {
implementation 'org.springframework.ai:spring-ai-mistral-ai-spring-boot-starter'
}
Refer to the Dependency Management section to add the Spring AI BOM to your build file. |
Embedding Properties
Retry Properties
The prefix spring.ai.retry
is used as the property prefix that lets you configure the retry mechanism for the Mistral AI Embedding model.
Property | Description | Default |
---|---|---|
spring.ai.retry.max-attempts |
Maximum number of retry attempts. |
10 |
spring.ai.retry.backoff.initial-interval |
Initial sleep duration for the exponential backoff policy. |
2 sec. |
spring.ai.retry.backoff.multiplier |
Backoff interval multiplier. |
5 |
spring.ai.retry.backoff.max-interval |
Maximum backoff duration. |
3 min. |
spring.ai.retry.on-client-errors |
If false, throw a NonTransientAiException, and do not attempt retry for |
false |
spring.ai.retry.exclude-on-http-codes |
List of HTTP status codes that should not trigger a retry (e.g. to throw NonTransientAiException). |
empty |
spring.ai.retry.on-http-codes |
List of HTTP status codes that should trigger a retry (e.g. to throw TransientAiException). |
empty |
Connection Properties
The prefix spring.ai.mistralai
is used as the property prefix that lets you connect to MistralAI.
Property | Description | Default |
---|---|---|
spring.ai.mistralai.base-url |
The URL to connect to |
|
spring.ai.mistralai.api-key |
The API Key |
- |
Configuration Properties
The prefix spring.ai.mistralai.embedding
is property prefix that configures the EmbeddingModel
implementation for MistralAI.
Property | Description | Default |
---|---|---|
spring.ai.mistralai.embedding.enabled |
Enable OpenAI embedding model. |
true |
spring.ai.mistralai.embedding.base-url |
Optional overrides the spring.ai.mistralai.base-url to provide embedding specific url |
- |
spring.ai.mistralai.embedding.api-key |
Optional overrides the spring.ai.mistralai.api-key to provide embedding specific api-key |
- |
spring.ai.mistralai.embedding.metadata-mode |
Document content extraction mode. |
EMBED |
spring.ai.mistralai.embedding.options.model |
The model to use |
mistral-embed |
spring.ai.mistralai.embedding.options.encodingFormat |
The format to return the embeddings in. Can be either float or base64. |
- |
You can override the common spring.ai.mistralai.base-url and spring.ai.mistralai.api-key for the ChatModel and EmbeddingModel implementations.
The spring.ai.mistralai.embedding.base-url and spring.ai.mistralai.embedding.api-key properties if set take precedence over the common properties.
Similarly, the spring.ai.mistralai.chat.base-url and spring.ai.mistralai.chat.api-key properties if set take precedence over the common properties.
This is useful if you want to use different MistralAI accounts for different models and different model endpoints.
|
All properties prefixed with spring.ai.mistralai.embedding.options can be overridden at runtime by adding a request specific Runtime Options to the EmbeddingRequest call.
|
Runtime Options
The MistralAiEmbeddingOptions.java provides the MistralAI configurations, such as the model to use and etc.
The default options can be configured using the spring.ai.mistralai.embedding.options
properties as well.
At start-time use the MistralAiEmbeddingModel
constructor to set the default options used for all embedding requests.
At run-time you can override the default options, using a MistralAiEmbeddingOptions
instance as part of your EmbeddingRequest
.
For example to override the default model name for a specific request:
EmbeddingResponse embeddingResponse = embeddingModel.call(
new EmbeddingRequest(List.of("Hello World", "World is big and salvation is near"),
MistralAiEmbeddingOptions.builder()
.withModel("Different-Embedding-Model-Deployment-Name")
.build()));
Sample Controller
This will create a EmbeddingModel
implementation that you can inject into your class.
Here is an example of a simple @Controller
class that uses the EmbeddingModel
implementation.
spring.ai.mistralai.api-key=YOUR_API_KEY
spring.ai.mistralai.embedding.options.model=mistral-embed
@RestController
public class EmbeddingController {
private final EmbeddingModel embeddingModel;
@Autowired
public EmbeddingController(EmbeddingModel embeddingModel) {
this.embeddingModel = embeddingModel;
}
@GetMapping("/ai/embedding")
public Map embed(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
var embeddingResponse = this.embeddingModel.embedForResponse(List.of(message));
return Map.of("embedding", embeddingResponse);
}
}
Manual Configuration
If you are not using Spring Boot, you can manually configure the OpenAI Embedding Model.
For this add the spring-ai-mistral-ai
dependency to your project’s Maven pom.xml
file:
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-mistral-ai</artifactId>
</dependency>
or to your Gradle build.gradle
build file.
dependencies {
implementation 'org.springframework.ai:spring-ai-mistral-ai'
}
Refer to the Dependency Management section to add the Spring AI BOM to your build file. |
The spring-ai-mistral-ai dependency provides access also to the MistralAiChatModel .
For more information about the MistralAiChatModel refer to the MistralAI Chat Client section.
|
Next, create an MistralAiEmbeddingModel
instance and use it to compute the similarity between two input texts:
var mistralAiApi = new MistralAiApi(System.getenv("MISTRAL_AI_API_KEY"));
var embeddingModel = new MistralAiEmbeddingModel(this.mistralAiApi,
MistralAiEmbeddingOptions.builder()
.withModel("mistral-embed")
.withEncodingFormat("float")
.build());
EmbeddingResponse embeddingResponse = this.embeddingModel
.embedForResponse(List.of("Hello World", "World is big and salvation is near"));
The MistralAiEmbeddingOptions
provides the configuration information for the embedding requests.
The options class offers a builder()
for easy options creation.