This version is still in development and is not considered stable yet. For the latest snapshot version, please use Spring AI 1.0.0-SNAPSHOT!

Azure OpenAI Embeddings

Azure’s OpenAI extends the OpenAI capabilities, offering safe text generation and Embeddings computation models for various task:

  • Similarity embeddings are good at capturing semantic similarity between two or more pieces of text.

  • Text search embeddings help measure whether long documents are relevant to a short query.

  • Code search embeddings are useful for embedding code snippets and embedding natural language search queries.

The Azure OpenAI embeddings rely on cosine similarity to compute similarity between documents and a query.

Prerequisites

The Azure OpenAI client offers three options to connect: using an Azure API key or using an OpenAI API Key, or using Microsoft Entra ID.

Azure API Key & Endpoint

Obtain your Azure OpenAI endpoint and api-key from the Azure OpenAI Service section on the Azure Portal.

Spring AI defines two configuration properties:

  1. spring.ai.azure.openai.api-key: Set this to the value of the API Key obtained from Azure.

  2. spring.ai.azure.openai.endpoint: Set this to the endpoint URL obtained when provisioning your model in Azure.

You can set these configuration properties by exporting environment variables:

export SPRING_AI_AZURE_OPENAI_API_KEY=<INSERT AZURE KEY HERE>
export SPRING_AI_AZURE_OPENAI_ENDPOINT=<INSERT ENDPOINT URL HERE>

OpenAI Key

To authenticate with the OpenAI service (not Azure), provide an OpenAI API key. This will automatically set the endpoint to api.openai.com/v1.

When using this approach, set the spring.ai.azure.openai.chat.options.deployment-name property to the name of the OpenAI model you wish to use.

export SPRING_AI_AZURE_OPENAI_OPENAI_API_KEY=<INSERT OPENAI KEY HERE>

Microsoft Entra ID

To authenticate using Microsoft Entra ID (formerly Azure Active Directory), create a TokenCredential bean in your configuration. If this bean is available, an OpenAIClient instance will be created using the token credentials.

Add Repositories and BOM

Spring AI artifacts are published in Spring Milestone and Snapshot repositories. Refer to the Repositories section to add these repositories to your build system.

To help with dependency management, Spring AI provides a BOM (bill of materials) to ensure that a consistent version of Spring AI is used throughout the entire project. Refer to the Dependency Management section to add the Spring AI BOM to your build system.

Auto-configuration

Spring AI provides Spring Boot auto-configuration for the Azure OpenAI Embedding Model. To enable it add the following dependency to your project’s Maven pom.xml file:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-azure-openai-spring-boot-starter</artifactId>
</dependency>

or to your Gradle build.gradle build file.

dependencies {
    implementation 'org.springframework.ai:spring-ai-azure-openai-spring-boot-starter'
}
Refer to the Dependency Management section to add the Spring AI BOM to your build file.

Embedding Properties

The prefix spring.ai.azure.openai is the property prefix to configure the connection to Azure OpenAI.

Property Description Default

spring.ai.azure.openai.api-key

The Key from Azure AI OpenAI Keys and Endpoint section under Resource Management

-

spring.ai.azure.openai.endpoint

The endpoint from the Azure AI OpenAI Keys and Endpoint section under Resource Management

-

spring.ai.azure.openai.openai-api-key

(non Azure) OpenAI API key. Used to authenticate with the OpenAI service, instead of Azure OpenAI. This automatically sets the endpoint to api.openai.com/v1. Use either api-key or openai-api-key property. With this configuraiton the spring.ai.azure.openai.embedding.options.deployment-name is threated as an OpenAi Model name.

-

The prefix spring.ai.azure.openai.embedding is the property prefix that configures the EmbeddingModel implementation for Azure OpenAI

Property Description Default

spring.ai.azure.openai.embedding.enabled

Enable Azure OpenAI embedding model.

true

spring.ai.azure.openai.embedding.metadata-mode

Document content extraction mode

EMBED

spring.ai.azure.openai.embedding.options.deployment-name

This is the value of the 'Deployment Name' as presented in the Azure AI Portal

text-embedding-ada-002

spring.ai.azure.openai.embedding.options.user

An identifier for the caller or end user of the operation. This may be used for tracking or rate-limiting purposes.

-

All properties prefixed with spring.ai.azure.openai.embedding.options can be overridden at runtime by adding a request specific Runtime Options to the EmbeddingRequest call.

Runtime Options

The AzureOpenAiEmbeddingOptions provides the configuration information for the embedding requests. The AzureOpenAiEmbeddingOptions offers a builder to create the options.

At start time use the AzureOpenAiEmbeddingModel constructor to set the default options used for all embedding requests. At run-time you can override the default options, by passing a AzureOpenAiEmbeddingOptions instance with your to the EmbeddingRequest request.

For example to override the default model name for a specific request:

EmbeddingResponse embeddingResponse = embeddingModel.call(
    new EmbeddingRequest(List.of("Hello World", "World is big and salvation is near"),
        AzureOpenAiEmbeddingOptions.builder()
        .withModel("Different-Embedding-Model-Deployment-Name")
        .build()));

Sample Code

This will create a EmbeddingModel implementation that you can inject into your class. Here is an example of a simple @Controller class that uses the EmbeddingModel implementation.

spring.ai.azure.openai.api-key=YOUR_API_KEY
spring.ai.azure.openai.endpoint=YOUR_ENDPOINT
spring.ai.azure.openai.embedding.options.model=text-embedding-ada-002
@RestController
public class EmbeddingController {

    private final EmbeddingModel embeddingModel;

    @Autowired
    public EmbeddingController(EmbeddingModel embeddingModel) {
        this.embeddingModel = embeddingModel;
    }

    @GetMapping("/ai/embedding")
    public Map embed(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
        EmbeddingResponse embeddingResponse = this.embeddingModel.embedForResponse(List.of(message));
        return Map.of("embedding", embeddingResponse);
    }
}

Manual Configuration

If you prefer not to use the Spring Boot auto-configuration, you can manually configure the AzureOpenAiEmbeddingModel in your application. For this add the spring-ai-azure-openai dependency to your project’s Maven pom.xml file:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-azure-openai</artifactId>
</dependency>

or to your Gradle build.gradle build file.

dependencies {
    implementation 'org.springframework.ai:spring-ai-azure-openai'
}
Refer to the Dependency Management section to add the Spring AI BOM to your build file.
The spring-ai-azure-openai dependency also provide the access to the AzureOpenAiEmbeddingModel. For more information about the AzureOpenAiChatModel refer to the Azure OpenAI Embeddings section.

Next, create an AzureOpenAiEmbeddingModel instance and use it to compute the similarity between two input texts:

var openAIClient = OpenAIClientBuilder()
        .credential(new AzureKeyCredential(System.getenv("AZURE_OPENAI_API_KEY")))
		.endpoint(System.getenv("AZURE_OPENAI_ENDPOINT"))
		.buildClient();

var embeddingModel = new AzureOpenAiEmbeddingModel(this.openAIClient)
    .withDefaultOptions(AzureOpenAiEmbeddingOptions.builder()
        .withModel("text-embedding-ada-002")
        .withUser("user-6")
        .build());

EmbeddingResponse embeddingResponse = this.embeddingModel
	.embedForResponse(List.of("Hello World", "World is big and salvation is near"));
the text-embedding-ada-002 is actually the Deployment Name as presented in the Azure AI Portal.