OCI GenAI Cohere Chat

OCI GenAI Service 提供具有按需模型或专用人工智能集群的生成式人工智能聊天。

OCI GenAI Service offers generative AI chat with on-demand models, or dedicated AI clusters.

OCI Chat Models PageOCI Generative AI Playground 提供有关在 OCI 上使用和托管聊天模型的详细信息。

The OCI Chat Models Page and OCI Generative AI Playground provide detailed information about using and hosting chat models on OCI.

Prerequisites

您需要一个有效的 Oracle Cloud Infrastructure (OCI) 帐户才能使用 OCI GenAI Cohere Chat 客户端。客户端提供四种不同的连接方式,包括使用用户和私钥的简单身份验证、工作负载身份、实例主体或 OCI 配置文件身份验证。

You will need an active Oracle Cloud Infrastructure (OCI) account to use the OCI GenAI Cohere Chat client. The client offers four different ways to connect, including simple authentication with a user and private key, workload identity, instance principal, or OCI configuration file authentication.

Add Repositories and BOM

Spring AI 工件发布在 Maven Central 和 Spring Snapshot 存储库中。请参阅“添加 Spring AI 仓库”部分,将这些仓库添加到您的构建系统。

Spring AI artifacts are published in Maven Central and Spring Snapshot repositories. Refer to the Artifact Repositories section to add these repositories to your build system.

为了帮助进行依赖项管理,Spring AI 提供了一个 BOM(物料清单)以确保在整个项目中使用一致版本的 Spring AI。有关将 Spring AI BOM 添加到你的构建系统的说明,请参阅 Dependency Management 部分。

To help with dependency management, Spring AI provides a BOM (bill of materials) to ensure that a consistent version of Spring AI is used throughout the entire project. Refer to the Dependency Management section to add the Spring AI BOM to your build system.

Auto-configuration

Spring AI 自动配置、启动器模块的工件名称发生了重大变化。请参阅 upgrade notes 以获取更多信息。

There has been a significant change in the Spring AI auto-configuration, starter modules' artifact names. Please refer to the upgrade notes for more information.

Spring AI 为 OCI GenAI Cohere Chat Client 提供了 Spring Boot 自动配置。要启用它,请将以下依赖项添加到您的项目 Maven pom.xml 文件中:

Spring AI provides Spring Boot auto-configuration for the OCI GenAI Cohere Chat Client. To enable it add the following dependency to your project’s Maven pom.xml file:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-starter-model-oci-genai</artifactId>
</dependency>

或添加到 Gradle build.gradle 构建文件中。

or to your Gradle build.gradle build file.

dependencies {
    implementation 'org.springframework.ai:spring-ai-starter-model-oci-genai'
}
  1. 参见 Dependency Management 部分,将 Spring AI BOM 添加到你的构建文件中。

Refer to the Dependency Management section to add the Spring AI BOM to your build file.

Chat Properties

Connection Properties

前缀 spring.ai.oci.genai 是用于配置与 OCI GenAI 连接的属性前缀。

The prefix spring.ai.oci.genai is the property prefix to configure the connection to OCI GenAI.

Property Description Default

spring.ai.oci.genai.authenticationType

The type of authentication to use when authenticating to OCI. May be file, instance-principal, workload-identity, or simple.

file

spring.ai.oci.genai.region

OCI service region.

us-chicago-1

spring.ai.oci.genai.tenantId

OCI tenant OCID, used when authenticating with simple auth.

-

spring.ai.oci.genai.userId

OCI user OCID, used when authenticating with simple auth.

-

spring.ai.oci.genai.fingerprint

Private key fingerprint, used when authenticating with simple auth.

-

spring.ai.oci.genai.privateKey

Private key content, used when authenticating with simple auth.

-

spring.ai.oci.genai.passPhrase

Optional private key passphrase, used when authenticating with simple auth and a passphrase protected private key.

-

spring.ai.oci.genai.file

Path to OCI config file. Used when authenticating with file auth.

<user’s home directory>/.oci/config

spring.ai.oci.genai.profile

OCI profile name. Used when authenticating with file auth.

DEFAULT

spring.ai.oci.genai.endpoint

Optional OCI GenAI endpoint.

-

Configuration Properties

聊天自动配置的启用和禁用现在通过前缀为 spring.ai.model.chat 的顶级属性进行配置。要启用,spring.ai.model.chat=oci-genai(默认启用)要禁用,spring.ai.model.chat=none(或任何不匹配 oci-genai 的值)此更改旨在允许配置多个模型。

Enabling and disabling of the chat auto-configurations are now configured via top level properties with the prefix spring.ai.model.chat. To enable, spring.ai.model.chat=oci-genai (It is enabled by default) To disable, spring.ai.model.chat=none (or any value which doesn’t match oci-genai) This change is done to allow configuration of multiple models.

前缀 spring.ai.oci.genai.chat.cohere 是配置 OCI GenAI Cohere Chat 的 ChatModel 实现的属性前缀。

The prefix spring.ai.oci.genai.chat.cohere is the property prefix that configures the ChatModel implementation for OCI GenAI Cohere Chat.

Property Description Default

spring.ai.model.chat

Enable OCI GenAI Cohere chat model.

oci-genai

spring.ai.oci.genai.chat.cohere.enabled (no longer valid)

Enable OCI GenAI Cohere chat model.

true

spring.ai.oci.genai.chat.cohere.options.model

Model OCID or endpoint

-

spring.ai.oci.genai.chat.cohere.options.compartment

Model compartment OCID.

-

spring.ai.oci.genai.chat.cohere.options.servingMode

The model serving mode to be used. May be on-demand, or dedicated.

on-demand

spring.ai.oci.genai.chat.cohere.options.preambleOverride

Override the chat model’s prompt preamble

-

spring.ai.oci.genai.chat.cohere.options.temperature

Inference temperature

-

spring.ai.oci.genai.chat.cohere.options.topP

Top P parameter

-

spring.ai.oci.genai.chat.cohere.options.topK

Top K parameter

-

spring.ai.oci.genai.chat.cohere.options.frequencyPenalty

Higher values will reduce repeated tokens and outputs will be more random.

-

spring.ai.oci.genai.chat.cohere.options.presencePenalty

Higher values encourage generating outputs with tokens that haven’t been used.

-

spring.ai.oci.genai.chat.cohere.options.stop

List of textual sequences that will end completions generation.

-

spring.ai.oci.genai.chat.cohere.options.documents

List of documents used in chat context.

-

所有带有 spring.ai.oci.genai.chat.cohere.options 前缀的属性都可以在运行时通过向 Prompt 调用添加请求特定的 Runtime Options 来覆盖。

All properties prefixed with spring.ai.oci.genai.chat.cohere.options can be overridden at runtime by adding a request specific Runtime Options to the Prompt call.

Runtime Options

OCICohereChatOptions.java 提供模型配置,例如要使用的模型、温度、频率惩罚等。

The OCICohereChatOptions.java provides model configurations, such as the model to use, the temperature, the frequency penalty, etc.

在启动时,可以使用 OCICohereChatModel(api, options) 构造函数或 spring.ai.oci.genai.chat.cohere.options.* 属性配置默认选项。

On start-up, the default options can be configured with the OCICohereChatModel(api, options) constructor or the spring.ai.oci.genai.chat.cohere.options.* properties.

在运行时,可以通过向 Prompt 调用中添加新的请求特定选项来覆盖默认选项。例如,覆盖特定请求的默认模型和温度:

At run-time you can override the default options by adding new, request specific, options to the Prompt call. For example to override the default model and temperature for a specific request:

ChatResponse response = chatModel.call(
    new Prompt(
        "Generate the names of 5 famous pirates.",
        OCICohereChatOptions.builder()
            .model("my-model-ocid")
            .compartment("my-compartment-ocid")
            .temperature(0.5)
        .build()
    ));

Sample Controller

Create 一个新的 Spring Boot 项目,并将 spring-ai-starter-model-oci-genai 添加到您的 pom(或 gradle)依赖项中。

Create a new Spring Boot project and add the spring-ai-starter-model-oci-genai to your pom (or gradle) dependencies.

src/main/resources 目录下添加一个 application.properties 文件,以启用和配置 OCI GenAI Cohere 聊天模型:

Add a application.properties file, under the src/main/resources directory, to enable and configure the OCI GenAI Cohere chat model:

spring.ai.oci.genai.authenticationType=file
spring.ai.oci.genai.file=/path/to/oci/config/file
spring.ai.oci.genai.cohere.chat.options.compartment=my-compartment-ocid
spring.ai.oci.genai.cohere.chat.options.servingMode=on-demand
spring.ai.oci.genai.cohere.chat.options.model=my-chat-model-ocid

filecompartmentmodel 替换为您的 OCI 帐户中的值。

replace the file, compartment, and model with your values from your OCI account.

这将创建一个 OCICohereChatModel 实现,您可以将其注入到您的类中。这是一个简单的 @Controller 类的示例,它使用聊天模型进行文本生成。

This will create a OCICohereChatModel implementation that you can inject into your class. Here is an example of a simple @Controller class that uses the chat model for text generations.

@RestController
public class ChatController {

    private final OCICohereChatModel chatModel;

    @Autowired
    public ChatController(OCICohereChatModel chatModel) {
        this.chatModel = chatModel;
    }

    @GetMapping("/ai/generate")
    public Map generate(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
        return Map.of("generation", chatModel.call(message));
    }

    @GetMapping("/ai/generateStream")
	public Flux<ChatResponse> generateStream(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
        var prompt = new Prompt(new UserMessage(message));
        return chatModel.stream(prompt);
    }
}

Manual Configuration

OCICohereChatModel 实现了 ChatModel 并使用 OCI Java SDK 连接到 OCI GenAI 服务。

The OCICohereChatModel implements the ChatModel and uses the OCI Java SDK to connect to the OCI GenAI service.

spring-ai-oci-genai 依赖项添加到您项目的 Maven pom.xml 文件中:

Add the spring-ai-oci-genai dependency to your project’s Maven pom.xml file:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-oci-genai</artifactId>
</dependency>

或添加到 Gradle build.gradle 构建文件中。

or to your Gradle build.gradle build file.

dependencies {
    implementation 'org.springframework.ai:spring-ai-oci-genai'
}
  1. 参见 Dependency Management 部分,将 Spring AI BOM 添加到你的构建文件中。

Refer to the Dependency Management section to add the Spring AI BOM to your build file.

接下来,创建 OCICohereChatModel 并将其用于文本生成:

Next, create a OCICohereChatModel and use it for text generations:

var CONFIG_FILE = Paths.get(System.getProperty("user.home"), ".oci", "config").toString();
var COMPARTMENT_ID = System.getenv("OCI_COMPARTMENT_ID");
var MODEL_ID = System.getenv("OCI_CHAT_MODEL_ID");

ConfigFileAuthenticationDetailsProvider authProvider = new ConfigFileAuthenticationDetailsProvider(
        CONFIG_FILE,
        "DEFAULT"
);
var genAi = GenerativeAiInferenceClient.builder()
        .region(Region.valueOf("us-chicago-1"))
        .build(authProvider);

var chatModel = new OCICohereChatModel(genAi, OCICohereChatOptions.builder()
        .model(MODEL_ID)
        .compartment(COMPARTMENT_ID)
        .servingMode("on-demand")
        .build());

ChatResponse response = chatModel.call(
        new Prompt("Generate the names of 5 famous pirates."));

OCICohereChatOptions 提供聊天请求的配置信息。 OCICohereChatOptions.Builder 是流畅的选项构建器。

The OCICohereChatOptions provides the configuration information for the chat requests. The OCICohereChatOptions.Builder is fluent options builder.