Anthropic 3 Chat
Anthropic Claude 是一系列基础AI模型,可用于各种应用程序。对于开发者和企业,您可以利用API访问并在 Anthropic’s AI infrastructure 之上直接构建。
Anthropic Claude is a family of foundational AI models that can be used in a variety of applications. For developers and businesses, you can leverage the API access and build directly on top of Anthropic’s AI infrastructure.
Spring AI支持Anthropic Messaging API 用于同步和流式文本生成。
Spring AI supports the Anthropic Messaging API for sync and streaming text generations.
Anthropic的Claude模型也可通过Amazon Bedrock Converse获得。Spring AI也提供了专门的 Amazon Bedrock Converse Anthropic 客户端实现。 |
Anthropic’s Claude models are also available through Amazon Bedrock Converse. Spring AI provides dedicated Amazon Bedrock Converse Anthropic client implementations as well. |
Prerequisites
您需要在Anthropic门户上创建API密钥。
You will need to create an API key on the Anthropic portal.
在 Anthropic API dashboard 创建账户并在 Get API Keys 页面生成API密钥。
Create an account at Anthropic API dashboard and generate the API key on the Get API Keys page.
Spring AI项目定义了一个名为 spring.ai.anthropic.api-key
的配置属性,您应该将其设置为从anthropic.com获得的 API Key
的值。
The Spring AI project defines a configuration property named spring.ai.anthropic.api-key
that you should set to the value of the API Key
obtained from anthropic.com.
你可以在` application.properties
`文件中设置此配置属性:
You can set this configuration property in your application.properties
file:
spring.ai.anthropic.api-key=<your-anthropic-api-key>
为了在处理 API 密钥等敏感信息时增强安全性,您可以使用 Spring 表达式语言 (SpEL) 引用自定义环境变量:
For enhanced security when handling sensitive information like API keys, you can use Spring Expression Language (SpEL) to reference a custom environment variable:
# In application.yml
spring:
ai:
anthropic:
api-key: ${ANTHROPIC_API_KEY}
# In your environment or .env file
export ANTHROPIC_API_KEY=<your-anthropic-api-key>
你也可以在应用程序代码中以编程方式设置此配置:
You can also set this configuration programmatically in your application code:
// Retrieve API key from a secure source or environment variable
String apiKey = System.getenv("ANTHROPIC_API_KEY");
Add Repositories and BOM
Spring AI 工件发布在 Maven Central 和 Spring Snapshot 存储库中。请参阅“添加 Spring AI 仓库”部分,将这些仓库添加到您的构建系统。
Spring AI artifacts are published in Maven Central and Spring Snapshot repositories. Refer to the Artifact Repositories section to add these repositories to your build system.
为了帮助进行依赖项管理,Spring AI 提供了一个 BOM(物料清单)以确保在整个项目中使用一致版本的 Spring AI。有关将 Spring AI BOM 添加到你的构建系统的说明,请参阅 Dependency Management 部分。
To help with dependency management, Spring AI provides a BOM (bill of materials) to ensure that a consistent version of Spring AI is used throughout the entire project. Refer to the Dependency Management section to add the Spring AI BOM to your build system.
Auto-configuration
Spring AI 自动配置、启动器模块的工件名称发生了重大变化。请参阅 upgrade notes 以获取更多信息。 There has been a significant change in the Spring AI auto-configuration, starter modules' artifact names. Please refer to the upgrade notes for more information. |
Spring AI为Anthropic Chat Client提供了Spring Boot自动配置。要启用它,请将以下依赖项添加到您项目的Maven pom.xml
或Gradle build.gradle
文件中:
Spring AI provides Spring Boot auto-configuration for the Anthropic Chat Client.
To enable it add the following dependency to your project’s Maven pom.xml
or Gradle build.gradle
file:
-
Maven
-
Gradle
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-starter-model-anthropic</artifactId>
</dependency>
dependencies {
implementation 'org.springframework.ai:spring-ai-starter-model-anthropic'
}
|
Refer to the Dependency Management section to add the Spring AI BOM to your build file. |
Chat Properties
Retry Properties
前缀 spring.ai.retry
用作属性前缀,可用于配置Anthropic聊天模型的重试机制。
The prefix spring.ai.retry
is used as the property prefix that lets you configure the retry mechanism for the Anthropic chat model.
Property | Description | Default |
---|---|---|
spring.ai.retry.max-attempts |
Maximum number of retry attempts. |
10 |
spring.ai.retry.backoff.initial-interval |
Initial sleep duration for the exponential backoff policy. |
2 sec. |
spring.ai.retry.backoff.multiplier |
Backoff interval multiplier. |
5 |
spring.ai.retry.backoff.max-interval |
Maximum backoff duration. |
3 min. |
spring.ai.retry.on-client-errors |
If false, throw a NonTransientAiException, and do not attempt retry for |
false |
spring.ai.retry.exclude-on-http-codes |
List of HTTP status codes that should NOT trigger a retry (e.g. to throw NonTransientAiException). |
empty |
spring.ai.retry.on-http-codes |
List of HTTP status codes that should trigger a retry (e.g. to throw TransientAiException). |
empty |
目前,重试策略不适用于流式API。 |
currently the retry policies are not applicable for the streaming API. |
Connection Properties
前缀 spring.ai.anthropic
用作属性前缀,可用于连接到Anthropic。
The prefix spring.ai.anthropic
is used as the property prefix that lets you connect to Anthropic.
Property | Description | Default |
---|---|---|
spring.ai.anthropic.base-url |
The URL to connect to |
[role="bare"]https://api.anthropic.com |
spring.ai.anthropic.completions-path |
The path to append to the base URL. |
|
spring.ai.anthropic.version |
Anthropic API version |
2023-06-01 |
spring.ai.anthropic.api-key |
The API Key |
- |
spring.ai.anthropic.beta-version |
Enables new/experimental features. If set to |
|
Configuration Properties
聊天自动配置的启用和禁用现在通过前缀为 Enabling and disabling of the chat auto-configurations are now configured via top level properties with the prefix 要启用,spring.ai.model.chat=anthropic(默认已启用) To enable, spring.ai.model.chat=anthropic (It is enabled by default) 要禁用,spring.ai.model.chat=none(或任何与anthropic不匹配的值) To disable, spring.ai.model.chat=none (or any value which doesn’t match anthropic) 此更改旨在允许配置多个模型。 This change is done to allow configuration of multiple models. |
前缀 spring.ai.anthropic.chat
是属性前缀,可用于配置Anthropic的聊天模型实现。
The prefix spring.ai.anthropic.chat
is the property prefix that lets you configure the chat model implementation for Anthropic.
Property | Description | Default |
---|---|---|
spring.ai.anthropic.chat.enabled (Removed and no longer valid) |
Enable Anthropic chat model. |
true |
spring.ai.model.chat |
Enable Anthropic chat model. |
anthropic |
spring.ai.anthropic.chat.options.model |
This is the Anthropic Chat model to use. Supports: |
|
spring.ai.anthropic.chat.options.temperature |
The sampling temperature to use that controls the apparent creativity of generated completions. Higher values will make output more random while lower values will make results more focused and deterministic. It is not recommended to modify temperature and top_p for the same completions request as the interaction of these two settings is difficult to predict. |
0.8 |
spring.ai.anthropic.chat.options.max-tokens |
The maximum number of tokens to generate in the chat completion. The total length of input tokens and generated tokens is limited by the model’s context length. |
500 |
spring.ai.anthropic.chat.options.stop-sequence |
Custom text sequences that will cause the model to stop generating. Our models will normally stop when they have naturally completed their turn, which will result in a response stop_reason of "end_turn". If you want the model to stop generating when it encounters custom strings of text, you can use the stop_sequences parameter. If the model encounters one of the custom sequences, the response stop_reason value will be "stop_sequence" and the response stop_sequence value will contain the matched stop sequence. |
- |
spring.ai.anthropic.chat.options.top-p |
Use nucleus sampling. In nucleus sampling, we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p. You should either alter temperature or top_p, but not both. Recommended for advanced use cases only. You usually only need to use temperature. |
- |
spring.ai.anthropic.chat.options.top-k |
Only sample from the top K options for each subsequent token. Used to remove "long tail" low probability responses. Learn more technical details here. Recommended for advanced use cases only. You usually only need to use temperature. |
- |
spring.ai.anthropic.chat.options.toolNames |
List of tools, identified by their names, to enable for tool calling in a single prompt requests. Tools with those names must exist in the toolCallbacks registry. |
- |
spring.ai.anthropic.chat.options.toolCallbacks |
Tool Callbacks to register with the ChatModel. |
- |
spring.ai.anthropic.chat.options.internal-tool-execution-enabled |
If false, the Spring AI will not handle the tool calls internally, but will proxy them to the client. Then it is the client’s responsibility to handle the tool calls, dispatch them to the appropriate function, and return the results. If true (the default), the Spring AI will handle the function calls internally. Applicable only for chat models with function calling support |
true |
(deprecated - replaced by |
List of functions, identified by their names, to enable for function calling in a single prompt requests. Functions with those names must exist in the functionCallbacks registry. |
- |
(deprecated - replaced by |
Tool Function Callbacks to register with the ChatModel. |
- |
(deprecated - replaced by a negated |
If true, the Spring AI will not handle the function calls internally, but will proxy them to the client. Then is the client’s responsibility to handle the function calls, dispatch them to the appropriate function, and return the results. If false (the default), the Spring AI will handle the function calls internally. Applicable only for chat models with function calling support |
false |
spring.ai.anthropic.chat.options.http-headers |
Optional HTTP headers to be added to the chat completion request. |
- |
所有以 |
All properties prefixed with |
Runtime Options
AnthropicChatOptions.java 提供模型配置,例如要使用的模型、温度、最大令牌计数等。
The AnthropicChatOptions.java provides model configurations, such as the model to use, the temperature, the max token count, etc.
启动时,可以使用 AnthropicChatModel(api, options)
构造函数或 spring.ai.anthropic.chat.options.*
属性配置默认选项。
On start-up, the default options can be configured with the AnthropicChatModel(api, options)
constructor or the spring.ai.anthropic.chat.options.*
properties.
在运行时,可以通过向 Prompt
调用中添加新的请求特定选项来覆盖默认选项。例如,覆盖特定请求的默认模型和温度:
At run-time you can override the default options by adding new, request specific, options to the Prompt
call.
For example to override the default model and temperature for a specific request:
ChatResponse response = chatModel.call(
new Prompt(
"Generate the names of 5 famous pirates.",
AnthropicChatOptions.builder()
.model("claude-3-7-sonnet-latest")
.temperature(0.4)
.build()
));
除了模型特定的 AnthropicChatOptions ,您还可以使用通过 ChatOptionsBuilder#builder() 创建的便携式 ChatOptions 实例。 |
In addition to the model specific AnthropicChatOptions you can use a portable ChatOptions instance, created with the ChatOptionsBuilder#builder(). |
Tool/Function Calling
您可以使用 AnthropicChatModel
注册自定义 Java 工具,并让 Anthropic Claude 模型智能地选择输出一个 JSON 对象,其中包含调用一个或多个已注册函数的参数。这是一种将 LLM 功能与外部工具和 API 连接起来的强大技术。了解更多关于 Tool Calling 的信息。
You can register custom Java Tools with the AnthropicChatModel
and have the Anthropic Claude model intelligently choose to output a JSON object containing arguments to call one or many of the registered functions.
This is a powerful technique to connect the LLM capabilities with external tools and APIs.
Read more about Tool Calling.
Multimodal
多模态是指模型能够同时理解和处理来自各种来源的信息的能力,包括文本、PDF、图像和数据格式。
Multimodality refers to a model’s ability to simultaneously understand and process information from various sources, including text, pdf, images, data formats.
Images
目前,Anthropic Claude 3 支持 base64
的 images
源类型,以及 image/jpeg
、 image/png
、 image/gif
和 image/webp
媒体类型。请查看 Vision guide 获取更多信息。Anthropic Claude 3.5 Sonnet 还支持 pdf
文件的 application/pdf
源类型。
Currently, Anthropic Claude 3 supports the base64
source type for images
, and the image/jpeg
, image/png
, image/gif
, and image/webp
media types.
Check the Vision guide for more information.
Anthropic Claude 3.5 Sonnet also supports the pdf
source type for application/pdf
files.
Spring AI 的 Message
接口通过引入媒体类型来支持多模态 AI 模型。此类型包含消息中媒体附件的数据和信息,使用 Spring 的 org.springframework.util.MimeType
和 java.lang.Object
作为原始媒体数据。
Spring AI’s Message
interface supports multimodal AI models by introducing the Media type.
This type contains data and information about media attachments in messages, using Spring’s org.springframework.util.MimeType
and a java.lang.Object
for the raw media data.
下面是一个摘自 AnthropicChatModelIT.java 的简单代码示例,演示了用户文本与图像的结合。
Below is a simple code example extracted from AnthropicChatModelIT.java, demonstrating the combination of user text with an image.
var imageData = new ClassPathResource("/multimodal.test.png");
var userMessage = new UserMessage("Explain what do you see on this picture?",
List.of(new Media(MimeTypeUtils.IMAGE_PNG, this.imageData)));
ChatResponse response = chatModel.call(new Prompt(List.of(this.userMessage)));
logger.info(response.getResult().getOutput().getContent());
它将 multimodal.test.png
图像作为输入:
It takes as an input the multimodal.test.png
image:

并附带短信“解释一下你在这张图片上看到了什么?”,然后生成类似如下的回复:
along with the text message "Explain what do you see on this picture?", and generates a response something like:
The image shows a close-up view of a wire fruit basket containing several pieces of fruit. ...
从 Sonnet 3.5 PDF support (beta) 开始提供。使用 application/pdf
媒体类型将 PDF 文件附加到消息中:
Starting with Sonnet 3.5 PDF support (beta) is provided.
Use the application/pdf
media type to attach a PDF file to the message:
var pdfData = new ClassPathResource("/spring-ai-reference-overview.pdf");
var userMessage = new UserMessage(
"You are a very professional document summarization specialist. Please summarize the given document.",
List.of(new Media(new MimeType("application", "pdf"), pdfData)));
var response = this.chatModel.call(new Prompt(List.of(userMessage)));
Sample Controller
Create 一个新的 Spring Boot 项目,并将 spring-ai-starter-model-anthropic
添加到您的 pom(或 gradle)依赖项中。
Create a new Spring Boot project and add the spring-ai-starter-model-anthropic
to your pom (or gradle) dependencies.
在 src/main/resources
目录下添加一个 application.properties
文件,以启用和配置 Anthropic 聊天模型:
Add a application.properties
file, under the src/main/resources
directory, to enable and configure the Anthropic chat model:
spring.ai.anthropic.api-key=YOUR_API_KEY
spring.ai.anthropic.chat.options.model=claude-3-5-sonnet-latest
spring.ai.anthropic.chat.options.temperature=0.7
spring.ai.anthropic.chat.options.max-tokens=450
将 |
replace the |
这将创建一个 AnthropicChatModel
实现,您可以将其注入到您的类中。以下是一个简单的 @Controller
类示例,它使用聊天模型进行文本生成。
This will create a AnthropicChatModel
implementation that you can inject into your class.
Here is an example of a simple @Controller
class that uses the chat model for text generations.
@RestController
public class ChatController {
private final AnthropicChatModel chatModel;
@Autowired
public ChatController(AnthropicChatModel chatModel) {
this.chatModel = chatModel;
}
@GetMapping("/ai/generate")
public Map generate(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
return Map.of("generation", this.chatModel.call(message));
}
@GetMapping("/ai/generateStream")
public Flux<ChatResponse> generateStream(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
Prompt prompt = new Prompt(new UserMessage(message));
return this.chatModel.stream(prompt);
}
}
Manual Configuration
AnthropicChatModel 实现了 ChatModel
和 StreamingChatModel
,并使用 Low-level AnthropicApi Client 连接到 Anthropic 服务。
The AnthropicChatModel implements the ChatModel
and StreamingChatModel
and uses the Low-level AnthropicApi Client to connect to the Anthropic service.
将 spring-ai-anthropic
依赖项添加到您项目的 Maven pom.xml
文件中:
Add the spring-ai-anthropic
dependency to your project’s Maven pom.xml
file:
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-anthropic</artifactId>
</dependency>
或添加到 Gradle build.gradle
构建文件中。
or to your Gradle build.gradle
build file.
dependencies {
implementation 'org.springframework.ai:spring-ai-anthropic'
}
|
Refer to the Dependency Management section to add the Spring AI BOM to your build file. |
接下来,创建一个 AnthropicChatModel
并将其用于文本生成:
Next, create a AnthropicChatModel
and use it for text generations:
var anthropicApi = new AnthropicApi(System.getenv("ANTHROPIC_API_KEY"));
var chatModel = new AnthropicChatModel(this.anthropicApi,
AnthropicChatOptions.builder()
.model("claude-3-opus-20240229")
.temperature(0.4)
.maxTokens(200)
.build());
ChatResponse response = this.chatModel.call(
new Prompt("Generate the names of 5 famous pirates."));
// Or with streaming responses
Flux<ChatResponse> response = this.chatModel.stream(
new Prompt("Generate the names of 5 famous pirates."));
AnthropicChatOptions
提供聊天请求的配置信息。 AnthropicChatOptions.Builder
是流畅的选项构建器。
The AnthropicChatOptions
provides the configuration information for the chat requests.
The AnthropicChatOptions.Builder
is fluent options builder.
Low-level AnthropicApi Client
AnthropicApi 提供的是一个轻量级的 Java 客户端,用于 Anthropic Message API 。
The AnthropicApi provides is lightweight Java client for Anthropic Message API.
以下类图说明了 AnthropicApi
聊天接口和构建块:
Following class diagram illustrates the AnthropicApi
chat interfaces and building blocks:


下面是一个简单的片段,说明如何以编程方式使用 API:
Here is a simple snippet how to use the api programmatically:
AnthropicApi anthropicApi =
new AnthropicApi(System.getenv("ANTHROPIC_API_KEY"));
AnthropicMessage chatCompletionMessage = new AnthropicMessage(
List.of(new ContentBlock("Tell me a Joke?")), Role.USER);
// Sync request
ResponseEntity<ChatCompletionResponse> response = this.anthropicApi
.chatCompletionEntity(new ChatCompletionRequest(AnthropicApi.ChatModel.CLAUDE_3_OPUS.getValue(),
List.of(this.chatCompletionMessage), null, 100, 0.8, false));
// Streaming request
Flux<StreamResponse> response = this.anthropicApi
.chatCompletionStream(new ChatCompletionRequest(AnthropicApi.ChatModel.CLAUDE_3_OPUS.getValue(),
List.of(this.chatCompletionMessage), null, 100, 0.8, true));
请查阅 AnthropicApi.java 的 JavaDoc 以获取更多信息。
Follow the AnthropicApi.java's JavaDoc for further information.
Low-level API Examples
-
AnthropicApiIT.java 测试提供了一些关于如何使用轻量级库的通用示例。
-
The AnthropicApiIT.java test provides some general examples how to use the lightweight library.