Bedrock Converse¶
基本用法¶
使用提示词调用 complete
¶
如果您在 colab 上打开此 Notebook,可能需要安装 LlamaIndex 🦙。
In [ ]
已复制!
%pip install llama-index-llms-bedrock-converse
%pip install llama-index-llms-bedrock-converse
In [ ]
已复制!
!pip install llama-index
!pip install llama-index
In [ ]
已复制!
from llama_index.llms.bedrock_converse import BedrockConverse
profile_name = "Your aws profile name"
resp = BedrockConverse(
model="anthropic.claude-3-haiku-20240307-v1:0",
profile_name=profile_name,
).complete("Paul Graham is ")
from llama_index.llms.bedrock_converse import BedrockConverse profile_name = "Your aws profile name" resp = BedrockConverse( model="anthropic.claude-3-haiku-20240307-v1:0", profile_name=profile_name, ).complete("Paul Graham is ")
In [ ]
已复制!
print(resp)
print(resp)
使用消息列表调用 chat
¶
In [ ]
已复制!
from llama_index.core.llms import ChatMessage
from llama_index.llms.bedrock_converse import BedrockConverse
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
resp = BedrockConverse(
model="anthropic.claude-3-haiku-20240307-v1:0",
profile_name=profile_name,
).chat(messages)
from llama_index.core.llms import ChatMessage from llama_index.llms.bedrock_converse import BedrockConverse messages = [ ChatMessage( role="system", content="You are a pirate with a colorful personality" ), ChatMessage(role="user", content="Tell me a story"), ] resp = BedrockConverse( model="anthropic.claude-3-haiku-20240307-v1:0", profile_name=profile_name, ).chat(messages)
In [ ]
已复制!
print(resp)
print(resp)
流式传输¶
使用 stream_complete
端点
In [ ]
已复制!
from llama_index.llms.bedrock_converse import BedrockConverse
llm = BedrockConverse(
model="anthropic.claude-3-haiku-20240307-v1:0",
profile_name=profile_name,
)
resp = llm.stream_complete("Paul Graham is ")
from llama_index.llms.bedrock_converse import BedrockConverse llm = BedrockConverse( model="anthropic.claude-3-haiku-20240307-v1:0", profile_name=profile_name, ) resp = llm.stream_complete("Paul Graham is ")
In [ ]
已复制!
for r in resp:
print(r.delta, end="")
for r in resp: print(r.delta, end="")
使用 stream_chat
端点
In [ ]
已复制!
from llama_index.llms.bedrock_converse import BedrockConverse
llm = BedrockConverse(
model="anthropic.claude-3-haiku-20240307-v1:0",
profile_name=profile_name,
)
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
resp = llm.stream_chat(messages)
from llama_index.llms.bedrock_converse import BedrockConverse llm = BedrockConverse( model="anthropic.claude-3-haiku-20240307-v1:0", profile_name=profile_name, ) messages = [ ChatMessage( role="system", content="You are a pirate with a colorful personality" ), ChatMessage(role="user", content="Tell me a story"), ] resp = llm.stream_chat(messages)
In [ ]
已复制!
for r in resp:
print(r.delta, end="")
for r in resp: print(r.delta, end="")
配置模型¶
In [ ]
已复制!
from llama_index.llms.bedrock_converse import BedrockConverse
llm = BedrockConverse(
model="anthropic.claude-3-haiku-20240307-v1:0",
profile_name=profile_name,
)
from llama_index.llms.bedrock_converse import BedrockConverse llm = BedrockConverse( model="anthropic.claude-3-haiku-20240307-v1:0", profile_name=profile_name, )
In [ ]
已复制!
resp = llm.complete("Paul Graham is ")
resp = llm.complete("Paul Graham is ")
In [ ]
已复制!
print(resp)
print(resp)
使用访问密钥连接到 Bedrock¶
In [ ]
已复制!
from llama_index.llms.bedrock_converse import BedrockConverse
llm = BedrockConverse(
model="us.amazon.nova-lite-v1:0",
aws_access_key_id="AWS Access Key ID to use",
aws_secret_access_key="AWS Secret Access Key to use",
aws_session_token="AWS Session Token to use",
region_name="AWS Region to use, eg. us-east-1",
)
resp = llm.complete("Paul Graham is ")
from llama_index.llms.bedrock_converse import BedrockConverse llm = BedrockConverse( model="us.amazon.nova-lite-v1:0", aws_access_key_id="AWS Access Key ID to use", aws_secret_access_key="AWS Secret Access Key to use", aws_session_token="AWS Session Token to use", region_name="AWS Region to use, eg. us-east-1", ) resp = llm.complete("Paul Graham is ")
In [ ]
已复制!
print(resp)
print(resp)
函数调用¶
Claude、Command 和 Mistral Large 模型通过 AWS Bedrock Converse 支持原生函数调用。通过 llm
上的 predict_and_call
函数,可以与 LlamaIndex 工具无缝集成。
这允许用户附加任何工具,并让 LLM 决定调用哪些工具(如果有)。
如果您希望在 Agent 循环中执行工具调用,请参阅我们的Agent 指南。
注意:并非所有 AWS Bedrock 模型都支持函数调用和 Converse API。在此查看每个 LLM 的可用功能。
In [ ]
已复制!
from llama_index.llms.bedrock_converse import BedrockConverse
from llama_index.core.tools import FunctionTool
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
def mystery(a: int, b: int) -> int:
"""Mystery function on two integers."""
return a * b + a + b
mystery_tool = FunctionTool.from_defaults(fn=mystery)
multiply_tool = FunctionTool.from_defaults(fn=multiply)
llm = BedrockConverse(
model="anthropic.claude-3-haiku-20240307-v1:0",
profile_name=profile_name,
)
from llama_index.llms.bedrock_converse import BedrockConverse from llama_index.core.tools import FunctionTool def multiply(a: int, b: int) -> int: """Multiple two integers and returns the result integer""" return a * b def mystery(a: int, b: int) -> int: """Mystery function on two integers.""" return a * b + a + b mystery_tool = FunctionTool.from_defaults(fn=mystery) multiply_tool = FunctionTool.from_defaults(fn=multiply) llm = BedrockConverse( model="anthropic.claude-3-haiku-20240307-v1:0", profile_name=profile_name, )
In [ ]
已复制!
response = llm.predict_and_call(
[mystery_tool, multiply_tool],
user_msg="What happens if I run the mystery function on 5 and 7",
)
response = llm.predict_and_call( [mystery_tool, multiply_tool], user_msg="What happens if I run the mystery function on 5 and 7", )
In [ ]
已复制!
print(str(response))
print(str(response))
In [ ]
已复制!
response = llm.predict_and_call(
[mystery_tool, multiply_tool],
user_msg=(
"""What happens if I run the mystery function on the following pairs of numbers? Generate a separate result for each row:
- 1 and 2
- 8 and 4
- 100 and 20
NOTE: you need to run the mystery function for all of the pairs above at the same time \
"""
),
allow_parallel_tool_calls=True,
)
response = llm.predict_and_call( [mystery_tool, multiply_tool], user_msg=( """What happens if I run the mystery function on the following pairs of numbers? Generate a separate result for each row: - 1 and 2 - 8 and 4 - 100 and 20 NOTE: you need to run the mystery function for all of the pairs above at the same time \ """ ), allow_parallel_tool_calls=True, )
In [ ]
已复制!
print(str(response))
print(str(response))
In [ ]
已复制!
for s in response.sources:
print(f"Name: {s.tool_name}, Input: {s.raw_input}, Output: {str(s)}")
for s in response.sources: print(f"Name: {s.tool_name}, Input: {s.raw_input}, Output: {str(s)}")
异步¶
In [ ]
已复制!
from llama_index.llms.bedrock_converse import BedrockConverse
llm = BedrockConverse(
model="anthropic.claude-3-haiku-20240307-v1:0",
aws_access_key_id="AWS Access Key ID to use",
aws_secret_access_key="AWS Secret Access Key to use",
aws_session_token="AWS Session Token to use",
region_name="AWS Region to use, eg. us-east-1",
)
resp = await llm.acomplete("Paul Graham is ")
from llama_index.llms.bedrock_converse import BedrockConverse llm = BedrockConverse( model="anthropic.claude-3-haiku-20240307-v1:0", aws_access_key_id="AWS Access Key ID to use", aws_secret_access_key="AWS Secret Access Key to use", aws_session_token="AWS Session Token to use", region_name="AWS Region to use, eg. us-east-1", ) resp = await llm.acomplete("Paul Graham is ")
In [ ]
已复制!
print(resp)
print(resp)