安装设置¶
如果您在Colab中打开此Notebook,您可能需要安装LlamaIndex 🦙。
In [ ]
已复制!
%pip install llama-index-llms-oci-genai
%pip install llama-index-llms-oci-genai
In [ ]
已复制!
!pip install llama-index
!pip install llama-index
您还需要安装OCI SDK。
In [ ]
已复制!
!pip install -U oci
!pip install -U oci
基本用法¶
将OCI Generative AI提供的LLM与LlamaIndex一起使用,只需使用您的OCI端点、模型ID、OCID和身份验证方法初始化OCIGenAI接口即可。
使用Prompt调用complete
方法¶
In [ ]
已复制!
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.complete("Paul Graham is ")
print(resp)
from llama_index.llms.oci_genai import OCIGenAI llm = OCIGenAI( model="MY_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", ) resp = llm.complete("Paul Graham is ") print(resp)
使用消息列表调用chat
方法¶
In [ ]
已复制!
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.chat(messages)
print(resp)
from llama_index.llms.oci_genai import OCIGenAI from llama_index.core.llms import ChatMessage messages = [ ChatMessage( role="system", content="You are a pirate with a colorful personality" ), ChatMessage(role="user", content="Tell me a story"), ] llm = OCIGenAI( model="MY_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", ) resp = llm.chat(messages) print(resp)
流式传输¶
使用stream_complete
端点
In [ ]
已复制!
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
print(r.delta, end="")
from llama_index.llms.oci_genai import OCIGenAI llm = OCIGenAI( model="MY_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", ) resp = llm.stream_complete("Paul Graham is ") for r in resp: print(r.delta, end="")
使用stream_chat
端点
In [ ]
已复制!
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")
from llama_index.llms.oci_genai import OCIGenAI from llama_index.core.llms import ChatMessage messages = [ ChatMessage( role="system", content="You are a pirate with a colorful personality" ), ChatMessage(role="user", content="Tell me a story"), ] llm = OCIGenAI( model="MY_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", ) resp = llm.stream_chat(messages) for r in resp: print(r.delta, end="")
异步¶
目前不支持原生异步。异步调用将回退到同步方式。
In [ ]
已复制!
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.achat(messages)
print(resp)
resp = llm.astream_chat(messages)
for r in resp:
print(r.delta, end="")
from llama_index.llms.oci_genai import OCIGenAI from llama_index.core.llms import ChatMessage messages = [ ChatMessage( role="system", content="You are a pirate with a colorful personality" ), ChatMessage(role="user", content="Tell me a story"), ] llm = OCIGenAI( model="MY_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", ) resp = llm.achat(messages) print(resp) resp = llm.astream_chat(messages) for r in resp: print(r.delta, end="")
配置模型¶
In [ ]
已复制!
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="cohere.command",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
resp = llm.complete("Paul Graham is ")
print(resp)
from llama_index.llms.oci_genai import OCIGenAI llm = OCIGenAI( model="cohere.command", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", ) resp = llm.complete("Paul Graham is ") print(resp)
In [ ]
已复制!
from llama_index.llms.oci_genai import OCIGenAI
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
auth_type="SECURITY_TOKEN",
auth_profile="MY_PROFILE", # replace with your profile name
auth_file_location="MY_CONFIG_FILE_LOCATION", # replace with file location where profile name configs present
)
resp = llm.complete("Paul Graham is ")
print(resp)
from llama_index.llms.oci_genai import OCIGenAI llm = OCIGenAI( model="MY_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", auth_type="SECURITY_TOKEN", auth_profile="MY_PROFILE", # 替换为您的配置文件名 auth_file_location="MY_CONFIG_FILE_LOCATION", # 替换为存在配置文件名配置的文件位置 ) resp = llm.complete("Paul Graham is ") print(resp)
In [ ]
已复制!
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.llms import ChatMessage
llm = OCIGenAI(
model="ocid1.generativeaiendpoint.oc1.us-chicago-1....",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="DEDICATED_COMPARTMENT_OCID",
auth_profile="MY_PROFILE", # replace with your profile name,
auth_file_location="MY_CONFIG_FILE_LOCATION", # replace with file location where profile name configs present
provider="MODEL_PROVIDER", # e.g., "cohere" or "meta"
context_size="MODEL_CONTEXT_SIZE", # e.g., 128000
)
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
resp = llm.chat(messages)
print(resp)
from llama_index.llms.oci_genai import OCIGenAI from llama_index.core.llms import ChatMessage llm = OCIGenAI( model="ocid1.generativeaiendpoint.oc1.us-chicago-1....", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="DEDICATED_COMPARTMENT_OCID", auth_profile="MY_PROFILE", # 替换为您的配置文件名 auth_file_location="MY_CONFIG_FILE_LOCATION", # 替换为存在配置文件名配置的文件位置 provider="MODEL_PROVIDER", # 例如:"cohere" 或 "meta" context_size="MODEL_CONTEXT_SIZE", # 例如:128000 ) messages = [ ChatMessage( role="system", content="You are a pirate with a colorful personality" ), ChatMessage(role="user", content="Tell me a story"), ] resp = llm.chat(messages) print(resp)
LlamaIndex中的基本工具调用¶
目前只有Cohere支持工具调用
In [ ]
已复制!
from llama_index.llms.oci_genai import OCIGenAI
from llama_index.core.tools import FunctionTool
llm = OCIGenAI(
model="MY_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
def add(a: int, b: int) -> int:
"""Addition function on two integers."""
return a + b
add_tool = FunctionTool.from_defaults(fn=add)
multiply_tool = FunctionTool.from_defaults(fn=multiply)
response = llm.chat_with_tools(
tools=[add_tool, multiply_tool],
user_msg="What is 3 * 12? Also, what is 11 + 49?",
)
print(response)
tool_calls = response.message.additional_kwargs.get("tool_calls", [])
print(tool_calls)
from llama_index.llms.oci_genai import OCIGenAI from llama_index.core.tools import FunctionTool llm = OCIGenAI( model="MY_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", ) def multiply(a: int, b: int) -> int: """Multiple two integers and returns the result integer""" return a * b def add(a: int, b: int) -> int: """Addition function on two integers.""" return a + b add_tool = FunctionTool.from_defaults(fn=add) multiply_tool = FunctionTool.from_defaults(fn=multiply) response = llm.chat_with_tools( tools=[add_tool, multiply_tool], user_msg="What is 3 * 12? Also, what is 11 + 49?", ) print(response) tool_calls = response.message.additional_kwargs.get("tool_calls", []) print(tool_calls)