如果您在 colab 上打开此 Notebook,您可能需要安装 LlamaIndex 🦙。
In [ ]
已复制!
%pip install llama-index-llms-clarifai
%pip install llama-index-llms-clarifai
In [ ]
已复制!
!pip install llama-index
!pip install llama-index
安装 clarifai
In [ ]
已复制!
!pip install clarifai
!pip install clarifai
将 clarifai PAT 设置为环境变量。
In [ ]
已复制!
import os
os.environ["CLARIFAI_PAT"] = "<YOUR CLARIFAI PAT>"
import os os.environ["CLARIFAI_PAT"] = ""
导入 clarifai 包
In [ ]
已复制!
from llama_index.llms.clarifai import Clarifai
from llama_index.llms.clarifai import Clarifai
从我们的模型页面根据您的偏好探索各种模型
In [ ]
已复制!
# Example parameters
params = dict(
user_id="clarifai",
app_id="ml",
model_name="llama2-7b-alternative-4k",
model_url=(
"https://clarifai.com/clarifai/ml/models/llama2-7b-alternative-4k"
),
)
# 示例参数 params = dict( user_id="clarifai", app_id="ml", model_name="llama2-7b-alternative-4k", model_url=( "https://clarifai.com/clarifai/ml/models/llama2-7b-alternative-4k" ), )
初始化 LLM
In [ ]
已复制!
# Method:1 using model_url parameter
llm_model = Clarifai(model_url=params["model_url"])
# 方法 1:使用 model_url 参数 llm_model = Clarifai(model_url=params["model_url"])
In [ ]
已复制!
# Method:2 using model_name, app_id & user_id parameters
llm_model = Clarifai(
model_name=params["model_name"],
app_id=params["app_id"],
user_id=params["user_id"],
)
# 方法 2:使用 model_name, app_id & user_id 参数 llm_model = Clarifai( model_name=params["model_name"], app_id=params["app_id"], user_id=params["user_id"], )
调用 complete
函数
In [ ]
已复制!
llm_reponse = llm_model.complete(
prompt="write a 10 line rhyming poem about science"
)
llm_reponse = llm_model.complete( prompt="write a 10 line rhyming poem about science" )
In [ ]
已复制!
print(llm_reponse)
print(llm_reponse)
. Science is fun, it's true! From atoms to galaxies, it's all new! With experiments and tests, we learn so fast, And discoveries come from the past. It helps us understand the world around, And makes our lives more profound. So let's embrace this wondrous art, And see where it takes us in the start!
调用 chat
函数
In [ ]
已复制!
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(role="user", content="write about climate change in 50 lines")
]
Response = llm_model.chat(messages)
from llama_index.core.llms import ChatMessage messages = [ ChatMessage(role="user", content="write about climate change in 50 lines") ] Response = llm_model.chat(messages)
In [ ]
已复制!
print(Response)
print(Response)
user: or less. Climate change is a serious threat to our planet and its inhabitants. Rising temperatures are causing extreme weather events, such as hurricanes, droughts, and wildfires. Sea levels are rising, threatening coastal communities and ecosystems. The melting of polar ice caps is disrupting global navigation and commerce. Climate change is also exacerbating air pollution, which can lead to respiratory problems and other health issues. It's essential that we take action now to reduce greenhouse gas emissions and transition to renewable energy sources to mitigate the worst effects of climate change.
使用推理参数¶
或者,您可以使用推理参数调用模型。
In [ ]
已复制!
# Here is an inference parameter example for GPT model.
inference_params = dict(temperature=str(0.3), max_tokens=20)
# 以下是 GPT 模型的推理参数示例。 inference_params = dict(temperature=str(0.3), max_tokens=20)
In [ ]
已复制!
llm_reponse = llm_model.complete(
prompt="What is nuclear fission and fusion?",
inference_params=params,
)
llm_reponse = llm_model.complete( prompt="What is nuclear fission and fusion?", inference_params=params, )
In [ ]
已复制!
messages = [ChatMessage(role="user", content="Explain about the big bang")]
Response = llm_model.chat(messages, inference_params=params)
messages = [ChatMessage(role="user", content="Explain about the big bang")] Response = llm_model.chat(messages, inference_params=params)