ModelScope LLMS¶
在本 notebook 中,我们展示了如何在 LlamaIndex 中使用 ModelScope LLM 模型。请访问 ModelScope 网站。
如果您在 colab 上打开此 Notebook,您需要安装 LlamaIndex 🦙 和 modelscope。
In [ ]
已复制!
!pip install llama-index-llms-modelscope
!pip install llama-index-llms-modelscope
基本用法¶
In [ ]
已复制!
import sys
from llama_index.llms.modelscope import ModelScopeLLM
llm = ModelScopeLLM(model_name="qwen/Qwen1.5-7B-Chat", model_revision="master")
rsp = llm.complete("Hello, who are you?")
print(rsp)
import sys from llama_index.llms.modelscope import ModelScopeLLM llm = ModelScopeLLM(model_name="qwen/Qwen1.5-7B-Chat", model_revision="master") rsp = llm.complete("Hello, who are you?") print(rsp)
使用消息请求¶
In [ ]
已复制!
from llama_index.core.base.llms.types import MessageRole, ChatMessage
messages = [
ChatMessage(
role=MessageRole.SYSTEM, content="You are a helpful assistant."
),
ChatMessage(role=MessageRole.USER, content="How to make cake?"),
]
resp = llm.chat(messages)
print(resp)
from llama_index.core.base.llms.types import MessageRole, ChatMessage messages = [ ChatMessage( role=MessageRole.SYSTEM, content="You are a helpful assistant." ), ChatMessage(role=MessageRole.USER, content="How to make cake?"), ] resp = llm.chat(messages) print(resp)