聊天引擎 - ReAct Agent 模式¶
ReAct 是一种基于代理的聊天模式,构建在您数据的查询引擎之上。
对于每次聊天交互,代理进入 ReAct 循环
- 首先决定是否使用查询引擎工具并提出合适的输入
- (可选)使用查询引擎工具并观察其输出
- 决定是重复还是给出最终响应
这种方法很灵活,因为它既可以选择查询知识库,也可以选择不查询。然而,性能也更依赖于 LLM 的质量。您可能需要更多地强制它在正确的时间选择查询知识库,而不是幻觉回答。
如果您正在 colab 上打开此 Notebook,您可能需要安装 LlamaIndex 🦙。
In [ ]
已复制!
%pip install llama-index-llms-anthropic
%pip install llama-index-llms-openai
%pip install llama-index-llms-anthropic %pip install llama-index-llms-openai
In [ ]
已复制!
!pip install llama-index
!pip install llama-index
下载数据¶
In [ ]
已复制!
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
!mkdir -p 'data/paul_graham/' !wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
5 行代码快速入门¶
加载数据并构建索引
In [ ]
已复制!
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.llms.openai import OpenAI
from llama_index.llms.anthropic import Anthropic
llm = OpenAI()
data = SimpleDirectoryReader(input_dir="./data/paul_graham/").load_data()
index = VectorStoreIndex.from_documents(data)
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader from llama_index.llms.openai import OpenAI from llama_index.llms.anthropic import Anthropic llm = OpenAI() data = SimpleDirectoryReader(input_dir="./data/paul_graham/").load_data() index = VectorStoreIndex.from_documents(data)
配置聊天引擎
In [ ]
已复制!
chat_engine = index.as_chat_engine(chat_mode="react", llm=llm, verbose=True)
chat_engine = index.as_chat_engine(chat_mode="react", llm=llm, verbose=True)
与您的数据聊天
In [ ]
已复制!
response = chat_engine.chat(
"Use the tool to answer what did Paul Graham do in the summer of 1995?"
)
response = chat_engine.chat( "Use the tool to answer what did Paul Graham do in the summer of 1995?" )
Thought: I need to use a tool to help me answer the question. Action: query_engine_tool Action Input: {'input': 'What did Paul Graham do in the summer of 1995?'} Observation: In the summer of 1995, Paul Graham worked on building a web application for making web applications. He recruited Dan Giffin, who had worked for Viaweb, and two undergrads who wanted summer jobs, and they got to work trying to build what it's now clear is about twenty companies and several open source projects worth of software. The language for defining applications would of course be a dialect of Lisp. Response: In the summer of 1995, Paul Graham worked on building a web application for making web applications. He recruited Dan Giffin, who had worked for Viaweb, and two undergrads who wanted summer jobs, and they got to work trying to build what it's now clear is about twenty companies and several open source projects worth of software. The language for defining applications would of course be a dialect of Lisp.
In [ ]
已复制!
print(response)
print(response)
In the summer of 1995, Paul Graham worked on building a web application for making web applications. He recruited Dan Giffin, who had worked for Viaweb, and two undergrads who wanted summer jobs, and they got to work trying to build what it's now clear is about twenty companies and several open source projects worth of software. The language for defining applications would of course be a dialect of Lisp.
自定义 LLM¶
使用 Anthropic ("claude-2")
In [ ]
已复制!
llm = Anthropic()
llm = Anthropic()
配置聊天引擎
In [ ]
已复制!
chat_engine = index.as_chat_engine(llm=llm, chat_mode="react", verbose=True)
chat_engine = index.as_chat_engine(llm=llm, chat_mode="react", verbose=True)
In [ ]
已复制!
response = chat_engine.chat("what did Paul Graham do in the summer of 1995?")
response = chat_engine.chat("what did Paul Graham do in the summer of 1995?")
Thought: I need to use a tool to help me answer the question. Action: query_engine_tool Action Input: {'input': 'what did Paul Graham do in the summer of 1995?'} Observation: Based on the context, in the summer of 1995 Paul Graham: - Painted a second still life using the same objects he had used for a previous still life painting. - Looked for an apartment to buy in New York, trying to find a neighborhood similar to Cambridge, MA. - Realized there wasn't really a "Cambridge of New York" after visiting the actual Cambridge. The passage does not mention what Paul Graham did in the summer of 1995 specifically. It talks about painting a second still life at some point and looking for an apartment in New York at some point, but it does not connect those events to the summer of 1995. Response: The passage does not provide enough information to know specifically what Paul Graham did in the summer of 1995. It mentions some activities like painting and looking for an apartment in New York, but does not say these occurred specifically in the summer of 1995.
In [ ]
已复制!
print(response)
print(response)
The passage does not provide enough information to know specifically what Paul Graham did in the summer of 1995. It mentions some activities like painting and looking for an apartment in New York, but does not say these occurred specifically in the summer of 1995.
In [ ]
已复制!
response = chat_engine.chat("What did I ask you before?")
response = chat_engine.chat("What did I ask you before?")
Response: You asked me "what did Paul Graham do in the summer of 1995?".
In [ ]
已复制!
print(response)
print(response)
You asked me "what did Paul Graham do in the summer of 1995?".
重置聊天引擎
In [ ]
已复制!
chat_engine.reset()
chat_engine.reset()
In [ ]
已复制!
response = chat_engine.chat("What did I ask you before?")
response = chat_engine.chat("What did I ask you before?")
Response: I'm afraid I don't have any context about previous questions in our conversation. This seems to be the start of a new conversation between us.
In [ ]
已复制!
print(response)
print(response)
I'm afraid I don't have any context about previous questions in our conversation. This seems to be the start of a new conversation between us.