PromptLayer Handler¶
PromptLayer 是一个 LLMOps 工具,用于管理 Prompt,查看其功能。目前此集成仅支持 OpenAI。
如果您在 Colab 上打开此 Notebook,您可能需要安装 LlamaIndex 🦙 和 PromptLayer。
In [ ]
已复制!
!pip install llama-index
!pip install promptlayer
!pip install llama-index !pip install promptlayer
配置 API Key¶
In [ ]
已复制!
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["PROMPTLAYER_API_KEY"] = "pl_..."
import os os.environ["OPENAI_API_KEY"] = "sk-..." os.environ["PROMPTLAYER_API_KEY"] = "pl_..."
下载数据¶
In [ ]
已复制!
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
!mkdir -p 'data/paul_graham/' !wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
Will not apply HSTS. The HSTS database must be a regular and non-world-writable file. ERROR: could not open HSTS store at '/home/loganm/.wget-hsts'. HSTS will be disabled. --2023-11-29 21:09:27-- https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.110.133, 185.199.109.133, 185.199.108.133, ... Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.110.133|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 75042 (73K) [text/plain] Saving to: ‘data/paul_graham/paul_graham_essay.txt’ data/paul_graham/pa 100%[===================>] 73.28K --.-KB/s in 0.04s 2023-11-29 21:09:28 (1.76 MB/s) - ‘data/paul_graham/paul_graham_essay.txt’ saved [75042/75042]
In [ ]
已复制!
from llama_index.core import SimpleDirectoryReader
docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
from llama_index.core import SimpleDirectoryReader docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
Callback Manager 设置¶
In [ ]
已复制!
from llama_index.core import set_global_handler
# pl_tags are optional, to help you organize your prompts and apps
set_global_handler("promptlayer", pl_tags=["paul graham", "essay"])
from llama_index.core import set_global_handler # pl_tags are optional, to help you organize your prompts and apps set_global_handler("promptlayer", pl_tags=["paul graham", "essay"])
使用查询触发回调¶
In [ ]
已复制!
from llama_index.core import VectorStoreIndex
index = VectorStoreIndex.from_documents(docs)
query_engine = index.as_query_engine()
from llama_index.core import VectorStoreIndex index = VectorStoreIndex.from_documents(docs) query_engine = index.as_query_engine()
In [ ]
已复制!
response = query_engine.query("What did the author do growing up?")
response = query_engine.query("What did the author do growing up?")