使用 OpenLLMetry 的可观测性¶
OpenLLMetry 是一个基于 OpenTelemetry 的开源项目,用于跟踪和监控 LLM 应用。它可以连接到所有主要的可观测性平台(如 Datadog、Dynatrace、Honeycomb、New Relic 等),并且可以在几分钟内安装。
如果您在 colab 上打开此 Notebook,您可能需要安装 LlamaIndex 🦙 和 OpenLLMetry。
In [ ]
已复制!
!pip install llama-index
!pip install traceloop-sdk
!pip install llama-index !pip install traceloop-sdk
配置 API 密钥¶
在 app.traceloop.com 注册 Traceloop。然后,前往 API 密钥页面并创建一个新的 API 密钥。复制该密钥并将其粘贴到下面的单元格中。
如果您更喜欢使用其他可观测性平台,如 Datadog、Dynatrace、Honeycomb 或其他平台,您可以在此处找到如何配置它的说明。
In [ ]
已复制!
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
os.environ["TRACELOOP_API_KEY"] = "..."
import os os.environ["OPENAI_API_KEY"] = "sk-..." os.environ["TRACELOOP_API_KEY"] = "..."
初始化 OpenLLMetry¶
In [ ]
已复制!
from traceloop.sdk import Traceloop
Traceloop.init()
from traceloop.sdk import Traceloop Traceloop.init()
Traceloop syncing configuration and prompts Traceloop exporting traces to https://api.traceloop.com authenticating with bearer token
下载数据¶
In [ ]
已复制!
!mkdir -p 'data/paul_graham/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
!mkdir -p 'data/paul_graham/' !wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'
--2024-01-12 12:43:16-- https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.109.133, 185.199.108.133, 185.199.111.133, ... Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.109.133|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 75042 (73K) [text/plain] Saving to: ‘data/paul_graham/paul_graham_essay.txt’ data/paul_graham/pa 100%[===================>] 73.28K --.-KB/s in 0.02s 2024-01-12 12:43:17 (3.68 MB/s) - ‘data/paul_graham/paul_graham_essay.txt’ saved [75042/75042]
In [ ]
已复制!
from llama_index.core import SimpleDirectoryReader
docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
from llama_index.core import SimpleDirectoryReader docs = SimpleDirectoryReader("./data/paul_graham/").load_data()
运行查询¶
In [ ]
已复制!
from llama_index.core import VectorStoreIndex
index = VectorStoreIndex.from_documents(docs)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
from llama_index.core import VectorStoreIndex index = VectorStoreIndex.from_documents(docs) query_engine = index.as_query_engine() response = query_engine.query("What did the author do growing up?") print(response)
The author wrote short stories and also worked on programming, specifically on an IBM 1401 computer in 9th grade. They used an early version of Fortran and typed programs on punch cards. They also mentioned getting a microcomputer, a TRS-80, in about 1980 and started programming on it.