跳过内容

聊天存储#

聊天存储作为一个中心化接口,用于存储您的聊天历史。聊天历史与其他存储格式不同,因为消息的顺序对于维持完整的对话至关重要。

聊天存储可以按键(例如 user_ids 或其他唯一的标识字符串)组织聊天消息序列,并处理 deleteinsertget 操作。

SimpleChatStore#

最基本的聊天存储是 SimpleChatStore,它将消息存储在内存中,可以保存到磁盘/从磁盘加载,或者可以序列化后存储到其他地方。

通常,您会实例化一个聊天存储并将其提供给一个记忆模块。如果未提供,使用聊天存储的记忆模块将默认使用 SimpleChatStore

from llama_index.core.storage.chat_store import SimpleChatStore
from llama_index.core.memory import ChatMemoryBuffer

chat_store = SimpleChatStore()

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

创建记忆后,您可以将其包含在代理或聊天引擎中

agent = OpenAIAgent.from_tools(tools, memory=memory)
# OR
chat_engine = index.as_chat_engine(memory=memory)

要保存聊天存储以便将来使用,您可以保存到磁盘/从磁盘加载

chat_store.persist(persist_path="chat_store.json")
loaded_chat_store = SimpleChatStore.from_persist_path(
    persist_path="chat_store.json"
)

或者您可以转换为字符串/从字符串转换,在此过程中将字符串保存到其他地方

chat_store_string = chat_store.json()
loaded_chat_store = SimpleChatStore.parse_raw(chat_store_string)

UpstashChatStore#

使用 UpstashChatStore,您可以使用 Upstash Redis 远程存储您的聊天历史,Upstash Redis 提供无服务器的 Redis 解决方案,非常适合需要可扩展和高效聊天存储的应用。此聊天存储支持同步和异步操作。

安装#

pip install llama-index-storage-chat-store-upstash

用法#

from llama_index.storage.chat_store.upstash import UpstashChatStore
from llama_index.core.memory import ChatMemoryBuffer

chat_store = UpstashChatStore(
    redis_url="YOUR_UPSTASH_REDIS_URL",
    redis_token="YOUR_UPSTASH_REDIS_TOKEN",
    ttl=300,  # Optional: Time to live in seconds
)

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

UpstashChatStore 支持同步和异步操作。以下是使用异步方法的示例

import asyncio
from llama_index.core.llms import ChatMessage


async def main():
    # Add messages
    messages = [
        ChatMessage(content="Hello", role="user"),
        ChatMessage(content="Hi there!", role="assistant"),
    ]
    await chat_store.async_set_messages("conversation1", messages)

    # Retrieve messages
    retrieved_messages = await chat_store.async_get_messages("conversation1")
    print(retrieved_messages)

    # Delete last message
    deleted_message = await chat_store.async_delete_last_message(
        "conversation1"
    )
    print(f"Deleted message: {deleted_message}")


asyncio.run(main())

RedisChatStore#

使用 RedisChatStore,您可以远程存储您的聊天历史,无需担心手动持久化和加载聊天历史。

from llama_index.storage.chat_store.redis import RedisChatStore
from llama_index.core.memory import ChatMemoryBuffer

chat_store = RedisChatStore(redis_url="redis://localhost:6379", ttl=300)

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

AzureChatStore#

使用 AzureChatStore,您可以将聊天历史远程存储在 Azure 表存储或 CosmosDB 中,无需担心手动持久化和加载聊天历史。

pip install llama-index
pip install llama-index-llms-azure-openai
pip install llama-index-storage-chat-store-azure
from llama_index.core.chat_engine import SimpleChatEngine
from llama_index.core.memory import ChatMemoryBuffer
from llama_index.storage.chat_store.azure import AzureChatStore

chat_store = AzureChatStore.from_account_and_key(
    account_name="",
    account_key="",
    chat_table_name="ChatUser",
)

memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="conversation1",
)

chat_engine = SimpleChatEngine(
    memory=memory, llm=Settings.llm, prefix_messages=[]
)

response = chat_engine.chat("Hello.")

DynamoDBChatStore#

使用 DynamoDBChatStore,您可以将聊天历史存储在 AWS DynamoDB 中。

安装#

pip install llama-index-storage-chat-store-dynamodb

用法#

请确保您已创建一个具有适当模式的 DynamoDB 表。默认情况下,示例如下:

import boto3

# Get the service resource.
dynamodb = boto3.resource("dynamodb")

# Create the DynamoDB table.
table = dynamodb.create_table(
    TableName="EXAMPLE_TABLE",
    KeySchema=[{"AttributeName": "SessionId", "KeyType": "HASH"}],
    AttributeDefinitions=[
        {"AttributeName": "SessionId", "AttributeType": "S"}
    ],
    BillingMode="PAY_PER_REQUEST",
)

然后,您可以使用 DynamoDBChatStore 类来持久化和检索聊天历史。

import os
from llama_index.core.llms import ChatMessage, MessageRole
from llama_index.storage.chat_store.dynamodb.base import DynamoDBChatStore

# Initialize DynamoDB chat store
chat_store = DynamoDBChatStore(
    table_name="EXAMPLE_TABLE", profile_name=os.getenv("AWS_PROFILE")
)

# A chat history, which doesn't exist yet, returns an empty array.
print(chat_store.get_messages("123"))
# >>> []

# Initializing a chat history with a key of "SessionID = 123"
messages = [
    ChatMessage(role=MessageRole.USER, content="Who are you?"),
    ChatMessage(
        role=MessageRole.ASSISTANT, content="I am your helpful AI assistant."
    ),
]
chat_store.set_messages(key="123", messages=messages)
print(chat_store.get_messages("123"))
# >>> [ChatMessage(role=<MessageRole.USER: 'user'>, content='Who are you?', additional_kwargs={}),
#      ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content='I am your helpful AI assistant.', additional_kwargs={})]]

# Appending a message to an existing chat history
message = ChatMessage(role=MessageRole.USER, content="What can you do?")
chat_store.add_message(key="123", message=message)
print(chat_store.get_messages("123"))
# >>> [ChatMessage(role=<MessageRole.USER: 'user'>, content='Who are you?', additional_kwargs={}),
#      ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content='I am your helpful AI assistant.', additional_kwargs={})],
#      ChatMessage(role=<MessageRole.USER: 'user'>, content='What can you do?', additional_kwargs={})]

PostgresChatStore#

使用 PostgresChatStore,您可以远程存储您的聊天历史,无需担心手动持久化和加载聊天历史。

from llama_index.storage.chat_store.postgres import PostgresChatStore
from llama_index.core.memory import ChatMemoryBuffer

chat_store = PostgresChatStore.from_uri(
    uri="postgresql+asyncpg://postgres:[email protected]:5432/database",
)

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

TablestoreChatStore#

使用 TablestoreChatStore,您可以远程存储您的聊天历史,无需担心手动持久化和加载聊天历史。

安装#

pip install llama-index-storage-chat-store-tablestore

用法#

from llama_index.storage.chat_store.tablestore import TablestoreChatStore
from llama_index.core.memory import ChatMemoryBuffer

# 1. create tablestore vector store
chat_store = TablestoreChatStore(
    endpoint="<end_point>",
    instance_name="<instance_name>",
    access_key_id="<access_key_id>",
    access_key_secret="<access_key_secret>",
)
# You need to create a table for the first use
chat_store.create_table_if_not_exist()

chat_memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

Google AlloyDB ChatStore#

使用 AlloyDBChatStore,您可以将聊天历史存储在 AlloyDB 中,无需担心手动持久化和加载聊天历史。

本教程演示了同步接口。所有同步方法都有对应的异步方法。

安装#

pip install llama-index
pip install llama-index-alloydb-pg
pip install llama-index-llms-vertex

用法#

from llama_index.core.chat_engine import SimpleChatEngine
from llama_index.core.memory import ChatMemoryBuffer
from llama_index_alloydb_pg import AlloyDBChatStore, AlloyDBEngine
from llama_index.llms.vertex import Vertex
import asyncio

# Replace with your own AlloyDB info
engine = AlloyDBEngine.from_instance(
    project_id=PROJECT_ID,
    region=REGION,
    cluster=CLUSTER,
    instance=INSTANCE,
    database=DATABASE,
    user=USER,
    password=PASSWORD,
)

engine.init_chat_store_table(table_name=TABLE_NAME)

chat_store = AlloyDBChatStore.create_sync(
    engine=engine,
    table_name=TABLE_NAME,
)

memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

llm = Vertex(model="gemini-1.5-flash-002", project=PROJECT_ID)

chat_engine = SimpleChatEngine(memory=memory, llm=llm, prefix_messages=[])

response = chat_engine.chat("Hello.")

print(response)

Google Cloud SQL for PostgreSQL ChatStore#

使用 PostgresChatStore,您可以将聊天历史存储在 Cloud SQL for Postgres 中,无需担心手动持久化和加载聊天历史。

本教程演示了同步接口。所有同步方法都有对应的异步方法。

安装#

pip install llama-index
pip install llama-index-cloud-sql-pg
pip install llama-index-llms-vertex

用法#

from llama_index.core.chat_engine import SimpleChatEngine
from llama_index.core.memory import ChatMemoryBuffer
from llama_index_cloud_sql_pg import PostgresChatStore, PostgresEngine
from llama_index.llms.vertex import Vertex
import asyncio

# Replace with your own Cloud SQL info
engine = PostgresEngine.from_instance(
    project_id=PROJECT_ID,
    region=REGION,
    instance=INSTANCE,
    database=DATABASE,
    user=USER,
    password=PASSWORD,
)

engine.init_chat_store_table(table_name=TABLE_NAME)

chat_store = PostgresChatStore.create_sync(
    engine=engine,
    table_name=TABLE_NAME,
)

memory = ChatMemoryBuffer.from_defaults(
    token_limit=3000,
    chat_store=chat_store,
    chat_store_key="user1",
)

llm = Vertex(model="gemini-1.5-flash-002", project=PROJECT_ID)

chat_engine = SimpleChatEngine(memory=memory, llm=llm, prefix_messages=[])

response = chat_engine.chat("Hello.")

print(response)