Aleph Alpha 嵌入¶
如果您在 Colab 上打开此 Notebook,您可能需要安装 LlamaIndex 🦙。
In [ ]
已复制!
%pip install llama-index-embeddings-alephalpha
%pip install llama-index-embeddings-alephalpha
In [ ]
已复制!
!pip install llama-index
!pip install llama-index
In [ ]
已复制!
# Initialise with your AA token
import os
os.environ["AA_TOKEN"] = "your_token_here"
# 用您的 AA token 初始化 import os os.environ["AA_TOKEN"] = "your_token_here"
使用 luminous-base
嵌入。¶
- representation="Document": 用于您想存储在向量数据库中的文本(文档)
- representation="Query": 用于搜索查询,以在向量数据库中找到最相关的文档
- representation="Symmetric": 用于聚类、分类、异常检测或可视化任务。
In [ ]
已复制!
from llama_index.embeddings.alephalpha import AlephAlphaEmbedding
# To customize your token, do this
# otherwise it will lookup AA_TOKEN from your env variable
# embed_model = AlephAlpha(token="<aa_token>")
# with representation='query'
embed_model = AlephAlphaEmbedding(
model="luminous-base",
representation="Query",
)
embeddings = embed_model.get_text_embedding("Hello Aleph Alpha!")
print(len(embeddings))
print(embeddings[:5])
from llama_index.embeddings.alephalpha import AlephAlphaEmbedding # 要定制您的 token,请这样做 # 否则它将从您的环境变量中查找 AA_TOKEN # embed_model = AlephAlpha(token="") # 使用 representation='query' embed_model = AlephAlphaEmbedding( model="luminous-base", representation="Query", ) embeddings = embed_model.get_text_embedding("Hello Aleph Alpha!") print(len(embeddings)) print(embeddings[:5])
representation_enum: SemanticRepresentation.Query 5120 [0.14257812, 2.59375, 0.33203125, -0.33789062, -0.94140625]
In [ ]
已复制!
# with representation='Document'
embed_model = AlephAlphaEmbedding(
model="luminous-base",
representation="Document",
)
embeddings = embed_model.get_text_embedding("Hello Aleph Alpha!")
print(len(embeddings))
print(embeddings[:5])
# 使用 representation='Document' embed_model = AlephAlphaEmbedding( model="luminous-base", representation="Document", ) embeddings = embed_model.get_text_embedding("Hello Aleph Alpha!") print(len(embeddings)) print(embeddings[:5])
representation_enum: SemanticRepresentation.Document 5120 [0.14257812, 2.59375, 0.33203125, -0.33789062, -0.94140625]