RedisVectorStore(
self,
embeddings: Embeddings,
config: Optional[RedisConfig] = None,
ttl:| Name | Type | Description |
|---|---|---|
embeddings* | Embeddings | The |
config | Optional[RedisConfig] | Default: NoneOptional If not provided, a new one will be created from kwargs. |
ttl | Optional[int] | Default: None |
**kwargs | Any | Default: {} |
| Name | Type |
|---|---|
| embeddings | Embeddings |
| config | Optional[RedisConfig] |
| ttl | Optional[int] |
Redis vector store integration.
Setup:
Install langchain-redis and running the Redis docker container.
pip install -qU langchain-redis
docker run -p 6379:6379 redis/redis-stack-server:latest
Key init args — indexing params:
index_name: str
Name of the index to create.
embedding: Embeddings
Embedding function to use.
distance_metric: str
Distance metric to use for similarity search. Default is 'COSINE'.
indexing_algorithm: str
Indexing algorithm to use. Default is 'FLAT'.
vector_datatype: str
Data type of the vector. Default is 'FLOAT32'.
Key init args — client params: redis_url: Optional[str] URL of the Redis instance to connect to. redis_client: Optional[Redis] Pre-existing Redis connection. ttl: Optional[int] Time-to-live for the Redis keys.
Instantiate:
from langchain_redis import RedisVectorStore
from langchain_openai import OpenAIEmbeddings
vector_store = RedisVectorStore(
index_name="langchain-demo",
embedding=OpenAIEmbeddings(),
redis_url="redis://localhost:6379",
)
You can also connect to an existing Redis instance by passing in a
pre-existing Redis connection via the redis_client argument.
from langchain_redis import RedisVectorStore
from langchain_openai import OpenAIEmbeddings
from redis import Redis
redis_client = Redis.from_url("redis://localhost:6379")
store = RedisVectorStore(
embedding=OpenAIEmbeddings(),
index_name="langchain-demo",
redis_client=redis_client
)from langchain_core.documents import Document
document_1 = Document(page_content="foo", metadata={"baz": "bar"})
document_2 = Document(page_content="bar", metadata={"foo": "baz"})
document_3 = Document(page_content="to be deleted")
documents = [document_1, document_2, document_3]
ids = ["1", "2", "3"]
vector_store.add_documents(documents=documents, ids=ids)Delete Documents
vector_store.delete(ids=["3"])results = vector_store.similarity_search(query="foo", k=1)
for doc in results:
print(f"* {doc.page_content} [{doc.metadata}]")
* foo [{'baz': 'bar'}]from redisvl.query.filter import Tag
results = vector_store.similarity_search(
query="foo",
k=1,
filter=Tag("baz") == "bar"
)
for doc in results:
print(f"* {doc.page_content} [{doc.metadata}]")
* foo [{'baz': 'bar'}]results = vector_store.similarity_search_with_score(query="foo", k=1)
for doc, score in results:
print(f"* [SIM={score:.3f}] {doc.page_content} [{doc.metadata}]")
* [SIM=0.916] foo [{'baz': 'bar'}]retriever = vector_store.as_retriever(
search_type="mmr",
search_kwargs={"k": 1, "fetch_k": 2, "lambda_mult": 0.5},
)
retriever.get_relevant_documents("foo")
[Document(page_content='foo', metadata={'baz': 'bar'})]Optional time-to-live for Redis keys.
Additional keyword arguments for RedisConfig if
config is not provided.