from_texts_return_keys(
cls,
texts: List[str],
embedding: Embeddings,
metadatas: | Name | Type | Description |
|---|---|---|
texts* | List[str] | List of texts to add to the |
embedding* | Embeddings | Embeddings to use for the |
metadatas | Optional[List[dict]] | Default: None |
index_name | Optional[str] | Default: None |
index_schema | Optional[Union[Dict[str, ListOfDict], str, os.PathLike]] | Default: None |
vector_schema | Optional[Dict[str, Union[str, int]]] | Default: None |
**kwargs | Any | Default: {} |
Create a InMemoryVectorStore vectorstore from raw documents.
This is a user-friendly interface that:
This method will generate schema based on the metadata passed in
if the index_schema is not defined. If the index_schema is defined,
it will compare against the generated schema and warn if there are
differences. If you are purposefully defining the schema for the
metadata, then you can ignore that warning.
To examine the schema options, initialize an instance of this class and print out the schema using the `InMemoryVectorStore.schema`` property. This will include the content and content_vector classes which are always present in the langchain schema.
Example:
from langchain_aws.vectorstores import InMemoryVectorStore
embeddings = OpenAIEmbeddings()
redis, keys = InMemoryVectorStore.from_texts_return_keys(
texts,
embeddings,
redis_url="redis://cluster_endpoint:6379"
)Optional list of metadata dicts to add to the VectorStore.
Optional name of the index to create or add to.
Optional fields to index within the metadata. Overrides generated schema.
Optional vector schema to use.
Additional keyword arguments to pass to the Redis client.