Weaviate
This notebook covers how to get started with the Weaviate vector store in LangChain, using the langchain-weaviate
package.
Weaviate is an open-source vector database. It allows you to store data objects and vector embeddings from your favorite ML-models, and scale seamlessly into billions of data objects.
To use this integration, you need to have a running Weaviate database instance.
Minimum versionsโ
This module requires Weaviate 1.23.7
or higher. However, we recommend you use the latest version of Weaviate.
Connecting to Weaviateโ
In this notebook, we assume that you have a local instance of Weaviate running on http://localhost:8080
and port 50051 open for gRPC traffic. So, we will connect to Weaviate with:
weaviate_client = weaviate.connect_to_local()
Other deployment optionsโ
Weaviate can be deployed in many different ways such as using Weaviate Cloud Services (WCS), Docker or Kubernetes.
If your Weaviate instance is deployed in another way, read more here about different ways to connect to Weaviate. You can use different helper functions or create a custom instance.
Note that you require a
v4
client API, which will create aweaviate.WeaviateClient
object.
Authenticationโ
Some Weaviate instances, such as those running on WCS, have authentication enabled, such as API key and/or username+password authentication.
Read the client authentication guide for more information, as well as the in-depth authentication configuration page.