Option 2: A local Docker instance
If you have created a cloud instance of Weaviate, you can skip this page and continue with Communicate with Weaviate.
Here, you will create a Weaviate instance using Docker.
Download and run the docker-compose file
Install Docker on your machine. We recommend following the official Docker installation guide.
Create a new directory and navigate to it in your terminal. Then, create a new file called docker-compose.yml
and add the following content:
---
services:
weaviate_anon:
command:
- --host
- 0.0.0.0
- --port
- '8080'
- --scheme
- http
image: cr.weaviate.io/semitechnologies/weaviate:1.27.4
ports:
- 8080:8080
- 50051:50051
restart: on-failure:0
environment:
OPENAI_APIKEY: $OPENAI_APIKEY
QUERY_DEFAULTS_LIMIT: 25
AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'
PERSISTENCE_DATA_PATH: '/var/lib/weaviate'
DEFAULT_VECTORIZER_MODULE: 'none'
ENABLE_MODULES: 'text2vec-cohere,text2vec-huggingface,text2vec-openai,generative-openai,generative-cohere'
BACKUP_FILESYSTEM_PATH: '/var/lib/weaviate/backups'
CLUSTER_HOSTNAME: 'node1'
...
Create a Weaviate instance
Run the following command to start Weaviate:
docker compose up
Your Weaviate instance details
Once the instance is created, you can access it at http://localhost:8080
.
Connect to your Weaviate instance
To connect to the Weaviate instance, use the connect_to_local
function.
import weaviate
client = weaviate.connect_to_local()
Provide inference API keys
Some Weaviate modules can use inference APIs for vectorizing data or large language model integration. You can provide the API keys for these services to Weaviate at instantiation.
This course uses OpenAI, so you can provide the OpenAI API key to Weaviate through headers={"X-OpenAI-Api-Key": <YOUR_KEY>}
as shown below:
import weaviate
import os
headers = {
"X-OpenAI-Api-Key": os.getenv("OPENAI_APIKEY")
} # Replace with your OpenAI API key
client = weaviate.connect_to_local(headers=headers)
Questions and feedback
If you have any questions or feedback, let us know in the user forum.