This page describes how to run Weaviate from source (git checkout / tarball) locally.
You can find the source code at the Weaviate repo.
Running from source
The fastest way to run Weaviate from source is to issue the command below:
<configuration> is one of the server configuration (
$CONFIG) values in
/tools/dev/run_dev_server.sh. For example, you can run:
To run the server locally with the OpenAI module.
The default configuration is
local-development which will run the server locally with the
You can also create your own configuration. For instance, you can clone an entry (
local-all-openai-cohere-palm is a good start) and add the required environment variables.
Running with Docker
To run with Docker, start up the Weaviate container and the container(s) for any additional services with
then run the development server as described in the section above.
For example, the setup below uses Docker Compose to spin up Prometheus and Grafana instances. Those are pre-configured to scrape metrics from Weaviate. Using this setup, you can:
- access Weaviate on port
- access Grafana on port
- if necessary for debugging - access prometheus directly on port
tools/dev/restart_dev_environment.sh --prometheus && tools/dev/run_dev_server.sh local-no-modules
Below are more examples of running Weaviate with Docker.
Transformers t2v only
tools/dev/restart_dev_environment.sh --transformers && ./tools/dev/run_dev_server.sh local-transformers
Contextionary t2v & Transformers QnA
tools/dev/restart_dev_environment.sh --qna && ./tools/dev/run_dev_server.sh local-qna
The above commands are subject to change as we add more modules and require specific combinations for local testing. You can always inspect restart_dev_environment.sh and run_dev_server.sh to see which options are available. The first option without any arguments is always guaranteed to work.
To make queries from a web interface, use the WCS console to connect to
For additional information, try these sources.