Skip to main content

The AI-Native Vector Database

Purpose-built for a new breed of software applications.

Weaviate is an open-source vector database that simplifies the development of AI applications. Built-in vector and hybrid search, easy-to-connect machine learning models, and a focus on data privacy enable developers of all levels to build, iterate, and scale AI capabilities faster.

Bring the power of AI to more developers

Help developers build and scale AI-powered applications more easily with an open source, developer-friendly platform and ecosystem.

Learn more

Easily connect to popular ML models

Build and iterate faster with integrations for 20+ ML models and frameworks. Quickly adopt and test new models as the ecosystem evolves.

Learn more

Get the best of vector and keyword search

Improve semantic understanding and accuracy to deliver better insights. Leverage both vector search and BM25 keyword search without any extra overhead.

Learn more

Run securely, flexibly, and reliably—at scale

Run where you want, how you want. Weaviate is available as a self-hosted database, a managed service, or a Kubernetes package in your VPC.

Learn more about Trust and Security with Weaviate

Feature Overview

Built-in hybrid search

Merge different search algorithms and re-rank results accordingly.

Learn more

Advanced filtering

Apply complex filters across large datasets in milliseconds. 

Learn more

Out-of-the box RAG

Use proprietary data to securely interact with ML models.

Learn more

Vectorizer modules

Easily generate new vector embeddings or bring your own.

Learn more

Configurable backups

Back up as often as needed with zero downtime.

Learn more

Native multi-tenancy

Scale horizontally and consume resources efficiently.

Learn more

Vector index compression

Improve the memory footprint of large datasets.

Learn more

Tenant
isolation

Ensure security with strict resource isolation.

Learn more

20+ Ecosystem Integrations

Move faster with direct integration into the ML ecosystem.

“Weaviate's batteries-included approach, incorporating both model serving and multi-tenancy, has helped us quickly prototype and build our vector search at Stack.”

Constantine Kokkinos,
Stack Overflow