Skip to main content

Integrations

Weaviate's integration ecosystem enables developers to build various applications leveraging Weaviate and another technology.

All the notebooks and code examples are on Weaviate Recipes!

alt

About the Categories

The ecosystem is divided into these categories:

  • Cloud Hyperscalers - Large-scale computing and storage
  • Compute Infrastructure - Run and scale containerized applications
  • Data Platforms - Data ingestion and web scraping
  • LLM Frameworks - Build generative AI applications
  • Operations - Tools for monitoring and analyzing generative AI workflows

List of Companies

Company CategoryCompanies
Cloud HyperscalersAWS, Google
Compute InfrastructureModal, Replicate
Data PlatformsAryn, Confluent Cloud, Context Data, Databricks, Firecrawl, Unstructured
LLM FrameworksComposio, DSPy, Haystack, LangChain, LlamaIndex, Semantic Kernel
OperationsArize, Langtrace, LangWatch, Nomic, Ragas, Weights & Biases

Model Provider Integrations

Weaviate integrates with self-hosted and API-based embedding models from a range of providers.

Refer to the documentation page to see the full list of model providers.