Skip to main content

Ollama + Weaviate

New Documentation

The model provider integration pages are new and still undergoing improvements. We appreciate any feedback on this forum thread.

The Ollama library allows you to easily run a wide range of models on your own device. Weaviate seamlessly integrates with the Ollama library, allowing users to leverage compatible models directly within the Weaviate database.

These integrations empower developers to build sophisticated AI-driven applications with ease.

Integrations with Ollama

Weaviate integrates with compatible Ollama models by accessing the locally hosted Ollama API.

Embedding integration illustration

Ollama's embedding models transform text data into high-dimensional vector representations, capturing semantic meaning and context.

Weaviate integrates with Ollama's embedding models to enable seamless vectorization of data. This integration allows users to perform semantic and hybrid search operations without the need for additional preprocessing or data transformation steps.

Ollama embedding integration page

Generative AI models for RAG

Single prompt RAG integration generates individual outputs per search result

Ollama's generative AI models can generate human-like text based on given prompts and contexts.

Weaviate's generative AI integration enables users to perform retrieval augmented generation (RAG) directly within the Weaviate database. This combines Weaviate's efficient storage and fast retrieval capabilities with Ollama's generative AI models to generate personalized and context-aware responses.

Ollama generative AI integration page


These integrations enable developers to leverage powerful Ollama models from directly within Weaviate.

In turn, they simplify the process of building AI-driven applications to speed up your development process, so that you can focus on creating innovative solutions.

Get started

A locally hosted Weaviate instance is required for these integrations so that you can host your own Ollama models.

Go to the relevant integration page to learn how to configure Weaviate with the Ollama models and start using them in your applications.

If you have any questions or feedback, let us know in the user forum.