ResNet Image Vectorizer
For new projects, we recommend using the Transformers multi-modal integration module instead of img2vec-neural
. This uses CLIP models, which uses a more modern model architecture than resnet
models used in img2vec-neural
. CLIP models are also multi-modal, meaning they can handle both images and text and therefore applicable to a wider range of use cases.
The img2vec-neural
module enables Weaviate to obtain vectors locally images using a resnet50
model.
img2vec-neural
encapsulates the model in a Docker container, which allows independent scaling on GPU-enabled hardware while keeping Weaviate on CPU-only hardware, as Weaviate is CPU-optimized.
Key notes:
- This module is not available on Weaviate Cloud (WCD).
- Enabling this module will enable the
nearImage
search operator. - Model encapsulated in a Docker container.
- This module is not compatible with Auto-schema. You must define your classes manually as shown below.
Weaviate instance configuration
This module is not available on Weaviate Cloud.
Docker Compose file
To use img2vec-neural
, you must enable it in your Docker Compose file (e.g. docker-compose.yml
).
While you can do so manually, we recommend using the Weaviate configuration tool to generate the Docker Compose
file.
Parameters
Weaviate:
ENABLE_MODULES
(Required): The modules to enable. Includeimg2vec-neural
to enable the module.DEFAULT_VECTORIZER_MODULE
(Optional): The default vectorizer module. You can set this toimg2vec-neural
to make it the default for all classes.IMAGE_INFERENCE_API
(Required): The URL of the inference container.
Inference container:
image
(Required): The image name of the inference container. (e.g.semitechnologies/img2vec-pytorch:resnet50
orsemitechnologies/img2vec-keras:resnet50
)
Example
This configuration enables img2vec-neural
, sets it as the default vectorizer, and sets the parameters for the Docker container, including setting it to use img2vec-pytorch:resnet50
image.
services:
weaviate:
image: cr.weaviate.io/semitechnologies/weaviate:1.28.0
restart: on-failure:0
ports:
- 8080:8080
- 50051:50051
environment:
QUERY_DEFAULTS_LIMIT: 20
AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'
PERSISTENCE_DATA_PATH: "./data"
ENABLE_MODULES: 'img2vec-neural'
IMAGE_INFERENCE_API: "http://i2v-neural:8080"
CLUSTER_HOSTNAME: 'node1'
i2v-neural:
image: cr.weaviate.io/semitechnologies/img2vec-pytorch:resnet50
...
Alternative: Run a separate container
As an alternative, you can run the inference container independently from Weaviate. To do so, you can:
- Enable
img2vec-neural
in your Docker Compose file, - Omit
img2vec-neural
parameters, - Run the inference container separately, e.g. using Docker, and
- Set
IMAGE_INFERENCE_API
to the URL of the inference container.
Then, for example if Weaviate is running outside of Docker, set IMAGE_INFERENCE_API="http://localhost:8000"
. Alternatively if Weaviate is part of the same Docker network, e.g. because they are part of the same docker-compose.yml
file, you can use Docker networking/DNS, such as IMAGE_INFERENCE_API=http://i2v-neural:8080
.
For example, can spin up an inference container with the following command:
docker run -itp "8000:8080" semitechnologies/img2vec-neural:resnet50-61dcbf8
Class configuration
You can configure how the module will behave in each class through the Weaviate schema.
Vectorization settings
You can set vectorizer behavior using the moduleConfig
section under each class and property:
Class-level
vectorizer
- what module to use to vectorize the data.imageFields
- property names for images to be vectorized
Property-level
dataType
- the data type of the property. For use inimageFields
, must be set toblob
.
Example
The following example class definition sets the img2vec-neural
module as the vectorizer
for the class FashionItem
. It also sets:
image
property as ablob
datatype and as the image field,
{
"classes": [
{
"class": "FashionItem",
"description": "Each example is a 28x28 grayscale image, associated with a label from 10 classes.",
"vectorizer": "img2vec-neural",
"moduleConfig": {
"img2vec-neural": {
"imageFields": [
"image"
]
}
},
"properties": [
{
"dataType": [
"blob"
],
"description": "Grayscale image",
"name": "image"
},
{
"dataType": [
"number"
],
"description": "Label number for the given image.",
"name": "labelNumber"
},
{
"dataType": [
"text"
],
"description": "label name (description) of the given image.",
"name": "labelName"
}
],
}
]
}
blob
properties must be in base64-encoded data.Adding blob
data objects
Any blob
property type data must be base64 encoded. To obtain the base64-encoded value of an image for example, you can use the helper methods in the Weaviate clients or run the following command:
cat my_image.png | base64
Additional search operator
The img2vec-neural
vectorizer module will enable the nearImage
search operator.
Usage example
NearImage
- Python
- JS/TS Client v2
- Go
- Java
- Curl
- GraphQL
import weaviate
client = weaviate.Client("http://localhost:8080")
nearImage = {"image": "/9j/4AAQSkZJRgABAgE..."}
result = (
client.query
.get("FashionItem", "image")
.with_near_image(nearImage)
.do()
)
print(result)
import weaviate from 'weaviate-ts-client';
const client = weaviate.client({
scheme: 'http',
host: 'localhost:8080',
});
const response = await client.graphql
.get()
.withClassName('FashionItem')
.withFields('image')
.withNearImage({ image: '/9j/4AAQSkZJRgABAgE...' })
.do();
console.log(JSON.stringify(response, null, 2));
package main
import (
"context"
"fmt"
"github.com/weaviate/weaviate-go-client/v4/weaviate"
"github.com/weaviate/weaviate-go-client/v4/weaviate/graphql"
)
func main() {
cfg := weaviate.Config{
Host: "localhost:8080",
Scheme: "http",
}
client, err := weaviate.NewClient(cfg)
if err != nil {
panic(err)
}
className := "FashionItem"
image := graphql.Field{Name: "image"}
nearImage := client.GraphQL().NearImageArgBuilder().WithImage("/9j/4AAQSkZJRgABAgE...")
ctx := context.Background()
result, err := client.GraphQL().Get().
WithClassName(className).
WithFields(image).
WithNearImage(nearImage).
Do(ctx)
if err != nil {
panic(err)
}
fmt.Printf("%v", result)
}
package io.weaviate;
import io.weaviate.client.Config;
import io.weaviate.client.WeaviateClient;
import io.weaviate.client.base.Result;
import io.weaviate.client.v1.graphql.model.GraphQLResponse;
import io.weaviate.client.v1.graphql.query.argument.NearImageArgument;
import io.weaviate.client.v1.graphql.query.fields.Field;
public class App {
public static void main(String[] args) {
Config config = new Config("http", "localhost:8080");
WeaviateClient client = new WeaviateClient(config);
String className = "FashionItem";
Field image = Field.builder().name("image").build();
NearImageArgument nearImage = client.graphQL().arguments().nearImageArgBuilder()
.image("/9j/4AAQSkZJRgABAgE...")
.build();
Result<GraphQLResponse> result = client.graphQL().get()
.withClassName(className)
.withFields(image)
.withNearImage(nearImage)
.run();
if (result.hasErrors()) {
System.out.println(result.getError());
return;
}
System.out.println(result.getResult());
}
}
echo '{
"query": "{
Get {
FashionItem(nearImage: {
image: "/9j/4AAQSkZJRgABAgE..."
}) {
image
}
}
}"
}' | curl \
-X POST \
-H 'Content-Type: application/json' \
-d @- \
http://localhost:8080/v1/graphql
{
Get {
FashionItem(nearImage: {
image: "/9j/4AAQSkZJRgABAgE..."
}) {
image
}
}
}
About the model
resnet50
is a residual convolutional neural network with 25.5 million parameters trained on more than a million images from the ImageNet database. As the name suggests, it has a total of 50 layers: 48 convolution layers, 1 MaxPool layer and 1 Average Pool layer.
Available img2vec-neural models
There are two different inference models you can choose from. Depending on your machine (arm64
or other) and whether you prefer to use multi-threading to extract feature vectors or not, you can choose between keras
and pytorch
. There are no other differences between the two models.
resnet50
(keras
):- Supports
amd64
, but notarm64
. - Does not currently support
CUDA
- Supports multi-threaded inference
- Supports
resnet50
(pytorch
):- Supports both
amd64
andarm64
. - Supports
CUDA
- Does not support multi-threaded inference
- Supports both
Model license(s)
The img2vec-neural
module uses the resnet50
model.
It is your responsibility to evaluate whether the terms of its license(s), if any, are appropriate for your intended use.
Questions and feedback
If you have any questions or feedback, let us know in the user forum.