Providers
Embedding Providers

Embedding Providers

Embedding providers convert text into vector representations for semantic search.

OpenAI

Industry-standard embeddings.

Installation

pip install remina-memory[openai]

Configuration

"embedder": {
    "provider": "openai",
    "config": {
        "api_key": None,  # Uses OPENAI_API_KEY env
        "model": "text-embedding-3-small",
        "dimensions": 1536,
        "base_url": None,
    }
}
OptionTypeDefaultDescription
api_keystrenv varOpenAI API key
modelstrtext-embedding-3-smallModel name
dimensionsint1536Output dimensions
base_urlstrNoneCustom endpoint

Models

ModelDimensionsCost
text-embedding-3-small1536$0.02/1M tokens
text-embedding-3-large3072$0.13/1M tokens
text-embedding-ada-0021536$0.10/1M tokens

Google Gemini

High-quality embeddings with free tier.

Installation

pip install remina-memory[gemini]

Configuration

"embedder": {
    "provider": "gemini",
    "config": {
        "api_key": None,  # Uses GOOGLE_API_KEY env
        "model": "models/text-embedding-004",
        "dimensions": 768,
    }
}
OptionTypeDefaultDescription
api_keystrenv varGoogle API key
modelstrmodels/text-embedding-004Model name
dimensionsint768Output dimensions

Cohere

Enterprise-grade with multilingual support.

Installation

pip install remina-memory[cohere]

Configuration

"embedder": {
    "provider": "cohere",
    "config": {
        "api_key": None,  # Uses COHERE_API_KEY env
        "model": "embed-english-v3.0",
    }
}

Models

ModelDimensionsLanguages
embed-english-v3.01024English
embed-multilingual-v3.01024100+

Ollama (Local)

Local embeddings without API costs.

Installation

pip install remina-memory[ollama]

Configuration

"embedder": {
    "provider": "ollama",
    "config": {
        "base_url": "http://localhost:11434",
        "model": "nomic-embed-text",
    }
}

Setup

curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

Models

ModelDimensionsSize
nomic-embed-text768274MB
mxbai-embed-large1024670MB
all-minilm38445MB

HuggingFace (Local)

Any HuggingFace embedding model locally.

Installation

pip install remina-memory[huggingface]

Configuration

"embedder": {
    "provider": "huggingface",
    "config": {
        "model": "sentence-transformers/all-MiniLM-L6-v2",
        "device": "cpu",
    }
}

Models

ModelDimensionsSize
all-MiniLM-L6-v238480MB
all-mpnet-base-v2768420MB
e5-large-v210241.3GB

Embedding Interface

All embedding providers implement:

class EmbeddingBase(ABC):
    def embed(self, text: str) -> List[float]
    def embed_batch(self, texts: List[str]) -> List[List[float]]
    @property
    def dimensions(self) -> int
    @property
    def model_name(self) -> str

Selection Guide

RequirementRecommendation
Best qualityOpenAI text-embedding-3-large
Quality + free tierGemini
MultilingualCohere embed-multilingual-v3.0
No API costsOllama or HuggingFace
Privacy/on-premiseOllama or HuggingFace