Skip to main content
Prerequisites: Python 3.11+, pip or uv available.

Install

pip install mellea
uv add mellea

Optional extras

Install extras for specific backends and features:
pip install "mellea[litellm]"    # LiteLLM multi-provider (Anthropic, Bedrock, etc.)
pip install "mellea[hf]"         # HuggingFace transformers for local inference
pip install "mellea[watsonx]"    # IBM WatsonX
pip install "mellea[tools]"      # Tool and agent dependencies (LangChain, smolagents)
pip install "mellea[telemetry]"  # OpenTelemetry tracing and metrics
uv add "mellea[litellm]"        # LiteLLM multi-provider (Anthropic, Bedrock, etc.)
uv add "mellea[hf]"             # HuggingFace transformers for local inference
uv add "mellea[watsonx]"        # IBM WatsonX
uv add "mellea[tools]"          # Tool and agent dependencies (LangChain, smolagents)
uv add "mellea[telemetry]"      # OpenTelemetry tracing and metrics
You can combine extras:
pip install "mellea[litellm,tools,telemetry]"
uv add "mellea[litellm,tools,telemetry]"
All extras: mellea[all] installs everything. For the full list of available extras see pyproject.toml.

Default backend: Ollama

The default session connects to Ollama running locally. Install Ollama and pull the default model before running any examples:
ollama pull granite4:micro