Deprecated: The native WatsonX backend is deprecated since v0.4. Use the LiteLLM or OpenAI backend with a WatsonX-compatible endpoint instead.The WatsonX backend connects to IBM’s managed AI platform. It requires an API key, project ID, and service URL. Prerequisites:
pip install 'mellea[watsonx]' and IBM Cloud credentials.
Credentials
- API key: IBM Cloud IAM
- Project ID: Your Watson Studio project settings
- URL: Region-specific endpoint (e.g.,
https://us-south.ml.cloud.ibm.com)
Connecting
The quickest path isstart_session() with backend_name="watsonx":
Backend directly for full control:
Available models
model_ids constant | WatsonX model name | Notes |
|---|---|---|
IBM_GRANITE_4_HYBRID_SMALL | ibm/granite-4-h-small | Default WatsonX model |
IBM_GRANITE_3_3_8B | ibm/granite-3-3-8b-instruct | |
IBM_GRANITE_3_2_8B | ibm/granite-3-2b-instruct |
model_ids.
Troubleshooting
Missing credentials:pip install "mellea[watsonx]" required:
The WatsonX backend requires the ibm-watson-machine-learning package, which is not
installed by default:
Vision support
Note:WatsonxAIBackenddoes not currently support image input. Passingimages=[...]toinstruct()orchat()will raise an error. Use the OpenAI backend or Ollama for vision tasks.
See also: Backends and Configuration