- Tool bridging — wrap existing LangChain tools as
MelleaToolobjects and pass them to anyMelleaSessioncall. - Message history — seed a Mellea
ChatContextwith conversation history from a LangChain session.
Using LangChain tools
Prerequisites:pip install langchain-core (or pip install langchain-community
for community tools).
MelleaTool.from_langchain() wraps any LangChain BaseTool so it can be passed to
instruct() or chat() via ModelOption.TOOLS:
from_langchain() reads the tool’s name and schema directly from the BaseTool
instance, so any tool that follows the LangChain BaseTool interface works without
further configuration.
Backend note: Tool calling requires a backend and model that support function calling (e.g., Ollama withgranite4:micro, OpenAI withgpt-4o). The default Ollama setup supports this.
Seeding a session with LangChain message history
When migrating from LangChain or building a system that spans both libraries, you may want to start a Mellea session from an existing LangChain conversation. Mellea uses explicitChatContext objects; the bridge is to convert LangChain messages to OpenAI
format first, then build the context:
convert_to_openai_messages normalises all LangChain message subtypes (system, human,
AI, tool) into {"role": ..., "content": ...} dicts. Any library that exports to
OpenAI chat format — LlamaIndex, Haystack, Semantic Kernel — works with the same pattern.
Full example: docs/examples/library_interop/langchain_messages.py
Which approach to use
| Scenario | Use |
|---|---|
Your tool exists as a LangChain BaseTool | MelleaTool.from_langchain(tool) |
Your tool exists as a smolagents Tool | MelleaTool.from_smolagents(tool) |
| You have a plain Python function to expose | @tool decorator |
| You have LangChain message history to continue | convert_to_openai_messages → ChatContext |
| You want Mellea as an OpenAI endpoint for another framework | m serve |
See also: Tools and Agents | Context and Sessions