m serve runs any Mellea program as an OpenAI-compatible chat endpoint. This lets
any LLM client — LangChain, the OpenAI SDK, curl — call your Mellea program as if
it were a model.
Prerequisites: pip install mellea.
The serve() function
Your program must define aserve() function with this signature:
m serve loads your file, finds serve(), and routes incoming requests to it.
ChatMessage has role and content fields matching the OpenAI chat format.
Example serve program
ChatContext conversation history across turns.
Starting m serve
POST /v1/chat/completions— OpenAI-compatible chat completions endpointGET /health— health check
Calling the served endpoint
Any OpenAI-compatible client works. Usingcurl:
docs/examples/m_serve/m_serve_example_simple.py
See also: Context and Sessions | Backends and Configuration