Integrating Odoo ERP Data with Natural Language Chat via a Modular MCP Agentic Framework



Integrating Odoo ERP Data with Natural Language Chat via a Modular MCP Agentic Framework
Modern businesses often struggle to get quick answers from their ERP systems using natural language. Odoo ERP, while powerful, doesn’t easily answer ad-hoc questions like “What is our billable utilization rate this week?” or “How much AWS funding is remaining for client XYZ?” using its native UI. These queries typically span multiple Odoo models (timesheets, projects, budgets, etc.) and require custom logic to join data. As a result, answering such questions usually involves manual report building or custom development. It’s a pain point for developers and analysts who want insights fast, in plain English, without stitching together spreadsheets or SQL queries.
Fortunately, recent advancements in agentic AI and the Model Context Protocol (MCP) offer a solution. By combining Odoo’s centralized data with a natural language chatbot powered by an LLM, we can let AI do the heavy lifting of data retrieval and computation. Generative AI integrated with Odoo can analyze real-time and historical data and answer complex business questions, enabling teams to “ask, explore, and act – all within a single ERP ecosystem,” reducing the need for multiple tools . In this post, we’ll explore a developer-centric approach to building such a solution using a modular MCP-based agent framework. We’ll break down the architecture, the key components (FastMCP, Odoo RPC, Open WebUI, AWS Bedrock, Claude 4), and illustrate how they work together to answer those tough questions in seconds.
The Challenge: Complex Questions Across Odoo Models
Odoo is an all-in-one platform with apps for everything from Projects to Accounting. However, its native reporting tools are siloed by module. A question like “What is our billable utilization rate this week?” isn’t answered by a single Odoo report – it requires combining data from the timesheets (hours logged by consultants), maybe a list of billable projects or tasks, and the working hours capacity of your team for the week. Similarly, “How much AWS funding is remaining for client XYZ?” might involve checking a custom model or project record where you track that client’s budget and subtracting the sum of AWS-related expenses or timesheets. In Odoo’s UI, you’d likely have to pull data from multiple screens or build a custom query to get these insights. This is time-consuming and often out of reach for a non-technical user.
The core difficulty is that cross-model queries and business-specific logic aren’t readily accessible via Odoo’s default interface. You can certainly develop custom Python methods or use Odoo’s reporting engine to calculate such metrics, but that requires developer time. Wouldn’t it be nicer if you could just ask Odoo in natural language and get the answer? That’s exactly what we aim to achieve by integrating Odoo with a chat-based AI assistant. By leveraging an LLM that knows how to use tools, we can dynamically fetch and join data from Odoo’s database on the fly. Next, let’s see how our solution is architected to enable this.
Solution Overview: An Agentic Chatbot for Odoo via MCP
Our approach uses an agentic AI framework – meaning the AI (a large language model) can act as an “agent,” invoking tools and code to gather information before responding. The integration is built on the Model Context Protocol (MCP), which “provides a standardized way for applications to connect AI models with data sources and tools — think of it like a USB-C port for AI applications”. In simple terms, MCP defines how an AI can discover and call external tools (like an Odoo data fetcher) in a structured way. We use FastMCP 2.0 as the orchestration layer to set up these tools and handle communication between the LLM and Odoo. FastMCP is a Pythonic framework for building MCP servers (agents) and clients – it makes it easy to define tools and expose them to an AI in a standardized format.
Here’s a high-level look at the components involved in our system:
FastMCP 2.0 – Orchestrating AI Agents and Tools
FastMCP 2.0 is the glue of our architecture. We create an MCP server with FastMCP that acts as the Odoo Data Agent. Within this server, we define a set of tools (functions) that the AI can use – for example, a tool to query timesheet records, another to sum up hours, another to fetch a client’s budget, etc. FastMCP handles packaging these functions with JSON schema metadata so that the LLM (Claude) knows how to invoke them correctly. The MCP server connects to Odoo (more on that next) and ensures each request from the AI is executed securely and returns a result. Essentially, FastMCP lets us modularize the logic – we could have one agent server for Odoo, and potentially others for different services, all accessible to the LLM. This modular design follows MCP’s philosophy of focused, purpose-built servers for each domain.
Odoo-RPC-Client – Programmatic Access to Odoo Data
To pull real data from Odoo, our agent uses odoo-rpc-client under the hood. Odoo exposes a remote API (XML-RPC/JSON-RPC) for its models, and odoo-rpc-client is a Python library that makes calling this API convenient. Through this client, our FastMCP tools can authenticate to Odoo and read or write data just as an Odoo module would. The integration is comprehensive – we have full access to Odoo models, records, and even the ability to call server-side methods, all over a secure XML-RPC connection. For example, a tool might use odoo_rpc_client to search all account.analytic.line (timesheet) records for this week, or to call a custom method that returns remaining budget for a project. By wrapping these calls in easy-to-use functions, the LLM doesn’t need to know Odoo specifics; it just needs to decide which tool to use and with what parameters.
Open WebUI – The User-Facing Chat Interface
For the user interface, we leverage Open WebUI, an open-source chat frontend. Open WebUI provides a ChatGPT-like experience in the browser – the user can type questions and see the AI’s answers with rich text, code, or even images if needed. We chose Open WebUI because it’s feature-rich and easily extensible, saving us from building a UI from scratch. Importantly, Open WebUI can connect to any backend that speaks the OpenAI API schema (it was designed to work with OpenAI-style chat completions). We’ll use this capability to hook Open WebUI up to our LLM (Claude 4) via AWS Bedrock. The end result is that a user at our company can go to a local webpage (Open WebUI), ask an Odoo-related question in natural language, and get an answer with cited data – all without leaving the chat interface.
Bedrock Access Gateway – Securely Routing to Claude 4
To connect Open WebUI to Claude 4 (Anthropic’s LLM) hosted on AWS, we utilize the Amazon Bedrock Access Gateway (BAG). The Bedrock Access Gateway is essentially a proxy/middleware that AWS provides to make Bedrock’s API look like OpenAI’s API. Open WebUI “requires OpenAI-compatible endpoints,” and BAG wraps the Bedrock endpoints to match that schema . In practice, we run the Bedrock Access Gateway as a service; Open WebUI sends the user’s chat request to BAG, which then relays it to the actual Bedrock service where Claude 4 lives. This setup also adds a layer of security and auditability – corporate data stays within our AWS environment and calls to the LLM go through our controlled gateway. With BAG, we can use our AWS IAM credentials and policies to manage access to Claude, ensuring that prompts (which might include some Odoo data or query details) are transmitted securely.
Claude 4 – The LLM “Brain” Coordinating Everything
At the heart of the system is Claude 4, Anthropic’s latest large language model. Claude 4 is an ideal choice for this agentic scenario for several reasons: it has an enormous context window (up to 200K tokens) for handling lots of data and extended reasoning, and it’s designed to excel at complex, multi-step tasks. In fact, Claude 4 is described as functioning like an “expert virtual collaborator” that can maintain focus on complex tasks, use tools, and deliver complete solutions without constant guidance. This makes it well-suited to interpret a user’s request, break it down into sub-tasks, and utilize the MCP tools to fetch data from Odoo. Through the Bedrock integration, Claude 4 receives our prompt and a listing of available MCP tools (the Odoo agent’s capabilities). It then autonomously decides how to answer the question – often by generating a plan to call one or more tools, executing them via the MCP server, gathering the results, and formulating a final answer. All of this happens in a single interactive session with the user, typically within a few seconds, thanks to Claude’s advanced reasoning and the fast tool API.
How It Works: From User Query to Answer
Let’s walk through the process step by step, using our architecture and components:
User asks a question in Open WebUI: The user types a query in natural language (for example, “What is our billable utilization rate this week?”) into the Open WebUI chat interface and hits enter. The frontend captures this message and sends it as a request to the configured backend (our AI model endpoint).
Open WebUI forwards the query via Bedrock Access Gateway: Open WebUI is configured to use an OpenAI-compatible endpoint provided by the Bedrock Access Gateway. The user’s question (along with conversation context) is routed to BAG, which then calls the Claude 4 model on Amazon Bedrock. From the perspective of Open WebUI, it’s just like talking to an OpenAI API – BAG handles the translation under the hood. The question now reaches Claude 4 in the cloud, along with system instructions that include the MCP tool interface.
Claude 4 interprets the question and plans tool usage: Given the query and knowing what tools are available (the Odoo agent’s functions exposed via MCP), Claude 4 decides how to find the answer. For a complex question, Claude can break it into steps. For example, to get “billable utilization rate this week,” Claude might plan: (a) use the Odoo tool to get total hours logged by the consulting team this week, (b) use another tool or filter to get only billable hours, (c) calculate the ratio between billable and total hours. This planning happens internally (Claude’s chain-of-thought), leveraging its training and our prompt instructions. The key here is that Claude is acting as an agent, orchestrating calls to external functions as needed.
FastMCP agent executes Odoo data queries: When Claude decides to use a tool, it sends a JSON-RPC call (per the MCP protocol) to our FastMCP Odoo server. For instance, it might call a tool like get_timesheet_hours(user_group="consultants", start_date=..., end_date=...). FastMCP receives this request, uses the odoo_rpc_client to perform the query on the Odoo database (over XML-RPC), and then returns the result back to Claude in JSON format. Thanks to FastMCP and MCP, all this is standardized – the LLM doesn’t need to worry about API auth or formats. It might make multiple tool calls in sequence: e.g., first retrieve data, then perhaps call a small Python tool to do a calculation (we could even have a calculation tool if we didn’t want Claude to do math itself). Each call’s result is fed back into Claude’s context.
Claude 4 synthesizes the answer: With the necessary data in hand (e.g. total hours = 160, billable hours = 120), Claude performs any final computation or analysis. In our example, it calculates 120/160 = 75% and formulates a response like: “Our billable utilization for the week is 75%.” The answer is phrased in a user-friendly way, potentially with an explanation of how it was derived or a breakdown if the user might want details. Claude’s large context window and reasoning ability allow it to incorporate all relevant info and ensure the answer is coherent. It may also cite sources or refer to data points (if we instruct it to do so for transparency). The completed answer is then sent back through the Bedrock gateway to Open WebUI.
Open WebUI displays the answer: The user sees the final answer appear in the chat interface, as if they were chatting with a knowledgeable assistant. For instance, the assistant might reply: “Your consulting team’s billable utilization for the current week is 75%. (Out of 160 total hours logged, 120 hours were billable.)” The conversation can continue, with the user asking follow-ups, since Claude 4 (through MCP) can keep using the Odoo tools to fetch more information as needed in context.
Throughout this flow, the user never had to specify which database or table to look at – the AI agent figures that out. The combination of an LLM that can use tools and direct programmatic access to Odoo’s data enables a natural Q&A experience on top of a complex ERP.
Example Use Cases in Action
To make this concrete, let’s revisit our two example questions and how the system handles them:
Q: “What is our billable utilization rate this week?” – The AI understands this as a request for a specific KPI. It breaks it down: “billable utilization rate” usually means the percentage of total available hours that were billed to clients. Using the Odoo agent, Claude pulls the total hours logged by employees for the week (perhaps from the hr.timesheet or account.analytic.line model) and the subset of those hours that are tagged as billable (maybe by filtering on project or task attributes). It then calculates the percentage. For example, if 5 consultants each have a 40-hour work week (200 hours total) and together they billed 150 hours to client projects, the utilization is 150/200 = 75%. The answer is given as 75%, with a brief explanation. The beauty is that if the user wants to drill down (say, by asking “Can you show me the breakdown by consultant?”), the same agent can perform that follow-up by querying hours per employee.
Q: “How much AWS funding is remaining for client XYZ?” – Suppose Client XYZ was given a certain budget or credit for AWS spend, tracked in Odoo. This question implies: what’s the unused portion of that budget? The agent might first fetch the allocated funding for client XYZ (maybe from a custom Odoo model or a field on the client’s record). Then it fetches the actual AWS expenses or usage recorded for that client (perhaps from vendor bills, project expenses, or a timesheet tagged as AWS usage). Subtracting the two gives the remaining funding. For instance, if Client XYZ had $50,000 allocated for AWS and so far $35,000 has been utilized, the agent would return “$15,000 remaining.” If additional logic is needed (like considering invoices in draft vs. posted), the agent can be coded to handle that, or Claude can be prompted to account for it. The key is that what normally might require writing a custom SQL query or running multiple Odoo reports is now achieved by a single question in the chat.
These examples highlight how the system can navigate Odoo’s data model in a domain-aware way. The AI agent understands terms like “billable” or “AWS funding” because we give it the context (either through prompt hints or the design of tools). It’s not magic – behind the scenes, developers define what each tool does (e.g., one tool might know how to compute utilization given timesheet data), but once defined, the AI uses them fluidly to answer new questions. This drastically improves the accessibility of ERP data for non-technical users, and even for developers, it speeds up the retrieval of information during debugging or analysis.
Architecture Diagram
(Optional)
(Imagine a diagram here showing the flow: User → Open WebUI (chat UI) → Bedrock Access Gateway → Claude 4 (LLM) → FastMCP Odoo Agent → Odoo database. The LLM coordinates with the MCP agent to fetch data, then returns answer to user.)
Claude 4 is central, orchestrating between the front-end and back-end: it receives the question from Open WebUI (via Bedrock), uses the Odoo MCP server (FastMCP + odoo-rpc-client) as a tool to get data from the Odoo ERP, and then responds back through the gateway. The entire process is seamless to the user.
Closing Thoughts and Future Outlook
Implementing a natural language chat interface for Odoo data using this agentic framework has proven to be a powerful approach. Developers get to define robust, reusable tools for data access, and the AI handles the logic of calling those tools and composing answers. The use of MCP and FastMCP 2.0 means our solution is modular – we could plug in additional agents (for example, an agent for a third-party API or a different database) and Claude can coordinate among them. And because MCP is standardized, we have flexibility to swap out the LLM or integrate with other platforms without rewriting the core logic.
It’s worth noting that our current implementation is not (yet) open-sourced. The main reason is that it relies on a lot of custom Odoo model logic specific to our business (custom fields, specific calculations, etc.). Open-sourcing it would require stubbing those out or generalizing them, which is non-trivial. However, we recognize that many companies face similar questions, so we’re exploring how to generalize this approach. In the future, we might extract a generic “Odoo Chat Agent” template that others can adapt to their own data models. The promise is exciting: an AI assistant that truly knows your ERP.
For developers and GenAI enthusiasts, this project showcases what’s possible at the intersection of enterprise software and AI. By combining Odoo’s rich data with an LLM like Claude 4 (with its huge context and reasoning ability), and by using a framework like FastMCP to bridge the two, we unlock a new level of interaction with business data. Complex questions become conversational. As model and framework capabilities grow, we can expect even deeper integration – imagine the AI not only answering questions, but also taking actions in Odoo on behalf of users, or proactively alerting you when metrics deviate from the norm (with your approval, of course).
In summary, integrating Odoo ERP with a natural language chat interface via a modular MCP-based agent framework allows us to ask business questions in plain English and get answers in real-time. It overcomes the limitations of Odoo’s native tools for cross-domain queries and demonstrates a practical use of AI agents in an enterprise context. We’re excited about the results so far, and even though our solution is custom today, we see broad potential for similar setups in the Odoo community and beyond. The era of chatting with your ERP – and getting meaningful answers – has arrived.
References
Odoo and Generative AI integration benefits
Model Context Protocol (MCP) overview
Odoo MCP integration features (odoo-rpc, XML-RPC access)
Open WebUI with Bedrock Access Gateway (OpenAI-compatible endpoints)
Anthropic Claude 4 on AWS Bedrock (200K context, agentic capabilities)