AI-Friendly Decision Services

LLMs aren’t decision-makers — at least not yet. But they’ve proven to be remarkably effective translators between natural language and domain-specific languages, unlocking access to decision services built on traditional rule engines, optimization solvers, and machine learning tools. Naturally, decision intelligence platforms have followed suit, providing mechanisms to turn decision models into AI-friendly services compatible with leading LLMs like ChatGPT, Claude, and Gemini.

At OpenRules, we set out to make this process as seamless as possible — no coding, no manual configuration. With the latest release 12.0.0, every OpenRules decision service is AI-ready out of the box. Here’s how we got there.

We started where many platforms did: deploying OpenRules decision models as MCP Servers. While effective, this approach requires a certain degree of decision service configuration as well as the manual creation of project-specific prompts.

Then we asked ourselves: if OpenRules glossaries already contain everything an LLM needs to communicate with a deployed decision service — why not go one step further and generate the LLM input automatically?

We enabled OpenRules decision services to expose their own descriptions — including inputs and outputs — so that an LLM can analyze this information, automatically invoke the appropriate service, and guide the entire user interaction in plain English.

Every OpenRules decision service, whether deployed locally or remotely, now automatically generates two descriptor files:

  • description.md
  • schema.json

These files are produced directly from the decision model’s glossaries, primarily describing its inputs and outputs. To connect an LLM to an OpenRules decision service, simply enter the service’s endpoint URL — appended with “/description.md” — directly into the LLM dialog. For example, to configure an LLM to act as a loan officer using the deployed constraint-based decision service “decision-loan” (implemented in the decision model “Loan“), you might enter something like the following:

The file “description.md” includes a reference to the corresponding JSON schema defined in “schema.json.”

Users may also optionally include a DecisionModelDescription table to provide the LLM with additional plain-English context about the model. As an example, the “Inside/Outside Production” decision model leverages this table to direct its underlying optimization-based service to apply a problem-specific solution method:

OpenRules customers can continue using their existing decision services independently of any LLM. At the same time, without any modifications, they can connect those services to an LLM — gaining a natural language interface or, at a minimum, a powerful QA tool.

You may analyze various examples of OpenRules AI agents by selecting them from the bar on the right.

Rules-based AI Agents. The “PatientTherapy” and “VacationDays” AI agents are standard decision models that rely exclusively on business rules. When deployed as decision services, however, they automatically expose their own descriptions — enabling an LLM to invoke the appropriate services and guide the entire user interaction through a plain-English conversation. The same approach applies to any other OpenRules decision model, including those in the standard “openrules.samples” installation or your own custom models.

Optimization-based AI Agents. The “Loan” and “Inside/Outside Production” AI agents showcase optimization-based decision models implemented with OpenRules RuleSolver. They demonstrate how LLM-driven dialogues can guide end users toward the most appropriate decisions — even allowing them to introduce additional constraints on the fly that fall outside the original decision model. Together, they empower non-technical users to confidently engage with complex optimization problems.

Third-party AI Agents. It is worth noting that OpenRules can wrap any external decision service — not just those created within OpenRules. Such services can be built with any rule-based tool, in Python, in Java, by an LLM, or by any other means, and deployed locally or remotely. By defining a table of type “DecisionService“ within an OpenRules project, any such service can be seamlessly integrated, as shown below:

Article content

With the addition of an appropriate glossary, third-party decision services become just as AI-ready as any native OpenRules decision service.

LLMs are fundamentally transforming the orchestration of decision services. They enable end users to interact with existing decision services in a natural, flexible way — without any custom-built interfaces or rigid workflows. By combining business rules, machine learning, optimization, and AI agents, OpenRules empowers customers to build smarter, more adaptive decision-making systems—all within a unified decision intelligence platform.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.