A new OpenRules Release 6.3.1 enhances its Test Harness with automatic comparison of expected and actual decision execution results. OpenRules always considered Decision Testing as a highly important component of our Decision Management framework. From the very beginning we provided tests creation, execution, and maintenance facilities oriented to subject matter experts (not programmers) – the same people who create and manage business rules. They could create test cases directly in Excel using tables of the type “Data”. For example, the decision model “Clinical Guidelines”, described in this Tutorial, recommends different medications and doses during a patient’s visit to a doctor. Here are two related test cases for business concepts “Patient” and “DoctorVisit”:
In real life our large customers create thousands of test cases, and maintain them with the same degree of precision as their business rules. Contrary to these simple test cases when results (Recommended Medication and Dose) are not known, our customers frequently associate the expected results to their test data. Then they write a simple code to compare the produced and the expected decision execution results and to find possible mismatches.
Now OpenRules Test Harness includes generic facilities for associating test cases with expected results using a new type of the decision table called “DecisionTableTest”. Here is an example for the about test data:
Here the first column specifies test IDs, next two columns specify test-objects defined in the above Data tables, and the last two columns specify the expected results for selected decision variables.
We also extended Decision API with a new method
test(String name _of_decision_table_test)
So, you do not need anymore to read and put your test objects into the decision and then call decision.execute() in your Java launcher. Instead you may simply call decision.test(“testCases”). This method executes all tests from the Excel table “testCases” as separate decision runs. After the execution of each test-case, the actually produced results will be compared with the expected results (defined in the columns of the type “ActionExpect”). All mismatches will be reported in the execution log. For instance, the above example will produce the following records:
Validating results for the test <Test 1> Test 1 was successful Validating results for the test <Test 2> MISMATCH: variable <Patient Creatinine Clearance> has value <48.03> while <48.04> was expected Test 2 was unsuccessful 1 test(s) out of 2 failed!
The detailed explanations of which rules were actually executed for each test may be found in the automatically generated HTML execution reports. A complete example of the test harness is provided in the project “DecisionPatientTherapyTest” added to the standard installation.
A user may build custom tables of the type “DecisionTableTest” using as many columns of the types “ActionUseObject” and “ActionExpect” as needed. We expect OpenRules customers will appreciate these new testing facilities.