A tool for converting Semantic of Business Vocabulary and Rules specifications into Answer Set Programming.
The translation of the main SBVR constructs is reported below:
SBVR | ASP |
---|---|
it is prohibited that p | ASP constraint with p |
it is obligatory that p | ASP constraint with p negated |
it is necessary that p | ASP constraint with p negated |
it is impossible that p | ASP constraint with p |
only if (combined with another modal operator) | it inverts the modality of the modal operator (positive becomes negative) |
each | generate the corresponding atom |
at least n | #count{...} >= n / arithmetic operation when combined with before or after |
at most n | #count{...} <= n / arithmetic operation when combined with before or after |
exactly n | #count{...} = n / arithmetic operation when combined with before or after |
at least n and at most m | n <= #count{...} <= m / arithmetic operation when combined with before or after |
if p then q | generate a constraint with p and q, depending on the modal operator p and q might be negated |
q if p | generate a constraint with p and q, depending on the modal operator p and might be negated |
p of q | generate a relation between p and q adding the corresponding atom |
p before q | p < q |
p after q | p > q |
Install the required Python package:
pip install lark
You can run sbvr2asp using the following command:
python3 src/main.py VOCABULARY RULES
where
VOCABULARY
is the path to the text file containing the vocabularyRULES
is the path to the text file containing the business rules.
Example datasets are available in the examples
directory. These include:
eu_rent
loan
photo_equipment
Each dataset has its own folder containing:
vocabulary.txt
: the SBVR vocabulary.rules.txt
: the SBVR business rules.encoding.lp
: the conding generated by SBVR2ASP
You can generate the same encoding by running:
python3 src/main.py examples/DATASET_NAME/vocabulary.txt examples/DATASET_NAME/rules.txt
where DATASET_NAME
is one of: eu_rent
, loan
, photo_equipment
.
Each dataset also includes benchmark instances. To run them, you need clingo. To install it, you can follow the instruction available here: https://github.com/potassco/clingo .
Then, you can run an instance using:
clingo examples/DATASET_NAME/encoding.lp examples/DATASET_NAME/data/CONSISTENCY/INSTANCE
where DATASET_NAME
is one of: eu_rent
, loan
, photo_equipment
; while CONSISTENCY
is one of: sat
(instances
without conflicts), unsat
(instances with conflicts).
A working example is:
clingo examples/loan/encoding.lp examples/loan/data/sat/instance_5_100.lp
Note that for the eu_rent dataset, due to GitHub's file size limitations, some eu_rent
input files are split into
multiple parts and stored into a folder.
You need to pass all parts to Clingo:
clingo examples/eu_rent/encoding.lp examples/eu_rent/data/sat/instance_500_400_5_5/*