Executing Rules

Execute IoI rules against your own forensic datasets in five steps.

Testing with published datasets

To reproduce one of the published anti-forensic cases directly, start on the Artifacts & Datasets page, download a per-case reproducibility bundle from the Reproducibility bundle column, then follow the steps below using the included raw artifacts and parser outputs.

The framework operates in three layers:

  • Instantiators — Python scripts that map artifact parser CSV output to CASE/UCO JSON-LD knowledge graphs.
  • Knowledge Graph — named graphs loaded into a SPARQL 1.1 triplestore (Virtuoso), one per artifact source.
  • IoI Rules — SPARQL signatures that query across graphs to surface cross-artifact contradictions.

Prerequisites

Python 3.9+, rdflib and pandas for instantiators and JSON-LD conversion, and a SPARQL 1.1 triplestore. Validation used OpenLink Virtuoso Open-Source Edition running in Docker.

pip install rdflib pandas

Pull the Virtuoso Docker image:

docker pull openlink/virtuoso-opensource-7:latest

Step-by-step

1

Clone the repository

The repository contains all instantiator scripts, JSON-LD templates, ground-truth documents, and IoI SPARQL rules.

git clone https://github.com/ioi-framework/ioi-framework.git
cd ioi-framework
2

Parse your artifacts

If you downloaded one of the published case bundles from Artifacts & Datasets, use the bundled raw artifacts and CSV parser outputs here instead of exporting your own.

Use artifact-specific parsers to produce structured CSV output. The framework was validated with the following tools:

# NTFS $MFT and $UsnJrnl:$J
MFTECmd.exe -f "$MFT" --csv ./output --csvf mft.csv
MFTECmd.exe -f "$J"   --csv ./output --csvf usn.csv

# Windows Event Logs
EvtxECmd.exe -f Security.evtx --csv ./output

# LNK files
LECmd.exe -f target.lnk --csv ./output

# Manual Chrome History export (if you have a copied SQLite History DB)
python3 SCRIPTS/export_chrome_history.py "/path/to/History" ./output/history.json
3

Instantiate CASE/UCO knowledge graphs

Run the instantiator for each artifact. Each script maps parser CSV fields to the CASE/UCO ontology and serializes the result as JSON-LD.

mkdir -p outputs

python instantiators/mft_instantiator.py "<MFT_CSV>" outputs/mft_filled.jsonld
python instantiators/usn_instantiator.py "<USN_CSV>" outputs/usn_filled.jsonld
python3 instantiators/history_instantiator.py ./output/history.json outputs/history_filled.jsonld

Convert JSON-LD to N-Triples for bulk loading. For browser-history work outside Autopsy, the flow is: copied Chrome History SQLite DB → SCRIPTS/export_chrome_history.pyhistory_instantiator.py → JSON-LD/N-Triples:

python3 SCRIPTS/convert_to_ntriples.py outputs/mft_filled.jsonld outputs/mft_case.nt
python3 SCRIPTS/convert_to_ntriples.py outputs/usn_filled.jsonld outputs/usn_case.nt

Note: For large MFT/USN files, use --chunk-size 5000 to split the output into multiple JSON-LD chunks. Each chunk is then converted to N-Triples separately.

python instantiators/mft_instantiator.py "<MFT_CSV>" outputs/mft_filled.jsonld --chunk-size 5000
# Converts each chunk: outputs/mft_filled_chunk0.jsonld, _chunk1.jsonld, ...
for f in outputs/mft_filled_chunk*.jsonld; do
  python3 SCRIPTS/convert_to_ntriples.py "$f" "${f%.jsonld}.nt"
done
4

Start Virtuoso and verify your setup

Start Virtuoso and wait for it to be ready before loading data:

docker run --name vos -d -e DBA_PASSWORD=dba \
  -p 8890:8890 -p 1111:1111 \
  openlink/virtuoso-opensource-7:latest

# Wait ~10 seconds, then confirm it is ready
docker exec vos isql 1111 dba dba "exec=select 1;"

You should see 1 returned. If the command fails, wait a few more seconds and retry.

Verify your environment using the AF-004 test graphs (no real data needed). These synthetic graphs are included in the repository and produce a known result:

# Copy test graphs into the container
docker cp CASES/AF-004/test/mft_test.nt vos:/usr/share/proj/mft_test.nt
docker cp CASES/AF-004/test/usn_test.nt vos:/usr/share/proj/usn_test.nt

# Load named graphs
docker exec -i vos isql 1111 dba dba <<'EOF'
DB.DBA.TTLP_MT(file_to_string_output('/usr/share/proj/mft_test.nt'), '', 'https://ioi-framework.github.io/cases/AF-004/graphs/mft', 512);
DB.DBA.TTLP_MT(file_to_string_output('/usr/share/proj/usn_test.nt'), '', 'https://ioi-framework.github.io/cases/AF-004/graphs/usn', 512);
EOF

# Run IOI-004 — expect 1 row
docker cp RULES/structural/IOI-004_vss_traces_missing.rq vos:/database/rule.rq
docker exec vos bash -lc "printf 'SPARQL\n'; sed '/^#/d' /database/rule.rq; printf '\n;'" \
  | docker exec -i vos isql 1111 dba dba

A result with 1 row confirms Virtuoso is correctly loaded and rules execute as expected. The row corresponds to bare {GUID} VSS deletion evidence; Apps_{GUID} names are ignored to avoid Windows Search false positives. IOI-004 also requires the full VSS infrastructure triad (tracking.log, IndexerVolumeGuid, _OnDiskSnapshotProp) to be present before correlating to GUID deletions. This flow uses /usr/share/proj because Virtuoso's default DirsAllowed configuration reads from that directory, and TTLP_MT loads the triples directly once the files are there. See CASES/AF-004/ground_truth.md for what this result means forensically.

5

Load your own graphs and execute IoI rules

Copy your N-Triples into the container, load them as named graphs from Virtuoso's allowed directory, then run any rule from RULES/:

# Copy N-Triples into the container
docker cp outputs/mft_case.nt vos:/usr/share/proj/mft_case.nt
docker cp outputs/usn_case.nt vos:/usr/share/proj/usn_case.nt

# Load named graphs
docker exec -i vos isql 1111 dba dba <<'EOF'
DB.DBA.TTLP_MT(file_to_string_output('/usr/share/proj/mft_case.nt'), '', 'https://ioi-framework.github.io/cases/{your_case_id}/graphs/mft', 512);
DB.DBA.TTLP_MT(file_to_string_output('/usr/share/proj/usn_case.nt'), '', 'https://ioi-framework.github.io/cases/{your_case_id}/graphs/usn', 512);
EOF

# Execute an IoI rule
docker cp RULES/temporal/IOI-007_usn_clear_before_event.rq vos:/database/rule.rq
docker exec vos bash -lc "printf 'SPARQL\n'; sed '/^#/d' /database/rule.rq; printf '\n;'" \
  | docker exec -i vos isql 1111 dba dba

A non-empty result set indicates a detected inconsistency. Read the corresponding CASES/AF-NNN/ground_truth.md to interpret the result.

Need help?

See the artifacts & datasets page for published reproducibility bundles and downloadable case datasets, the scenarios for ground-truth specifications, and the IoI rules library for all SPARQL signatures. Open an issue on GitHub ↗ if you encounter problems.