Case study
Service as a Software, in motion
Journal Entry Testing
The problem
A first-year auditor's tour through a client's general ledger looks like this: download the trial balance, pull the journal-entry file, run the same dozen tests every audit shop teaches — round-dollar postings, weekend entries, manual reversals to revenue, period-end adjustments touched by management. The rules are public. The work is procedural. The judgment is whether any flagged entry is actually a problem.
Almost every step is well-defined except the last. The rules are written down in PCAOB AS 2401. The output format is fixed. The evidence trail is reproducible. The only step that takes professional judgment is the call — this one warrants follow-up — and even that's bounded by precedent.
The insight
That's a shape that became cheap to build. Well-defined inputs, producible evidence, a single judgment step at the end. The procedural part of the workflow is buildable now — not as a replacement for the auditor, but as the part of the work that doesn't need them.
Journal Entry Testing is a working version of that pattern. The page below is one real run.
What we hit
The verdict
What was flagged
Each rule below cites the PCAOB auditing standard it implements. The matched count is the number of postings flagged by that rule, with the percentage shown against the rule-evaluation target set — the in-scope population the rules ran against, not the dataset total. An auditor works the table top-down — high-severity, high-share rules first — and the evidence trail behind each finding is reproducible from the run id below.
| Rule | Severity | Matches |
|---|
How it was built
23 sprints. Single curator. The discipline JET applies to the audit — every claim linked to a rule, every rule cited to a standard, every run hash-stamped — is the same discipline that made the build inspectable. Each sprint had to clear its own contracts and evidence gates before the next one started.
The rule catalog is the spine. Each rule declares what it matches, what standard it cites, and what evidence it produces. New rules slot into the same contract without touching the engine — an audit team can ship firm-specific tests alongside the PCAOB pack, or a different industry can swap the whole catalog and keep the engine.
What's interesting isn't this system. It's that this class of system is now cheap. The question is which other workflows have the same shape: rules you can write down, evidence you can produce, a judgment step you'd want a human to make.