Reading Reports
Markdown report
Section titled “Markdown report”The primary human-readable output. Written to {output}/latest.md.
Summary table
Section titled “Summary table”Shows aggregate counts at a glance:
| Metric | Count |
|---|---|
| Total Tests | 12 |
| Passed | 10 |
| Failed | 2 |
Errored, skipped, and invalid rows only appear if their count is greater than zero.
Test results
Section titled “Test results”Results are grouped by directory. Each failing test gets a detailed section:
### FAIL auth-check
Source: auth-middleware.spec.mdStatus: FAIL
Expectation:The auth middleware should validate JWT tokens on protected routes.
Observed:The middleware only checks for cookie-based sessions, no JWT validation exists.
Location:src/middleware/auth.ts
Resolution:Add JWT token validation logic to the auth middleware.Passing tests are hidden by default. Use --include-passing to show them.
Validation issues
Section titled “Validation issues”If validation finds problems, they appear at the bottom:
## Validation Issues
- **duplicate-id**: Duplicate test ID "auth-check" found in: auth.spec.md, session.spec.mdJSON report
Section titled “JSON report”Machine-readable output for CI. Written to {output}/ci-results.json.
{ "status": "fail", "summary": { "total": 12, "passed": 10, "failed": 2 }, "tests": [ { "id": "auth-check", "sourceFile": "auth-middleware.spec.md", "status": "fail", "location": "src/middleware/auth.ts" }, { "id": "project-structure", "sourceFile": "project-structure.spec.md", "status": "pass", "group": "infra" } ]}Key fields:
| Field | Type | Description |
|---|---|---|
status | "pass" | "fail" | "error" | Overall run status |
summary.total | number | Total test scenario count |
tests[].id | string | LLM-extracted test ID |
tests[].status | TestStatus | Per-test result |
tests[].group | string? | Directory-based group (e.g. "api") |
tests[].location | string? | Relevant file path (fail only) |
tests[].error | string? | Error message (error only) |
JUnit XML report
Section titled “JUnit XML report”Standard JUnit format for integration with CI platforms and test frameworks. Written to {output}/junit-results.xml when --junit is passed.
<?xml version="1.0" encoding="UTF-8"?><testsuites tests="12" failures="2" errors="0" skipped="0" timestamp="..."> <testsuite name="semantic-tests" tests="10" ...> <testcase name="auth-check" classname="auth-middleware.spec.md" /> <testcase name="config-check" classname="config-schema.spec.md"> <failure message="...">...</failure> </testcase> </testsuite></testsuites>Tests are grouped into <testsuite> elements by directory group. The classname attribute contains the source file name.
Terminal output
Section titled “Terminal output”During execution
Section titled “During execution”Live progress with spinner animation (TTY) or static lines (non-TTY):
| Icon | Colour | Meaning |
|---|---|---|
✔ | Green | All scenarios in the file passed |
✗ | Red | At least one scenario failed |
⚠ | Yellow | At least one scenario errored |
⟳ | Yellow | Retrying after empty response |
Final summary
Section titled “Final summary”Semantic tests completed
Report: semantic-test-results/latest.mdCI Output: semantic-test-results/ci-results.jsonJUnit: semantic-test-results/junit-results.xmlDebug: semantic-test-results/debug/
Passed: 10Failed: 2
FAILEDThe JUnit and Debug lines only appear when those options are enabled. The final verdict is PASSED (green bold) or FAILED (red bold).