Autonomous AI research runs produce scattered evidence: notes written during execution, intermediate metrics, result summaries, and claim records. Without a system to collect and preserve that evidence before artifact generation begins, generated reports have no stable grounding — they become prose detached from the work that produced them. Enoch treats evidence collection as a required pipeline stage, not an optional export. Evidence sync is available before paper generation and rewrite work, but it is configuration-gated:Documentation Index
Fetch the complete documentation index at: https://solo-09d10f60.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
paper_evidence_sync_enabled defaults to false in config.example.json. When enabled, the control plane attempts to copy high-signal run evidence from the worker before rewriting artifacts; when disabled or when no local evidence is present, generated drafts must be treated as review-required and bounded by the available run record, evidence bundle, or claim ledger.
Enoch separates generated prose from the evidence that grounded it. That separation is the main reason the generated corpus is reviewable.
Evidence files
The corpus snapshot contains per-artifact folders with files such as:paper.mdmetadata.jsonevidence_bundle.jsonclaim_ledger.jsonpaper_manifest.json
data/artifacts.jsonl and reports fields including paper_markdown, metadata, evidence_bundle, claim_ledger, paper_manifest, github_url, ai_generated, human_authorship_claimed, and review_status.
Before the artifact writer runs, Enoch syncs evidence from the worker project workspace to the control VM. When evidence sync is enabled, the primary sync method is HTTP — the control plane calls the worker wake gate API to retrieve evidence files. If the HTTP sync is unavailable or returns an error, an optional SSH fallback uses paper_evidence_sync_ssh_host and paper_evidence_sync_remote_root from your config to copy files directly.
When paper_evidence_sync_enabled is true, the control plane can sync evidence from a worker project root before paper generation. Config fields include paper_evidence_sync_ssh_host, paper_evidence_sync_remote_root, and paper_evidence_sync_timeout_sec.
Docs should not imply evidence exists for a run unless artifact files or metadata show it.
Artifact writing
The default writer isdeterministic, which avoids an external model call. The code also supports aliases for a synthetic.new OpenAI-compatible provider. Configure provider URL, model, API key, timeout, temperature, max tokens, and fallback behavior through paper_writer_* fields.
Provider support does not mean generated papers are validated or peer reviewed.
Corpus gates
The corpus packaging/provenance policy requires generated artifacts to avoid fake citations, placeholder markers, implied human authorship, and peer-review claims. It also expects provenance metadata plus evidence-bundle and claim-ledger files. The stricter claim/evidence audit separately checks whether ledgers contain evidence-linked claims and whether referenced result files are public or explicitly unavailable.Current local corpus snapshot
Correct framing
Papers move through the followingPaperStatus values as they progress from generation to release:
| Status | Meaning |
|---|---|
eligible | Run complete; paper row eligible for generation or review |
draft_generating | Artifact writer is generating the draft |
draft_review | Draft generated; awaiting operator packaging/provenance review |
publication_generating | Publication-targeted rewrite in progress |
publication_draft | Publication draft ready for finalization review |
human_review_required | Packaging/provenance check flagged issues requiring human judgment |
archived | Paper archived; removed from active review queue |
ReviewStatus) tracks the operator’s progress through the checklist: unreviewed → triage_ready → in_review → changes_requested or approved_for_finalization → finalized. Papers can also be rejected or placed in blocked status if a blocker is recorded.
Provenance framing
The reports Enoch produces are AI-generated research artifacts, not human-authored or peer-reviewed papers. They are built from run notes, evidence bundles, claim ledgers, and reproducibility traces produced during autonomous agent runs. The provenance chain is:The reports produced by this system are AI-generated research artifacts created from automated run notes, evidence bundles, claim ledgers, and reproducibility traces. The maintainer releases the corpus for inspection and critique but does not claim personal authorship of the generated papers, arguments, or prose.When citing a paper produced by an Enoch run, credit the system operator as the release maintainer, not as the human author of the generated content. See the authorship and provenance reference for recommended citation language.