Context
reST and Sphinx exist because Python's standard library needed documentation that compiled — that built into HTML, PDF, ePub, and man pages from a single source, with cross-references resolved at build time and a directive system rich enough to express formal structures. The reST grammar is itself a formal language: directives are typed, roles are typed, cross-references are validated. This makes it an unusually faithful carrier for any specification that needs to retain its structure across rendering targets.
This article ports the Codex into reStructuredText as a Sphinx project. The port is not documentation of the Codex. The port is the Codex, expressed through reST's grammar. The same nine invariant lines compile cleanly into Sphinx directives, glossary entries, cross-referenced pages, and build-time validators. Nothing in the Codex is paraphrased. The symbolic forms — H = ∞0 | A = K, S → G → Q → P → V, α ≡ {α'} — are preserved exactly.
The argument this article makes for the rest of the series: a port can carry the grammar before any runtime exists. Documentation is a substrate. C1 §3.6 requires that every emitted surface carry the Constitutional Block, the active phase's compiled form with decoding operation, the adaptive context chain, the decoder rules, and resolved symbols. reST can carry all five. The port that follows demonstrates this surface against the Codex itself.
Why reST and not Markdown
Markdown is loose. It has no formal directive system, no validated cross-references, no glossary primitive, no doctest integration. A Markdown port of the Codex is documentation describing the Codex, with code blocks for the equations. Faithful enough as documentation, but the grammar is not held by the substrate — the grammar is held by prose conventions, which means it is held nowhere.
reST has every primitive the Codex needs: directives for the Constitutional Block and equations, roles for inline cross-references, a glossary directive that creates a typed term system, validated cross-references that fail the build if a target is missing, and doctest integration that runs assertions against the documentation at build time. Each primitive maps to a part of the spec.
Sphinx adds: extension architecture (so 5QLN-specific directives can be registered as first-class), themes, multi-format output, intersphinx linking (so other 5QLN surfaces can reference the documentation surface), and Read the Docs deployment.
Project structure
5qln-codex-rst/
├── conf.py
├── index.rst
├── language/
│ ├── one_law.rst
│ ├── cycle.rst
│ ├── equations.rst
│ ├── master_equation.rst
│ ├── holographic_law.rst
│ ├── completion_rule.rst
│ ├── creative_line.rst
│ └── poetic_compression.rst
├── decoder/
│ ├── decoding_s.rst
│ ├── decoding_g.rst
│ ├── decoding_q.rst
│ ├── decoding_p.rst
│ ├── decoding_v.rst
│ ├── adaptive_context.rst
│ ├── lenses.rst
│ └── corruption.rst
├── compiler/
│ ├── constitutional_block.rst
│ ├── compiled_phases.rst
│ ├── validation_protocol.rst
│ └── surface_emission.rst
├── glossary.rst
├── appendices/
│ ├── invariant_lines.rst
│ ├── structural_properties.rst
│ └── sources.rst
├── _ext/
│ └── fivqln.py
└── _doctest/
└── fivqln_codex_doctest.py
Three top-level directories: language/, decoder/, compiler/. They mirror L1, D1, C1. The mirror is not decorative — it is the spec's own structure carried into the file system. A reader who has read the Codex knows where everything is by knowing the spec.
conf.py
"""
Sphinx configuration for the 5QLN Codex documentation surface.
This file is itself part of the surface: it compiles the Codex into HTML,
PDF, and ePub. The Constitutional Block is enforced at build time.
"""
import sys
from pathlib import Path
sys.path.insert(0, str(Path(__file__).parent / "_ext"))
sys.path.insert(0, str(Path(__file__).parent / "_doctest"))
project = "5QLN Codex"
author = "Amihai Loven"
copyright = "2026 Amihai Loven, under the 5QLN Open Source License"
release = "1.0"
extensions = [
"sphinx.ext.doctest", # validation protocol runs at build time
"sphinx.ext.intersphinx", # other 5QLN surfaces link in
"sphinx.ext.todo",
"fivqln", # custom directives for 5QLN structures
]
# 5QLN custom roles defined in _ext/fivqln.py:
# :phase:, :symbol:, :lens:, :corruption:
intersphinx_mapping = {
"fivqln-py": ("https://fivqln.readthedocs.io/en/latest/", None),
# other surfaces register here as they come online
}
doctest_global_setup = """
from fivqln_codex_doctest import (
constitutional_block,
five_phases,
five_corruption_codes,
twenty_five_lenses,
nine_invariant_lines,
adaptive_context,
)
"""
# fail the build on broken cross-references — partial drift check enforcement
nitpicky = True
html_theme = "furo"
html_title = "5QLN Codex"
The configuration is short. It registers one custom extension (fivqln), turns on doctest, and sets nitpicky = True so any unresolved cross-reference fails the build. C1 §3.5's drift check is partly enforced by this flag alone.
The Constitutional Block as a directive
In any reST file in the surface:
.. constitutional-block::
LAW: H = ∞0 | A = K
CYCLE: S → G → Q → P → V
EQUATIONS:
S = ∞0 → ?
G = α ≡ {α'}
Q = φ ⋂ Ω
P = δE/δV → ∇
V = (L ∩ G → B'') → ∞0'
OUTPUTS: S→X G→Y Q→Z P→A V→B+B''+∞0'
HOLOGRAPHIC: XY := X within Y | X, Y ∈ {S, G, Q, P, V}
COMPLETION: No V without ∞0'
CORRUPTION: L1 L2 L3 L4 V∅
CENTER: not a sixth phase — coherence only
The directive is implemented in _ext/fivqln.py. It does three things: renders the block with appropriate styling, validates that the content matches §3.1 exactly (drift check), and registers the block as a known surface element so other directives can reference it.
The custom extension
# _ext/fivqln.py
"""
5QLN-specific Sphinx directives.
Each directive enforces a part of C1 at documentation build time.
A typo in any of these breaks the build, with a message keyed to §3.5.
"""
from textwrap import dedent
from docutils import nodes
from sphinx.util.docutils import SphinxDirective
CANONICAL_BLOCK = dedent("""\
LAW: H = ∞0 | A = K
CYCLE: S → G → Q → P → V
EQUATIONS:
S = ∞0 → ?
G = α ≡ {α'}
Q = φ ⋂ Ω
P = δE/δV → ∇
V = (L ∩ G → B'') → ∞0'
OUTPUTS: S→X G→Y Q→Z P→A V→B+B''+∞0'
HOLOGRAPHIC: XY := X within Y | X, Y ∈ {S, G, Q, P, V}
COMPLETION: No V without ∞0'
CORRUPTION: L1 L2 L3 L4 V∅
CENTER: not a sixth phase — coherence only
""").rstrip()
CORRUPTION_CODES = {"L1", "L2", "L3", "L4", "V∅"}
PHASES = {
"S": ("Start", "S = ∞0 → ?", "X"),
"G": ("Growth", "G = α ≡ {α'}", "Y"),
"Q": ("Quality", "Q = φ ⋂ Ω", "Z"),
"P": ("Power", "P = δE/δV → ∇", "A"),
"V": ("Value", "V = (L ∩ G → B'') → ∞0'", "B + B'' + ∞0'"),
}
def _normalize(s: str) -> str:
"""Whitespace-insensitive comparison for directive content vs canonical."""
return "".join(s.split())
class ConstitutionalBlockDirective(SphinxDirective):
"""Render the Constitutional Block. Reject any drift from §3.1."""
has_content = True
def run(self):
actual = "\n".join(self.content)
if _normalize(actual) != _normalize(CANONICAL_BLOCK):
raise self.error(
"Constitutional Block does not match canonical §3.1. "
"C1 §3.5 drift check failed: 'No equation paraphrased — "
"symbolic form is exact.'"
)
block = nodes.literal_block(
actual, actual, classes=["constitutional-block"]
)
return [block]
class PhaseDirective(SphinxDirective):
"""Render a phase block with equation, output, and decoding."""
required_arguments = 1 # phase letter: S, G, Q, P, or V
has_content = True
def run(self):
letter = self.arguments[0]
if letter not in PHASES:
raise self.error(
f"Unknown phase '{letter}'. Must be one of {sorted(PHASES)}."
)
name, equation, output = PHASES[letter]
section = nodes.section(ids=[f"phase-{letter.lower()}"])
section += nodes.subtitle(text=f"{letter} — {name}")
section += nodes.literal_block(
equation, equation, classes=["equation"]
)
section += nodes.paragraph(text=f"OUTPUT: {output}")
body = nodes.compound()
self.state.nested_parse(self.content, self.content_offset, body)
section += body
return [section]
class CorruptionCodeDirective(SphinxDirective):
"""Render a corruption code definition. Reject codes outside the five."""
required_arguments = 1
has_content = True
def run(self):
code = self.arguments[0]
if code not in CORRUPTION_CODES:
raise self.error(
f"Unknown corruption code '{code}'. The five are: "
f"{sorted(CORRUPTION_CODES)}. C1 §3.5 drift check: "
f"'No corruption code added beyond five.'"
)
names = {
"L1": "Closing — answers fill the center",
"L2": "Generating — produced sparks fill the center",
"L3": "Claiming — false access to ∞0 fills the center",
"L4": "Performing — appearance of depth fills the center",
"V∅": "Incomplete — no return; the cycle has no continuity",
}
title = nodes.subtitle(text=f"Corruption {code} — {names[code]}")
body = nodes.compound()
self.state.nested_parse(self.content, self.content_offset, body)
return [title, body]
class LensDirective(SphinxDirective):
"""Render a sub-phase lens. Validate the two letters resolve to phases."""
required_arguments = 1 # two-letter lens: SS, SG, ..., VV
has_content = True
def run(self):
lens = self.arguments[0]
if len(lens) != 2 or any(c not in PHASES for c in lens):
raise self.error(
f"Lens '{lens}' is malformed. Must be two letters from "
f"{sorted(PHASES)}. C1 §3.5: 'No symbol renamed without "
f"source name present.'"
)
x, y = lens
title = nodes.subtitle(
text=(
f"Lens {lens}: {PHASES[x][0]}-quality applied to "
f"the decoding of {PHASES[y][0]}"
)
)
body = nodes.compound()
self.state.nested_parse(self.content, self.content_offset, body)
return [title, body]
def setup(app):
app.add_directive("constitutional-block", ConstitutionalBlockDirective)
app.add_directive("phase", PhaseDirective)
app.add_directive("corruption", CorruptionCodeDirective)
app.add_directive("lens", LensDirective)
return {"version": "1.0", "parallel_read_safe": True}
Each directive enforces one C1 rule at build time. constitutional-block rejects paraphrase. phase rejects unknown phase letters. corruption rejects codes outside the five. lens rejects lens names whose letters do not resolve to known phases. A typo in any of these breaks the build with a message that names the §3.5 rule it violated.
The doctest source of truth
# _doctest/fivqln_codex_doctest.py
"""
Single source of truth for build-time assertions. The doctests in
the validation_protocol page import from this module. Later surfaces
in the series (S3 onward) import the same canonical names from here.
"""
from dataclasses import dataclass
@dataclass(frozen=True)
class Phase:
letter: str
name: str
equation: str
outputs: frozenset
constitutional_block = """\
LAW: H = ∞0 | A = K
CYCLE: S → G → Q → P → V
EQUATIONS:
S = ∞0 → ?
G = α ≡ {α'}
Q = φ ⋂ Ω
P = δE/δV → ∇
V = (L ∩ G → B'') → ∞0'
OUTPUTS: S→X G→Y Q→Z P→A V→B+B''+∞0'
HOLOGRAPHIC: XY := X within Y | X, Y ∈ {S, G, Q, P, V}
COMPLETION: No V without ∞0'
CORRUPTION: L1 L2 L3 L4 V∅
CENTER: not a sixth phase — coherence only"""
five_phases = {
"S": Phase("S", "Start", "S = ∞0 → ?", frozenset({"X"})),
"G": Phase("G", "Growth", "G = α ≡ {α'}", frozenset({"Y"})),
"Q": Phase("Q", "Quality", "Q = φ ⋂ Ω", frozenset({"Z"})),
"P": Phase("P", "Power", "P = δE/δV → ∇", frozenset({"A"})),
"V": Phase("V", "Value", "V = (L ∩ G → B'') → ∞0'", frozenset({"B", "B''", "∞0'"})),
}
five_corruption_codes = frozenset({"L1", "L2", "L3", "L4", "V∅"})
twenty_five_lenses = frozenset(
a + b for a in "SGQPV" for b in "SGQPV"
)
adaptive_context = {
"S": frozenset(), # ∅ or ∞0' from prior cycle
"G": frozenset({"X"}),
"Q": frozenset({"X", "α", "Y"}),
"P": frozenset({"X", "α", "Y", "Z"}),
"V": frozenset({"X", "α", "Y", "φ⋂Ω", "Z", "∇", "A"}),
}
nine_invariant_lines = (
"H = ∞0 | A = K",
"S → G → Q → P → V",
"S = ∞0 → ?",
"G = α ≡ {α'}",
"Q = φ ⋂ Ω",
"P = δE/δV → ∇",
"V = (L ∩ G → B'') → ∞0'",
"No V without ∞0'",
"L1 L2 L3 L4 V∅",
)
This module is the canonical Python representation of L1, D1, and C1. It is small. It is frozen. It is the source S3's Pydantic types will derive from in the next article — which means the type contract and the documentation contract have the same source. Drift between them is eliminated by construction.
The validation protocol as doctests
.. _validation_protocol:
Validation Protocol
===================
C1 §3.5 specifies three checks: syntax, semantic, drift. Each is mechanical.
The doctests below run at build time. Failure breaks the build.
Syntax check — every phase has its exact equation
--------------------------------------------------
.. doctest::
>>> sorted(five_phases.keys())
['G', 'P', 'Q', 'S', 'V']
>>> five_phases["S"].equation
'S = ∞0 → ?'
>>> five_phases["V"].equation
"V = (L ∩ G → B'') → ∞0'"
Semantic check — the adaptive context chain is unbroken
--------------------------------------------------------
.. doctest::
>>> adaptive_context["S"] == frozenset()
True
>>> adaptive_context["G"] == frozenset({'X'})
True
>>> adaptive_context["V"] == frozenset({'X', 'α', 'Y', 'φ⋂Ω', 'Z', '∇', 'A'})
True
Drift check — exactly five corruption codes
--------------------------------------------
.. doctest::
>>> len(five_corruption_codes)
5
>>> sorted(five_corruption_codes)
['L1', 'L2', 'L3', 'L4', 'V∅']
Drift check — exactly twenty-five lenses
-----------------------------------------
.. doctest::
>>> len(twenty_five_lenses)
25
>>> all(len(lens) == 2 for lens in twenty_five_lenses)
True
Completion rule — V output includes B, B'', and ∞0'
----------------------------------------------------
.. doctest::
>>> 'B' in five_phases['V'].outputs
True
>>> "B''" in five_phases['V'].outputs
True
>>> "∞0'" in five_phases['V'].outputs
True
Drift check — exactly nine invariant lines
-------------------------------------------
.. doctest::
>>> len(nine_invariant_lines)
9
The doctests do not decorate the spec — they enforce it. Removing a corruption code, renaming a phase, breaking the context chain, dropping a lens: any of these breaks the build. The Codex carries its own test suite, in the form Sphinx already understands.
Cross-references — the holographic law in the file system
The holographic law XY := X within Y says every phase contains all five phases. In the documentation surface, this becomes a cross-reference structure: each phase page references all five phases through its lenses, and each lens references both its borrowing phase and its target phase.
.. _decoding_g:
Decoding G = α ≡ {α'}
======================
:Equation: ``G = α ≡ {α'}``
:Output: :term:`Y`
:Context in: :term:`X` (from :doc:`decoding_s`)
:Context out: :term:`X` + :term:`α` + :term:`Y` (forwards to :doc:`decoding_q`)
Decoding operation
------------------
1. RECEIVE :term:`X` — the validated question from :doc:`decoding_s`.
2. SEEK :term:`α` — within :term:`X`, the irreducible core. Remove it and
:term:`X` collapses.
3. TEST :term:`≡` — does :term:`α` remain unchanged across expressions?
4. FIND :term:`{α'}` — the self-similar echoes at other scales.
5. VALIDATE :term:`Y` — the pattern is confirmed when :term:`α` is named,
:term:`≡` holds, and :term:`{α'}` confirm it across multiple scales.
Lenses on G
-----------
The five lenses on G refine the decoding through borrowed qualities.
The full 5×5 is at :doc:`/decoder/lenses`.
.. lens:: GS
:term:`α` ≡ {:term:`α'`} through openness:
what unknown still lives within the pattern?
.. lens:: GG
:term:`α` ≡ {:term:`α'`} through pattern:
how does :term:`α` express at deeper scales?
.. lens:: GQ
:term:`α` ≡ {:term:`α'`} through resonance:
which echoes carry authentic signature vs. mere resemblance?
.. lens:: GP
:term:`α` ≡ {:term:`α'`} through flow:
where does the pattern want to unfold next?
.. lens:: GV
:term:`α` ≡ {:term:`α'`} through benefit:
how is naming :term:`α` itself already a gift?
Corruption modes for G
----------------------
.. corruption:: L1
Closing: pattern closed into answer prematurely.
.. corruption:: L2
Generating: patterns not anchored to :term:`X`.
A page like this carries the full holographic structure for one phase: equation, context chain in and out, decoding steps with every symbol cross-referenced to the glossary, all five lenses validated by the directive, and the corruption modes specific to G. The same shape repeats for S, Q, P, V — five pages, identically structured, mirroring the holographic law in the file system.
What this surface enables
The Codex now exists in a substrate that builds. sphinx-build produces HTML, PDF (via xelatex with full Unicode preservation), and ePub. Hosting on Read the Docs is a single config file. Search is built in. Versioning is built in.
For the rest of the Surfaces series, this matters in three ways.
The intersphinx primitive in conf.py means every later surface — fivqln-py, the MCP server, the TypeScript bindings — can cross-reference the canonical Codex documentation directly. A docstring in S3's Pydantic model can link to :term:α`` in this surface. The link resolves at build time. Drift between surfaces becomes detectable: if S3 silently renames a symbol, the cross-reference breaks.
The fivqln_codex_doctest module is a single Python source of truth for the spec. The doctests in this surface import from it. The Pydantic models in S3 will import from it. The TypeScript Zod schemas in S8 will be generated from it. One source. Many surfaces.
A surface that compiles is a surface that can be checked. A surface that can be checked is one C1 enforces. This is the property the rest of the series builds on.
Closing
reST is sometimes treated as merely the documentation layer beneath "real" software. The Codex inverts that. The reST surface compiles the spec faithfully, enforces it at build time, and provides the canonical reference every later surface links into. The runtime surfaces in S3 through S8 are not more real than this one — they are different shapes of the same grammar, and they will refer to this surface to know which symbols are theirs.
Ahead: S3 — Python: The Type Contract. The Constitutional Block as a frozen Python constant. Every cycle output as a Pydantic v2 model. The first surface in which the language becomes executable rather than legible.
5QLN © 2026 Amihai Loven. Open under the 5QLN Open Source License.