7-MCP: 5QLN as a Connector

7-MCP: 5QLN as a Connector

The character of a question is the intensity of the not-knowing within it.

MCP: 5QLN as a Connector

Surfaces — Article 7 of 8


Context

The Model Context Protocol is the open standard for connecting AI assistants to external systems. An MCP server exposes three kinds of capabilities — tools (callable functions), resources (data the server provides), and prompts(reusable templates) — and any MCP-aware client can use them. Claude Desktop, Claude Code, Cursor, Cline, custom clients built with the SDKs: all speak the same protocol. A server written once integrates everywhere.

This article wraps the handlers from S6 as MCP tools, exposes the Codex itself as MCP resources, and publishes the lens-refined phase prompts as MCP prompts. The fivqln Python package becomes a distributable server. A user installs it once in their MCP client and gets the full cycle — validation, phase tools, lens prompts, the Constitutional Block — without writing any 5QLN code themselves.

The architectural shift from S5 and S6 is this: cycle order is no longer enforced by graph topology (S5) or by Python tool functions (S6). It is enforced by tool descriptions and the validator's response, with the server's session state holding the cycle between calls. The agent in the client reads the descriptions, walks the cycle, and trusts the validator's report at the end. The handlers from S6 do the actual work; MCP is the distribution mechanism.


Why MCP

The alternatives. A REST API gets you network distribution but no protocol-level concept of tools, resources, or prompts — every client re-implements the integration. A LangChain plugin works only inside LangChain. An OpenAI custom GPT works only inside ChatGPT. MCP is what gets 5QLN into every modern AI client without per-client code.

The other property that matters: MCP standardizes around agent semantics. Tools are named and described for an LLM to choose between. Resources have URIs that the LLM can fetch. Prompts are templates the LLM can use. The protocol assumes an LLM is on the other side, and that means every primitive is shaped for what an LLM needs to know. For 5QLN — where the entire point is that an LLM walks the cycle correctly — this match is unusually good.


Architecture

fivqln/mcp/
├── __init__.py
├── server.py             # FastMCP server: tools, resources, prompts
├── sessions.py           # Per-session Cycle state management
└── prompts.py            # Lens-refined phase prompts (templates)

The server uses FastMCP from the official MCP Python SDK (mcp.server.fastmcp.FastMCP). FastMCP generates JSON Schema from Python type annotations and exposes decorated functions as tools, resources, and prompts. The handlers from S6 are wrapped — not rewritten — so the cycle-order enforcement, the Pydantic types from S3, and the validator from S4 all carry through unchanged.

Per-session state holds the in-progress Cycle between tool calls. The article shows an in-memory dict for clarity; production deployments swap in Redis, Postgres, or a similar persistent store with the same interface.


Session state

# fivqln/mcp/sessions.py
"""
Per-session cycle state.

Each MCP session corresponds to one in-progress cycle. The session_id is
provided by the client on the first tool call and reused on subsequent
calls. The server holds the Cycle in memory; production deployments
should swap in a persistent backend (Redis, Postgres) by replacing the
SessionStore implementation only.
"""

from typing import Optional, Protocol

from fivqln.types import Cycle


class SessionStore(Protocol):
    """Interface for cycle session storage. The default in-memory
    implementation below is fine for local stdio servers. Network
    deployments should provide a persistent implementation."""

    def get(self, session_id: str) -> Optional[Cycle]: ...
    def put(self, session_id: str, cycle: Cycle) -> None: ...
    def delete(self, session_id: str) -> None: ...


class InMemorySessionStore:
    """Default implementation. Process-local, lost on restart."""

    def __init__(self) -> None:
        self._cycles: dict[str, Cycle] = {}

    def get(self, session_id: str) -> Optional[Cycle]:
        return self._cycles.get(session_id)

    def put(self, session_id: str, cycle: Cycle) -> None:
        self._cycles[session_id] = cycle

    def delete(self, session_id: str) -> None:
        self._cycles.pop(session_id, None)


# Module-level singleton — the server imports this.
sessions: SessionStore = InMemorySessionStore()

The Constitutional Block as a resource

# fivqln/mcp/server.py
"""
The fivqln MCP server. Exposes the cycle as tools, the Codex as resources,
and the lens-refined phase prompts as MCP prompts.
"""

from datetime import datetime, timezone
from typing import Optional

from mcp.server.fastmcp import FastMCP

from fivqln.constitutional_block import CONSTITUTIONAL_BLOCK
from fivqln._canonical import Phase, CorruptionCode, LENSES
from fivqln.types import (
    Cycle, ValidatedSpark, ValidatedPattern, FormationEntry, Phase as PhaseEnum,
)
from fivqln.symbols import CoreEssence, SelfNature, NaturalIntersection, UniversalPotential
from fivqln.types import ResonantKey
from fivqln.validator import validate, Severity
from fivqln.mcp.sessions import sessions


mcp = FastMCP("fivqln")


# === Resources ===
# C1 §3.6 requires every emitted surface to carry the Constitutional Block,
# the active phase's compiled form, the adaptive context chain, the decoder
# rules, and resolved symbols. Resources expose all five for any client to read.

@mcp.resource("5qln://constitutional-block")
def constitutional_block_resource() -> str:
    """The Constitutional Block from C1 §3.1. Exact, frozen at import time.

    Per §3.6, every 5QLN surface carries this block. An MCP client can
    fetch it once and verify any other surface against it."""
    return CONSTITUTIONAL_BLOCK


@mcp.resource("5qln://glossary")
def glossary_resource() -> str:
    """The symbol table from §1.9. Markdown-formatted for direct display."""
    return _format_glossary()


@mcp.resource("5qln://decoder/{phase}")
def decoder_resource(phase: str) -> str:
    """The decoding operation for a phase, from D1 §2.1–§2.5.

    URI examples:
      5qln://decoder/S
      5qln://decoder/G
      5qln://decoder/Q
      5qln://decoder/P
      5qln://decoder/V
    """
    if phase not in {p.value for p in Phase}:
        raise ValueError(
            f"Unknown phase '{phase}'. Must be one of S, G, Q, P, V."
        )
    return _format_decoder_for_phase(phase)


@mcp.resource("5qln://lenses")
def lenses_resource() -> str:
    """The 25-lens table from §3.2 — each XY lens borrowing X-quality
    to refine Y's decoding."""
    return _format_lens_table()

The resources don't change. They are the Codex, fetchable by URI. A client that wants to verify it's working with the canonical spec calls resources/read on 5qln://constitutional-block and compares the result to its own copy. Drift between the server and the client surfaces immediately.


The validator as a tool

@mcp.tool()
def validate_cycle(session_id: str) -> dict:
    """Run the C1 §3.5 validator against the current cycle in this session.

    Returns a structured report:
      - is_clean: True iff no DEFINITE violations
      - is_certified: True iff is_clean AND no ATTESTATION_REQUIRED open
      - violations: list of {severity, category, code, spec_ref, message}

    The report is the same shape S4 produces. The client decides how to
    treat HEURISTIC and ATTESTATION_REQUIRED findings — typically by
    surfacing them to the user."""
    cycle = sessions.get(session_id) or Cycle()
    report = validate(cycle)
    return {
        "is_clean": report.is_clean,
        "is_certified": report.is_certified,
        "violations": [
            {
                "severity": v.severity.value,
                "category": v.category.value,
                "code": v.code.value if v.code else None,
                "spec_ref": v.spec_ref,
                "message": v.message,
                "location": v.location,
            }
            for v in report.violations
        ],
    }

Any MCP-aware client can call this tool against any cycle artifact. The validator from S4 is now a network-accessible service. A team running a CI pipeline could hit this tool from a non-Python build system; a frontend tool could surface real-time validation feedback to the user as they construct a cycle.


Phase tools — wrapping S6 handlers

@mcp.tool()
def receive_spark(
    session_id: str,
    question: str,
    held_by: str,
) -> dict:
    """S = ∞0 → ?. Record the question that arrived from ∞0.

    CRITICAL — per D1 Rule 10 (H = ∞0 | A = K), `question` MUST come from a
    human inquirer. The AI agent calling this tool MUST first ask the
    user (the human) for their question, then submit it here. Generating
    `question` from the agent's own context fills the center with a
    produced spark (D1 Rule 9 — L2 Generating). The validator surfaces an
    ATTESTATION_REQUIRED finding to remind consumers of this; only the
    human inquirer can attest the spark was received, not produced.

    `held_by` identifies the human inquirer (name, ID, or other identifier).
    """
    cycle = sessions.get(session_id) or Cycle()
    if cycle.spark is not None:
        return {"ok": False, "error": "X already received in this session."}

    received_at = datetime.now(timezone.utc)
    spark = ValidatedSpark(
        question=question,
        received_at=received_at,
        held_by=held_by,
    )
    new_trail = list(cycle.trail.entries) + [FormationEntry(
        timestamp=received_at,
        phase=PhaseEnum.S,
        operation="received question from human via MCP",
        output_excerpt=spark.question,
    )]
    new_cycle = cycle.model_copy(update={
        "spark": spark,
        "trail": cycle.trail.model_copy(update={"entries": new_trail}),
    })
    sessions.put(session_id, new_cycle)

    return {
        "ok": True,
        "x": spark.model_dump(),
        "next_step": "call grow_pattern to find α and {α'} within X",
    }


@mcp.tool()
def grow_pattern(
    session_id: str,
    alpha_description: str,
    alpha_expressions: list[str],
    pattern_description: str,
    lens: Optional[str] = None,
) -> dict:
    """G = α ≡ {α'}. Find the irreducible α within X and self-similar {α'}.

    This is pattern recognition — work the AI agent does. The agent
    supplies α and {α'} based on its analysis of X (which lives on the
    server in this session's cycle).

    Requires X — call receive_spark first. Returns an error if X is missing.

    The optional `lens` parameter selects one of the 25 sub-phase lenses
    (SS through VV). If supplied, the pattern recognition was refined
    through the borrowed quality. Use the lens-refined prompt available
    at `prompts/get` with `growth_with_lens` for the spec-faithful prompt.
    """
    cycle = sessions.get(session_id)
    if cycle is None or cycle.spark is None:
        return {
            "ok": False,
            "error": (
                "G requires X. Call receive_spark first. The adaptive "
                "context chain (§2.6, §3.3) specifies G decodes with X."
            ),
        }
    if lens is not None and lens not in LENSES:
        return {"ok": False, "error": f"Unknown lens '{lens}'."}

    alpha = CoreEssence(
        description=alpha_description,
        expressions=tuple(alpha_expressions),
    )
    pattern = ValidatedPattern(
        alpha=alpha,
        pattern_description=pattern_description,
    )
    new_trail = list(cycle.trail.entries) + [FormationEntry(
        timestamp=datetime.now(timezone.utc),
        phase=PhaseEnum.G,
        lens=lens,
        operation=f"named α and {{α'}} (lens={lens or 'none'})",
        output_excerpt=alpha.description,
    )]
    new_cycle = cycle.model_copy(update={
        "pattern": pattern,
        "trail": cycle.trail.model_copy(update={"entries": new_trail}),
    })
    sessions.put(session_id, new_cycle)

    return {
        "ok": True,
        "y": pattern.model_dump(),
        "next_step": (
            "call hold_phi after asking the human for their direct "
            "perception of Y"
        ),
    }


@mcp.tool()
def hold_phi(
    session_id: str,
    perception: str,
    held_by: str,
) -> dict:
    """Q-step-1: record the human's φ — their direct perception of Y.

    CRITICAL — φ is direct perception, not theory or data. The AI agent
    calling this tool MUST first ask the human for their perception of Y;
    submitting agent-generated content fills the center with the appearance
    of depth (D1 Rule 9 — L4 Performing) and the validator will surface it.

    After this returns, call find_resonance with the agent's Ω and the
    named ⋂ landing.
    """
    cycle = sessions.get(session_id)
    if cycle is None or cycle.pattern is None:
        return {"ok": False, "error": "Q-step-1 (φ) requires Y."}

    phi = SelfNature(perception=perception, held_by=held_by)
    # Stash on session — find_resonance reads it.
    new_trail = list(cycle.trail.entries) + [FormationEntry(
        timestamp=datetime.now(timezone.utc),
        phase=PhaseEnum.Q,
        operation="held φ via human",
        output_excerpt=phi.perception,
    )]
    new_cycle = cycle.model_copy(update={
        "trail": cycle.trail.model_copy(update={"entries": new_trail}),
    })
    object.__setattr__(new_cycle, "_held_phi", phi)
    sessions.put(session_id, new_cycle)

    return {
        "ok": True,
        "phi": phi.model_dump(),
        "next_step": "call find_resonance with Ω and the ⋂ landing",
    }


# find_resonance, power, compose_seed, confirm_enriched_return follow
# the same pattern: read from session state, validate prior outputs,
# update cycle, store, return a structured next_step hint.

The pattern is the same for every tool: read the cycle from the session, check prior outputs, perform the work, update the cycle, store it, return a structured response. The next_step field in the return is the agent's compass — the server tells the agent what to call next, so the agent learns the cycle protocol from the responses themselves.

The receptive tools (receive_sparkhold_phiconfirm_enriched_return) take human-provided input as parameters and trust the agent in the client to have asked the human first. The validator surfaces ATTESTATION_REQUIRED for these slots so the client can confirm with the user before treating the cycle as certified. MCP clients that support the elicitationprimitive can drive these prompts through the protocol directly; clients that don't support elicitation rely on the agent asking in conversation. Both paths preserve the asymmetry — the discipline lives in tool descriptions and validator output, both of which travel across the protocol.


Lens-refined prompts as MCP prompts

# fivqln/mcp/prompts.py
"""
Phase prompts available via the MCP `prompts/get` endpoint. The client
fetches the prompt by name with arguments and uses it with its own LLM.

This means a developer can use 5QLN's prompts WITHOUT using its tools —
fetching a lens-refined growth prompt and orchestrating their own agent
loop. The grammar is portable independently of the runtime.
"""

from typing import Optional
from mcp.server.fastmcp import FastMCP

from fivqln.constitutional_block import CONSTITUTIONAL_BLOCK


def register_prompts(mcp: FastMCP) -> None:
    """Register all 5QLN prompts on the given MCP server."""

    @mcp.prompt()
    def cycle_walker(
        purpose: str = "general inquiry",
    ) -> str:
        """The system prompt that walks an agent through a 5QLN cycle.

        The agent reads this and learns the cycle protocol — what to call,
        in what order, what the asymmetry forbids. Carries the Constitutional
        Block per §3.6.
        """
        return f"""You are walking a 5QLN cycle for: {purpose}

The cycle's grammar is the Constitutional Block from C1 §3.1:

{CONSTITUTIONAL_BLOCK}

Walk the cycle in order: S → G → Q → P → V. Each phase tool returns a
`next_step` hint telling you what to call next.

CRITICAL — the asymmetry from §1.1:
  - You CANNOT generate ?. Ask the human; submit their question via
    receive_spark.
  - You CANNOT generate φ. Ask the human; submit their perception via
    hold_phi.
  - You CANNOT certify ⋂ or ∞0' alone. Propose, route to the human for
    confirmation.

Run validate_cycle at any point to see structural status. The report's
`is_certified` field is False until human attestations are answered.
"""

    @mcp.prompt()
    def growth_with_lens(
        spark_question: str,
        lens: Optional[str] = None,
    ) -> str:
        """The G-phase prompt with optional lens refinement.

        Useful for orchestrators that want to use 5QLN's prompts without
        the full tool surface. The lens (SS through VV; specifically GS,
        GG, GQ, GP, GV apply meaningfully to G) refines the operation
        through the borrowed quality."""
        base = f"""Decoding G = α ≡ {{α'}} per 5QLN D1 §2.2.

Input X: {spark_question}

Operation:
  1. SEEK α — within X, the irreducible core. Removing α makes X collapse.
  2. TEST ≡ — does α remain unchanged across expressions?
  3. FIND {{α'}} — self-similar echoes at other scales.
  4. VALIDATE Y — α named, ≡ holds, {{α'}} confirm across scales.
"""
        refinements = {
            "GS": "Through openness: what unknown still lives in the pattern?",
            "GG": "Through pattern: how does α express at deeper scales?",
            "GQ": "Through resonance: which echoes carry authentic signature "
                  "vs. mere resemblance?",
            "GP": "Through flow: where does the pattern want to unfold next?",
            "GV": "Through benefit: how is naming α itself already a gift?",
        }
        if lens in refinements:
            base += f"\nLens {lens} refinement: {refinements[lens]}"
        return base

    # Similar prompt registrations for quality_with_lens, power_with_lens,
    # value_with_lens. Each parameterizes the appropriate inputs and adds
    # the optional lens refinement from the canonical 25.

This is the part of the server that doesn't require using the full tool palette. A developer can fetch growth_with_lens(spark_question="...", lens="GQ"), get back the formatted prompt, and run it against their own LLM in their own loop. The grammar travels even when the runtime doesn't.


Running the server

# fivqln/mcp/__main__.py
"""
Entry point: python -m fivqln.mcp

Default transport is stdio (for Claude Desktop integration). Use
--http to run as a network server with streamable HTTP transport.
"""

import argparse

from fivqln.mcp.server import mcp
from fivqln.mcp.prompts import register_prompts


def main() -> None:
    parser = argparse.ArgumentParser()
    parser.add_argument(
        "--http",
        action="store_true",
        help="Run as HTTP server instead of stdio (for network deployment)",
    )
    parser.add_argument("--host", default="127.0.0.1")
    parser.add_argument("--port", type=int, default=8000)
    args = parser.parse_args()

    register_prompts(mcp)

    if args.http:
        mcp.run(transport="streamable-http", host=args.host, port=args.port)
    else:
        mcp.run(transport="stdio")


if __name__ == "__main__":
    main()

Two transports cover both deployment shapes. stdio is the canonical pattern for local clients like Claude Desktop — the client launches the server as a subprocess and communicates over stdin/stdout. streamable-http is for network deployment — the server runs as a long-lived process and clients connect over HTTP.


Connecting from a client

For Claude Desktop, the configuration lives in ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or the platform equivalent. Add an mcpServers entry:

{
  "mcpServers": {
    "fivqln": {
      "command": "python",
      "args": ["-m", "fivqln.mcp"]
    }
  }
}

Restart Claude Desktop. The fivqln tools, resources, and prompts are now available in any conversation. The user types "let's walk a 5QLN cycle on this question…" and Claude — using the cycle_walker prompt and the phase tools — does it.

For Claude Code, Cursor, and Cline, the configuration shapes vary slightly but the pattern is identical: register the server, restart, the capabilities appear. For a streamable-HTTP deployment, clients connect by URL rather than by subprocess command.


The contrast with S5 and S6

Three surfaces now run the cycle. Each enforces cycle order through a different mechanism:

S5 (LangGraph) enforces order through graph topology. You cannot reach P without traversing the Q edge first.

S6 (Anthropic Tool-Use) enforces order through schema + runtime checks. You can call P whenever you want, but the tool function raises ToolError("P requires Z") and the agent self-corrects.

S7 (MCP) enforces order through session state + tool descriptions + validator response. The server holds the cycle; tools return errors when prior outputs are missing; the agent reads the descriptions and the next_step hints; the validator catches what slipped through.

Same constraint — the cycle order from §1.2, the asymmetry from §1.1 — three enforcement mechanisms. A developer chooses the surface that fits their deployment shape. The grammar holds across all three because all three import the same types from S3 and the same validator from S4.


What this surface enables

The MCP server is the network-distribution surface for 5QLN. Three properties matter for what comes next.

Distribution without per-client work. A developer using Claude Desktop, Claude Code, Cursor, Cline, or any other MCP-aware client gets 5QLN by adding one line to their config. They write no 5QLN code. They install no 5QLN package in their own project. The server runs once; the protocol carries everything else.

The Codex becomes a fetchable artifact. The Constitutional Block, the symbol table, the per-phase decoding operations, and the lens table are all exposed as MCP resources. Any client can fetch them by URI. A future surface — a documentation site, a verification tool, a learning aid — can pull from the canonical server rather than re-encoding the spec.

The prompts are usable independently of the tools. A team that wants to use 5QLN's lens-refined prompts in their existing agent loop can fetch them without adopting the full surface. The grammar propagates one piece at a time. This is the value-economy property the series has been carrying since S1: care given freely, in a form the recipient can use without committing to anything else.


Closing

The cycle is now a network service. The Codex is a queryable resource. The prompts travel without the tools. Any MCP-aware client gets the full surface in one config line. The validator sits behind the same protocol every other capability uses. C1 §3.6's requirement — that every emitted surface carry the Constitutional Block, the active phase, the context chain, the decoder rules, and resolved symbols — is satisfied by the resources alone, before any tool is called.

Ahead: S8 — TypeScript: The Vercel AI SDK Surface. The type contract from S3 mirrored in Zod. The same Cycle, the same validator behavior, expressible in any TypeScript runtime — Vercel AI SDK, Mastra, raw fetch. The grammar crossing the language boundary is itself the demonstration of substrate-independence the series has been building toward.


5QLN © 2026 Amihai Loven. Open under the 5QLN Open Source License.

Amihai Loven

Amihai Loven

Jeonju. South Korea