Skip to main content
Jira progress: loading…

AIIL-CON

API Contracts

1. API Contracts

1.1. Purpose

The API layer is the contractual glue that ties together:

  • The GDO (Global Disclosure Ontology) for standards resolution.
  • The ZADIF dispatcher for routing queries across agents.
  • The Computation Hub for numeric governance.
  • The Go/No-Go engine for lifecycle enforcement.

APIs must be:

  • Deterministic → predictable schemas, no hidden behavior.
  • uditable → every call logged with provenance IDs.
  • Composable → extensible to new frameworks without code rewrites.

1.2. Core Endpoints

EndpointPurposeExample PayloadNotes
/gdo/resolveResolve disclosure requirement →GDO node{"query": "ESRS E1-5 GHG disclosure"} Returns node ID, metadata, citations.
/gdo/compareCrosswalk disclosures between frameworks{"source":"ESRS.E1", "target":"ISSB.S2"} Returns mapping confidence + overlap score.
/zadif/dispatchDispatch query to correct agent (RAG, Compute, Validator){"query":"What is Scope 2 intensity?", "mode":"auto"}Uses routing policy from ZADIF.
/compute/factorCall Computation Hub for numeric calc{"method":"GHG.intensity", "inputs":{"scope1":100, "scope2":200,"revenue":50}}Returns validated number + dataset provenance.
/eval/gateLifecycle gating check{"response_id":"abc123", "tests":["citation","refusal","numeric"]} Returns pass/fail per test.

1.3. Example: GDO API Contract

Request (POST /gdo/resolve):

gdo-api-contract.json
{
"query": "List disclosure requirements for ESRS E1",
"options": {
"jurisdiction": "EU",
"materiality": "environmental"
}
}

Response:

gdo-api-contract-response.json
{
"status": "ok",
"nodes": [
{
"id": "ESRS.E1.5",
"title": "GHG Emissions – Scope 1-3",
"doc_ref": "Amended_ESRS_E1.pdf:L50-L85",
"jurisdiction": "EU",
"effective_date": "2025-01-01"
}
],
"provenance_id": "px-23d9d"
}

1.4. ZADIF Dispatcher (Routing Contract)

The ZADIF dispatcher routes user queries to the correct AI agent.

Routing Logic:

  • Standards Query → RAG agent.
  • Numeric Query → Computation Hub agent.
  • Validation/Check → Eval agent.
  • Multi-step → Composed pipeline.

Request (POST /zadif/dispatch):

routing-contract.json
{
"query": "What is Scope 2 intensity per revenue?",
"context": "ESRS E1",
"mode": "auto"
}

Response:

routing-contract-response.json
{
"route": "compute",
"agent": "ComputationHub.GHG",
"inputs": {
"scope2": "from_context",
"revenue": "from_context"
},
"provenance_id": "zadif-7f2e"
}

1.5. Computation Hub API Contract

Request (POST /compute/factor):

computation-api-contract.json
{
"method": "GHG.intensity",
"inputs": {
"scope1": 100,
"scope2": 200,
"revenue": 50
},
"dataset_ref": "IPCC-EFDB.xlsx"
}

Response:

computation-api-contract-response.json
{
"status": "ok",
"result": 6.0,
"unit": "tCO2e/€m",
"provenance": {
"dataset": "IPCC-EFDB.xlsx",
"method": "EFDB Tier 1",
"hash": "sha256-98ad..."
}
}

Explanation:

The /compute/factor request is asking the Computation Hub to calculate a numeric KPI (here: GHG intensity = total GHG emissions ÷ revenue).

  • method → tells the hub what computation to run. Here "GHG.intensity" = GHG emissions per € revenue.
  • inputs → the raw values needed to perform that calculation. In the example:
    • "scope1": 100 → 100 tCO₂e direct emissions (from company data).
    • "scope2": 200 → 200 tCO₂e indirect energy emissions.
    • "revenue": 50 → 50 €m company revenue.
  • dataset_ref → which dataset / emission factor library should be used to validate or contextualize. Here, "IPCC-EFDB.xlsx" means the EFDB emission factor reference is applied.

GHG intensity=scope1+scope2revenue=100+20050=6.0  tCO₂e / €m\text{GHG intensity} = \frac{\text{scope1} + \text{scope2}}{\text{revenue}} = \frac{100 + 200}{50} = 6.0 \; \text{tCO₂e / €m}

The data is just illustrative inputs, not defaults. In real life:

  • Scope 1 & 2 values would come from disclosures or extracted supplier/company reports.
  • Revenue would come from the financial dataset integrated in ZAYAZ.
  • The dataset_ref (IPCC-EFDB.xlsx) ensures the computation is aligned with the right factor library and auditable.

Contracted Functions

The Computation Hub exposes methods as contracted functions. Each method is versioned and auditable.

Supported methods (v1.0)

MethodDescriptionInputsOutputNotes
GHG.intensityGHG emissions per € revenue (Scope 1, 2, or 3)scope1, scope2, scope3_catX, revenuetCO₂e/€mCategory-specific optional inputs (Cat1, Cat4, etc.).
GHG.absAbsolute GHG emissions (summed across scopes)scope1, scope2, scope3_catXtCO₂eAligns with ESRS E1 and GHG Protocol.
Energy.intensityEnergy consumption per € revenueenergy_total, revenueMWh/€mEnergy includes fuel + purchased electricity.
Energy.absAbsolute energy consumptionenergy_totalMWhValidated via GDO mappings.
Water.intensityWater use per € revenuewater_withdrawal, revenuem³/€mUses ESRS E3 categories (withdrawal, consumption).
Water.absAbsolute water withdrawal/consumptionwater_withdrawal, water_consumptionSplit by category (surface, groundwater, etc.).
Waste.intensityWaste generated per € revenuewaste_total, revenuetonnes/€mMaps to ESRS E5 disclosures.
Waste.absAbsolute waste generatedwaste_totaltonnesSupports hazardous/non-hazardous split.
GHG.target_gapDistance-to-target emissions gapcurrent_emissions, target_emissions% gapUsed for scenario pathways.
GHG.scenario_alignAlignment with 1.5°C / 2°C scenariosemissions, scenario_curve_ref% alignmentRequires scenario dataset (e.g., IEA, IPCC SSPs).

Example Call: Scenario Alignment

Request (POST /compute/factor):

scenario-alignment.json
{
"method": "GHG.scenario_align",
"inputs": {
"emissions": [2025, 12500, 2026, 12000, 2027, 11000],
"scenario_curve_ref": "IPCC_SSP1_1.9"
},
"options": {
"jurisdiction": "EU",
"sector": "Manufacturing"
}
}

Response:

scenario-alignment-response.json
{
"status": "ok",
"result": 0.87,
"unit": "alignment_score",
"provenance": {
"dataset": "IPCC Scenario SSP1-1.9",
"method": "Scenario alignment check",
"hash": "sha256-45c1..."
}
}

Interpretation: The company’s emissions trajectory aligns with the 1.5°C-compatible curve at 87% confidence.

1.6. Compute Method Registry - Schema Reference

Compute Method Registry

Source file: compute_method_registry.xlsx
SignalTypeExampleDescription
method_idTEXT (PK)MID-00001Surrogate ID for the compute method, e.g. MID-00001. Used as stable contract ID across ZAYAZ.
method_nameTEXTGHG.intensityCanonical method name, e.g. GHG.intensity, Water.abs. Human/machine-friendly semantic identifier.
versionTEXT1.0.0Semantic version of the method implementation and schema (e.g., 1.0.0). Enables side-by-side versions.
statusTEXTsupportedLifecycle state: supported | beta | deprecated. Used by discovery and allowlists.
descriptionTEXTComputes GHG intensity per revenue unitHuman-readable summary of what the method computes and when to use it.
inputs_schema_jsonTEXTzar://json-schema/ghg_intensity_inputs_v1ZAR artifact ID for JSON Schema describing required/optional input fields and constraints (units, mins, enums). Enforced at runtime.
options_schema_jsonTEXTzar://json-schema/ghg_intensity_options_v1ZAR artifact ID for JSON Schema describing optional method options (e.g., jurisdiction, scope3_category). May point to a trivial {} schema.
output_schema_jsonTEXTzar://json-schema/ghg_intensity_output_v1ZAR artifact ID for JSON Schema describing method output payload (e.g., result, unit, provenance). Enforced post-compute.
implementation_refTEXTpython://zayaz.compute.ghg:intensity_v1Resolvable entrypoint, e.g. python://zayaz.compute.ghg:intensity_v1 or container/image ref. Loader imports/calls this.
dataset_requirementsJSONB[[“IPCC-EFDB”,“IEA_SSP”]]Array/object describing required datasets/refs, e.g. ["IPCC-EFDB","IEA_SSP"]. Validated against dataset_key registry.
acl_tagsJSONB[[“EU_ONLY”,“NO_PHI”]]Array of policy tags for enforcement (e.g., ["EU_ONLY","NO_PHI"]). Validated against ACL Tag Registry (acl_tag_registry).
created_atTIMESTAMPTZ2025-01-12T09:14:22ZRow creation timestamp (UTC).
updated_atTIMESTAMPTZ2025-03-03T16:41:09ZRow last update timestamp (UTC)
Family
idcodeclassificationtextaudit
Used by engines
SEMMICEZARAZAAMZHIFZSSRALTD
Used by modules
CHUBZARARIHUBSIS

Compute Method Latest

Source file: compute_method_latest.xlsx
SignalTypeExampleDescription
method_idTEXTMID-00001, MID-00042Method name (e.g., GHG.intensity, Water.abs). Primary key; one “latest” row per method.
versionTEXT1.1.2, 1.0.0Version currently designated as “latest” for this method (e.g., 1.1.0). FK to compute_method_registry(method_id, version).
updated_atTIMESTAMPTZ2025-01-15T08:22:12Z, 2025-01-16T10:45:09ZTimestamp (UTC) when “latest” pointer was last changed.
noteTEXTPromoted after validation and drift checks., Hotfix release validated and approved by platform ops.Admin/operator note explaining why this version is latest (e.g., “post–gate promotion”, “hotfix”).
Family
idcodeaudittext
Used by engines
ZSSRZARAMICEALTD
Used by modules
CHUBZARASIS

Compute Executions

Source file: compute_executions.xlsx
SignalTypeExampleDescription
exec_id01J9Z9K4M2F7W7N4Q7ZJ9V9XGF, 01J9Z9KCN4AMJ5T2W6S5XYF4E9Globally unique execution ID (ULID). Primary key.
method_idMID-00001, MID-00042Method ID (e.g., MID-00001). FK part 1 → ref_compute_method_registry.method_id.
version1.1.2, 1.0.0Exact version used (e.g., 1.0.0). FK part 2 → ref_compute_method_registry.version.
tenant_ideco-197-123-456-789, eco-044-001-882-331Tenant/customer identifier (used for multi-tenant isolation and audit).
inputs_hashsha256:3fa42eab87cbdd29d8fd0c9a8bd1cb6a543920d8e16c08c28a1c78c9231e4b92, sha256:a128ab769e12ba91bb221b0a8afbd5311a37d4a0e366620eefb4e32976cf81aaCryptographic hash (e.g., sha256:...) of normalized input payload.
options_hashsha256:92f8aa769e12b2a3bfb2f2c91afbd5311a37d4a0e366620eefb4e32976c029d2, sha256:81aeaa769e12f89dfbb2f2c91afbd5311a37d4a0e366620eefb4e32976acc2b0Cryptographic hash of normalized options payload.
output_hashsha256:4cd920fa8912a4627bb9eab32c5fdd314ff2a119e80cc291eabe927caae123ef, sha256:f12920fa8912ff111bb9eab32c5fdd314ff2a119e80cc291eabe927caac44ab1Cryptographic hash of normalized output payload. Optional if execution failed pre-output.
dataset_hashesIPCC-EFDB:sha256-a123bc9f1d92f12398fabc77bb38d99f2c91afbd5311a37d,IEA_SSP:sha256-98231acd19fa0287fb29c8ff093a772cd44aa2bc1faff391, UN_FAO_WATER:sha256-71ab3cd19fa0aa77fb29c8ff093a772cd44aa2135d1ac191Array/map of dataset artifact hashes used (e.g., ["IPCC-EFDB:sha256-...","IEA_SSP:sha256-..."]).
provenance_idprov-7f892b1a3cdd4e8f8fba2e, prov-c102aa39efdd4694948df2Correlation ID returned to caller; links UI/API to this audit row.
latency_ms143, 212End-to-end compute latency in milliseconds.
statusok, errorok | error. Final outcome.
error_codenull, SCHEMA_VALIDATION_FAILEDMachine-readable error code if status='error' (e.g., SCHEMA_VALIDATION_FAILED).
created_at2025-02-01T10:14:33Z, 2025-02-01T10:14:34ZExecution timestamp (UTC).
regioneu-north-1, eu-central-1Region tag where compute ran (e.g., eu-north-1).
storage_refs3://zayaz-computations/exec/01J9Z9K4M2F7W7N4Q7ZJ9V9XGF/input_output.json, s3://zayaz-computations/exec/01J9Z9KCN4AMJ5T2W6S5XYF4E9/input_output.jsonPointer to sealed blob with full input/output snapshots if retained.
caller_ip192.168.14.52, 15.188.110.23Source IP for security / anomaly analysis.
Family
idcodemetricclassificationauditgeo
Used by engines
MICEALTDZSSRZARADICEDAVESEMPOSTH
Used by modules
CHUBSISZARARIHUB

Example row:

compute_method_registry.sql
INSERT INTO compute_method_registry (
method_id, version, status, description,
inputs_schema_json, options_schema_json, output_schema_json,
implementation_ref, dataset_requirements, acl_tags
) VALUES (
'GHG.intensity', '1.0.0', 'supported',
'GHG emissions per € revenue (Scopes 1/2/3 or category subsets)',
'{
"$schema":"https://json-schema.org/draft/2020-12/schema",
"type":"object",
"properties":{
"scope1":{"type":"number","minimum":0},
"scope2":{"type":"number","minimum":0},
"scope3_cat1":{"type":"number","minimum":0},
"revenue":{"type":"number","exclusiveMinimum":0}
},
"required":["revenue"],
"additionalProperties":false
}'::jsonb,
'{
"type":"object",
"properties":{"scope3_category":{"type":"string"}, "jurisdiction":{"type":"string"}},
"additionalProperties":false
}'::jsonb,
'{
"type":"object",
"properties":{"result":{"type":"number"},"unit":{"type":"string"},"provenance":{"type":"object"}},
"required":["result","unit","provenance"]
}'::jsonb,
"python://zayaz.compute.ghg:intensity_v1",
'["IPCC-EFDB"]'::jsonb,
'["EU_ONLY"]'::jsonb
);

Service startup: dynamic loader

compute_registry.py
# compute_registry.py
from importlib import import_module
from jsonschema import Draft202012Validator

class MethodSpec:
def __init__(self, row):
self.id = row["method_id"]
self.version = row["version"]
self.status = row["status"]
self.inputs_schema = row["inputs_schema_json"]
self.options_schema = row["options_schema_json"]
self.output_schema = row["output_schema_json"]
self.impl_ref = row["implementation_ref"]
self.datasets = row["dataset_requirements"] or []
self.acl = row["acl_tags"] or []

def load_methods(db):
specs = {}
for row in db.query("SELECT * FROM compute_method_registry WHERE status <> 'deprecated'"):
spec = MethodSpec(row)
# Resolve entrypoint like "python://package.module:function"
module_path, fn_name = spec.impl_ref.replace("python://","").split(":")
fn = getattr(import_module(module_path), fn_name)
# Precompile validators
spec.validate_in = Draft202012Validator(spec.inputs_schema)
spec.validate_opt = Draft202012Validator(spec.options_schema or {"type":"object"})
spec.validate_out = Draft202012Validator(spec.output_schema)
specs[(spec.id, spec.version)] = (spec, fn)
return specs

METHODS = load_methods(db)

On request:

compute_registry-or.py
def execute(method_id, version, inputs, options, ctx):
entry = get_method(method_id, version)
spec, fn = entry.spec, entry.fn

# --- contract validation ---
spec.validate_in.validate(inputs)
spec.validate_opt.validate(options or {})

# --- governance / trust gates ---
if not ctx.datasets_ok(spec.datasets):
raise PermissionError("Dataset provenance or availability check failed")

if not ctx.acl_ok(spec.acl):
raise PermissionError("ACL / residency check failed")

# --- execution (pure function) ---
result = fn(inputs, options or {}, ctx)

# --- output contract ---
spec.validate_out.validate(result)

# --- audit trail ---
ctx.audit_log(
method_id=method_id,
version=version,
inputs=inputs,
options=options,
result=result,
datasets=spec.datasets
)

return result

Method implementation skeleton

ghg.py
# zayaz/compute/ghg.py
def intensity_v1(inputs, options, ctx):
s1 = inputs.get("scope1", 0.0)
s2 = inputs.get("scope2", 0.0)
s3 = sum(v for k,v in inputs.items() if k.startswith("scope3_"))
rev = inputs["revenue"] # schema guarantees present and >0
val = (s1 + s2 + s3) / rev
return {
"result": val,
"unit": "tCO2e/€m",
"provenance": {
"dataset": "IPCC-EFDB.xlsx",
"method": "EFDB Tier 1",
"hash": ctx.dataset_hash("IPCC-EFDB.xlsx")
}
}

Discovery endpoint Expose a discovery API so methods are never hard-coded:

compute-methods.html
GET /compute/methods
Accept: application/json
compute-methods.json
{
"methods": [
{
"method_id": "GHG.intensity",
"version": "1.0.0",
"status": "supported",
"description": "GHG emissions per € revenue…",
"inputs_schema": { "$ref": "https://schemas.zayaz.io/compute/GHG.intensity/inputs/1.0.0" },
"options_schema": { "$ref": "https://schemas.zayaz.io/compute/GHG.intensity/options/1.0.0" },
"output_schema": { "$ref": "https://schemas.zayaz.io/compute/GHG.intensity/output/1.0.0" }
},
{
"method_id": "GHG.scenario_align",
"version": "1.0.0",
"status": "beta",
"description": "Scenario alignment against pathways",
"inputs_schema": { "$ref": "https://schemas.zayaz.io/compute/GHG.scenario_align/inputs/1.0.0" },
"options_schema": { "$ref": "https://schemas.zayaz.io/compute/GHG.scenario_align/options/1.0.0" },
"output_schema": { "$ref": "https://schemas.zayaz.io/compute/GHG.scenario_align/output/1.0.0" }
}
]
}

CI/CD & Governance

Contract tests (CI) - bash:

# 1) Lint schemas
jsonschema-lint schemas/**/*.json

# 2) Validate sample payloads against registry schemas
pytest tests/compute_contracts/ --maxfail=1 -q

# 3) Golden tests (determinism within tolerances)
pytest tests/compute_golden/ --maxfail=1 -q

Promotion policy:

  • New method → status = beta, requires:
    • Passing Evaluation Harness numeric gates (0% numeric hallucination).
    • SLO checks (latency P95, error rate).
    • Data residency & ACL checks.
  • After approval → flip to supported via DB migration/PR.
  • Deprecation: mark deprecated, keep available for 12 months; discovery API returns "status":"deprecated" and a "replacement":"<method,version>".

Helm values / feature flagging:

feature-flagging.yaml
compute:
allow_methods:
- method_id: GHG.intensity
version: "1.0.0"
- method_id: GHG.scenario_align
version: "1.0.0"

The service enforces an allowlist on top of the registry for staged rollout.

1.7. Standards Frameworks — Dynamic Policy Gate (Production Pack)

JiraStoryUnlinkedZYZ-732AIIL-CON#17-standards-frameworks-dynamic-policy-gate-production-pack#1
  • Postgres DDL: frameworks & criteria

Input Table Structure

Source file: standards_frameworks.xlsx
SignalTypeExampleDescription
framework_idstring (pk)ESRSCanonical identifier of a standards framework (e.g. ESRS, ISSB, SEC). Globally stable and later usable as a USO concept anchor, a policy routing key in TR/PG, a compliance scope discriminator in ZADIF
namestringEuropean Sustainability Reporting StandardsHuman-readable name of the standards framework
statusenumsupportedLifecycle status of the framework (supported, beta, deprecated)
descriptionstringESRS set of standards under CSRD for sustainability disclosuresTextual description of scope and purpose of the framework
created_attimestamptz2025-01-10T08:22:14ZTimestamp when the framework record was created
updated_attimestamptz2025-06-02T11:05:37ZTimestamp when the framework record was last updated
Family
codetextclassificationaudit
Used by engines
TRPGZADIFZARAZSSRALTD
Used by modules
SISZARAIHUBRIHUB
standards_frameworks.sql
-- 1. Frameworks master
CREATE TABLE IF NOT EXISTS standards_frameworks (
framework_id TEXT PRIMARY KEY, -- e.g., 'ESRS', 'ISSB', 'SEC'
name TEXT NOT NULL, -- human label
status TEXT NOT NULL DEFAULT 'supported', -- supported|beta|deprecated
description TEXT NOT NULL DEFAULT '',
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
);

Input Table Structure

Source file: standards_criteria.xlsx
SignalTypeExampleDescription
criterion_idstring (pk)ESRS.data_residency.eu_onlyCanonical identifier of a policy criterion (e.g. ESRS.data_residency.eu_only)
framework_idstring (fk)ESRSReference to owning standards framework
versionstring1.0.0Semantic version of the criterion rule
engineenumjsonlogicRule evaluation engine (jsonlogic, cel)
rule_jsonjsonb{”==”:[{“var”:“data_region”},“EU”]}Serialized rule definition executed by the policy engine
severityenumblockEnforcement level (block, warn, info)
descriptionstringRequires all regulated data to remain within EU regionsHuman-readable explanation of the criterion
created_attimestamptz2025-02-14T09:31:07ZTimestamp when the criterion was created
updated_attimestamptz2025-05-21T16:44:52ZTimestamp when the criterion was last updated
Family
codeclassificationtextaudit
Used by engines
TRPGZADIFZARAALTD
Used by modules
SISZARARIHUB
standards_criteria.sql
-- 2. Criteria catalog (per framework)
-- Each row is a boolean rule expressed as JSONLogic or CEL (your choice).
CREATE TABLE IF NOT EXISTS standards_criteria (
criterion_id TEXT PRIMARY KEY, -- e.g., 'ESRS.data_residency.eu_only'
framework_id TEXT NOT NULL REFERENCES standards_frameworks(framework_id) ON DELETE CASCADE,
version TEXT NOT NULL, -- semantic version for the rule
engine TEXT NOT NULL, -- 'jsonlogic' | 'cel'
rule_json JSONB NOT NULL, -- JSONLogic expr or CEL serialized config
severity TEXT NOT NULL DEFAULT 'block', -- block|warn|info
description TEXT NOT NULL DEFAULT '',
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
UNIQUE (framework_id, criterion_id, version)
);

Input Table Structure

Source file: tenant_frameworks.xlsx
SignalTypeExampleDescription
tenant_idstring (pk, fk)eco-196-123-456-789Tenant identifier (E-C-O™ Number or internal tenant key)
framework_idstring (pk, fk)ESRSFramework enabled or disabled for the tenant
statusenumenabledEnablement state (enabled, disabled)
notesstringEnabled for CSRD FY2025 reportingTenant-specific annotations or governance notes
created_attimestamptz2025-01-22T10:18:05ZTimestamp when the tenant-framework link was created
updated_attimestamptz2025-06-01T15:42:11ZTimestamp when the tenant-framework link was last updated
Family
idcodeclassificationtextaudit
Used by engines
ZADIFZARAALTDTRPG
Used by modules
SISZARA
tenant_frameworks.sql
-- 3. Framework enablement per tenant (allowlist)
CREATE TABLE IF NOT EXISTS tenant_frameworks (
tenant_id TEXT NOT NULL,
framework_id TEXT NOT NULL REFERENCES standards_frameworks(framework_id) ON DELETE CASCADE,
status TEXT NOT NULL DEFAULT 'enabled', -- enabled|disabled
notes TEXT NOT NULL DEFAULT '',
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
PRIMARY KEY (tenant_id, framework_id)
);

Input Table Structure

Source file: v_framework_criteria_latest.xlsx
SignalTypeExampleDescription
framework_idstringESRSFramework identifier
criterion_idstringESRS.data_residency.eu_onlyCriterion identifier
versionstring1.2.0Latest version of the criterion
engineenumjsonlogicRule engine used
rule_jsonjsonb{”==”:[{“var”:“data_region”},“EU”]}Latest executable rule definition
severityenumblockEnforcement severity
descriptionstringEnforces EU-only data residency for regulated datasetsCriterion description
updated_attimestamptz2025-06-10T09:55:31ZTimestamp of the latest criterion update
Family
codeclassificationtextaudit
Used by engines
TRPGZADIFZARAALTD
Used by modules
SISZARARIHUB
v_framework_criteria_latest.sql
-- 4. Optional: “latest” view for criteria per framework
CREATE VIEW v_framework_criteria_latest AS
SELECT DISTINCT ON (framework_id, criterion_id)
framework_id, criterion_id, version, engine, rule_json, severity, description, updated_at
FROM standards_criteria
ORDER BY framework_id, criterion_id, updated_at DESC;

Notes:

  • standards_frameworks

    • framework_id should be globally stable and later usable as:
      • a USO concept anchor
      • a policy routing key in TR/PG
      • a compliance scope discriminator in ZADIF
  • standards_criteria table

    • ZAYAZ best-practice extension (recommended):
      • Add applicable_scope later (reporting, data_ingest, export, ai_action)
      • Add trust_impact_weight to feed VTE / DaVE scoring
  • tenant_frameworks

    • Important design intent:
      • This table is what lets ZARA ask:
      • “Which frameworks am I allowed to reason under for this tenant?”
  • v_framework_criteria_latest

    • v_framework_criteria_latest is a read-optimized derived view, not a persistence table. It provides a deterministic projection of the most recent criterion version per framework and criterion ID, intended for:
      • policy evaluation engines
      • runtime compliance checks
      • API consumers requiring “current truth” without version negotiation
  • Seed data: ESRS, ISSB, SEC (minimum viable)

-- Frameworks
INSERT INTO standards_frameworks (framework_id, name, status, description)
VALUES
('ESRS','EU ESRS','supported','European Sustainability Reporting Standards'),
('ISSB','ISSB IFRS S1/S2','beta','IFRS Sustainability & Climate disclosures'),
('SEC','US SEC Climate','beta','US SEC climate-related disclosures')
ON CONFLICT (framework_id) DO UPDATE
SET name=EXCLUDED.name, status=EXCLUDED.status, description=EXCLUDED.description, updated_at=now();

-- Criteria (examples)
-- ESRS: require EU data residency for compute + storage
INSERT INTO standards_criteria (criterion_id, framework_id, version, engine, rule_json, severity, description)
VALUES
('ESRS.data_residency.eu_only','ESRS','1.0.0','jsonlogic',
'{"==":[{"var":"tenant.region"}, "EU"]}', 'block',
'Tenant region must be EU for ESRS processing/export'),
('ESRS.pack.enabled','ESRS','1.0.0','jsonlogic',
'{"in":[ "ESRS", {"var":"tenant.framework_allowlist"} ]}', 'block',
'ESRS must be allowlisted for this tenant');

-- ISSB: allow global residency, but require ISSB pack enablement
INSERT INTO standards_criteria (criterion_id, framework_id, version, engine, rule_json, severity, description)
VALUES
('ISSB.pack.enabled','ISSB','1.0.0','jsonlogic',
'{"in":[ "ISSB", {"var":"tenant.framework_allowlist"} ]}', 'block',
'ISSB must be allowlisted for this tenant');

-- SEC: US residency or approved exception; pack enabled
INSERT INTO standards_criteria (criterion_id, framework_id, version, engine, rule_json, severity, description)
VALUES
('SEC.residency.us_or_exception','SEC','1.0.0','jsonlogic',
{"or":[ {"==":[{"var":"tenant.region"}, "US"]}, {"==":[{"var":"tenant.policy_exception"}, true]} ]}, 'block',
'SEC requires US residency or approved exception'),
('SEC.pack.enabled','SEC','1.0.0','jsonlogic',
'{"in":[ "SEC", {"var":"tenant.framework_allowlist"} ]}', 'block',
'SEC must be allowlisted for this tenant')
ON CONFLICT (criterion_id) DO NOTHING;

More granular rules can be added later (e.g., “export destinations”, “dataset white/blacklists”) by upserting new criteria rows — no app code change required.

  • FastAPI runtime: load criteria and evaluate dynamically Key idea: requests declare framework (e.g., "ESRS") and we evaluate all latest criteria for that framework (from the DB) against a context object (tenant profile + request metadata). Block or warn based on rule outcomes.
runtime.py
# app/standards/runtime.py
from typing import Dict, List
from fastapi import HTTPException
import asyncpg
import json
from jsonlogic import jsonLogic # pip install json-logic-qubit (or equivalent)

class CriteriaRepo:
def __init__(self, pool: asyncpg.Pool):
self.pool = pool

async def get_latest_rules(self, framework_id: str) -> List[Dict]:
sql = """
SELECT framework_id, criterion_id, version, engine, rule_json, severity
FROM v_framework_criteria_latest
WHERE framework_id = $1
"""
async with self.pool.acquire() as conn:
rows = await conn.fetch(sql, framework_id)
return [dict(r) for r in rows]

async def evaluate_framework(pool: asyncpg.Pool, framework_id: str, context: Dict):
repo = CriteriaRepo(pool)
rules = await repo.get_latest_rules(framework_id)
violations = []
for r in rules:
engine = r["engine"]
rule = r["rule_json"]
if engine == "jsonlogic":
ok = bool(jsonLogic(rule, context))
else:
# Extend with CEL engine if used (compile/eval CEL expr)
# ok = eval_cel(rule, context)
raise HTTPException(status_code=500, detail=f"Unsupported engine: {engine}")
if not ok and r["severity"] == "block":
violations.append({"criterion_id": r["criterion_id"], "severity": r["severity"]})
if violations:
raise HTTPException(
status_code=403,
detail={"message": "Framework policy blocked", "framework": framework_id, "violations": violations}
)

How to use: In the router or adapter path, call evaluate_framework(pool, framework_id, context) before retrieval/compute/export. The context object is table-driven (e.g., tenant.region, tenant.framework_allowlist, request.export_target, etc.).

Adaptive JSON Schema (driven by frameworks table)

Expose a disclosure request schema that validates framework against the DB and adapts optional/required fields based on what’s enabled. (In practice, you can serve a prebuilt static schema per framework or build at runtime.)

Minimal dynamic generator (Python)

schema.py
# app/standards/schema.py
from typing import Dict, List

def schema_for_frameworks(framework_ids: List[str]) -> Dict:
return {
"$id": "https://zayaz.io/schemas/disclosure.request.json",
"type": "object",
"required": ["framework", "datapoint_ref"],
"properties": {
"framework": { "type": "string", "enum": framework_ids },
"datapoint_ref": { "type": "string" },
"inputs": { "type": "object" },
"options": { "type": "object" }
},
"allOf": [
# Example: ESRS adds required 'citations'
{
"if": {"properties": {"framework": {"const": "ESRS"}}},
"then": {
"properties": {
"citations": {
"type": "array",
"items": { "type": "string", "pattern": r"\[.+?:L\d+-L\d+\]" }
}
},
"required": ["citations"]
}
},
# Extend with ISSB/SEC specific clauses...
]
}

Endpoint that serves the current schema (reads frameworks table):

api.py
# app/standards/api.py
from fastapi import APIRouter, Depends
import asyncpg
from .schema import schema_for_frameworks

router = APIRouter(prefix="/standards", tags=["standards"])

async def get_pool() -> asyncpg.Pool:
# wire your pool creation elsewhere and inject here
...

@router.get("/schema")
async def get_dynamic_schema(pool: asyncpg.Pool = Depends(get_pool)):
rows = await pool.fetch("SELECT framework_id FROM standards_frameworks WHERE status IN ('supported','beta')")
framework_ids = [r["framework_id"] for r in rows]
return schema_for_frameworks(framework_ids)

This guarantees clients always see a schema aligned to whatever frameworks are present in the table (and their policies baked into allOf clauses).

  • Admin endpoints (upsert frameworks/criteria)

Use idempotent upserts so admins can add/modify frameworks & criteria without redeploys. Wrap in RBAC (“standards_admin”) and audit every change.

admin_api.py
# app/standards/admin_api.py
from fastapi import APIRouter, Depends, HTTPException
from pydantic import BaseModel, Field
import asyncpg
from typing import List, Literal, Dict, Any

router = APIRouter(prefix="/admin/standards", tags=["admin-standards"])

class FrameworkUpsert(BaseModel):
framework_id: str = Field(pattern=r"^[A-Z]{2,8}$")
name: str
status: Literal["supported","beta","deprecated"] = "supported"
description: str = ""

class CriterionUpsert(BaseModel):
criterion_id: str
framework_id: str
version: str
engine: Literal["jsonlogic","cel"]
rule_json: Dict[str, Any]
severity: Literal["block","warn","info"] = "block"
description: str = ""

async def get_pool() -> asyncpg.Pool: ...
def require_admin(): ... # the RBAC

@router.post("/frameworks/upsert")
async def upsert_framework(payload: FrameworkUpsert, pool: asyncpg.Pool = Depends(get_pool), _=Depends(require_admin)):
sql = """
INSERT INTO standards_frameworks (framework_id, name, status, description)
VALUES ($1,$2,$3,$4)
ON CONFLICT (framework_id) DO UPDATE
SET name=$2, status=$3, description=$4, updated_at=now()
RETURNING framework_id;
"""
async with pool.acquire() as conn:
row = await conn.fetchrow(sql, payload.framework_id, payload.name, payload.status, payload.description)
return {"ok": True, "framework_id": row["framework_id"]}

@router.post("/criteria/upsert")
async def upsert_criterion(payload: CriterionUpsert, pool: asyncpg.Pool = Depends(get_pool), _=Depends(require_admin)):
sql = """
INSERT INTO standards_criteria (criterion_id, framework_id, version, engine, rule_json, severity, description)
VALUES ($1,$2,$3,$4,$5,$6,$7)
ON CONFLICT (criterion_id) DO UPDATE
SET framework_id=$2, version=$3, engine=$4, rule_json=$5, severity=$6, description=$7, updated_at=now()
RETURNING criterion_id, version;
"""
async with pool.acquire() as conn:
row = await conn.fetchrow(sql, payload.criterion_id, payload.framework_id, payload.version,
payload.engine, json.dumps(payload.rule_json), payload.severity, payload.description)
return {"ok": True, "criterion_id": row["criterion_id"], "version": row["version"]}
  • Admin safety
    • Require standards_admin role (JWT claim) and log every change with a decision_id in the audit trail.
    • Optionally add dry-run endpoints that evaluate a proposed rule against sample tenant contexts before upsert.

How it ties to AI & compliance (summary)

  • Behavior adapters and routers consult the tables to know which frameworks are enabled and what rules to enforce.
  • RAG prompts and Packaging (JSON/XBRL) apply framework-specific requirements (e.g., ESRS citations, export ACLs) without changing code — admin updates the table.
  • Go/No-Go gates read the same source of truth (criteria) to determine if a call should proceed (block/warn).
  • Auditors/Verifiers can fetch a snapshot of active frameworks & criteria to see exactly what policies governed an AI response.

1.8. The “Standards Registry”s relationship to the “Dataset Registry”

JiraStoryUnlinkedZYZ-730AIIL-CON#18-the-standards-registry-s-relationship-to-the-dataset-registry#1
  • “Standards Registry” (frameworks/criteria/tenant enablement)
  • Dataset Registry” (artifacts/schema/provenance/read policies)

The “Standards Registry”s and the “Dataset Registry” are two different registries at two different abstraction layers:

What standards_frameworks is Purpose: A compliance framework registry (policy scope + obligations layer).

It answers:

  • “Which reporting/governance frameworks exist in this tenant’s universe?” (ESRS, ISSB, SEC…)
  • “Is this framework supported/beta/deprecated?”
  • “What policy criteria (rules) apply under that framework?” (via standards_criteria)
  • “Is this framework enabled for this tenant?” (via tenant_frameworks)

Nature of data: Normative / governance metadata (framework identity + policy gates).

Primary consumers: TR/PG policy gating, ZARA compliance reasoning, FOGE form selection by framework, validation requirement selection.

What dataset_registry is Purpose: A data artifact registry (datasets as inputs to computation).

It answers:

  • “What dataset artifact is this?” (dataset_id, provider, version)
  • “Where is it stored and what is its integrity lineage?” (checksum, validity window, docs)
  • “What is its structural contract?” (column_schema, schema_version, primary_keys, partitions)
  • “How may it be read safely?” (read_policy)
  • “How do we validate quality and freshness?” (integrity_rules, update_frequency)

Nature of data: Empirical / computational input metadata (data contracts, provenance, reproducibility).

Primary consumers: ingestion pipelines, validation engines (DaVE/DICE), compute hub / micro-engines, reproducibility/audit (ALTD), query planning/performance.

The clean distinction:

  • standards_frameworks = “What rules/obligations apply?” (policy universe)
  • dataset_registry = “What data is allowed/available to compute with, and under what contract?” (data universe)

How they relate A framework often depends on datasets (e.g., ESRS calculations might rely on IPCC EFDB factors). That dependency should be expressed explicitly, e.g.:

How to bridge tables

framework_dataset_bindings (or framework_dataset_requirements)

  • framework_id
  • dataset_id
  • usage_purpose (e.g., emission_factors, scenario_analysis, taxonomy_mapping)
  • required_for (e.g., E1, E2, ESRS_E1-6)
  • min_dataset_version / allowed_versions
  • tenant_override_allowed (bool)
  • policy_guard (optional: link to a criterion_id that enforces use/ban)

That keeps the system auditable:

  • frameworks define obligations
  • datasets define evidence + computation inputs
  • bindings define approved dependencies

Practical example

  • standards_frameworks: ESRS (supported)
  • standards_criteria: ESRS.data_residency.eu_only (block)
  • dataset_registry: IPCC-EFDB-2023 with checksum, schema, integrity rules
  • binding: ESRS -> IPCC-EFDB-2023 used for E1 emissions factors

So when ZARA or a micro-engine runs a calculation:

  • policy gate confirms ESRS allowed + criteria satisfied
  • compute engine loads dataset by registry key/version + enforces schema/read_policy/integrity_rules

2. Versioning & Lifecycle with Composite Keys.

  • Parallel versions: Run GHG.intensity v1.0.0 and v1.1.0 side-by-side for canaries/rollouts.
  • Determinism: Every compute call is reproducible: method + version uniquely identifies code, schemas, and outputs.
  • Non-breaking evolution: You can add fields/tweaks in v1.1.0 without risking live users on v1.0.0.
  • Clean deprecation: Mark old versions deprecated without touching newer ones.

Service behavior with composite key

2.1. Discovery endpoint

JiraStoryUnlinkedZYZ-731AIIL-CON#21-discovery-endpoint#1
  • GET /compute/methods → returns all (method_id, version, status, schemas).
  • Optional: GET /compute/methods/{method_id} to list versions, with a "latest" hint from compute_method_latest.

Invocation

  • Client must pass version for determinism:
    • POST /compute/factor { "method":"GHG.intensity", "version":"1.0.0", ... }
  • Server rejects calls without version (or uses a clear policy: permit only when compute_method_latest has an entry and the caller is on an allowlist).

Promotion

  • Add new row (method_id='GHG.intensity', version='1.1.0', status='beta').
  • Run Eval Harness → flip status to supported.
  • Update compute_method_latest to '1.1.0' when ready (doesn’t affect users who still request 1.0.0 explicitly).

Deprecation

  • Set status='deprecated' for 1.0.0; keep discoverable for a grace window.
  • Optionally enforce an allowlist that blocks deprecated versions in prod tenants.

2.2. Operational Controls & RBAC

Purpose: Ensure only authorized tenants, users, or services can invoke compute methods, aligned with jurisdictional and contractual constraints.

  • Role-Based Access Control (RBAC):
    • Roles:

      • tenant_admin → full access to methods, can manage users.
      • compute_client → invoke methods approved for their tenant.
      • auditor → list methods, replay past jobs, no new execution.
      • regulator_view → read-only access to results and logs.
    • Scope:

      • RBAC applies at method level with ACL tags enforcing jurisdictional limits.
      • Example: A tenant in the US may be restricted from methods tagged EU_ONLY.

2.3. ACL Tags Integration:

Each method in compute_method_registry carries acl_tags (JSON array):

tags-integration.json
{
"method_id": "GHG.intensity",
"version": "1.0.0",
"acl_tags": ["EU_ONLY", "NO_PHI"]
}

Runtime enforcement checks:

  • Tenant jurisdiction (tenant.country) vs. EU_ONLY.
  • Input dataset classification (contains_PHI) vs. NO_PHI.

If constraints fail → request rejected (403 Forbidden).

Example RBAC Policy (YAML)

rbac-policy.json
roles:
tenant_admin:
allow:
- "*"
compute_client:
allow:
- "GHG.intensity:*"
- "Energy.intensity:*"
deny:
- "Water.abs:*" # not licensed
auditor:
allow:
- "GET:/compute/methods"
- "GET:/compute/jobs/*"
deny:
- "POST:/compute/*"
regulator_view:
allow:
- "GET:/audit/logs/*"
- "GET:/compute/jobs/*"
deny:
- "POST:/compute/*"

acl_overrides:
EU_ONLY:
allow_regions: ["EU", "NO", "IS", "LI"]
NO_PHI:
deny_if_dataset_contains: ["PHI"]

Runtime Enforcement

  • Pre-check: Verify user role & permissions.
  • ACL evaluation: Compare request context (tenant region, dataset tags).
  • Decision log: Write decision_id with outcome (ALLOW or DENY).

Audit Coupling: Every decision is recorded with full context → reproducible RBAC behavior for verifiers and regulators.

2.4. Audit Log Example

Every RBAC and ACL decision is logged. Logs are immutable, signed, and queryable by verifiers.

audit-log-example.json
{
"timestamp": "2025-09-14T10:22:35Z",
"decision_id": "dec-8f92c6a7",
"tenant_id": "tenant-123",
"user_id": "user-456",
"role": "compute_client",
"method_id": "GHG.intensity",
"version": "1.0.0",
"request_context": {
"region": "NO",
"datasets": ["IPCC-EFDB.xlsx"],
"input_hash": "sha256:7f8a1e..."
},
"acl_evaluation": {
"EU_ONLY": "ALLOW",
"NO_PHI": "ALLOW"
},
"outcome": "ALLOW",
"latency_ms": 43,
"provenance_id": "prov-a17c22bc"
}

Key Points

  • decision_id: Unique ID for this RBAC evaluation; can be cross-referenced in other logs.
  • role: Role of the caller (compute_client in this case).
  • acl_evaluation: Shows tag-by-tag decision. Helpful when debugging jurisdictional denials.
  • outcome: Either ALLOW or DENY.
  • provenance_id: Links back to the compute job execution record for full reproducibility.

This gives us a closed loop:

  • RBAC Policy (YAML) defines what should happen.
  • Runtime Enforcement applies the rules.
  • Audit Log (JSON) proves what actually happened.

Security & Audit:

Purpose: Guarantee reproducibility, provenance, and traceability for every compute execution.

  • ACL tags (e.g., EU_ONLY) enforced per-tenant/jurisdiction.
  • Dataset requirements verified at runtime; include dataset hashes in output.
  • Audit log per call: tenant, method/version, input hash, output hash, dataset hashes, latency, decision IDs.
  • Reproducibility: every response includes a provenance_id and the method/version used.

Example: adding a new method:

  • PR adds row to compute_method_registry with schemas + implementation_ref.
  • Add implementation function under zayaz.compute.<domain>.
  • Add tests: contract validation + golden outputs.
  • CI passes → staged rollout (Helm allowlist enables for pilot tenants).
  • Promote to supported after Evaluation Harness gates pass.

2.5. Audit Log Query API

Endpoints

GET /audit/logs Query RBAC/ACL and compute audit entries. Query params (all optional):

  • tenant_id (string)
  • user_id (string)
  • method_id (string) — e.g., GHG.intensity
  • version (string) — e.g., 1.0.0
  • decision (string) — ALLOW | DENY
  • acl_tag (string) — e.g., EU_ONLY
  • from (RFC3339) — start time
  • to (RFC3339) — end time
  • limit (int, default 100, max 1000)
  • cursor (string) — pagination token

Response (200)

audit-log-resp.json
{
"items": [
{
"timestamp": "2025-09-14T10:22:35Z",
"decision_id": "dec-8f92c6a7",
"tenant_id": "tenant-123",
"user_id": "user-456",
"role": "compute_client",
"method_id": "GHG.intensity",
"version": "1.0.0",
"acl_evaluation": { "EU_ONLY": "ALLOW", "NO_PHI": "ALLOW" },
"outcome": "ALLOW",
"provenance_id": "prov-a17c22bc",
"region": "NO",
"latency_ms": 43
}
],
"next_cursor": "eyJwYWdlIjoyfQ=="
}

GET /audit/logs/{decision_id} Fetch a single audit record by decision ID. Response (200):

audit-rec.json
{
"timestamp": "2025-09-14T10:22:35Z",
"decision_id": "dec-8f92c6a7",
"tenant_id": "tenant-123",
"user_id": "user-456",
"role": "compute_client",
"request": {
"path": "/compute/factor",
"method": "POST",
"ip": "203.0.113.10"
},
"method_id": "GHG.intensity",
"version": "1.0.0",
"request_context": {
"region": "NO",
"datasets": ["IPCC-EFDB.xlsx"],
"input_hash": "sha256:7f8a1e..."
},
"acl_evaluation": { "EU_ONLY": "ALLOW", "NO_PHI": "ALLOW" },
"outcome": "ALLOW",
"provenance_id": "prov-a17c22bc",
"links": {
"execution": "/compute/jobs/prov-a17c22bc"
}
}

OpenAPI Excerpt

openai-excerpt.yaml
paths:
/audit/logs:
get:
summary: Query audit logs (RBAC/ACL + compute)
parameters:
- in: query; name: tenant_id; schema: { type: string }
- in: query; name: user_id; schema: { type: string }
- in: query; name: method_id; schema: { type: string }
- in: query; name: version; schema: { type: string }
- in: query; name: decision; schema: { type: string, enum: [ALLOW, DENY] }
- in: query; name: acl_tag; schema: { type: string }
- in: query; name: from; schema: { type: string, format: date-time }
- in: query; name: to; schema: { type: string, format: date-time }
- in: query; name: limit; schema: { type: integer, default: 100, maximum: 1000 }
- in: query; name: cursor; schema: { type: string }


responses:
'200':
description: Audit log page
content:
application/json:
schema:
type: object
properties:
items:
type: array
items: { $ref: '#/components/schemas/AuditLog' }
next_cursor:
type: string
/audit/logs/{decision_id}:
get:
summary: Get a single audit log entry
parameters:
- in: path
name: decision_id
required: true
schema: { type: string }
responses:
'200':
description: The audit record
content:
application/json:
schema: { $ref: '#/components/schemas/AuditLog' }

components:
schemas:
AuditLog:
type: object
properties:
timestamp: { type: string, format: date-time }
decision_id: { type: string }
tenant_id: { type: string }
user_id: { type: string }
role: { type: string }
method_id: { type: string }
version: { type: string }
request:
type: object
properties:
path: { type: string }
method: { type: string }
ip: { type: string }
request_context:
type: object
properties:
region: { type: string }
datasets:
type: array
items: { type: string }
input_hash: { type: string }
acl_evaluation:
type: object
additionalProperties:
type: string
enum: [ALLOW, DENY]
outcome: { type: string, enum: [ALLOW, DENY] }
provenance_id: { type: string }
region: { type: string }
latency_ms: { type: integer }
links:
type: object
properties:
execution: { type: string }

Security & Access

  • Roles allowed:

    • auditor, tenant_admin, regulator_view: read access.
    • compute_client: only own-tenant logs (enforced by tenancy scope).
  • Headers:

    • X-Tenant-ID, X-User-Role (or JWT claims) required for scoping.
  • PII:

    • IP/identifiers configurable; default redact or minimize for non-admin roles.

2.6. Index & Retention (DB)

  • Indexes:

    • (tenant_id, timestamp DESC)
    • (decision_id) unique
    • (method_id, version, timestamp DESC)
    • GIN on acl_evaluation JSON if tag filtering is heavy
  • Retention:

    • Hot (90 days) online; warm (12–24 months) in object storage.
    • Export signing + hash chains for tamper-evidence.

2.6.1. Loader & Runtime Enforcement (Code Recipes)**

**Minimal loader changes (pseudocode)

load_methods.py
def load_methods(db):
specs = {}
rows = db.query("""
SELECT * FROM compute_method_registry
WHERE status <> 'deprecated'
""")
for r in rows:
key = (r["method_id"], r["version"])
specs[key] = compile_spec_and_fn(r) # imports impl, compiles JSON Schemas
return specs

def execute(method_id, version, inputs, options, ctx):
spec = METHODS[(method_id, version)] # exact version lookup
validate_inputs(spec, inputs, options)
enforce_acl_and_datasets(spec, ctx)
out = spec.fn(inputs, options, ctx)
validate_output(spec, out)
audit(spec, inputs, out, ctx)
return out

2.6.2. Admin Flows (SQL Playbooks)

Register new version

compute_method_registry-insert.sql
INSERT INTO compute_method_registry (method_id, version, status, description,
inputs_schema_json, options_schema_json, output_schema_json,
implementation_ref, dataset_requirements, acl_tags)
VALUES
('GHG.intensity','1.1.0','beta','Improved handling of Scope 3 categories',
:inputs_schema, :options_schema, :output_schema,
'python://zayaz.compute.ghg:intensity_v110', '["IPCC-EFDB"]', '["EU_ONLY"]');

Promote to “latest”

compute_method_registry-update.sql
UPDATE compute_method_registry
SET status='supported', updated_at=now()
WHERE method_id='GHG.intensity' AND version='1.1.0';

INSERT INTO compute_method_latest (method_id, version)
VALUES ('GHG.intensity','1.1.0')
ON CONFLICT (method_id) DO UPDATE SET version=EXCLUDED.version;

Deprecate old

compute_method_registry.json
UPDATE compute_method_registry
SET status='deprecated', updated_at=now()
WHERE method_id='GHG.intensity' AND version='1.0.0';

OpenAPI excerpt

Here’s a short OpenAPI excerpt showing how to expose both GET /compute/methods and POST /compute/factor, with the requirement that execution is deterministic only when the (method_id, version) pair is provided.

openai-excerpt.yaml
openapi: 3.0.3
info:
title: ZAYAZ Compute API
version: 1.0.0

paths:
/compute/methods:
get:
summary: List registered compute methods
description: >
Returns all registered compute methods. Clients must select both
`method_id` and `version` to guarantee deterministic execution.
      responses:
'200':
description: List of methods
content:
application/json:
schema:
type: array
items:
type: object
properties:
method_id:
type: string
example: "GHG.intensity"
version:
type: string
example: "1.0.0"
status:
type: string
enum: [supported, beta, deprecated]
description:
type: string
example: "GHG emissions per € revenue"
x-internal-tags:
type: object
properties:
acl_tags:
type: array
items:
type: string
example: ["EU_ONLY", "NO_PHI"]
dataset_requirements:
type: array
items:
type: string
example: ["IPCC-EFDB", "IEA_SSP"]

/compute/factor:
post:
summary: Execute a compute method
description: >
Executes a registered compute method with inputs.
Both `method_id` and `version` are **required** for deterministic results.
requestBody:
required: true
content:
application/json:
schema:
type: object
required:
- method_id
- version
- inputs
properties:
method_id:
type: string
example: "GHG.intensity"
version:
type: string
example: "1.0.0"
inputs:
type: object
additionalProperties: true
example:
scope1: 100
scope2: 200
revenue: 50
dataset_ref:
type: string
example: "IPCC-EFDB.xlsx"
responses:
'200':
description: Compute result
content:
application/json:
schema:
type: object
properties:
result:
type: number
example: 6.0
unit:
type: string
example: "tCO2e/€"
provenance:
type: string
example: "IPCC-EFDB v2023"
x-internal-tags:
type: object
properties:
acl_tags:
type: array
items:
type: string
example: ["EU_ONLY"]
dataset_requirements:
type: array
items:
type: string
example: ["IPCC-EFDB"]

2.6.3. Eval/Gate API Contract

Request:

eval-gate-request.json
POST /eval/gate
{
"response_id": "resp-99c2a",
"tests": ["citation", "numeric", "refusal"]
}

Response:

eval-gate-response.json
{
"status": "pass",
"results": {
"citation": true,
"numeric": true,
"refusal": true
},
"provenance_id": "gate-5541f"
}

2.6.4. Governance

  • Schema Registry → All API schemas versioned in registry (JSON Schema + OpenAPI).
  • Immutable IDs → Every response includes provenance_id for audit.
  • Backward Compatibility → Old schema versions supported for 12 months.
  • Audit Trail → API logs exportable for assurance providers.


GitHub RepoRequest for Change (RFC)