Cryptographic proof of human oversight. Built for the auditors of the EU AI Act and LGPD.
The XASE Evidence Bundle is a self-contained proof package. Verify integrity without touching your systems.
$ ./verify.sh
Checking signature... ✓ Valid (signed by KMS key arn:aws:kms:...)
Checking hash chain... ✓ Intact (block 847 of 12,847)
Checking timestamps... ✓ Consistent (decision: 14:31:02, intervention: 14:32:47)
Checking model registry... ✓ Hash matches credit-scoring-v4.2.1
RESULT: Bundle is authentic and unmodified.XASE maps directly to regulatory requirements. No interpretation needed.
The EU AI Act, LGPD, and Central Banks now require proof—not promises.
Screenshots and emails don't survive forensic examination.
"Why did the model reject?" → "It said so." → €35M fine.
Prove the human reviewed it. Cryptographically. Offline.
XASE sits between your model and your regulator. We handle the proof.
// 3 lines. That's the integration.
const record = await xase.capture({
model: "credit-scoring-v4",
input: { income: 85000, debt_ratio: 0.32 },
output: { decision: "APPROVED", confidence: 0.94 },
explain: "shap" // Auto-generates explanation
});
// record.id → "rec_8a7f3b2c..."
// record.hash → "sha256:9f86d08..."// When a human overrides or approves
await xase.intervene({
recordId: "rec_8a7f3b2c",
actor: "joao.silva@company.com",
action: "OVERRIDE",
newOutcome: "APPROVED",
reason: "Collateral documentation verified manually",
evidence: ["doc_upload_id_1", "doc_upload_id_2"]
});
// Immutable. Signed. Identity-linked. Court-ready."We believe that as AI moves from 'Chat' to 'Action', the barrier to adoption won't be intelligence—it will be trust. We are building the infrastructure that allows companies to delegate authority to machines without carrying unlimited liability."
Built by engineers who've integrated terrible compliance tools. We made the opposite.
import { Xase } from '@xase/sdk';
const xase = new Xase({ apiKey: process.env.XASE_API_KEY });
// 1. Register your model once
await xase.models.register({
id: "credit-scoring-v4",
hash: "sha256:9f86d081884c...",
metrics: { accuracy: 0.94, auc_roc: 0.97 },
intendedUse: "Consumer credit decisions under $100K"
});
// 2. Capture every decision (async, non-blocking)
const record = await xase.capture({
model: "credit-scoring-v4",
input: { customerId: "cust_123", income: 85000 },
output: { decision: "APPROVED", limit: 25000 },
explain: "shap",
idempotencyKey: "req_abc123"
});
// 3. Record human intervention when it happens
await xase.intervene({
recordId: record.id,
actor: "analyst@company.com",
action: "APPROVED",
reason: "Manual document verification completed"
});
// 4. Export for audit (or let auditor do it)
const bundle = await xase.export({ recordId: record.id });
// Returns signed ZIP, verifiable offlineIncluding us. That's why bundles are verifiable without calling our API.
Stop building internal audit tools. Use infrastructure designed for the age of AI accountability.