The Requirement
Deterministic inference is not “close enough.” It is exact, bit-perfect reproducibility. Given identical inputs, the runtime must produce identical outputs on any hardware, at any time.
The Challenge
Standard AI runtimes introduce multiple sources of variance:
Floating-Point Drift
IEEE 754 floating-point arithmetic is not associative. Different evaluation orders produce different results:
(a + b) + c ≠ a + (b + c)
Even small rounding differences accumulate across deep networks, breaking reproducibility.
Non-Deterministic RNG
Standard random number generators use system entropy. Each run produces different values, making inference non-reproducible.
Undefined Evaluation Order
Parallel operations without explicit ordering create race conditions. Thread scheduling variance changes results.
Hardware Variance
Different GPUs, CPUs, and accelerators produce different outputs for the same operation due to implementation differences.
The Solution
A deterministic runtime eliminates all sources of variance:
Fixed-Point Arithmetic
Replace floating-point with fixed-point (Q15-style) representations where precision allows:
- Stable: No rounding variance across platforms
- Reproducible: Identical inputs → identical outputs
- Debuggable: Easier to trace numeric behavior
Deterministic Seeding
HKDF-SHA256 derives execution seeds from input digests:
seed = HKDF-SHA256(input_digest, model_version, context)
Same input → same seed → same “random” sequence. Reproducibility without stored state.
Canonical Serialization
Stable input representation:
- No whitespace variance
- Defined key ordering
- Version-locked schema
Enforced Evaluation Order
Explicit operation sequencing:
- No race conditions
- No scheduler variance
- Reproducible across cores
Verification
After execution, cryptographic hashes verify bit-perfect parity:
output_hash_a = BLAKE3(output_a)
output_hash_b = BLAKE3(output_b)
assert output_hash_a == output_hash_b
If hashes match, outputs are identical down to the bit. This is the foundation of verifiable AI.
The Evidence Chain
Determinism enables cryptographic binding:
- Input digest (BLAKE3)
- Execution seed (HKDF)
- Output digest (BLAKE3)
- Execution receipt (signed proof)
Each inference leaves a tamper-evident trail. Replay is guaranteed.
Use Cases
Debugging
Perfect replay for root-cause analysis. No “works on my machine” ambiguity.
Auditing
Third parties can verify claims by replaying inference with the same inputs.
Compliance
Regulated environments require reproducible evidence. Determinism provides it.
Security
Backdoor detection requires bit-perfect comparison. Non-deterministic systems cannot be verified.
AdapterOS Implementation
AdapterOS enforces determinism at the runtime level:
- Fixed-point constraints for numeric operations
- HKDF seed derivation for reproducible randomness
- Canonical JSON for stable inputs
- Explicit evaluation ordering for parallel ops
- BLAKE3 hashing for output verification
Patent application filed. Under review. Not an issued patent.
Performance Trade-offs
Determinism has costs:
- Fixed-point reduces precision (acceptable for many inference tasks)
- Explicit ordering limits parallelism (mitigated by careful op scheduling)
- Canonical serialization adds overhead (negligible for most prompts)
The benefit: perfect reproducibility. For regulated deployments, this is non-negotiable.
References
- IEEE 754: Floating-Point Arithmetic
- RFC 5869: HKDF
- BLAKE3: https://github.com/BLAKE3-team/BLAKE3
This is a canonical research note. For an interactive visualization, see ai.jkca.me/determinism.