
We present a method for injecting hardware-sourced entropy into LLM inference to produce provable behavioral divergence using the IBM POWER8 mftb (Move From TimeBase) instruction. Key results: Provable divergence: 3 runs with identical seeds produce 3 distinct MD5 hashes 0.2% overhead: burst strategy (every 4th token, top-512 only) is nearly free 8.81x combined speedup (16.74 to 147.54 t/s) with full PSE stack 4 behavioral metrics defined: NOI, DR, ACS, MCI for entropy-mediated quality Grounded in Hebbian learning theory and biological stochastic resonance. Part of the Proto-Sentient Emergence (PSE) framework.
Priority: December 2025 (PSE framework implementation on POWER8). Predates DeepSeek Engram (arXiv:2601.07372) by 27+ days. Source code: https://github.com/Scottcjn/ram-coffers Video evidence: https://youtu.be/T_o39s7r0iE (Dec 17, 2025)
POWER8, non-deterministic inference, neural computation, proto-sentient emergence, hardware entropy, LLM inference, stochastic resonance, behavioral divergence, timebase register, Hebbian learning, attention mechanism, PSE framework
POWER8, non-deterministic inference, neural computation, proto-sentient emergence, hardware entropy, LLM inference, stochastic resonance, behavioral divergence, timebase register, Hebbian learning, attention mechanism, PSE framework
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
