
π Release v1.0.0 β kbeta-transformer2d Companion code for the paper "Kourkoutas-Ξ²: A Sunspike-Driven Adam Optimizer with Desert Flair" (arXiv:2508.12996). This release delivers the full 2-D Heat-Diffusion Transformer workload used in the experiments, packaged for easy installation via PyPI and reproducibility in research. Note: This release is identical to v1.0.0. Published only to trigger Zenodo archiving and DOI minting. β¨ Highlights End-to-end Transformer benchmark for spatialβtemporal diffusion problems. Tight integration with Kourkoutas-Ξ² (see kbeta): Drop-in optimizer swap with --optimizer=kourkoutas. Sun-spike / Ξ²β diagnostics enabled via CLI flags (--collect_spikes). Dual masking modes: autoregressive (causal) and full-context (block). RoPE positional encoding option for better long-horizon extrapolation. Quantization-ready: all dense/conv projections use mlx.nn.quantize_lin. Lightweight footprint: Paper config β 32 M parameters (24 layers, 16 heads). Runs comfortably on a single Apple Silicon GPU (Mac Studio). Configurable learning-rate schedules: Explicit step schedule via learning_rate_schedule (used in the paper). Fallback to cosine schedule controlled by init_lr, target_lr, and ramp_steps. π¦ Installation Option 1 β PyPI wheels (end-users): pip install kbeta-transformer2d Dev extras: pip install "kbeta-transformer2d[dev]" Exact paper reproducibility (pinned deps, MLX 0.26.3): pip install "kbeta-transformer2d[repro]" Option 2 β Clone for research/contribution: git clone https://github.com/sck-at-ucy/kbeta-transformer2d.git cd kbeta-transformer2d python -m venv .venv && source .venv/bin/activate pip install -e ".[dev]" π Quick start Run smoke-tests: pytest -q Train with packaged defaults: python -m kbeta_transformer2d.demo_heat2d heat2d.yml --epochs=5 --optimizer=adam95 Use explicit output directory: python -m kbeta_transformer2d.demo_heat2d heat2d.yml --epochs=5 --optimizer=kourkoutas --override storage.outdir="./OUTPUTS/run_demo" π Project layout kbeta-transformer2d βββ src/kbeta_transformer2d/ # source βββ configs/ # YAML configs (default, paper, quick-test) βββ tests/ # smoke tests βββ assets/ # figures for README π Related resources Core optimizer: kbeta PINN benchmark: kbeta-pinn3d MLX Beyond Language: MLX_BeyondLanguage π Citation If you use this work, please cite: Paper: @article{Kassinos2025Kourkoutas, title = {Kourkoutas-Ξ²: A Sunspike-Driven Adam Optimizer with Desert Flair}, author = {Stavros Kassinos}, journal = {arXiv preprint arXiv:2508.12996}, year = {2025}, url = {https://arxiv.org/abs/2508.12996} } Software (Zenodo DOI once minted): @software{kassinos2025transformer2d, author = {Stavros Kassinos}, title = {kbeta-transformer2d: 2-D Heat-Diffusion Transformer β Companion Code}, year = 2025, publisher = {Zenodo}, version = {1.0.0}, doi = {10.5281/zenodo.xxxxxxx}, url = {https://doi.org/10.5281/zenodo.xxxxxxx} } β‘οΈ v1.0.0 is the first public release β stable, tested (wheel + editable installs), and ready for both research reproduction and practical experimentation.
