Powered by OpenAIRE graph
Found an issue? Give us feedback
addClaim

Ultralytics YOLO

Authors: Jocher, Glenn; Qiu, Jing; Chaurasia, Ayush;

Ultralytics YOLO

Abstract

🌟 Summary (single-line synopsis) Ultralytics v8.4.13 makes training more resilient by automatically recovering from CUDA out-of-memory (OOM) errors during the first epoch by retrying with a smaller batch size πŸ”πŸ§ πŸ”₯ πŸ“Š Key Changes Auto-retry on CUDA OOM during training (major change) πŸ”₯πŸ› οΈ If a CUDA OOM happens in the first epoch on single-GPU, Ultralytics will retry up to 3 times, halving the batch size each time (down to 1). Training pipeline is rebuilt after batch reduction (dataloaders + optimizer + scheduler) to continue cleanly. New internal training helper 🧩 Adds a _build_train_pipeline() method to rebuild loaders/optimizer/scheduler when batch size changes (used by the new OOM recovery flow). More reliable ONNX export for OBB + NMS πŸ“¦βœ… When exporting OBB (oriented bounding boxes) to ONNX with NMS enabled, simplify=True is now forced to avoid a known runtime issue (TopK-related error in some ONNX Runtime versions). DGX system detection + TensorRT handling πŸ–₯οΈβš™οΈ Adds is_dgx() detection and uses it (along with Jetson JetPack 7) to trigger a TensorRT version check/reinstall path for better export reliability on those systems. Packaging stability fix: pin setuptools πŸ§°πŸ”’ Pins build requirements to setuptools<=81.0.0 to avoid breakages introduced by newer setuptools versions (notably affecting tensorflow.js export tooling). Docs & examples refresh (YOLO26 messaging + tracking content) πŸ“šπŸŽ₯ Tracking docs now embed a newer multi-object tracking video featuring YOLO26 + BoT-SORT/ByteTrack. Exporter docs/examples updated to show YOLO26 (yolo26n.pt) and mention ExecuTorch/Axelera export options (documentation signposting). Example dependency update πŸ”„ Updates protobuf in the RT-DETR ONNX Runtime Python example. 🎯 Purpose & Impact Fewer training crashes for everyday users πŸ™ŒπŸ”₯ If you start training with a batch size that's slightly too large for your GPU, Ultralytics can now self-correct and continue instead of failing immediatelyβ€”especially helpful for beginners and for "first-epoch spikes" in memory use. Less manual trial-and-error 🎯 Reduces the common loop of "OOM β†’ lower batch β†’ restart training," saving time and frustration. More dependable deployment exports πŸš€ ONNX exports for OBB models with embedded NMS should work more reliably out of the box, with fewer runtime surprises. More predictable builds/CI 🧱 Pinning setuptools helps prevent sudden packaging/tooling failures across environments. Clearer guidance aligned with YOLO26 🧭 Docs and examples increasingly steer users toward YOLO26 as the recommended model for training, tracking, and export workflows. What's Changed feat: πŸš€ NVIDIA DGX device variants check by @onuralpszr in https://github.com/ultralytics/ultralytics/pull/23573 Add https://youtu.be/qQkzKISt5GE to docs by @RizwanMunawar in https://github.com/ultralytics/ultralytics/pull/23582 Bump protobuf from 6.31.1 to 6.33.5 in /examples/RTDETR-ONNXRuntime-Python in the pip group across 1 directory by @dependabot[bot] in https://github.com/ultralytics/ultralytics/pull/23572 docs: πŸ“ exporter documentation for new model formats and examples updated by @onuralpszr in https://github.com/ultralytics/ultralytics/pull/23585 Force simplify=True for OBB export with NMS by @Y-T-G in https://github.com/ultralytics/ultralytics/pull/23580 Pin setuptools version by @Burhan-Q in https://github.com/ultralytics/ultralytics/pull/23589 ultralytics 8.4.13 Retry smaller batch on training CUDA OOM by @glenn-jocher in https://github.com/ultralytics/ultralytics/pull/23590 Full Changelog: https://github.com/ultralytics/ultralytics/compare/v8.4.12...v8.4.13

If you use this software, please cite it using the metadata from this file.

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average