
Today, the overwhelming majority of Internet of Things (IoT) and mobile edge devices have extreme resource limitations, e.g., in terms of computing, memory, and energy. As a result, training Deep Neural Networks (DNNs) using the complex Backpropagation (BP) algorithm on such edge devices presents a major challenge. Forward-only algorithms have emerged as more computation- and memory-efficient alternatives without the requirement for backward passes. In this paper, we investigate binarizing state-of-the-art forward-only algorithms, which are applied to the forward passes of PEPITA, FF, and CwComp. We evaluate these forward-only algorithms with binarization and demonstrate that weight-only binarization may be up to ~31× more efficient in terms of memory, with minor degradation in classification performance. Furthermore, we investigate and compare BP and forward-only algorithms in terms of binarization, finding that PEPITA and FF are more vulnerable to binary activations. The code is available at https://github.com/whubaichuan/BinaryFO.
Computer Sciences
Computer Sciences
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
