
When implementing a super-resolution (SR) model on an edge device, it is common to train the model on a cloud using pre-determined training images. This is due to the lack of large-scale training data and computation power available on the edge device. However, such frameworks may encounter a domain gap issue because input images to these devices often have different characteristics than those used in training. Therefore, it is essential to continually update the model parameters through on-device learning, which takes into account the limited computation power of edge devices and makes use of on-site input images. In this paper, we present a fast and efficient on-device learning framework for an SR model that aims to overcome the challenges posed by restricted computation and domain gap issues. Specifically, we propose an architecture for training the SR model in a quantized domain, which helps to reduce the quantization errors that accumulate during training. Additionally, we propose cost-constrained gradient pruning and meta-learning-based fast training schemes to enhance restoration performance within a smaller number of iterations. Experimental results show that our approach can maintain the restoration performance for unseen inputs on a lightweight model achieved by our quantization scheme.
neural network acceleration, on-device learning, meta-learning, Electrical engineering. Electronics. Nuclear engineering, Gradient pruning, neural network quantization, neural network compression, TK1-9971
neural network acceleration, on-device learning, meta-learning, Electrical engineering. Electronics. Nuclear engineering, Gradient pruning, neural network quantization, neural network compression, TK1-9971
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
