
Deep learning has been widely adopted in compression sensing (CS) to achieve superior reconstruction quality, but is restricted by the black-box architecture in network design and lack of interpretability. In this paper, we propose a novel deep network-based CS framework via unfolding the $\ell_{0}$ -constrained convolutional sparse coding (CSC). The proposed method incorporates deep neural networks (DNNs) into the state-of-the-art optimization paradigm of alternating direction method of multipliers (ADMM). CS reconstruction can be improved with the compact and shift-invariant sparse representation under the merits of CSC and DNNs with learnable parameters. Convergence analysis demonstrates that the proposed method is guaranteed to converge, when the network parameters are reused for all the building blocks. Experimental results demonstrate that the proposed method is competitive or outperform the state-of-the-art DNN-based methods in CS of natural images and magnetic resonance imaging (MRI).
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 1 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
