
pmid: 22055747
SENSitivity Encoding (SENSE) is a mathematically optimal parallel magnetic resonance (MRI) imaging technique when the coil sensitivities are known. In recent times, compressed sensing (CS)-based techniques are incorporated within the SENSE reconstruction framework to recover the underlying MR image. CS-based techniques exploit the fact that the MR images are sparse in a transform domain (e.g., wavelets). Mathematically, this leads to an l(1)-norm-regularized SENSE reconstruction. In this work, we show that instead of reconstructing the image by exploiting its transform domain sparsity, we can exploit its rank deficiency to reconstruct it. This leads to a nuclear norm-regularized SENSE problem. The reconstruction accuracy from our proposed method is the same as the l(1)-norm-regularized SENSE, but the advantage of our method is that it is about an order of magnitude faster.
Data Interpretation, Statistical, Image Interpretation, Computer-Assisted, Brain, Humans, Reproducibility of Results, Image Enhancement, Magnetic Resonance Imaging, Sensitivity and Specificity, Algorithms
Data Interpretation, Statistical, Image Interpretation, Computer-Assisted, Brain, Humans, Reproducibility of Results, Image Enhancement, Magnetic Resonance Imaging, Sensitivity and Specificity, Algorithms
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 13 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
