
A growing number of multi-view data arises naturally in many scenarios, including medical diagnosis, webpage classification, and multimedia analysis. A challenge in learning from multi-view data is that not all instances are fully represented in all views, resulting in missing view data. In this paper, we focus on feature-level completion for missing view of multi-view data. Aiming at capturing both semantic complementarity and identical distribution among different views, an Isomorphic Linear Correlation Analysis (ILCA) method is proposed to linearly map multi-view data to a feature-isomorphic subspace through learning a set of excellent isomorphic features, thereby unfolding the shared information from different views. Meanwhile, we assume that missing view obeys normal distribution. Then, the missing view data matrix can be modeled as a low-rank component plus a sparse contribution. Thus, to accomplish missing view completion, an Identical Distribution Pursuit Completion (IDPC) model based on the learned features is proposed, in which the identical distribution constraint of missing view to the other available one in the feature-isomorphic subspace is fully exploited. Comprehensive experiments on several multi-view datasets demonstrate that our proposed framework yields promising results.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 40 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
