
handle: 11693/49889
Abstract Analysis of kinship from facial images or videos is an important problem. Prior machine learning and computer vision studies approach kinship analysis as a verification or recognition task. In this paper, for the first time in the literature, we propose a kinship synthesis framework, which generates smile and disgust videos of (probable) children from the expression videos (smile and disgust) of parents. While the appearance of a child's expression is learned using a convolutional encoder–decoder network, another neural network models the dynamics of the corresponding expression. The expression video of the estimated child is synthesized by the combined use of appearance and dynamics models. In order to validate our results, we perform kinship verification experiments using videos of real parents and estimated children generated by our framework. The results show that generated videos of children achieve higher correct verification rates than those of real children. Our results also indicate that the use of generated videos together with the real ones in the training of kinship verification models, increases the accuracy, suggesting that such videos can be used as a synthetic dataset. Furthermore, we evaluate the expression similarity between input and output frames, and show that the proposed method can fairly retain the expression of input faces while transforming the facial identity.
Facial action units, Facial dynamics, Temporal analysis, Kinship verification, Kinship synthesis
Facial action units, Facial dynamics, Temporal analysis, Kinship verification, Kinship synthesis
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 4 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
