
Dexterous robots have emerged in the last decade in response to the need for fine-motor-control assistance in applications as diverse as surgery, undersea welding, and mechanical manipulation in space. Crucial to the fine operation and contact environmental perception are tactile sensors that are fixed on the robotic fingertips. These can be used to distinguish material texture, roughness, spatial features, compliance, and friction. In this paper, we regard the investigated tactile data as time sequences, of which dissimilarity can be evaluated by the popular dynamic time warping method. A kernel sparse coding method is therefore developed to address the tactile data representation and classification problem. However, the naive use of sparse coding neglects the intrinsic relation between individual fingers, which simultaneously contact the object. To tackle this problem, we develop a joint kernel sparse coding model to solve the multifinger tactile sequence classification problem. In this model, the intrinsic relations between fingers are explicitly taken into account using the joint sparse coding, which encourages all of the coding vectors to share the same sparsity support pattern. The experimental results show that the joint sparse coding achieves better performance than conventional sparse coding.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 167 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 1% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 1% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 0.1% |
