Learning the signatures of the human grasp
using a scalable tactile glove

Teaser

Abstract

Humans can feel, weigh, and grasp diverse objects, and simultaneously infer their material properties while applying the right amount of force – a challenging set of tasks for a modern robot. Mechanoreceptor networks that provide sensory feedback and enable the dexterity seen in the human grasp remain challenging to imitate in manmade robots. While computer vision-based robot grasping strategies have progressed significantly with the abundance of visual data and emerging machine learning tools, there are yet no equivalent sensing platforms and large-scale datasets to utilize tactile information that humans rely on for grasping objects. Importantly, the inability to record and analyse tactile signals currently limits our understanding of the role of tactile information in the human grasp itself – e.g., how tactile maps are used to identify objects and infer their properties is unknown. Here we demonstrate – using a scalable tactile glove (STAG) and deep convolutional neural networks (CNNs) – that sensors uniformly distributed over the hand can be used to identify individual objects, estimate weights and explore the typical tactile patterns that emerge while grasping objects. The sensor array (548 sensors) is assembled on a knitted glove, and consists of a piezoresistive film connected by a network of conductive thread electrodes that are passively probed. Using the low-cost STAG sensor array (~ $10), we record a large-scale tactile dataset with 135,000 frames, each covering the full hand, while interacting with 26 different objects. The collective set of interactions with different objects explain the key correspondences between different regions of the hand while manipulating objects. Insights from the tactile signatures of the human grasp – through the lens of a manmade analogue of the natural mechanoreceptor network – can aid the future design of new prosthetics, robot grasping tools and human-robot interactions.

Paper

Learning the signatures of the human grasp using a scalable tactile glove Learning the signatures of the human grasp using a scalable tactile glove Subramanian Sundaram, Petr Kellnhofer, Yunzhu Li, Jun-Yan Zhu, Antonio Torralba, Wojciech Matusik Nature, 569 (7758), 2019
@article{
	SSundaram:2019:STAG,
	author = {Sundaram, Subramanian and Kellnhofer, Petr and Li, Yunzhu and Zhu, Jun-Yan and Torralba, Antonio and Matusik, Wojciech},
	title = {Learning the signatures of the human grasp using a scalable tactile glove},
	journal={Nature},
	volume={569},
	number={7758},
	year={2019},
	publisher={Nature Publishing Group},
	doi = {10.1038/s41586-019-1234-z}
}

Demos

Videos

Video shows the bendability of the STAG and includes a demonstration of folding a paper plane while wearing the STAG.
Auxetic version of the sensor array with 10 × 10 elements (speed – 3x).
Interaction from the STAG dataset – Mug (speed – 3x).
Interaction from the STAG dataset – Cat [stone] (speed – 3x).
Interaction from the STAG dataset – Safety glasses (speed – 3x).
Example sequence of the dataset used for weight estimation – Multimeter (speed – 3x).

Design and Dataset files

All the data and code are free under a license for non-commercial use. Contact us with inquiries about commercial use.

The schematic and Gerber files for the printed circuit board needed to read out data from our sensor.


Scalable Tactile Glove (STAG) Datasets are used in our paper for object classification, weight estimation and hand pose discrimination. These datasets are needed to run our code and reproduce of the results in our paper. Note that pressure data alone (i.e., without images) is sufficient for running our code.

Source code

The source code for our machine learning based object classification and weight estimation methods is available on GitHub. The algorithms are implemented in Python and require the Pytorch machine learning framework. Note that in addition to the code the datasets above are required.

Media coverage

Contact

Please contact the corresponding author Subramanian Sundaram with inquiries or send us an e-mail to info@humangrasp.io with any other questions.




© 2019 The Authors. The author's version of the work is posted here for your personal use. Not for redistribution.

File icons made by Smashicons from www.flaticon.com are licensed by CC 3.0 BY