[robotics-worldwide] [news] Yale Human Grasping Dataset

Aaron Dollar aaron.dollar at yale.edu
Tue Dec 2 13:07:47 PST 2014


We are happy to announce the public availability of a large dataset
consisting of tagged video and image data of human grasping movements in
unstructured environments. Wide-angle head-mounted camera video was recorded
from two housekeepers and two machinists during their regular work
activities, and the grasp types, objects and tasks were analyzed and coded
by study staff. The full dataset contains 27.7 hours of tagged video and
represents a wide range of manipulative behaviors spanning much of the
typical human hand usage. We provide the original videos, a spreadsheet
including the tagged grasp type, object and task parameters, time
information for each successive grasp, and video screenshots for each
instance. Example code is provided for MATLAB and R demonstrating how to
load in the dataset and produce simple plots.

 

The complete dataset is provided here:

 

http://www.eng.yale.edu/grablab/humangrasping/

 

----------------------------------------------

Aaron M. Dollar

John J. Lee Associate Professor of Mechanical Engineering and Materials
Science

Yale University

office:  <tel:%28203%29%20436-9122> (203) 436-9122

 <mailto:aaron.dollar at yale.edu> aaron.dollar at yale.edu

 <http://www.eng.yale.edu/grablab/> http://www.eng.yale.edu/grablab/

 

 



More information about the robotics-worldwide mailing list