The dataset consists of 14,444 and 16,728 RGB-D pairs of images of, respectively, open and closed hands. We used nine different subjects (seven males and two females) that were instructed to move their arms with their palms facing the camera. RGB-D pairs are synchronized and extracted at distances from 1 meter to 3 meters from the sensor. Subjects were also asked to record sequences both with sleeves rolled up and sleeves rolled down in order to avoid a bias in the dataset. A sample of processed hands taken from our dataset is shown in the below picture.
Please, if you use the dataset cite our papers as follows:
@inproceedings{ icpr12, author = {Andrew D. Bagdanov, Alberto Del Bimbo, Lorenzo Seidenari, Lorenzo Usai}, title = {Real-time hand status recognition from RGB-D imagery}, booktitle = {Proc. of International Conference on Pattern Recognition (ICPR)}, address = {Tsukuba, Japan}, month = {November}, note = {(Poster)}, year = {2012} }