Datasets:
Update README.md
Browse files
README.md
CHANGED
@@ -3,24 +3,26 @@ annotations_creators: []
|
|
3 |
language: en
|
4 |
size_categories:
|
5 |
- n<1K
|
6 |
-
task_categories: []
|
7 |
task_ids: []
|
8 |
pretty_name: hand_keypoints
|
9 |
tags:
|
10 |
- fiftyone
|
11 |
- image
|
12 |
-
|
|
|
|
|
13 |
|
14 |
|
15 |
|
16 |
|
17 |
-
This is a [FiftyOne](https://github.com/voxel51/fiftyone) dataset with 846
|
|
|
18 |
|
19 |
|
20 |
## Installation
|
21 |
|
22 |
|
23 |
-
If you haven'
|
24 |
|
25 |
|
26 |
```bash
|
@@ -42,7 +44,7 @@ dataset_summary: '
|
|
42 |
|
43 |
# Load the dataset
|
44 |
|
45 |
-
# Note: other available arguments include '
|
46 |
|
47 |
dataset = load_from_hub("voxel51/hand-keypoints")
|
48 |
|
@@ -52,8 +54,6 @@ dataset_summary: '
|
|
52 |
session = fo.launch_app(dataset)
|
53 |
|
54 |
```
|
55 |
-
|
56 |
-
'
|
57 |
---
|
58 |
|
59 |
# Dataset Card for Image Hand Keypoint Detection
|
@@ -94,14 +94,15 @@ As part of their research, the authors created a dataset by manually annotating
|
|
94 |
### Dataset Description
|
95 |
|
96 |
The dataset created in this research is a collection of manually annotated RGB images of hands sourced from the MPII Human Pose dataset and the New Zealand Sign Language (NZSL) Exercises. It contains 2D locations for 21 keypoints on 2800 hands, split into a training set of 2000 hands and a testing set of 800 hands. This dataset was used to train and evaluate their hand keypoint detection methods for single images.
|
97 |
-
- **Curated by:** Tomas Simon, Hanbyul Joo, Iain Matthews, Yaser Sheikh
|
98 |
-
- **Funded by:** Carnegie Mellon University
|
99 |
-
- **Shared by:** [Harpreet Sahota](https://huggingface.co/harpreetsahota), Hacker-in-Residence at Voxel51
|
100 |
-
- **License:** [More Information Needed]
|
101 |
|
102 |
-
### Dataset Sources
|
103 |
- **Paper:** https://arxiv.org/abs/1704.07809
|
104 |
- **Demo:** http://domedb.perception.cs.cmu.edu/handdb.html
|
|
|
|
|
|
|
|
|
|
|
|
|
105 |
|
106 |
## Uses
|
107 |
|
@@ -128,8 +129,13 @@ Sample fields:
|
|
128 |
body: fiftyone.core.fields.EmbeddedDocumentField(fiftyone.core.labels.Keypoints)
|
129 |
left_hand: fiftyone.core.fields.EmbeddedDocumentField(fiftyone.core.labels.Keypoints)
|
130 |
```
|
|
|
|
|
|
|
131 |
|
132 |
-
|
|
|
|
|
133 |
|
134 |
#### Data Collection and Processing
|
135 |
|
@@ -139,11 +145,6 @@ The dataset is composed of manually annotated RGB images of hands sourced from t
|
|
139 |
|
140 |
• **Splits:** The combined dataset of 2800 annotated hands was divided into a training set of 2000 hands and a testing set of 800 hands. The criteria for this split are not explicitly detailed in the provided excerpts.
|
141 |
|
142 |
-
Source Datasets:
|
143 |
-
|
144 |
-
◦ **MPII Human Pose dataset:** Contains images from YouTube videos depicting a wide range of everyday human activities. These images vary in quality, resolution, and hand appearance, and include various types of occlusions and hand-object/hand-hand interactions.
|
145 |
-
◦ **New Zealand Sign Language (NZSL) Exercises:** Features images of people making visible hand gestures for communication. This subset provides a variety of hand poses commonly found in conversational contexts
|
146 |
-
|
147 |
### Annotations
|
148 |
|
149 |
#### Annotation process
|
|
|
3 |
language: en
|
4 |
size_categories:
|
5 |
- n<1K
|
|
|
6 |
task_ids: []
|
7 |
pretty_name: hand_keypoints
|
8 |
tags:
|
9 |
- fiftyone
|
10 |
- image
|
11 |
+
- keypoints
|
12 |
+
- pose-estimation
|
13 |
+
dataset_summary: >
|
14 |
|
15 |
|
16 |
|
17 |
|
18 |
+
This is a [FiftyOne](https://github.com/voxel51/fiftyone) dataset with 846
|
19 |
+
samples.
|
20 |
|
21 |
|
22 |
## Installation
|
23 |
|
24 |
|
25 |
+
If you haven't already, install FiftyOne:
|
26 |
|
27 |
|
28 |
```bash
|
|
|
44 |
|
45 |
# Load the dataset
|
46 |
|
47 |
+
# Note: other available arguments include 'max_samples', etc
|
48 |
|
49 |
dataset = load_from_hub("voxel51/hand-keypoints")
|
50 |
|
|
|
54 |
session = fo.launch_app(dataset)
|
55 |
|
56 |
```
|
|
|
|
|
57 |
---
|
58 |
|
59 |
# Dataset Card for Image Hand Keypoint Detection
|
|
|
94 |
### Dataset Description
|
95 |
|
96 |
The dataset created in this research is a collection of manually annotated RGB images of hands sourced from the MPII Human Pose dataset and the New Zealand Sign Language (NZSL) Exercises. It contains 2D locations for 21 keypoints on 2800 hands, split into a training set of 2000 hands and a testing set of 800 hands. This dataset was used to train and evaluate their hand keypoint detection methods for single images.
|
|
|
|
|
|
|
|
|
97 |
|
|
|
98 |
- **Paper:** https://arxiv.org/abs/1704.07809
|
99 |
- **Demo:** http://domedb.perception.cs.cmu.edu/handdb.html
|
100 |
+
- **Curated by:** Tomas Simon, Hanbyul Joo, Iain Matthews, Yaser Sheikh
|
101 |
+
- **Funded by:** Carnegie Mellon University
|
102 |
+
- **Shared by:** [Harpreet Sahota](https://huggingface.co/harpreetsahota), Hacker-in-Residence at Voxel51
|
103 |
+
- **License:** This dataset contains images from:
|
104 |
+
1. MPII Huaman Pose dataset, which is under the [BSD License](https://github.com/YuliangXiu/MobilePose/blob/master/pose_dataset/mpii/mpii_human_pose_v1_u12_2/bsd.txt)
|
105 |
+
2. New Zealand Sign Language Dictionary Dataste, which is under the [CC BY-NC-SA 3.0 License](https://creativecommons.org/licenses/by-nc-sa/3.0/)
|
106 |
|
107 |
## Uses
|
108 |
|
|
|
129 |
body: fiftyone.core.fields.EmbeddedDocumentField(fiftyone.core.labels.Keypoints)
|
130 |
left_hand: fiftyone.core.fields.EmbeddedDocumentField(fiftyone.core.labels.Keypoints)
|
131 |
```
|
132 |
+
### Dataset Sources
|
133 |
+
|
134 |
+
#### Source Data
|
135 |
|
136 |
+
◦ [**MPII Human Pose dataset:**](https://github.com/YuliangXiu/MobilePose/blob/master/pose_dataset/mpii/mpii_human_pose_v1_u12_2/README.md) Contains images from YouTube videos depicting a wide range of everyday human activities. These images vary in quality, resolution, and hand appearance, and include various types of occlusions and hand-object/hand-hand interactions.
|
137 |
+
|
138 |
+
◦ [ **New Zealand Sign Language (NZSL) Exercises:**](https://github.com/Bluebie/NZSL-Dictionary) Features images of people making visible hand gestures for communication. This subset provides a variety of hand poses commonly found in conversational contexts
|
139 |
|
140 |
#### Data Collection and Processing
|
141 |
|
|
|
145 |
|
146 |
• **Splits:** The combined dataset of 2800 annotated hands was divided into a training set of 2000 hands and a testing set of 800 hands. The criteria for this split are not explicitly detailed in the provided excerpts.
|
147 |
|
|
|
|
|
|
|
|
|
|
|
148 |
### Annotations
|
149 |
|
150 |
#### Annotation process
|