codymlewis commited on
Commit
1923dfe
·
1 Parent(s): 1d53516

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +86 -0
README.md ADDED
@@ -0,0 +1,86 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ dataset_info:
3
+ features:
4
+ - name: features
5
+ sequence: float32
6
+ length: 561
7
+ - name: labels
8
+ dtype:
9
+ class_label:
10
+ names:
11
+ '0': WALKING
12
+ '1': WALKING_UPSTAIRS
13
+ '2': WALKING_DOWNSTAIRS
14
+ '3': SITTING
15
+ '4': STANDING
16
+ '5': LAYING
17
+ '6': STAND_TO_SIT
18
+ '7': SIT_TO_STAND
19
+ '8': SIT_TO_LIE
20
+ '9': LIE_TO_SIT
21
+ '10': STAND_TO_LIE
22
+ '11': LIE_TO_STAND
23
+ - name: subject id
24
+ dtype: uint8
25
+ splits:
26
+ - name: train
27
+ num_bytes: 17499051
28
+ num_examples: 7767
29
+ - name: test
30
+ num_bytes: 7123986
31
+ num_examples: 3162
32
+ download_size: 79596192
33
+ dataset_size: 24623037
34
+ license: cc-by-4.0
35
+ language:
36
+ - en
37
+ pretty_name: HAR
38
+ size_categories:
39
+ - n<1K
40
+ ---
41
+
42
+ # Dataset Card for HAR
43
+
44
+ A tabular dataset which poses the task of prediction human activity based on smartphone sensor signal (accelerometer and gyroscope).
45
+
46
+ ## Dataset Details
47
+
48
+ ### Dataset Description
49
+
50
+ *Summary from https://archive.ics.uci.edu/dataset/240/human+activity+recognition+using+smartphones:*
51
+ The experiments were carried out with a group of 30 volunteers within an age bracket of 19-48 years. They performed a protocol of activities composed of six basic activities: three static postures (standing, sitting, lying) and three dynamic activities (walking, walking downstairs and walking upstairs). The experiment also included postural transitions that occurred between the static postures. These are: stand-to-sit, sit-to-stand, sit-to-lie, lie-to-sit, stand-to-lie, and lie-to-stand. All the participants were wearing a smartphone (Samsung Galaxy S II) on the waist during the experiment execution. We captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz using the embedded accelerometer and gyroscope of the device. The experiments were video-recorded to label the data manually. The obtained dataset was randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data.
52
+
53
+ The sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of 561 features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details.
54
+
55
+ This dataset is an updated version of the UCI Human Activity Recognition Using smartphones Dataset that can be found at: https://archive.ics.uci.edu/ml/datasets/Human+Activity+Recognition+Using+Smartphones
56
+
57
+ This version provides the original raw inertial signals from the smartphone sensors, instead of the ones pre-processed into windows which were provided in version 1. This change was done in order to be able to make online tests with the raw data. Moreover, the activity labels were updated in order to include postural transitions that were not part of the previous version of the dataset.
58
+
59
+
60
+
61
+ - **Curated by:** Reyes-Ortiz, Jorge, Anguita, Davide, Ghio, Alessandro, Oneto, Luca, and Parra, Xavier
62
+ - **License:** This dataset is licensed under a [Creative Commons Attribution 4.0 International (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/legalcode) license.
63
+
64
+ ### Dataset Sources
65
+
66
+ - **Repository:** http://archive.ics.uci.edu/dataset/341/smartphone+based+recognition+of+human+activities+and+postural+transitions
67
+ - **Paper:** https://www.sciencedirect.com/science/article/abs/pii/S0925231215010930
68
+ - **Experiment Demo:** http://www.youtube.com/watch?v=XOEN9W05_4A
69
+
70
+
71
+ ## Citation
72
+
73
+ **BibTeX:**
74
+
75
+ @misc{misc_smartphone-based_recognition_of_human_activities_and_postural_transitions_341,
76
+ author = {Reyes-Ortiz,Jorge, Anguita,Davide, Oneto,Luca, and Parra,Xavier},
77
+ title = {{Smartphone-Based Recognition of Human Activities and Postural Transitions}},
78
+ year = {2015},
79
+ howpublished = {UCI Machine Learning Repository},
80
+ note = {{DOI}: https://doi.org/10.24432/C54G7M}
81
+ }
82
+
83
+
84
+ **APA:**
85
+
86
+ Reyes-Ortiz, Jorge, Anguita, Davide, Oneto, Luca, and Parra, Xavier. (2015). Smartphone-Based Recognition of Human Activities and Postural Transitions. UCI Machine Learning Repository. https://doi.org/10.24432/C54G7M.