Various groups deal with using IMUs and EMG in the context of gesture recognition. A comparison between their results is often very hard, due to the multitude of different tasks and gesture sets. In addition, often the performance in session-, as well as person-independent recognition are not evaluated, both being of special importance for practical and usable interfaces.
We publish the data corpus we collected during our experiments for gesture recognition using IMUs and EMG.
Data corpus for gesture recognition using IMUs and EMG. We collected both IMU and EMG readings from 5 different subjects in 5 different session.
If you want to use this data, please cite:
Recognizing Hand and Finger Gestures with IMU based Motion and EMG based Muscle Activity Sensing (Marcus Georgi, Christoph Amma, Tanja Schultz), In International Conference on Bio-inspired Systems and Signal Processing, 2015. (BIOSIGNALS 2015)
The complete data can be found here.
This is the csl-hdemg dataset containing high-density EMG recordings of finger motions. The dataset is described together with a baseline recognition system in the paper:
Advancing Muscle-Computer Interfaces with High-Density Electromyography (Christoph Amma, Thomas Krings, Jonas Böer, Tanja Schultz), In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, ACM, 2015.
Please cite this paper, if you publish any work based on this dataset. The data is contained in a zip file which is over 2GB in size. Within the archive, there is a readme.txt, that describes how the data is structured and how to access the data from Matlab or Python.
You can download the dataset here.