Real-Time Macro Gesture Recognition using Efficient Empirical Feature
Extraction with Millimeter-Wave Technology
Abstract
Human Machine Interaction based on air gestures finds an increasing
number of applications in consumer electronics. The availability of
mmWave technology, combined with machine learning, allows the detection
and classification of gestures, avoiding high-resolution LIDAR or video
sensors Nevertheless, in most of the existing studies, the processing
takes place offline, takes into account only the velocity and distance
of the moving arm, and can handle only gestures that are conducted very
close to the sensor device, which limits the range of possible
applications. Here, we use an experimental multi-channel mmWave- based
system that can detect small targets, like a moving arm, up to a few
meters away from the sensor. As our pipeline can estimate and take into
account the angle of arrival in azimuth and elevation, it has the
ability to classify a greater variety of dynamic gestures. Furthermore,
the digital signal processing chain we present here, runs in real-time,
incorporating an event detector. Whenever an event is detected, a novel
empirical feature extraction takes place and a Multi-Layer Perceptron is
deployed to infer the type of the gesture. To evaluate our setup and
signal processing pipeline, a dataset with ten subjects, performing nine
gestures was recorded. Our method yielded 94.3% accuracy on the test
set, indicating a successful combination of our proposed sensor and
signal processing pipeline for real time applications.