Multimodal Sensor Technology for Human Behavior Classification
Human Behavior Classification (HBC) has attracted considerable interest due to its ability to facilitate automation in various application areas, including but not limited to smart homes, active assisted living, and security. At present, optical modalities such as RGB, depth, and thermal imaging are prevalent in the field due to the effectiveness of deep learning algorithms like Convolutional Neural Networks (CNNs) and the abundance of publicly available image data. From a purely technical perspective, the preference for cameras is justified by the high information density in images that eases the implementation of fundamental vision tasks. However, this characteristic also makes the success of camera-based HBC systems highly context-dependent. The idea of having a camera-based HBC system at home, regardless of the application, is a no-go for many people, as the mere presence of a camera-like object can induce the negative feeling of “being watched”. Cameras can carry a negative connotation and be perceived as intrusive or surveilling. As a result, the perceived usefulness of cameras in HBC systems is often outweighed by privacy concerns leading to rejection. Recently, unconventional modalities such as radar, WiFi, seismic and environmental sensors are being recognized as potential alternatives due to their capacity for contactless long-range sensing in spatially constrained environments and preservation of visual privacy. In the Blindsight project we develop new multimodal computer vision systems based on these emerging modalities and explore their potential in person-centric sensing applications.
Strohmayer J. and Kampel M., “WiFi CSI-Based Long-Range Through-Wall Human Activity Recognition with the ESP32”, Accepted in the 14th International Conference on Computer Vision Systems (ICVS), September 2023, Vienna, Austria.