Multimodal Sensor Technology for Human Behavior Classification
Human Behavior Classification (HBC) has attracted considerable interest due to its ability to facilitate automation in various application areas, including but not limited to smart homes, active assisted living, and security. At present, optical modalities such as RGB, depth, and thermal imaging are prevalent in the field due to the effectiveness of deep learning algorithms like Convolutional Neural Networks (CNNs) and the abundance of publicly available image data. From a purely technical perspective, the preference for cameras is justified by the high information density in images that eases the implementation of fundamental vision tasks. However, this characteristic also makes the success of camera-based HBC systems highly context-dependent. The idea of having a camera-based HBC system at home, regardless of the application, is a no-go for many people, as the mere presence of a camera-like object can induce the negative feeling of “being watched”. Cameras can carry a negative connotation and be perceived as intrusive or surveilling. As a result, the perceived usefulness of cameras in HBC systems is often outweighed by privacy concerns leading to rejection. Recently, unconventional modalities such as radar, WiFi, seismic and environmental sensors are being recognized as potential alternatives due to their capacity for contactless long-range sensing in spatially constrained environments and preservation of visual privacy. In the Blindsight project we develop new multimodal computer vision systems based on these emerging modalities and explore their potential in person-centric sensing applications.
Project Partners
Cogvis Software und Consulting GmbH
Funding
This project is partly funded by the Vienna Business Agency (grant 4829418) and the Austrian security research program KIRAS of the Austrian Research Promotion Agency FFG (grant 879744 ).
Publications
Strohmayer, J., and Kampel, M. (2024). “WiFi CSI-based Long-Range Person Localization Using Directional Antennas”, Accepted in The Second Tiny Papers Track at ICLR 2024, May 2024, Vienna, Austria.
Strohmayer, J., Sterzinger, S., Stippel, C., and Kampel, M. (2024). “Through-Wall Imaging based on WiFi Channel State Information”, arXiv preprint, doi: https://doi.org/10.48550/arXiv.2401.17417
Strohmayer, J., and Kampel, M. (2024). “Directional Antenna Systems for Long-Range Through-Wall Human Activity Recognition”, arXiv preprint, doi: https://doi.org/10.48550/arXiv.2401.01388
Strohmayer, J., and Kampel, M. (2024). “Data Augmentation Techniques for Cross-Domain WiFi CSI-based Human Activity Recognition, Accepted at The 20th International Conference on Artificial Intelligence Applications and Innovations (AIAI), June 2024, Corfu, Greece, arXiv preprint, doi: https://doi.org/10.48550/arXiv.2401.00964
Strohmayer, J., Kampel, M. (2023). WiFi CSI-Based Long-Range Through-Wall Human Activity Recognition with the ESP32. In Computer Vision Systems. ICVS 2023. Lecture Notes in Computer Science, vol 14253. Springer, Cham. https://doi.org/10.1007/978-3-031-44137-0_4
Strohmayer J. and Kampel M., “Blind Modalities for Human Activity Recognition.” In Assistive Technology: Shaping a Sustainable and Inclusive World, pp. 89-96. IOS Press, August 2023, doi: https://doi.org/10.3233/SHTI230601.