Multimodal Sensor Technology for Human Behavior Classification
Human Behavior Classification (HBC) has attracted considerable interest due to its ability to facilitate automation in various application areas, including but not limited to smart homes, active assisted living, and security. At present, optical modalities such as RGB, depth, and thermal imaging are prevalent in the field due to the effectiveness of deep learning algorithms like Convolutional Neural Networks (CNNs) and the abundance of publicly available image data. From a purely technical perspective, the preference for cameras is justified by the high information density in images that eases the implementation of fundamental vision tasks. However, this characteristic also makes the success of camera-based HBC systems highly context-dependent. The idea of having a camera-based HBC system at home, regardless of the application, is a no-go for many people, as the mere presence of a camera-like object can induce the negative feeling of “being watched”. Cameras can carry a negative connotation and be perceived as intrusive or surveilling. As a result, the perceived usefulness of cameras in HBC systems is often outweighed by privacy concerns leading to rejection. Recently, unconventional modalities such as radar, WiFi, seismic and environmental sensors are being recognized as potential alternatives due to their capacity for contactless long-range sensing in spatially constrained environments and preservation of visual privacy. In the Blindsight project we develop new multimodal computer vision systems based on these emerging modalities and explore their potential in person-centric sensing applications.
Project Partners
Cogvis Software und Consulting GmbH
Funding
This project is partly funded by the Vienna Business Agency (grant 4829418) and the Austrian security research program KIRAS of the Austrian Research Promotion Agency FFG (grant 49450173 ).
Publications
Strohmayer, J., Kampel, M. (2025). On the Generalization of WiFi-Based Person-Centric Sensing in Through-Wall Scenarios. In: Pattern Recognition. ICPR 2024. Lecture Notes in Computer Science, vol 15315. Springer, Cham. doi: https://doi.org/10.1007/978-3-031-78354-8_13
Strohmayer J., Sterzinger R., Stippel C. and Kampel M., “Through-Wall Imaging Based On WiFi Channel State Information,” 2024 IEEE International Conference on Image Processing (ICIP), Abu Dhabi, United Arab Emirates, 2024, pp. 4000-4006, doi: https://doi.org/10.1109/ICIP51287.2024.10647775.
Strohmayer J. and Kampel M., “Directional Antenna Systems for Long-Range Through-Wall Human Activity Recognition,” 2024 IEEE International Conference on Image Processing (ICIP), Abu Dhabi, United Arab Emirates, 2024, pp. 3594-3599, doi: https://doi.org/10.1109/ICIP51287.2024.10647666.
Strohmayer, J., Lumetzberger, J., Heitzinger, T., Kampel, M. (2024). Person-Centric Sensing in Indoor Environments. In: Scanning Technologies for Autonomous Systems. Springer, Cham. https://doi.org/10.1007/978-3-031-59531-8_11
Strohmayer, J., and Kampel, M. (2024). “Data Augmentation Techniques for Cross-Domain WiFi CSI-Based Human Activity Recognition”, In IFIP International Conference on Artificial Intelligence Applications and Innovations (pp. 42-56). Cham: Springer Nature Switzerland, doi: https://doi.org/10.1007/978-3-031-63211-2_4
Strohmayer, J., and Kampel, M. (2024). “WiFi CSI-based Long-Range Person Localization Using Directional Antennas”, The Second Tiny Papers Track at ICLR 2024, May 2024, Vienna, Austria. https://openreview.net/forum?id=AOJFcEh5Eb
Strohmayer, J., Kampel, M. (2023). WiFi CSI-Based Long-Range Through-Wall Human Activity Recognition with the ESP32. In Computer Vision Systems. ICVS 2023. Lecture Notes in Computer Science, vol 14253. Springer, Cham. https://doi.org/10.1007/978-3-031-44137-0_4
Strohmayer J. and Kampel M., “Blind Modalities for Human Activity Recognition.” In Assistive Technology: Shaping a Sustainable and Inclusive World, pp. 89-96. IOS Press, August 2023, doi: https://doi.org/10.3233/SHTI230601.