Gan Pei 1 . Junhao Ning1 . Chenrui Niu1 . Siqiong Yao2# . Menghan Hu1# . Guangtao Zhai2
1East China Normal University 2Shanghai Jiao Tong University
For non-contact respiratory rate (RR) measurement, effectively addressing the interference from continuous motion artifacts remains a significant challenge. Most existing research focuses on the removal of weak motion artifacts in a two-dimensional plane, and the fixed spatial scale of the scenes limits the generalization of these methods to real-world scenarios, especially in real walking scenarios. To tackle this issue, we propose an RR measurement framework based on a multi-strategy fusion motion artifact suppression algorithm and have constructed a real-world walking dataset. Specifically, the framework consists of three core modules: an ROI automatic selection and adaptive enhancement module to guide the selection of high-quality corner points; a signal quality evaluation module that adaptively assesses whether the signal is noisy, preventing blind denoising; and a multi-strategy fusion motion artifact removal module that dynamically selects the appropriate strategy to suppress motion interference. To the best of our knowledge, this is the first study to investigate the task of video-based RR measurement in real walking scenarios. Experimental results demonstrate that the method achieves state-of-the-art performance across multiple datasets, with a mean absolute error (MAE) of 1.04 breaths per minute (bpm) on the COHFACE, 3.17 bpm on the OVRM-Walking dataset, and an average MAE of just 2.41 bpm on the in-house real-world walking dataset, which includes both indoor and outdoor scenarios. This study broadens the applicability of camera-based non-contact RR detection technology.
[1] An adaptive edge enhancement module that integrates RGB three-channel features is proposed, enabling high-quality corner point selection in distant, low-light, and low-resolution ROI regions.
[2] A time-domain feature-based waveform quality assessment module is proposed, enabling on-demand activation of motion artifact removal to prevent performance degradation caused by excessive denoising.
[3] A multi-strategy fusion-based adaptive noise removal method is proposed, which adaptively chooses SCR, ASS, and TF-FastICA denoising algorithms based on the signal’s spectral characteristics, effectively removing motion artifacts. To the best of our knowledge, this is the first study to investigate the task of video-based RR measurement in real walking scenarios. The proposed method exhibits superior performance compared to state-of-the-art (SOTA) methods on the in-house Walking Breathing dataset, OVRM-Walking dataset and COHFACE dataset.
[4] A real-word walking dataset has been constructed, consisting of 600 video samples collected from both indoor and outdoor natural lighting environments, filling the gap of missing real-word walking datasets.
The dataset includes two scenes, indoor and outdoor, with 300 samples for each scene. For dataset requests, please contact the author via email.
conda create -n Walking-Breath python=3.9
pip install numpy pandas scipy scikit-learn matplotlib tqdm opencv-python h5py
pip install torch torchvisionThe dataset should be organized in the following structure. Taking the COHFACE dataset as an example, it is recommended to obtain COHFACE from official sources.
./COHFACE/
subject1/
1.avi
1.hdf5
2.avi
2.hdf5
...
subject2/
1.avi
1.hdf5
2.avi
2.hdf5
...
...A checkpoint directory should be created under the lib folder to store the HRNet weights used in this project. The weights can be downloaded from Google Drive.
If you find our research useful for your project, please consider citing our paper:
@article{pei2026video,
author={Pei, Gan and Ning, Junhao and Niu, Chenrui and Yao, Siqiong and Hu, Menghan and Zhai, Guangtao},
journal={IEEE Transactions on Circuits and Systems for Video Technology},
title={Video Respiratory Rate Measurement in Walking Scenarios Using Multi-strategy Adaptive Denoising},
year={2026},
volume={Early Acess},
doi={10.1109/TCSVT.2026.3679396}}52295904023@stu.ecnu.edu.cn

