Biomedical Signal Processing and Control 29 (2016) 1–10
Contents lists available at ScienceDirect
Biomedical Signal Processing and Control journal homepage: www.elsevier.com/locate/bspc
Remote respiratory monitoring system based on developing motion magnification technique Ali Al-Naji a,∗ , Javaan Chahl b a b
Electrical & Information Engineering, School of Engineering, University of South Australia, SA 5095, Australia Joint and Operations Analysis Division, Defence Science and Technology Group, SA 5095, Australia
a r t i c l e
i n f o
Article history: Received 31 July 2015 Received in revised form 19 March 2016 Accepted 7 May 2016 Keywords: Video processing Motion magnification Wavelet pyramid decomposition Motion detection Frame subtraction
a b s t r a c t The aim of this study is to detect and measure the rate and timing parameters of the respiratory cycle at a distance from different sleeping positions of a baby based on video imagery. This study relied on amplifying motion resulting from movement of the chest caused by inhalation and exhalation. A motion magnification technique based on a wavelet decomposition and an elliptic filter was used to magnify breathing movement that is difficult to see with the naked eye. A novel measuring method based on motion detection was used to measure respiratory rate and its time parameters by detecting the fastest moving areas in the magnified video frame sequences. The video frames were converted into a corresponding logical matrix. The experimental results on several videos for the baby at different sleeping positions show that the remote respiratory monitoring system has an accuracy of 99%. The proposed system has very low computational complexity, is feasible and safe making it suitable for the design of next generation non-contact vital signs monitoring systems. © 2016 Elsevier Ltd. All rights reserved.
1. Introduction Remote, non-invasive detection of vital signs has an increasing of research interest in clinical and biomedical applications. Respiratory rate is one of the most important vital parameters of interest in a clinical diagnostic and monitoring system. Generally, respiration measurement can be achieved by using contact methods, including nasal thermocouples, a respiratory-effort belt transducer, piezoelectric transducer, oximetry probe and electrocardiography (ECG). However, all of these methods are inconvenient and constrain the patient [1–3]. Therefore, researchers have developed a variety of non-contact methods to extract respiratory rate, including approaches based on radar sensors [4–10]. But using Doppler radar in measuring vital signs needs specialized hardware to provide radar frequencies and receive a reliable return signal. Current systems suffer significant signal to noise ratio (SNR) decreases at distances greater than 1 m between the radar and the patient due to increased free space loss at the frequencies employed [4]. In addition, the radar antenna must face the chest wall and any movement of the body during sampling will corrupt the measurements [4–10]. Furthermore, focused radar energy may have harmful side
∗ Corresponding author. E-mail address:
[email protected] (A. Al-Naji). http://dx.doi.org/10.1016/j.bspc.2016.05.002 1746-8094/© 2016 Elsevier Ltd. All rights reserved.
effects on biological tissue [11]. Some studies [2,12,13] used image sequence analysis captured by video camera to detect optical flow of movements of body surface on a bed resulting from respiration as non-contact methods to measure respiratory rate. Because the studies relied on optical flow calculations, the results were affected by motion artefacts, ambient light and computational complexity. Thermal cameras have been used in several studies [1,14–19] to measure respiratory rate based on skin temperature differences associated with inspiration and expiration of the patient. Although non-contact systems based on thermal cameras have succeeded as a way to monitor respiratory rate, its measurements are also corrupted by the body movements, head rotation, and particularly any apparatus that covers the face. These systems are unable to detect respiratory rate when the nasal region is not visible. Recently, several researchers have utilized video photoplethysmography imaging (PPGI) signals [20–23] to measure variations in the skin blood volume resulting from respiratory rhythms. Though PPGI is attractive in principle, previous studies were affected by illumination conditions, skin colour, and distance [21] that causes background noise to fall within the frequency band of interest. These methods cannot be utilized to reveal physiological signs in unclear regions of interest (ROIs); therefore, PPG cannot work on a baby that moves into many poses. The aim of this study is to develop a vision based remote respiratory monitoring system that is able to
2
A. Al-Naji, J. Chahl / Biomedical Signal Processing and Control 29 (2016) 1–10
Fig. 1. Proposed system diagram of the remote respiratory monitoring system.
measure respiratory parameters at different baby positions, while being safe, cost effective, reliable and easy to use. This study uses a video camera as an non-contact sensor to measure respiratory parameters by amplifying and monitoring chest or blanket movement based on two processing systems. The first processing system is called the developing video magnification technique. The video magnification technique [24] is developed by using a wavelet pyramid decomposition instead of Laplacian pyramid decomposition and an elliptic band pass filter instead of a Butterworth band pass filter. The developing motion magnification technique was used to magnify respiratory chest movements of the baby at different positions and it has higher values of peak signal to noise ratio (PSNR) than the Eulerian video magnification used in [24]. The second processing system performs motion detection based on frame subtraction. This algorithm was applied to magnified video to monitor respiratory parameters by detecting white areas in the frame sequences that correspond to respiratory motion. Then, a new measuring method was used to convert video frames into a logical matrix to calculate respiratory rate based on the average distance between ones in the logical matrix. The paper is organised as follows. Section 2 presents the methodology of the proposed system. Section 3 presents experimental setup & proposed flowchart. Results obtained in this work are given in Section 4. Finally, results discussion and conclusions are given in Sections 5 and 6 respectively.
and video performance. Both spatial and temporal processing were implemented to observe small motion in the input video. We used one colour channel and converted RGB frames to YIQ colour space because it allows straight forward magnification of image intensity functions. The time series of the pixel value on all spatial levels of the pyramid were converted to the frequency domain through FFT for fast computation. The spatial bands obtained by decomposition were then passed through temporal elliptic band pass filters with selected frequencies of 0.4–0.8 Hz, corresponding to 24–48 breaths/min to extract the frequency bands of interest as well as attenuating the noise frequencies for the frames and thus increasing SNR. The extracted bands from temporal processing were then magnified by multiplying them with amplification factor (˛) and these magnified signals were added back to the standard input video signals. To explain the relationship between head motion magnification and temporal processing and how the video magnification technique operated, let I(x,t) denote the image intensity function that has moved between two frames at position x at time t. After a translational motion, the image intensity function can be given as:
I (x, t) = f x + ı (t)
where ı(t) is a motion function (displacement function) and I (x, 0) = f (x) is the initial image function without any motion. Based on 1D Taylor series expansion [25,26], the image intensity function can be approximated at a time t as
2. Methodology There are two main processing systems for the remote respiratory monitoring system. The first system is called the developing motion magnification system. The second system is called the respiratory rate measuring system. A full system diagram for this study is shown in Fig. 1.
I (x, t) ≈ f (x) + ı (t)
∂f (x) ∂x
(2)
Taylor series expansion can also be applied to a 2D image. Assuming intensity variation function B(x,t) is the result of applying a temporal band-pass filter to I(x,t). The intensity variation function B(x,t) can be expressed with the motion ı(t) falling within a passband filter range as:
2.1. Developing motion magnification system B (x, t) = ı (t) We enhanced and developed the video magnification system to suit the proposed application using wavelet pyramid decomposition and an elliptic band pass filter in terms of noise removal and video quality. Wavelet pyramid decomposition techniques were first applied to decompose input video into different spatial pools of frequencies and observe the differences in the video quality
(1)
∂f (x) ∂x
(3)
To get the magnified intensity function ˆI (x, t), the amplified function B(x,t) by magnification factor (˛) will be added back to the original intensity function I(x,t), as follows ˆI (x, t) ≈ I (x, t) + ˛B (x, t)
(4)
A. Al-Naji, J. Chahl / Biomedical Signal Processing and Control 29 (2016) 1–10
3
Table 1 Result comparison of developing magnification based method and the Eulerian magnification based method. Position
Ground truth data
Developing magnification based method
Error %
Eulerian magnification based method
Error%
P1 P2 P3 P4 P5 P6 P7 P8 P9 P10 MAE
25 25 23 25 23 26 25 23 22 24
25.13 24.69 23.71 25.75 23.43 26.1 25.23 23.12 21.8 24.17
0.52 1.24 3.08 3 1.87 0.38 0.92 0.52 0.9 0.71 1.314%
26.37 23.58 24.85 26.91 25.61 27.22 26.27 24.76 20.6 25.5
5.48 5.68 8.04 7.64 11.35 4.69 5.08 7.13 5.45 5.54 6.608%
After inserting Eqs. (2)–(4), the magnified intensity function ˆI (x, t) can be expressed as: ˆI (x, t) ≈ f (x, t) + (1 + ˛) ı (t)
∂f (x) ∂x
(5)
Finally, assuming the first order Taylor expansion of the temporally band-pass filtered image intensity is a good approximation of the ˆI (x, t), the magnified motion will be
ˆI (x, t) = f x + (1 + ˛) ı (t)
(6)
Clearly, this shows that the motion spatial function ı(t) of the initial image function f(x) has been amplified by a factor of (1 ˛). To calculate PSNR values of the magnified video frames and compare them with the input video frames to measure the noise level and observe respiratory signal, the following equations are applied: [27,28] PSNR = 10log
255
(7)
MSE Ii , ˆI
where Ii and ˆI are the input and magnified video frames respectively. The mean square error (MSE) can be defined as: MSE =
1 M N MN
m=1
n=1
[Ii − ˆI ]
2
Then, we can calculate number of zeros (Z) and number of ones (O) in A as follows: Z=N−
i=1
Ai
O=N−Z
(10) (11)
To calculate differences between adjacent elements of A, let B = diff(A) return a vector of length N − 1. The elements of B are the differences between adjacent elements of A as follows B = [A (2) − A (1) , A (3) − A (2) , ...A (N) − A (N − 1)]
(12)
Let C ∈ B be a given nonzero vector, then C = find(B) returns a vector containing the value of each nonzero element in the array B. C = [C (1) , C (2) , C (3) , . . .. . ..C (j)]
(13)
where j is a number of nonzero elements in the array B. By calculating the differences between ones using D = diff(C) D = [C (2) − C (1) , C (3) − C (2) , ...C (j) − C (j − 1)]
(14)
where D returns a vector of length j − 1. The average distance (Dmean ) between ones can be expressed as: Dmean =
(8)
1 j−1 Di j−1 i=1
(15)
Now, we can measure R, the respiratory rate over 15 s as follows
2
where [Ii − ˆI ] is the error difference between the input video frames and the magnified video frames. Respiratory signal and time parameters of the respiratory cycle can be observed from the PSNR values of the magnified video frames (see Fig. 5).
R=
N Dmean
2.2. Respiratory rate measuring system The respiratory rate measuring system is a processing system that is used to measure respiratory parameters from the magnified video by using a new measuring system based on motion detection. Motion detection based on frame subtraction was used in this study to measure respiratory rate at different baby poses. This system comprises several processing methods. Firstly, frame subtraction was applied on the magnified video to detect motion of the chest. Secondly, image processing methods were applied on video frames, including local contrast enhancement, frame binarization, morphological filtering and ROI mask selection. Thirdly, detecting of white (above threshold) areas in the ROI in all frames caused by the repository rhythm. A new proposed measuring method was used to convert video frames into a logical matrix to calculate respiratory rate based on average distance between ones in the logical matrix. We converted the logical matrix to a binary vector to deal with 0 and 1, where 0 represents the dark area and 1 represents the white area in the image. Let A is a vector of length N, (9)
(16)
To measure respiratory rate (RR) per minute, we apply Eq. (17) RR =
A = [A (1) , A (2) , A (3) , . . .. . ...A (N)]
N
R × 60 15
(17)
Also, we can measure the timing parameters of the respiration cycle and the inhalation-exhalation ratio by using the following equations: Rc = I + E + P I:E=
I E
(18) (19)
where Rc is respiration cycle, I is inhalation time in seconds, E is exhalation time in seconds, P is pause period and (I:E) is inhalationexhalation ratio. According to some studies [29–31], P for healthy subjects is approximately 1 s. followed by resumption of inhalation with exhalation. 3. Experimental setup & flowchart Experiments were done using a Nikon D5300 DSLR camera with a CMOS focal plane. Each input video from this camera was 15 s. with 30 frames per second (fps) and pixel resolution of 1080 × 720 and saved in AVI format on the computer. The camera lens face was located approximately 1 m away from the baby. The input videos
4
A. Al-Naji, J. Chahl / Biomedical Signal Processing and Control 29 (2016) 1–10
4. Results
Fig. 2. Proposed flow chart of the remote respiratory monitoring system.
were recorded for the 8 month baby at 10 sleeping positions (5 positions fully clothed and 5 positions half clothed) for consecutive days. The flowchart was carried out in the MATLAB environment as shown in Fig. 2. The first step of the proposed flow chart was to load 15 s of input video for the baby at different positions (10 positions) independently. The second step was to magnify the input video 20 times by using the developing video magnification technique. We used an eight level wavelet pyramid and 5th order elliptic band pass filter with 0.5% ripple and selected frequencies of 0.4–0.8 Hz, corresponding to 24–48 breaths/min. We converted magnified video into frame sequences. The number of video frames (N) can be found by multiplying 15 s movie length by the frame rate (30 fps). Next, motion detection based on frame subtraction was applied to video frames to detect respiratory motion. Then, we used contrast-limited adaptive histogram equalization (CLAHE) [32] for enhancing the local contrast of frame sequences and converting adaptive frames to binary images. Filling of image regions was used to fill holes and gaps for all binary images. Next, morphological filtering was used for noise removal in the image sequences. Next, we determined ROIs by placing a mask filter on an interest area. The next step is to detect and count white areas in the ROI in all images that represent respiratory rhythms. From this, we obtained a matrix vector with dimension of 1 × N. The respiratory rate was measured by applying Eqs. (16) and (17). Also, our system measured the timing parameters of the respiratory cycle, including Rc, I, E and I:E ratio based on Eqs. (18) and (19).
Fifteen seconds of video of an eight-month old female baby was recorded at 30 fps when she was sleeping in different positions. The chest area was selected as the ROI for respiration measurement. Inhaling movements were detected by the white area in the frame sequences and exhaling movements were detected by the dark area in the frame sequences, including pause periods. The results for our proposed method and mean absolute error (MAE) values based on both magnification techniques against ground-truth data are shown in Table 1. The respiratory parameters were detected by calculating the number of white areas in frame sequences and how many frames in each breath. To obtain the frame interval, the length of video in seconds was divided by the number of frames (15 s/441 = 0.0333). Our system, based on developing magnification technique, detected approximately 6 breaths in 15 s from the baby at position 1 as shown in Fig. 3. By using Eq. (16) and (17), we found a respiratory rate of 25.13 breaths/min with absolute present error of 0.52% as shown in Table 1. To calculate time parameters of the respiratory cycle, we divided 60 s/min on the measured respiratory rate to obtain the respiratory cycle (Rc = 60/25.13 = 2.38 s). This means each breath occurs every 2.38 s. Also, Rc can be measured by multiplying the number of frames for each breath with the frame interval (0.0333 s). Our system detected 71 frames for the Rc length (inhalation, exhalation and pause period) corresponds to (71 × 0.0333 = 2.36 s) for each breath. Inhalation time (I) was 14 frames (14 × 0.0333 = 0.46 s) and exhalation time E was 57 frames including pause period (P) (57 × 0.0333 = 1.89 s). To calculate I:E ratio, we removed 1 s from E as the pause period. The I:E ratio was 1:2 (0.46 s/0.89 s). The respiratory rate and its timing parameters for the baby at position 1 are shown in Fig. 4. While the measured value for the respiratory rate was 26.37 breaths/min when we used the Eulerian video magnification technique with absolute present error of 5.48%. This is because the Eulerian magnification technique was accompanied with noise and artifacts that may be detected as white areas (ones) in the binary vector and leading to change Dmean . To observe the difference between the video magnification techniques for position 1, we calculated the PSNR values of the video frames from the developing video magnification technique based on wavelet pyramid decomposition and temporal elliptic band pass with respect to input video frames and compare them with Eulerian magnification technique as shown in Fig. 5. It is clear from the previous figure that PSNR values for our magnification technique are higher than PSNR values for an Eulerian video magnification technique. Also, it can be seen from Fig. 5 that six respiratory signals were observed of the PSNR values for the 15 s of magnified video sequences corresponding to 24 breaths/min. For position 2 as shown in Fig. 6, our system detected 24.69 breath/min and each breath had 73 frames corresponding to 2.43 s. Time of I was of 15 frames (0.49 s) and time of E was 58 frames including P (1.93 s) as shown in Fig. 7. The I:E ratio ≈1:2 (0.49 s/0.93 s) after removing 1 s of P. For position 3 as shown in Fig. 8, 23.71 breaths/min was detected with Rc of 2.53 s (76 frames for each breath). Time of I was 0.56 s (17 frames for inhaling) and 1.96 s (59 frames for exhaling and P period) with ratio of ≈1:2 as shown in Fig. 9. For position 4 with the blanket as shown in Fig. 10, 25.75 breaths/min was detected with Rc of 2.33 s (70 frames for each breath), 0.46 s for I (14 frames), 1.86 s for E (56 frames) and ratio of ≈1:2 as shown in Fig. 11. From Fig. 12 (position 5), we detected 23.43 breaths/min, 2.56 s (77 frames for Rc), 0.56 s (17 frames for I), 1.99 s (60 frames for E + P)
A. Al-Naji, J. Chahl / Biomedical Signal Processing and Control 29 (2016) 1–10
and ratio of ≈1:2. The respiratory rate and its timing parameters for the baby at position 5 are shown in Fig. 13. The Bland-Altman method 35, 36 was used to compare correlation and agreement between the proposed methods and ground-truth data. The 95% limits of agreement were defined as mean difference ± 1.96 SD, where ‘SD’ is the standard deviation of the differences in paired measurements. Bland-Altman plot of agreement between actual measurements (ground-truth data) and measured readings estimated from the proposed system based on the developing video magnification system is shown in Fig. 14a. As can be shown from Fig. 14, the reproducibility coefficient (RPC) was 0.67 breaths/min-2.8% and the mean difference (bias) was 0.21 with limits of agreement of +0.88 and −0.46. Also, correlation analysis between actual and measured readings was carried out using Pearson correction coefficient (PCC) and Spearman’s Rho coefficient (SRC) (PCC = 0.966, SRC = 0.9566). While, when we compared the results resulting from Eulerian magnification technique with ground-data, Bland–Altman analysis had RPC of 2.7 breaths/min11% and the main difference was 1.1 with +3.7 and −1.6 as upper
5
and lower limits of agreement as shown in Fig. 14b, with PCC of 0.7115 and SRC of 0.7286. 5. Discussion Our system succeeded to remotely measure the respiratory rate for a sleeping baby at different poses, even in the presence of the blanket on the baby or unclear ROI. Our experimental results were in close agreement (≈99%) with the actual readings through visual counting directly from the input videos. The accuracy of the system slightly decreased to about 98.5% at positions 4 and 5 when the baby was covered with the blanket, while it slightly improved to about 99.3% at positions 6–10 when the baby was half clothed (see Table 1). Number of breaths and timing parameters of the respiratory cycle computed in 15 s video by calculating the time interval of the frame sequences during the inhalation and exhalation. The ratio of inhalation to exhalation was about 1:2 for all input videos. All results based on developing magnification technique fall within <±1 breaths/min with MAE of 1.314% which is likely to be acceptable for remote measurements, While they fall within
Fig. 3. Respiratory rate detection system for position 1 based on the proposed system.
Fig. 4. Number of breaths and timing parameters in 15 s frame sequences at positions 1.
6
A. Al-Naji, J. Chahl / Biomedical Signal Processing and Control 29 (2016) 1–10
Fig. 5. Respiratory signal based on PSNR values of the 15 s magnified video frames for the baby at position 1 using both video magnification techniques (our magnification technique and Eulerian magnification technique).
Fig. 6. Respiratory rate detection system for the position 2.
Fig. 7. Number of breaths and timing parameters in 15 s frame sequences at positions 2.
A. Al-Naji, J. Chahl / Biomedical Signal Processing and Control 29 (2016) 1–10
Fig. 8. Respiratory rate detection system for the position 3.
Fig. 9. Number of breaths and timing parameters in 15 s frame sequences at positions 3.
Fig. 10. Respiratory rate detection system for the position 4.
7
8
A. Al-Naji, J. Chahl / Biomedical Signal Processing and Control 29 (2016) 1–10
Fig. 11. Number of breaths and time parameters in 15 s frame sequences at positions 4.
Fig. 12. Respiratory rate detection system for the position 5.
<±2 breaths/min when Eulerian magnification technique was used instead with MAE of 6.608%. Also, Fig. 5 showed that our magnification system based on wavelet decomposition has advantages over Eulerian video magnification technique in terms of PSNR values and video quality. We noted from Fig. 5 that respiratory signal and breath to breath peaks can be detected by considering the PSNR values for the frame sequences of the magnified video. The results based on the statistical analysis showed that the correction coefficients (PCC & SRC) and MAE between the results in the first state were 0.966, 0.9566 and 1.314% while they were of 0.7115, 0.7286 and 6.608% respectively in the second state. 6. Conclusion We have presented the remote respiratory monitoring system to measure the respiratory rate and timing parameters of the respiratory cycle. Our study demonstrates that the respiratory rate can be assessed successfully using DSLR video camera imaging the baby’s chest at different positions, even in presence of a blan-
ket on the baby or unclear ROI. Our system was effective despite changing of the light conditions because it relied on motion magnification not on skin colour analysis. We have developed the video magnification technique to suit our measurements by using wavelet decomposition and elliptic filter and then we used motion detection based on a frame subtraction method to detect the respiratory movements of the baby’s chest in the magnified videos. The experimental results of the respiratory rates and their timing cycle parameters were successfully measured at different baby positions with accuracy of approximately 99% and MAE 1.314% falling within <±1 breaths/min, while the result MAE was 6.608% falling with <±2 when Eulerian magnification technique was used instead. The system used in this study may be a suitable canditable candidate to replace conventional contact sensors that are used for breathing measurements and it is expected to create options for further noncontact information extraction. In the future, real-time measurement, extending the number of babies, and ground truth data comparison will be undertaken to make more quantitative analysis and further improvements.
A. Al-Naji, J. Chahl / Biomedical Signal Processing and Control 29 (2016) 1–10
Fig. 13. Number of breaths and timing parameters in 15 s frame sequences at positions 5.
Fig. 14. Statistical results based on (a) developing motion magnification system, (b) Eulerain video magnification technique.
References [1] A.K. Abbas, K. Heimann, K. Jergus, T. Orlikowsky, S. Leonhardt, Neonatal non-contact respiratory monitoring based on real-time infrared thermography, Biomed. Eng. Online 10 (2011) 93. [2] K. Nakajima, Y. Matsumoto, T. Tamura, Development of real-time image sequence analysis for evaluating posture change and respiratory rate of a subject in bed, Physiol. Meas. 22 (2001) 21–28. [3] F. Zhao, M. Li, Y. Qian, J.Z. Tsien, Remote measurements of heart and respiration rates for telemedicine, PLoS One 8 (2013) e71384. [4] A. Droitcour, V. Lubecke, J. Lin, O. Boric-Lubecke, A microwave radio for Doppler radar sensing of vital signs, Microwave Symposium Digest, 2001 IEEE MTT-S International 1 (2001) 175–178. [5] A.D. Droitcour, T.B. Seto, B.K. Park, S. Yamada, A. Vergara, C. El Hourani, T. Shing, A. Yuen, V.M. Lubecke, O. Boric-Lubecke, Non-contact respiratory rate measurement validation for hospitalized patients, Engineering in Medicine and Biology Society, 2009. EMBC 2009. Annual International Conference of the IEEE (2009) 4812–4815.
[6] O.B. Lubecke, P.-W. Ong, V. Lubecke, 10 GHz Doppler radar sensing of respiration and heart movement, in: Bioengineering Conference, 2002. Proceedings of the IEEE 28th Annual Northeast, IEEE, 2002, pp. 55–56. [7] P. Marchionni, L. Scalise, I. Ercoli, E. Tomasini, An optical measurement method for the simultaneous assessment of respiration and heart rates in preterm infants, Rev. Sci. Instrum. 84 (2013) 121705. [8] G. Matthews, B. Sudduth, M. Burrow, A non-contact vital signs monitor, Crit. Rev. Biomed. Eng. 28 (2000) 173–178. [9] S.D. Min, J.K. Kim, H.S. Shin, Y.H. Yun, C.K. Lee, J.-H. Lee, Noncontact respiration rate measurement system using an ultrasonic proximity sensor, IEEE Sens. J. 10 (2010) 1732–1739. [10] L. Scalise, I. Ercoli, P. Marchionni, E.P. Tomasini, Measurement of respiration rate in preterm infants by laser Doppler vibrometry, in: 2011 IEEE International Workshop on Medical Measurements and Applications Proceedings (MeMeA), IEEE, 2011, pp. 657–661. [11] G.C. Holst, Common Sense Approach to Thermal Imaging, SPIE Optical Engineering Press, Washington, DC, USA, 2000. [12] M. Frigola, J. Amat, J. Pagès, Vision Based Respiratory Monitoring System, in: Proceedings of the 10th Mediterranean Conference on Control and Automation–MED2002, Lisbon, Portugal, 2002, pp. 9–12.
9
10
A. Al-Naji, J. Chahl / Biomedical Signal Processing and Control 29 (2016) 1–10
[13] K. Nakajima, A. Osa, H. Miike, A method for measuring respiration and physical activity in bed by optical flow analysis, in: Engineering in Medicine and Biology Society, 1997. Proceedings of the 19th Annual International Conference of the IEEE, IEEE, 1997, pp. 2054–2057. [14] A.K. Abbas, K. Heiman, T. Orlikowsky, S. Leonhardt, Non-contact respiratory monitoring based on real-time IR-thermography, in: World Congress on Medical Physics and Biomedical Engineering, September 7–12, 2009, Springer, Munich, Germany, 2010, pp. 1306–1309. [15] S.Y. Chekmenev, H. Rara, A.A. Farag, Non-contact, wavelet-based measurement of vital signs using thermal imaging, in: The First International Conference on Graphics, Vision, and Image Processing (GVIP), Cairo, Egypt, 2005, pp. 107–112. [16] J. Fei, I. Pavlidis, Analysis of breathing air flow patterns in thermal imaging, in: Engineering in Medicine and Biology Society, 2006. EMBS’06. 28th Annual International Conference of the IEEE, IEEE, 2006, pp. 946–952. [17] R. Murthy, I. Pavlidis, P. Tsiamyrtzis, Touchless monitoring of breathing function, in: Engineering in Medicine and Biology Society, 2004. IEMBS’04. 26th Annual International Conference of the IEEE, IEEE, 2004, pp. 1196–1199. [18] I. Pavlidis, J. Dowdall, N. Sun, C. Puri, J. Fei, M. Garbey, Interacting with human physiology, Comput. Vis. Image Underst. 108 (2007) 150–170. [19] M. Yang, Q. Liu, T. Turner, Y. Wu, Vital sign estimation from passive thermal video, in: IEEE Conference on Computer Vision and Pattern Recognition, 2008. CVPR 2008, IEEE, 2008, pp. 1–8. [20] N. Blanik, A.K. Abbas, B. Venema, V. Blazek, S. Leonhardt, Hybrid optical imaging technology for long-term remote monitoring of skin perfusion and temperature behavior, J. Biomed. Opt. 19 (2014) 016012. [21] F. Bousefsaf, C. Maaoui, A. Pruski, Continuous wavelet filtering on webcam photoplethysmographic signals to remotely assess the instantaneous heart rate, Biomed. Signal Process. Control 8 (2013) 568–574.
[22] L. Nilsson, A. Johansson, S. Kalman, Monitoring of respiratory rate in postoperative care using a new photoplethysmographic technique, J. Clin. Monit. Comput. 16 (2000) 309–315. [23] W. Verkruysse, L.O. Svaasand, J.S. Nelson, Remote plethysmographic imaging using ambient light, Opt. Express 16 (2008) 21434–21445. [24] H.Y. Wu, M. Rubinstein, E. Shih, J.V. Guttag, F. Durand, W.T. Freeman, Eulerian video magnification for revealing subtle changes in the world, ACM Trans. Graph. 31 (2012) 65. [25] B.K. Horn, B.G. Schunck, Determining optical flow, 1981 Technical Symposium East, International Society for Optics and Photonics (1981) 319–331. [26] B.D. Lucas, T. Kanade, An iterative image registration technique with an application to stereo vision, IJCAI (1981) 674–679. [27] Y.A. Al-Najjar, D.C. Soong, Comparison of image quality assessment: PSNR HVS, SSIM, UIQI, Int. J. Sci. Eng. Res. 3 (2012) 1. [28] A. Hore, D. Ziou, Image quality metrics: PSNR vs. SSIM, ICPR (2010) 2366–2369. [29] B. Martin, J. Logemann, R. Shaker, W. Dodds, Coordination between respiration and swallowing: respiratory phase relationships and temporal integration, J. Appl. Physiol. 76 (1994) 714–723. [30] L. Scelza, C.S. Greco, A.J. Lopes, P.L. de Melo, Dysphagia in Chronic Obstructive Pulmonary Disease, InTech, 2015, pp. 201–227. [31] K. Svartengren, P.-Å. Lindestad, M. Svartengren, G. Bylin, K. Philipson, P. Camner, Deposition of inhaled particles in the mouth and throat of asthmatic subjects, Eur. Respir. J. 7 (1994) 1467–1473. [32] K. Zuiderveld, Contrast limited adaptive histogram equalization, in: Graphics Gems IV, Academic Press Professional, Inc., 1994, pp. 474–485.