Keywords: computer vision, neural networks, motion capture systems, patient rehabilitation systems, detection of breathing patterns
Development of a method for determining the dominant type of human breathing pattern based on computer vision technologies, motion capture systems and machine learning
UDC 004.588
DOI: 10.26102/2310-6018/2022.39.4.016
The study raises the problem of the absence of methods for determining the dominant type of breathing pattern that can be used in the implementation of software products that contribute to the support of patients with respiratory insufficiency and their rehabilitation at the stage of inpatient and outpatient treatment. Existing methods are either too labor-intensive to implement due to the excessive amount of markers utilized by motion capture systems or economically unprofitable due to the cost of the equipment itself or developed only for research purposes and are not applicable in clinical practice. In this regard, this article is aimed at developing a method for determining the type of breathing, which could later be employed for automated rehabilitation of patients with respiratory insufficiency. As part of the study, computer vision and machine learning methods were applied as well as methods based on motion capture technologies. The article presents methods for determining the position of markers in space and analyzing the type of human breathing (thoracic, abdominal, mixed) in real time based on the data obtained by means of motion capture system markers. The materials of the article are of practical value in the field of medical rehabilitation of patients with respiratory insufficiency; they make it possible to optimize labor processes within the field of medical rehabilitation, i.e. reducing labor and time costs of rehabilitologists.
1. 1. Di Tocco J., Lo Presti D., Zaltieri M., Bravi M., Morrone M., Sterzi S., Schena E., Massaroni C. Investigating stroke effects on respiratory parameters using a wearable device: a pilot study on hemiplegic patients. Sensors (Basel). 2022;22(17):6708. DOI: 10.3390/s22176708.
2. 2. Massaroni C., Cassetta E., Silvestri S. A novel method to compute breathing volumes via motion capture systems: design and experimental trials. Journal of Applied Biomechanics. 2017;33(5):361–365. DOI: 10.1123/jab.2016-0271.
3. 3. Romano Ch., Schena E., Formica D., Massaroni C. Comparison between chest-worn accelerometer and gyroscope performance for heart rate and respiratory rate monitoring. Biosensors. 2022;12(10):834. DOI: 10.3390/bios12100834.
4. 4. Vasil'ev V.I. Vliianie diafragmal'norelaksatsionnogo tipa dykhaniia na zdorov'e uchashchikhsia srednei obshcheobrazovatel'noi shkoly. Izvestiia Penzenskogo gosudarstvennogo pedagogicheskogo universiteta im. V.G. Belinskogo. 2006;5:121–123. (In Russ.).
5. 5. Subin S., Pravin A. Breathing techniques-A review – 25 different types. International Journal of Physical Education, Sports and Health. 2015;2(2):237–241.
6. 6. Cuña-Carrera I., Alonso Calvete A., González Yo., Soto-González M. Changes in abdominal muscles architecture induced by different types of breathing. Isokinetics and Exercise Science. 2022;30(1):15–21. DOI: 10.3233/IES-210159.
7. 7. Arai R., Murakami K. Hierarchical human motion recognition by using motion capture system. 2018 International Workshop on Advanced Image Technology (IWAIT). 2018;1–4.
8. 8. Nagymáté G., M. Kiss R. Application of OptiTrack motion capture systems in human movement analysis: A systematic literature review. Recent Innovations in Mechatronics. 2018;5(1):1–9. DOI: 10.17667/riim.2018.1/13.
9. 9. Lu C., Lin J., Chang C., Liu C., Wang L., Tseng K. Recognition of film type using HSV features on deep-learning neural networks. Journal of Electronic Science and Technology. 2020;18(1):31–41. DOI: 10.11989/JEST.1674-862X.90904223.
10. 10. Wang Y., Hua C., Ding W., Wu R. Real-time detection of flame and smoke using an improved YOLOv4 network. Signal Image Video Process. 2022;16(5):1109–1116. DOI: 10.1007/s11760-021-02060-8.
11. 11. Abassi S., Abdi H., Ahmadi A. A face-mask detection approach based on YOLO applied for a new collected dataset. 26th International Computer Conference, Computer Society of Iran, CSICC 2021. 2021;1–6. DOI:10.1109/CSICC52343.2021.9420599
12. 12. Redmon J., Divvala S., Girshick R., Farhadi A. You only look once: unified, real-time object detection. 2016 IEEE Conference on Computer Vision and Pattern Recognition, CVPR. 2016;779–788.
13. 13. Rybchits G.M., Zubkov A.V., Samokhodkina I.A., Orlova Yu.A. Razrabotka modelei prediktivnoi analitiki dlya predskazyvaniya tendentsii zabolevaemosti COVID-19 na osnovanii otkrytykh dannykh Volgogradskoi oblasti. Programmnaya inzheneriya: sovremennye tendentsii razvitiya i primeneniya (PI-2021): sb. mater. V-i Vseros. nauch.-prakt. konf. 2021;124–127. (In Russ.).
14. 14. Rybchits G.M., Zubkov A.V., Gomazkova Yu.S., Korshunov A.A. Razrabotka modeli dlya raspoznavaniya markerov na baze seti YOLO dlya detektirovaniya tsiklov dykhaniya. Innovatsionnye tekhnologii v obuchenii i proizvodstve: materialy XVI Vseros. zaochn. nauch.-prakt. konf. 2021;1(3):139–142. (In Russ.).
Keywords: computer vision, neural networks, motion capture systems, patient rehabilitation systems, detection of breathing patterns
For citation: Zubkov A.V., Donskaya A.R., Busheneva S.N., Orlova Y.A., Rybchits G.M. Development of a method for determining the dominant type of human breathing pattern based on computer vision technologies, motion capture systems and machine learning. Modeling, Optimization and Information Technology. 2022;10(4). URL: https://moitvivt.ru/ru/journal/pdf?id=1200 DOI: 10.26102/2310-6018/2022.39.4.016 (In Russ).
Received 02.06.2022
Revised 14.12.2022
Accepted 27.12.2022
Published 31.12.2022