A fused-image-based approach to detect obstructive sleep apnea using a single-lead ECG and a 2D convolutional neural network.

Citation metadata

Date: Apr. 26, 2021
From: PLoS ONE(Vol. 16, Issue 4)
Publisher: Public Library of Science
Document Type: Report
Length: 8,067 words
Lexile Measure: 1540L

Document controls

Main content

Abstract :

Obstructive sleep apnea (OSA) is a common chronic sleep disorder that disrupts breathing during sleep and is associated with many other medical conditions, including hypertension, coronary heart disease, and depression. Clinically, the standard for diagnosing OSA involves nocturnal polysomnography (PSG). However, this requires expert human intervention and considerable time, which limits the availability of OSA diagnosis in public health sectors. Therefore, electrocardiogram (ECG)-based methods for OSA detection have been proposed to automate the polysomnography procedure and reduce its discomfort. So far, most of the proposed approaches rely on feature engineering, which calls for advanced expert knowledge and experience. This paper proposes a novel fused-image-based technique that detects OSA using only a single-lead ECG signal. In the proposed approach, a convolutional neural network extracts features automatically from images created with one-minute ECG segments. The proposed network comprises 37 layers, including four residual blocks, a dense layer, a dropout layer, and a soft-max layer. In this study, three time-frequency representations, namely the scalogram, the spectrogram, and the Wigner-Ville distribution, were used to investigate the effectiveness of the fused-image-based approach. We found that blending scalogram and spectrogram images further improved the system's discriminative characteristics. Seventy ECG recordings from the PhysioNet Apnea-ECG database were used to train and evaluate the proposed model using 10-fold cross validation. The results of this study demonstrated that the proposed classifier can perform OSA detection with an average accuracy, recall, and specificity of 92.4%, 92.3%, and 92.6%, respectively, for the fused spectral images.

Source Citation

Source Citation   

Gale Document Number: GALE|A659723389