Skip to main content

Research Repository

Advanced Search

Discrimination of multiple sclerosis using OCT images from two different centers

Khodabandeh, Zahra; Rabbani, Hossein; Ashtari, Fereshteh; Zimmermann, Hanna G.; Motamedi, Seyedamirhosein; Brandt, Alexander U.; Paul, Friedemann; Kafieh, Rahele

Discrimination of multiple sclerosis using OCT images from two different centers Thumbnail


Authors

Zahra Khodabandeh

Hossein Rabbani

Fereshteh Ashtari

Hanna G. Zimmermann

Seyedamirhosein Motamedi

Alexander U. Brandt

Friedemann Paul



Abstract

Background: Multiple sclerosis (MS) is one of the most prevalent chronic inflammatory diseases caused by demyelination and axonal damage in the central nervous system. Structural retinal imaging via optical coherence tomography (OCT) shows promise as a noninvasive biomarker for monitoring of MS. There are successful reports regarding the application of Artificial Intelligence (AI) in the analysis of cross-sectional OCTs in ophthalmologic diseases. However, the alteration of sub-retinal thicknesses in MS is noticeably subtle compared to other ophthalmologic diseases. Therefore, raw cross-sectional OCTs are replaced with multilayer segmented OCTs for discrimination of MS and healthy controls (HCs). Methods: To conform to the principles of trustworthy AI, interpretability is provided by visualizing the regional layer contribution to classification performance with the proposed occlusion sensitivity approach. The robustness of the classification is also guaranteed by showing the effectiveness of the algorithm while being tested on the new independent dataset. The most discriminative features from different topologies of the multilayer segmented OCTs are selected by the dimension reduction method. Support vector machine (SVM), random forest (RF), and artificial neural network (ANN) are used for classification. Patient-wise cross-validation (CV) is utilized to evaluate the performance of the algorithm, where the training and test folds contain records from different subjects. Results: The most discriminative topology is determined to square with a size of 40 pixels and the most influential sub-retinal layers are the ganglion cell and inner plexiform layer (GCIPL) and inner nuclear layer (INL). Linear SVM resulted in 88% Accuracy (with standard deviation (std) = 0.49 in 10 times of execution to indicate the repeatability), 78% precision (std=1.48), and 63% recall (std=1.35) in the discrimination of MS and HCs using macular multilayer segmented OCTs. Conclusion: The proposed classification algorithm is expected to help neurologists in the early diagnosis of MS. This paper distinguishes itself from other studies by employing two distinct datasets, which enhances the robustness of its findings in comparison with previous studies with lack of external validation. This study aims to circumvent the utilization of deep learning methods due to the limited quantity of the available data and convincingly demonstrates that favorable outcomes can be achieved without relying on deep learning techniques.

Citation

Khodabandeh, Z., Rabbani, H., Ashtari, F., Zimmermann, H. G., Motamedi, S., Brandt, A. U., Paul, F., & Kafieh, R. (2023). Discrimination of multiple sclerosis using OCT images from two different centers. Multiple Sclerosis and Related Disorders, 77(September), Article 104846. https://doi.org/10.1016/j.msard.2023.104846

Journal Article Type Article
Acceptance Date Jun 19, 2023
Online Publication Date Jun 24, 2023
Publication Date Jul 5, 2023
Deposit Date Jun 30, 2023
Publicly Available Date Jun 30, 2023
Journal Multiple Sclerosis and Related Disorders
Print ISSN 2211-0348
Electronic ISSN 2211-0356
Publisher Elsevier
Peer Reviewed Peer Reviewed
Volume 77
Issue September
Article Number 104846
DOI https://doi.org/10.1016/j.msard.2023.104846
Public URL https://durham-repository.worktribe.com/output/1169053

Files


Journal Article (In Press, Corrected Pre-proof) (1.2 Mb)
PDF

Publisher Licence URL
http://creativecommons.org/licenses/by-nc-nd/4.0/

Copyright Statement
In Press, Corrected Pre-proof © 2023 The Author(s). Published by Elsevier B.V.
This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/by-nc-nd/4.0/)





You might also like



Downloadable Citations