An Artificial Intelligent System for Prostate Cancer Diagnosis in Whole Slide Images

In recent years a significant demand to develop computer-assisted diagnostic tools to assess prostate cancer using whole slide images has been observed. In this study we develop and validate a machine learning system for cancer assessment, inclusive of detection of perineural invasion and measuremen...

Teljes leírás

Elmentve itt :
Bibliográfiai részletek
Szerzők: Saha Sajib
Vignarajan Janardhan
Flesch Adam
Jelinko Patrik
Gorog Petra
Szep Eniko
Tóth Csaba
Gombas Peter
Schvarcz Tibor
Mihaly Orsolya
Kapin Marianna
Zub Alexandra
Kuthi Levente
Tiszlavicz László
Glasz Tibor
Frost Shaun
Dokumentumtípus: Cikk
Megjelent: 2024
Sorozat:JOURNAL OF MEDICAL SYSTEMS 48 No. 1
Tárgyszavak:
doi:10.1007/s10916-024-02118-3

mtmt:35492454
Online Access:http://publicatio.bibl.u-szeged.hu/39147
Leíró adatok
Tartalmi kivonat:In recent years a significant demand to develop computer-assisted diagnostic tools to assess prostate cancer using whole slide images has been observed. In this study we develop and validate a machine learning system for cancer assessment, inclusive of detection of perineural invasion and measurement of cancer portion to meet clinical reporting needs. The system analyses the whole slide image in three consecutive stages: tissue detection, classification, and slide level analysis. The whole slide image is divided into smaller regions (patches). The tissue detection stage relies upon traditional machine learning to identify WSI patches containing tissue, which are then further assessed at the classification stage where deep learning algorithms are employed to detect and classify cancer tissue. At the slide level analysis stage, entire slide level information is generated by aggregating all the patch level information of the slide. A total of 2340 haematoxylin and eosin stained slides were used to train and validate the system. A medical team consisting of 11 board certified pathologists with prostatic pathology subspeciality competences working independently in 4 different medical centres performed the annotations. Pixel-level annotation based on an agreed set of 10 annotation terms, determined based on medical relevance and prevalence, was created by the team. The system achieved an accuracy of 99.53% in tissue detection, with sensitivity and specificity respectively of 99.78% and 99.12%. The system achieved an accuracy of 92.80% in classifying tissue terms, with sensitivity and specificity respectively 92.61% and 99.25%, when 5x magnification level was used. For 10x magnification, these values were respectively 91.04%, 90.49%, and 99.07%. For 20x magnification they were 84.71%, 83.95%, 90.13%.
Terjedelem/Fizikai jellemzők:16
ISSN:0148-5598