Toronto Metropolitan University
Browse
Murugamoorthy, Gayathiri.pdf (1.31 MB)

Interpreting Uncertainty in Model Predictions in Bayesian Neural Networks for COVID-19 Diagnosis

Download (1.31 MB)
thesis
posted on 2023-08-29, 16:07 authored by Gayathiri Murugamoorthy

COVID-19, due to its accelerated spread has brought in the need to use assistive tools for faster diagnosis in addition to RT-PCR. Chest X-Rays for COVID cases tends to show changes in the lungs such as ground glass opacities and peripheral consolidations which can be detected by deep neural networks. However their point estimate nature and lack of capture of uncertainty in the prediction makes it less reliable for healthcare adoption. There have been several works in the interpretability of point estimate deep neural networks. However very limited work has been found oninterpretinguncertaintyin a COVID prediction and decomposing this to model or data uncertainty. To mitigate this we compute uncertainty in predictions with a Bayesian Convolutional Neural Network and develop a visualization framework to address interpretability. This framework aims to understand the contribution of individual features in the Chest-X-Ray images to predictive uncertainty. Providing this as an assistive tool can help the radiologist understand why the model came up with a prediction and whether the regions of interest captured by the model for the specific prediction are of significance in diagnosis. 

History

Language

English

Degree

  • Master of Engineering

Program

  • Electrical and Computer Engineering

Granting Institution

Ryerson University

LAC Thesis Type

  • MRP

Thesis Advisor

Dr. Naimul Khan

Year

2021

Usage metrics

    Electrical and Computer Engineering (Theses)

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC