Interpreting Uncertainty in Model Predictions in Bayesian Neural Networks for COVID-19 Diagnosis
COVID-19, due to its accelerated spread has brought in the need to use assistive tools for faster diagnosis in addition to RT-PCR. Chest X-Rays for COVID cases tends to show changes in the lungs such as ground glass opacities and peripheral consolidations which can be detected by deep neural networks. However their point estimate nature and lack of capture of uncertainty in the prediction makes it less reliable for healthcare adoption. There have been several works in the interpretability of point estimate deep neural networks. However very limited work has been found oninterpretinguncertaintyin a COVID prediction and decomposing this to model or data uncertainty. To mitigate this we compute uncertainty in predictions with a Bayesian Convolutional Neural Network and develop a visualization framework to address interpretability. This framework aims to understand the contribution of individual features in the Chest-X-Ray images to predictive uncertainty. Providing this as an assistive tool can help the radiologist understand why the model came up with a prediction and whether the regions of interest captured by the model for the specific prediction are of significance in diagnosis.
History
Language
EnglishDegree
- Master of Engineering
Program
- Electrical and Computer Engineering
Granting Institution
Ryerson UniversityLAC Thesis Type
- MRP