Quality assurance for GPU-accelerated bio-image analysis

Abstract number
39
DOI
10.22443/rms.mmc2023.39
Corresponding Email
[email protected]
Session
Reproducibility of Data Analysis at Scale
Authors
Dr. Robert Haase (1)
Affiliations
1. DFG Cluster of Excellence "Physics of Life", TU Dresden
Keywords


Abstract text

Modern computing hardware such as high-performance-computing and graphics computing units (GPUs) enable scientists to process image data on scales that were unthinkable a decade back. Modern algorithmic techniques such as machine learning, deep learning and large language models also contribute to the need to rethink how we analyze image data on daily basis. Common procedures such as ImageJ Macro programming are increasingly replaced by Python programming. The task of programming itself is being replaced by conventional and artificial-intelligence-based code generators. When new algorithmic techniques are published, typically authors show the advantages of the new technique by measuring higher performance, quality or other improving criteria. However, when searching for a modern drop-in replacement of a common procedure such as nuclei segmentation or shape quantification, users might be interested to see if the new approach delivers results similar to the old one. In this talk I will present established techniques for this task such as equivalence testing and Bland-Altman analysis in the context of replacing common image analysis workflows with GPU-accelerated counterparts. This talk is loosely related to this publication: https://link.springer.com/chapter/10.1007/978-3-030-76394-7_5