Veridical Data Science and PCS Uncertainty Quantification
Bin Yu
Statistics, EECS, and Center for Computational Biology, UC Berkeley
Data Science is central to AI and has driven most of recent advances in biomedicine and beyond. Human judgment calls are ubiquitous at every step of a data science life cycle (DSLC): problem formulation, data cleaning, EDA, modeling, and reporting. Such judgment calls are often responsible for the "dangers" of AI by creating a universe of hidden uncertainties well beyond sample-to-sample uncertainty.
To mitigate these dangers, veridical (truthful) data science is introduced based on three principles: Predictability, Computability and Stability (PCS). The PCS framework and documentation unify, streamline, and expand on the ideas and best practices of statistics and machine learning. In every step of a DSLC, PCS emphasizes reality check through predictability, considers computability up front, and takes into account expanded uncertainty sources including those from data curation/cleaning and algorithm choice to build more trust in data results. PCS will be showcased through collaborative research in prostate cancer detection. We will end with on-going research on PCS uncertainty quantification (UQ).
PCS-UQ addresses two other prominent sources of uncertainty in a DSLC from reasonable choices practitioners make in data cleaning and modeling stages (in addition to uncertainty arising from data collection).