Biomarkers for Traumatic Brain Injury: Data Standards and Statistical Considerations

Abstract
Recent biomarker innovations hold potential for transforming diagnosis, prognostic modeling, and precision therapeutic targeting of traumatic brain injury (TBI). However, many biomarkers, including brain imaging, genomics, and proteomics involve vast quantities of high-throughput and high-content data. Management, curation, analysis and evidence synthesis of these data are not trivial tasks. In this review we discuss data management concepts, and statistical and data sharing strategies when dealing with biomarker data in the context of TBI research. We propose that application of biomarkers involves three distinct steps—discovery, evaluation and evidence synthesis. First complex/big data has to be reduced to useful data elements at the stage of biomarker discovery. Second, inferential statistical approaches must be applied to these biomarker data elements for assessment of biomarker clinical utility and validity. Lastly, synthesis of relevant research is required to support practice guidelines and enable health decisions informed by the highest quality, up-to-date evidence available. We focus our discussion around recent experiences from the International Traumatic Brain Injury Research (InTBIR) initiative, with a specific focus on 4 major clinical projects (TRACK-TBI, CENTER-TBI, CREACTIVE and ADAPT), which are currently enrolling subjects in North America and Europe. We discuss common data elements, data collection efforts, data-sharing opportunities and challenges, and examine the statistical techniques required to realize successful adoption and use of biomarkers in the clinic as a foundation for precision medicine in TBI.