Abstract
Techniques which assume linear, time-invariant systems have been used to characterize indicator dilution pairs. As a basis for fully describing the relation between left ventricular (LV) and myocardial (MYC) time-density curves, produced by an intravenous contrast medium as measured by ultrafast CT, the assumption of time invariance was tested using recursive least squares regression and CUSUM, a test for time variability of regression parameters. Using data from anesthetized dogs with concomitant microsphere information, constant and time-varying regression models, MYC(t) = b(t)LV(t-1), were generated from time-density curves of flows from two groups: Group 1 (MBF < 2 ml/min/gm, n = 11) and Group 2 (MBF > 2 ml/min/gm, n = 10). The time-varying regression models had reduced root mean square error: 0.6 +/- 1.1 and 0.5 +/- 0.8 versus 7.3 +/- 3.5 and 4.1 +/- 1.6 for Groups 1 and 2, respectively. Significant time variability (p < 0.05) by CUSUM was found in 9/11 Group 1 models and 7/10 Group 2 models. Myocardial blood volume was estimated as the average value of b(t) over the rising portion of the LV curve. Myocardial blood flow was then calculated as myocardial blood volume divided by coronary transit time, determined from gamma variate fits of the LV and scaled, shifted LV curve, with excellent results over a wide range of flows (r = 0.93, y = 0.92 x + 0.28, range of 0.4 to 6.7 ml/min/gm). These results show that measurements of increased myocardial blood flow are possible with an intravenous contrast media, and that movement of contrast medium from intravascular space to extravascular space occurs during the course of the contrast medium's first pass.