@wrwlumpy
You probably know the great scientific secret of the last 50 years; that science has largely left parametric statistical inference behind and embraced the algorithm without telling the ordinary folks.
It is vaguely analogous to the precedent of the Vatican accepting Charlemagne's bestowal of temporal powers on the Vatican's spiritual authorities to make the wetware technology of that particular religion and its leader useful as a means of governance. The flock was not really told much about the gravity of the implications of their religion and its leadership becoming instituted as a temporal bureaucracy to rule them. There was after all a rather profound difference between confessing to your spiritual guide,who has a power to save your soul with a blessing, and confessing to your governor, who has the power and right to hang you, burn you, crucify you, imprison you, or otherwise disseminate that information to police authorities without your knowledge, or recourse.
(Note: I like Catholicism, like I like all religions, when some minority of them, as with some minority of all religions, is not behaving vilely.)
In science, in the good old days of induction, we studied things and we had a philosophy and logical foundation that made valid and meaningful our theoretical explanations of phenomena with probabilities of confidence (i.e., those hypotheses tested and found not refuted).
But we found over time that there were too many things we wanted to study that we could not study by induction without violating the assumptions of induction.
There were too many discoveries that lead us into too many realms where induction just didn't work very well.
We basically decided, what the heck, who cares if we violate the assumptions. We can at least learn something quantitatively by using models with violated parametric assumptions and accurately measured variable values.
And once we got used to that we decided, what the heck, why don't we move beyond the parametric models we are violating the assumptions of and just build algorithms, i.e., models that we "believe" will give useful answers. And let's gauge their usefulness by testing how well they predict the past, i.e., the historical data points we used to build the algorithm in the first place. And let's gauge how well they predict the future, too. And if they do both pretty well, then lets use those algorithms as explanations of phenomena, like we used to use empirically verified theories of induction.
And then we decided who the hell cares if those algorithms are right or not, if they will attract grants, let's work with them, and tried to wring at least some useful meanings out of them as we keep the lab open and overhead covered.
Today, in the age of the algorithm, induction is used ad hoc to create statistically significant confidence in the reliability of certain "parameters" and certain "variables" used in an algorithm. Alas, doing so can get rather like a car salesman that might use brand spanking new and highly tested and trustworthy Michelin tires as something to point to and say, "See, this fine, previously owned car has good tires, so you can trust that it will be a good car."
But of course the good tires may be empirically verifiable facts, but they may be mounted on the rusted rims and broken lug nuts of a rusted out hulk with the rust covered over by a cheap paint job.
The validity of our science increasingly depends not on the replicability of measurable findings, but on the character of the scientists extrapolating via algorithms beyond the statistically verifiable realms of empirical reality.
And that sort of reliance was exactly what drove us to inductive science in the first place way back when.
KENPOM's algorithms, much as I appreciate the insights he has achieved, and how much more knowledge there is to gain down the path he walks, is, nonetheless, an algorithmic based analysis.
So say your Hail Marys and hope he is a high priest of QA with good character. For when the inductive is mixed with the algorithmic, it is not unlike the religious being mixed with the temporal.
Nothing is ever quite the same afterwards.