Through the Heart of Every Man — Through the Heart of Every Man
home

Through the Heart of Every Man

Deduction as Efficient Induction: Why Physical Science, and Nothing Else, Works

17 February 2024 | Philosophy Philosophy of Science Statistics

A problem I've been obsessed with is as follows:

Logic, mathematics, and the physical sciences (physics, astronomy, chemistry) appear qualitatively (and, to some degree empirically) to have much more reliable results, stronger consensus, and something of an immunity to replication crises.

Why?

I've been reading a lot on the philosophy of Charles Sanders Peirce recently, and have been absolutely dumbfounded by the degree to which the man's opinions qualitatively match my own. His idea of "economy of research" has in particular really crystalized some things: the idea that the means to accomplish research are scarce, and that we should, as with all scarce means, economize our use of them. I posit that the non-implementation of methodology that satisfies this principle is the primary reason for the observed discrepancy between the disciplines described above.

To the eyes of this external, physically-minded observer, biological and social sciences appear to have been caught in the deadly, naïvely-empirical cult of the null ritual. The fatal feature of this approach which, indeed, may even befall undisciplined technique at principled empiricism, is that it considers all uncertain propositions to be justified on the basis of direct experiments on the truth value of said proposition. It does not seek to build marvelous deductive edifices, as constitutes (say) theoretical physics; instead, it just applies statistical inference to them all (at best, some form of causal inference). A fundamental premise of (Bayesian) statistical inference is that Aristotelian logic is the probability-1-and-0 subset of its real-valued logic (cf. Jaynes). So, any sound logical deductions made on uncertain propositions are lossless: the conclusion can't be any "falser" than the premise (inductive language: P(B|¬A + B) ≥ P(A|¬A + B), which clearly holds on destructing the disjunction). Moreover, this means that tests at the "leaves" of the tree of deductively established formulae tend to propagate certainty back up the implications (since an increase in P(B|¬A + B) might cause the inequality to be violated, necessitating an upward revision of P(A|¬A + B)), ultimately verifying the axioms and consequently strengthening all propositions deduced from them—since the number of leaves can grow exponentially, this (to first approximation) provides a logarithmic improvement in the asymptotic performance of the economy of research.

amxpjt.jpg

Figure 1: Naïve empiricism: no propagation of inductive knowledge.

jn9ZqD.jpg

Figure 2: Model-based empiricism: the entire deductive content is in the base propositions.

The reason, then, why the rest of these sciences lag behind the physical sciences might be simply attributed to the fact that these sciences could not mature before the positivist-empiricist ideas could take hold and be misinterpreted—the closest things to them would have been the 18th-century British empiricists, which hardly bear methodological resemblance to their later, analytical counterparts. The physical sciences had sufficiently well-developed deductive models already established that such empiricism either could not take root or could not cause harm (presuming we have not needed to overhaul that deductive edifice long ago; given the known foundational problems in theoretical physics, this is not obvious).

The methodological solution, then, I think is pretty clear: more mathematical modeling in biology, psychology, social science! And I don't mean mathematical modeling in the sense that those who would rejoinder "tried that; didn't work" are remembering. Newton had to invent calculus to cajole physics from Aristotelian squalor—it will likely be necessary for people to know real, deductive mathematics and generate real, deductive insights actually applicable to their disciplines. There's no need to regress to such crude devices as "numbers" and "partial differential equations." Get in the trenches; learn about lattice-based concept analysis, build programming languages out of implicit topoi, construct categorical ontology logs.

Forget the nightmarish scars inflicted by government schools' approach, and do good science!