Abstract
We look into the ontology of quantum theory as distinct from that of the classical theory in the sciences.
Theories carry with them their own ontology while the metaphysics may remain the same in the background.
We follow a broadly Kantian tradition, distinguishing between the noumenal and phenomenal realities where
the former is independent of our perception while the latter is assembled from the former by means of fragmentary
bits of interpretation. Theories do not tell us how the noumenal world is constituted but are conceptual constructs applying to models generated in the phenomenal world within limited contexts.
The ontology of quantum theory principally rests on the view that entities in the world
are pervasively correlated with one another not by means of probabilities as in the case of the classical
theory, but by means of probability amplitudes involving finely tuned phases characterising the
oscillatory behaviour of quantum mechanical states.
The amplitude-dependent correlations (quantum entanglement) exist over and above the
classical ones expressed in terms of probabilities. While the classical correlations are
essentially local in nature, quantum correlations are shared globally in the process of
environment-induced decoherence. The decoherence is an effectively random process
that removes local correlations in the course of global sharing of entanglement—the
removal being especially manifest in the case of systems that appear as classical ones.
It is this aspect of the decoherence process that makes the so-called measurement
postulate (one relating to wave function collapse) cohere with the rest of the principles of quantum
theory where the latter implies a Schrodinger type time evolution of quantum mechanical systems.
The mathematical basis of the quantum correlations consists of the description of pure
states of a system in terms of vectors in a linear vector space (a mixed state appears
as an admixture of pure states with some probability distribution associated with it)
and, in addition, the description of (pure) states of composite systems as vectors in
the product space arising from the component sub-systems.
The crucial aspect of the decoherence process, of significance in the context of the
apparent incompatibility of the measurement postulate with the unitary Schrodinger
evolution, relates to the fact that it is almost instantaneous in the case of a classical
object (one that is nevertheless amenable to a quantum description). Indeed, the
decoherence time is, in all likelihood, of the order of the Planck scale, being driven by
field fluctuations in the Planck regime. This points to factors of an unknown nature
determining the finest details of the decoherence process since Planck scale physics
remains an obscure terrain.
In other words, quantum theory is, to all intents and purposes, in the need of a radical
revision, in keeping with the fact that all theories are defeasible and need revision as
our domains of experience expand and get realigned in a complex manner. The context
within which quantum theory (and quantum field theory too) is defined is set precisely
by the Planck scale, across which a novel theoretical framework is likely to emerge.
However, as in the case of theory revisions in general, that emerging theory will stand
in an asymmetric relation of incommensurability with the present-day quantum theory,
where the concepts of the latter will be comprehensible in terms of those of the former,
but the converse will not hold.