That’s some deep hinting at though experimentation, Aethelwulf, and delightfully devoid of the detailed math that’s so often required of the subject, and which I so struggle and usually fail.

I gather what you’re saying is connection to the observation that some physical laws, including quantum mechanical laws, are time-reversible, working either in a forward or backward time direction – present state data can be used to calculate future or past states, which in quantum mechanics are ensembles of wavefunction values (conventionally written [imath]\psi [/imath], and related to real-number valued probabilities [imath]\psi \psi^*[/imath], “the complex conjugate of psi” – Aethelwulf is this what you meant by [math] \psi \psi^{\dagger} [/math]?), and available present state data constrained by the uncertainty principle. In principle, then, just as the uncertainty principle makes the results of future, predicted measurements uncertain, it makes the

*imagined* results of past, “postdicted” measurements (necessarily, because unlike a predicted furture measurement result, we can’t actually wait and make the postdicted measurment) uncertain also.

There this, and similar time-reversed though experiments, get paradoxical for me, is when it collides with my intuitive certainty that we

*know* the results of past measurements – we remember them, from recordings and documents and our brain-held memories, but don’t yet know the results of future ones. Practically, postdiction is usually not used to calculate the probability of a given past state of a system, but to test the goodness of the calculation against a certainly known past state. Some laws of physics work in both directions of time, but out intuitive perception of reality appears forward in time only.

I don’t believe this is a true, unresolved paradox, but its resolution isn’t intuitive.

We have evidence that this is what happens in an experiment called ''The Delayed Choice Experiment.'' Because of this, there is strong evidence that the past is shaped up by our actions in the present state statistically-speaking.

I don’t think the

delayed choice experiment shows that our actions shape the past.

Key to this conclusion is consideration of exactly what “the past” is in the domain of the DCE. It’s the passing of a single particle – in real DCE experiments, a photon of visible light – though one or the other of 2 pathways. Per quantum mechanics, if it’s certain which path the particle followed – for example, is a “gate” of some kind is closed so that it only one of the 2 paths is possible – the particle doesn’t exhibit wavelike behavior. The DCE adds a wrinkle to this basic consequence of QM, by not making certain which of the 2 pathways the particle “chose” (or, if you prefer, reality chose for it) by closing the gate on one, but by detecting, after the particle has “made its choice”, which path it followed, or not detecting it. Words begin to fail me in this description, so here’s a picture (from

this 2007 AAAS ScienceNOW article) or the famous (with this science fan, at least

) 2007 Aspect DCE:

The original DC thought experiment, and action ones like that 2007 one, use elementary particles. The validity of its verified prediction depends on this. Consider, for example redoing the DCE using large composite particles –for dramatic effect, let’s say tour busses full of people, replacing the beam splitters in the Aspect experiment with forks in underground tunnels, the driver chooses with a fair coin toss or is forced to take by barricades set by the experiment. It wouldn’t work – regardless of what the experimenter did to have or not have certainty of which path the busses take, the drive and passengers would know, because unlike photons, they can record and recall such information.

Aficionados of quantum physics though experiments might be reminded by this of

the

Wigner’s friend thought experiment/paradox. My intuitive “it wouldn’t work” conclusion is essentially an affirmation of the objective collapse theory resolution of the paradox mentioned in the linked article.

The point John Wheeler had in mind when he proposed the DC though experiment is not to suggest that the later choice of the experimenter to detect the path chosen earlier by the particle influenced (“shaped”) the particle’s choice, but that the view that such choices actually happen, in a usefully meaningful sense, is a wrong one. Metaphorically put, [imath]\psi \psi^*[/imath] is not a choice, the record of the roll of some cosmic dice, but rather a collection of probabilities (probability density functions, for the metaphor-phobic math purist

), a description of the exact nature of the dice.

There are other, deeper reasons to conclude that the “forward from the past to the future” direction of the “

arrow of time” is not an arbitrary decision, the ones I know best rooted in considerations of the underlying mechanics that give rise to the emergent property described by the

second law of thermodynamics. All these explanation seem to me to emerge from underlying mechanics common to both fundamentally deterministic (classical) and probabilistic (modern, quantum) mechanics.

It is real science that we could be ''smears'' throughout every moment in existence, nothing more than a ''probability'' that has emerged itself. How insignificant then a single solid object in reality. It seems to be a collection of statistical averages and nothing more. And worse yet, statistical averages to which no future can be ascertained fundamentally-speaking if we attempt to measure that object!

Yep - [imath]\psi \psi^*[/imath] is well-imagined as a smear.

Per objective collapse theory, though, big solid objects are very significant, because they’re big bundles of wavefunction-collapsing particle interactions. This, not mere statistical averaging of large numbers of individual probability density functions, is why the everyday macroscopic world we measure looks certain, not smeared out.

In principle (though not in practice, because the number of arithmetic operations required is too great), we can predict the future using quantum mechanics. That prediction, however, is not “certain” – consisting of wavefunctions with complex conjugate values of only 1 and 0 – but a [imath]\psi[/imath] collection of wavefunctions. Practically, the classical approximation approximates the likely outcome of these wavefuctions.

The uncertainty principle rules out a classical

Laplace’s demon. It doesn’t rule out or ability to determine the future of ensembles of particles for practical purposes, or of single fundamental particles, if we are content to represent it with probabilistically, with a wavefunction.

The reason why the past and future exist will therefore be reliant on where these ''waves'' coalesce.

Assuming by “waves” you mean [imath]\psi[/imath], Aethelwulf, I don’t get your meaning here. As I understand it, the value of [imath]\psi[/imath] changes with time, and represents the addition of many individual wavefunctions, so can be though to “coalesce” for a given particle by describing a high probability of that particle being detected in a particular volume at a particular time. A wavefunction does not, however, give a probability density function for a particle being detected only at a specific duration of time. Put another way, particles are distinct in space – they have high probabilities of being where we detect them to be, low probabilities of being elsewhere – but not in time – the probability of a particle existing in some volume at

*every* instant of time is 1. The wavefunction of a particle permits it to interfere with itself spatially, not temporaly, so I can’t understand what you mean by them “coalescing” at some instant in time distinguishing past from present. Can you elaborate