29 October, 2018

Heisenberg’s Uncertainty Principle and an Undetectable Substrate




This short piece does not stand alone!

It represents a very late, yet crucial, stage in a major philosophical and physical critical assault upon the Copenhagen Interpretation of Quantum Theory, but from a steadfastly materialist standpoint, involving a very different philosophical position, and also the inclusion of a currently undetectable, yet fully-defined and explained Universal Substrate. All the technical questions have been dealt with elsewhere, but there still remains one last piece of the jigsaw -

Heisenberg's Uncertainty Principle!

The purpose of this essay is to debunk Heisenberg's excuse for the Copenhagen stance, which dispenses with the Classical assumptions about Reality, but only within the special Sub Atomic Realm, where he insists determinism no longer applies, and only an assumption of indeterminism allows Mankind to deal with the phenomena we find there.

And, consequently, in such circumstances, NO Causal Explanations were possible, and the only methods capable of delivering anything useable were Statistics and Probability.

But, this opponent of Copenhagen, having already managed to theoretically explain many currently "physically-inexplicable phenomena", by assuming the universal presence of a currently existing, yet passively-undetectable Substrate, which can be, both affected-by and affecting-of, any encountered physical entities, and thereafter, even widening that body of explanations, both extensively and successfully - it suddenly struck me how-and-why the Copenhagenists get away with their entirely formal descriptions.

The reason is Heisenberg's Uncertainty Principle coversfor the incorrect omission of the Universal Substrate as an absolutely crucial premise, within the Sub Atomic Realm in Physics!

What does the Copenhagen Interpretation smuggle in to even make a formal description possible?

It is, of course a Wave Equation!

And, where do such equations usually apply?

They apply to phenomena in Media!






Certainly, the presence of such a Universal Substrate cannot currently be detected, so, in spite of its omission causing innumerable problems, it was still dropped permanently as a necessary premise.

Now, local incidents can cause non-local (extended) effects in such substrates! And, in addition, such effects can then affect not only local entities, but also by propagating to wide areas of the substrate, hence affect more distant entities.

They can even affect the very entities which originally caused them - in reflected-and-recursive interactions, as in the Double Slit Experiments, and in various kinds of resonance.

There is new evidence to consider from current Very Low Temperature Physics - soon to be imminently extended to Gravity-Free conditions in the Space Laboratory in orbit around the Earth - plus it is also abundantly clear from my own researches into Substrates (composed of undetectable joint-particles) that these are not only several in number, but also diverse in their achievable aggregate Phases, presenting very different conditions and possible phenomena to traversing interlopers.

How on earth do you deal with such influences with NO detectable substrate?

Physically, you can't!

So what did they do?

By using forms derived from Mathematics, and previously used with phenomena in observable substrates, you can, with difficulty, also FIT-UP-TO real data just such such formulae, even with no detectable Substrate, BUT never deterministically!

All sorts of workarounds are necessary, both formally and philosophically, to achieve, and then use, these formulae. The formal tricks are no problem, as scientists have been using such rigs throughout their History. But, the philosophical contrivances are more difficult, so the New View would have to take on Philosophy - a very well established discipline! They had to remove, "physically", the bases assumed by the philosophers. And, this was achieved via the Heisenberg Uncertainty Principle - for "At the bottommost levels determinism no longer holds, indeed, a kind of indeterminism holds instead."

Plurality strikes again!

The very principle, which, along with Pragmatism, allowed all the many contradictorily-established disciplines to exist simultaneously, was again brought to bear in this incredible anachronism.

Now, quite apart from the assertions being made here, the whole Philosophical Basis, for the usual range of intellectual disciplines, has already been established by this philosopher-physicist, in his analysis and description of the whole trajectory of intellectual development of Mankind, from their Hunter/Gatherer beginnings, to the present time, which have made various past and present Amalgams of contradictory premises appear "legitimate" via that usual banker premise of Pragmatism - "If it works, it is right!", which, of course, does no such thing, though it can deliver a workable basis for technological gains and productions.

Indeed, Philosophy itself is also one of these disciplines, whereas it should be the measureof them all, and provide the means for dismantling such false separations of disciplines, intellectually at least.

But, the Amalgam of contradictory premises underpinning all the Sciences, was unavoidably adopted, historically, as the only way Mankind had discovered to Control-and-Use many contradictory aspects of Reality to their benefit. It was suggested, initially, by Abstraction, which allowed some sort of discussion of things, achieved by both simplifying and naming them, and with the increasing socialisation of Human beings via the Neolithic Revolution, and the change in their mode of Life, primarily to staying in one place and Farming. For, then Abstraction began to be used in Descriptions, by simplifying observed shapes into Perfect Forms, and studying those in place of their naturally occurring sources: they began to idealiseas well as simplify, and both of these greatly increased what could be done in studying them, and what might be done with them.

From this we arrive at Euclidian Geometry, and the developable power of its Theorems and Proofs. This became a kind of standard for all other intellectual disciplines, and in particular, for Formal Logic, and therefore all the others in which Reasoning was applied too.

But the Mathematics into which Euclidian Geometry grew, was also entirely pluralistic - in that all entities involved were assumed to be separable, and always exactly the same - that is totally unchanging qualitatively - indeed they were considered to be eternal!

All of these disciplines were hamstrung by this totally false limitation.

Now, this imposition of Plurality onto all of these disciplines, including their common Lingua Franca - Formal Reasoning, made absolutely certain that they would always be limited to situations in which nothing ever changed in any profound or qualitative way, so when applied to anything real, it would necessarily only apply to stable situtations involving such things. So, anything involving real development, would necessarily be totally excluded.

This was a crippling restriction, so when Science began to be developed upon the exact same basis, such a Principle implied that no Natural Law (which would necessarily be eternal) would be affected by any changed context. And this tenet both severely handicapped, and yet also enabled a warped-version of Science for many centuries!

It hamstrung it by banning all Qualitative Change to any extracted Laws.

And, it enabled a version of it, as long as the severely-constrained Context, necessarily arranged for to get such Laws extracted in the first place, was identically replicated for its subsequent effective use.

In addition, this also meant that though such "eternal Laws" could be effectively and productively used, they were not those acting in allcircumstances, but only those in the single contexts that alone validated its use. What is generally termed Classical Physics, was actually entirely so crippled, that it should have been termed PluralistPhysics, usable only in very limited constrained circumstances, so that any supposedly General Theory based upon that Law, would always be wrong. And, thus all findings would be both simplified, and also idealised, by taking a pluralist mathematical formula, and fitting it up to the data collected from that pluralist single situation.

Theoretically, as in generating an explanation, that formula would also be wrong: it could be legitimately be used pragmatically, but never theoretically. Indeed, a thorough-going analysis of such a "Law", would reveal it as an illegitimate Amalgam of a Materialist Stance, along with an Idealist stance, and one crippled by Plurality, so would be useless for both explanation and use within any normal natural situation.

And crucially, this was the Physics that failed to cope with Quantized Phenomena: it neither would, nor ever could, cope with such phenomena adequately in any method of experiment in a real world - which also included an undetectable Universal Substrate.

The perpetrators of the Copenhagen Interpretation did not even know of its built-in disabilities - so they kept all the errors of Plurality, and decided instead to throw out Explanation as totally impossible, due to the Sub Atomic Realm being a different world, changed by the Principle of Uncertainty formulated by Werner Heisenberg.


Werner Heisenberg at the blackboard

Clearly, the usual assumptions were indeed adequate above a certain size of the participating components being studied. But, according to Heisenberg, once that size was left behind, and a World of the extremely small was entered, the rules changed dramatically! We had entered the World of Quantum Physics, where things just behaved very differently. Below that level, things became indeterminate - acting within a range of possibilities, and the old determinate Physics could no longer be used.

Indeed, a particular Wave Equation actually delivered that range, but in a very odd way! It delivered only the probabilities of a particle being in each of the whole range of locations covered by that Equation. BUT, we already have detailed knowledge of such phenomena! Long ago, scientists conquered similar situations when they were happening within an affected and effecting visible Substrate. Some material interloper could both disturb, and, in special circumstances, be recursively affected by that disturbance.

Could such methods be appropriate in this area too?

The assumption of a currently undetectable Universal Substrate was included, theoretically, in every single one of the Double Slit Experiments, and every single anomaly was physically explained without any recourse to Heisenberg or the Copenhagen Interpretation whatsoever. It seems that, as with so many of the strange anomalies of the Quantum world, and the subsequent ‘idealisation’ of Physics, this crucial missing premise is to blame.

This paper has been recently published in my new book The Real Philosophy of Science




No comments:

Post a Comment