Jackson Pollock |

**A Quantum Tangle**

What Bell's Inequalities are about, is not what is claimed. It is about formal descriptions of reality and not the material world itself.

In so-called

*Quantum Entanglement*, the assertion is that in measuring one of an entangled pair of particles, it influences the quantum state of the other, even if they are a million light years apart, and it does this literally instantaneously (obviously much faster than the speed of light, at any rate).
For any ordinary mortals reading this, I must point out that the believers of this "magic" are sub atomic physicists - and this kind of drivel is pretty well par-for-the-course in those quarters.

However, when it comes to actually "confirming" this phenomenon, they must measure one entity and then the other, or even simultaneously to prove their case. My concern is, "How do the experimenters know a change has been made to the other one of the pair?" For, if you measured the first, then it would immediately influence the other member of the pair. Clearly there is a problem here.

**Quantum weirdness proved real in first loophole-free experiment**by Jacob Aron, New Scientist (3037)

Do they regularly measure them alternately or simultaneously to attempt to establish a pattern? Questions arise even for those who support the theory. How could you ever know what the undisturbed state of either one was?

You can't of course! So what do the "believers" say?

They insist that prior to measurement they are both simultaneously in "all possible states at once" until you actually measure one of them, which then forces it into a particular one of those possible states.

Such ideas recur throughout this theoretical stance: it is the basic myth of

*superposition*once again! This concept states that a particle (before measurement) is simultaneously in all possible positions (like a wave), but with a fixed probability of being in each and every one. And, this remains the case until we measure its position, and by doing so, fix it into a single possible position.Ryoji Ikeda |

Now, though this is totally counter-intuitive (and most probably wrong), it does allow statistics to be used over a number of cases, and the statistically arrived-at answers do indeed match certain observations in reality.

The mathematicians make it all work by taking a Wave equation and associating probabilities to all possible points on the wave, which are interpreted as being probabilities that the particle is in each possible position.

Notice that this method

**cannot**deal with the position of a single particle, but can give overall estimates of a whole group!
As a physicist myself (and one who was originally a mathematician), I have a name for such methods - I call them

*frigs*! They are mere tricks. Such techniques are often used in Mathematics as clever ways of arriving at hard-to-find solutions to purely abstract equations.
So maybe you can see how they happened.

With this in mind we return to Quantum Entanglement - this totally counter-intuitive standpoint is described as having a before-measurement-particle only existing "

*as a fuzzy cloud of probabilities of all its possible states.*" And this is how they avoid the otherwise necessary infinite regress! Instead of an oscillation with each and every measurement, we are expected to believe that before measurement, such quantum entities are not in any particular state at all, but when measured an entity will suddenly be in a specific state, and its remote entangled partner will somehow be affected by this intervention too!
In other words, more generally, we can conceive of such things as particles, but nevertheless, often take such things that are as particulate as its position, as a quantum property, as if controlled by a wave. The trick involved, for it can be nothing else, is that of all possible positions in a wave represented by the probability of the particle being there. And this is, of course, complete nonsense, both in the way it is presented and used by these scientists.

Unless, that is, you consider there to be an actual substrate, filling all of space, which is both affected by, and can in turn itself affect, the enclosed particle.

In previous work undertaken by this researcher all the various anomalies of the infamous Double Slit experiments were completely explained away by the assumption of the presence of such a universal substrate - at the time called

*Empty Photons*.
The idea of a substrate was far from a new supposition, it had at one time, been the consensus view. But a substrate was never detected, so the prior theoretical idea, known as The Ether, was permanently dumped as insupportable, despite the fact that James Clerk Maxwell, using his theoretical model of The Ether, derived a correct set of Electromagnetic Equations which are still used to this day.

Clearly, all the points made here must be addressed. In fact, this theorist suggests that the whole myth of superposition and multiple simultaneous states, was invented to explain the results of things such as Quantum Entanglement.

Now, the reader might wonder how scientists could be so influenced: for it runs counter to the basic materialist conceptions that are the key premises of Science. The actual reason for this is clear. They have abandoned Physical Explanation for Purely Formal Description. They are no longer physicists, for they have rejected the physical world - they are merely mathematicians!

Einstein's dismissal of Quantum Entanglement is encapsulated perfectly in his phrase:

There can, however, without recourse to idealism, exist a hidden universal substrate with wave-like characteristics, and a sort of symbiotic relation between that substrate and physical particles moving through it.

It is the impossibility of answering my question about the "entangled particles" measurement that precipitates this monstrosity of a theory! The counter to that position by de Broglie, and later by David Bohm, about so-called "hidden variables" did not solve it, as these features were never found, no matter how detailed was the study of the particles involved.

Clearly, all the points made here must be addressed. In fact, this theorist suggests that the whole myth of superposition and multiple simultaneous states, was invented to explain the results of things such as Quantum Entanglement.

Now, the reader might wonder how scientists could be so influenced: for it runs counter to the basic materialist conceptions that are the key premises of Science. The actual reason for this is clear. They have abandoned Physical Explanation for Purely Formal Description. They are no longer physicists, for they have rejected the physical world - they are merely mathematicians!

Einstein's dismissal of Quantum Entanglement is encapsulated perfectly in his phrase:

For such are most certainly, idealistic notions."The Universe is real - observing it doesn't bring it into existence by crystallising vague probabilities"

There can, however, without recourse to idealism, exist a hidden universal substrate with wave-like characteristics, and a sort of symbiotic relation between that substrate and physical particles moving through it.

It is the impossibility of answering my question about the "entangled particles" measurement that precipitates this monstrosity of a theory! The counter to that position by de Broglie, and later by David Bohm, about so-called "hidden variables" did not solve it, as these features were never found, no matter how detailed was the study of the particles involved.

What was really needed to attempt to explain the sub atomic world was a separate substrate, involving a reciprocal and recursive relationship between a particle and its context. For then, and only then, can we have a passage of

*time*between the initial influence, and then the recursive effect. The assumption of an intrinsic "Pilot Wave" meant simultaneous effects, but the role of a substrate as intermediary allowed this crucial delay.
It is the formal, and even stilted nature of the mathematical physicists' thinking, that draws them inexorably towards the Copenhagen myths, and unfortunately away from reality.

Niels Bohr's insistence that the Quantum States "explained" things that classical physics could not was false in the first part, while true in the latter condemnation. In fact neither approach could explain our observations. Bohr's position was

*descriptive*of certain forms, but not in the least bit explanatory. Forms do not explain! They can describe reality, but they don't even do that perfectly. All equations are descriptions of idealised forms, they are not even accurate descriptions of any natural part of reality, they are always approximations, simplifications. Those forms can then only be applied back on to areas of reality that we have carefully prepared, or farmed into experimental domains. Here lies the role of technology in all our investigations. The form's validity is then confirmed by successful use in these domains.
The battle between the two standpoints embedded in Science was never resolved, because both sides of the argument subscribed to the same belief - that equations represent reality

*as it is*- an obvious fallacy when you stop to think about it. Both the classicists (such as Einstein and de Broglie) and the new school mathematical-physicists (Bohr, Heisenberg et al) were completely wedded to*form*.
Even Einstein's Relativity, and Space-Time Continuum were crucially

*formal*ideas.
So, in spite of a small section of physicists refusing to embrace the Copenhagen Interpretation of Quantum Theory, these remnants (in the 1960s), after 30 years of argument, required a final means of settling the dispute. And for the Copenhageners John Bell's suggestion was a godsend.

But he did this using only the purest basic forms of Mathematics, to which both sides mistakenly subscribed. Bell used Set Theory, and its embodiment in Venn Diagrams to "do it".

Now here had to be the most inappropriate "proof" concerning anything in concrete reality, for it only dealt in idealistic laws, and this was to prove what reality really was, and to do it by this means alone!

Bell used this method to construct a set of inequalities which could be clearly demonstrated in Venn diagrams, and as such, he had to be handling

*fixed*things: no qualitative modifications or evolution of those forms could take place, as it was impossible by such means. It would be accurate to state such a basis as the premises for Mathematics and Formal Logic only.
Bell used these as a basis for tests about reality. He used his Inequalities to set limits, and if in concrete reality they were clearly exceeded, then the claims of the opposing realists were "proved to be wrong", and Quantum Entanglement was proved correct.

Many will have noticed it was a proof which only really convinced the faithful! This kind of "proof"

*was*reality to them, it was their everyday modus operandi. But this gave the Copenhageners the result they required. The vast majority of physicists now claimed Quantum Mechanics victorious, and Realism finally defeated.
Bell had generated his required test using Formal Mathematics, and, as that was the "Essence of Reality" it simply must deliver a valid and correct test. But the actual conclusion of this method should be no, you cannot prove the nature of concrete reality solely by resorting to Formal Logic. Only other forms are provable solely by Mathematics. And only phenomena consistent with Formal Logic are provable by the methods of Formal Logic! Nothing else is possible in either case.

Nevertheless, though all experiments seemed to support the idea that Bell's Inequalities proved the Quantum Mechanical position to be true, the fact that it wasn't correct actually refused to go away. However, this recent Dutch experiment mentioned in New Scientist was supposed to settle the dispute forever...

The test was proved over many productions of Entangled pairs, and it was the statistics of the many runs overall result that delivered the required answer to Bell's Inequalities.

So, what had actually been achieved?

It was his formalisms that were proved correct!

He had suggested a test for his Formal reasoning, not for any feature of concrete reality.

Lots of so-called "loopholes" - all put forward by scientists who actually agreed with their opponents on the mathematics involved, turned out to be not only wrong, but entirely inapplicable. But as they came from the same camp in their attitude to the primacy of form, proving things in these loopholes was inappropriate anyway. They merely corrected their formal mistakes - absolutely nothing to do with concrete reality at all! All the Dutch group achieved was the defeat of their opponents on Formal Reasoning only.

Hanson lab at Delft University of Technology |

However, it is easily proven that by the means he used, Bell's Inequalities can only be used to address Ideality - the world of purely formal relations. They don't actually mean anything in concrete reality at all!

I concur that this condemnation of the precedence of Form over Content is still not enough to debunk these ideas. The most crucial principal in all such experimental investigations, both classical and Copenhagen school, is the cornerstone of all formalism and all analysis - the Principle of Plurality. This view of the world sees it as composed of many simultaneous natural laws, with different mixes of these happening in each and every observed situation. This can be drawn as distinct from Holism, which sees all things as inter-connected and inter-dependent, where Plurality sees only inherently separable component parts, which can always be deconstructed and analysed. Analysis can be made of any given situation, however complex, through isolation, simplification and control of those components, extracting the laws from the mix. Such methods are the basis of literally all scientific experiments.

This is all fine (and Science certainly wouldn't exist without such analytical methods), until erroneous assumptions are made about what this means - Plurality assumes that any law extracted in this way, is identical to that when acting in totally unfettered reality. And this is not true. In unfettered reality all "laws" are modified by their context. Realising this is the first step towards a Holist stance on Science. The objectivity of this position is confirmed by the fact that any law extracted by the usual method of farming and controlling a context for the experiment, can only be reliably used in that same context. The Pluralist view survives (and indeed thrives and dominates) because we are extremely adept at arranging for it, at controlling our environment, and this makes both prediction and production possible.

But, in theory, all reasoning using such laws as the actual components of reality, is bound to be wrong. Pluralist laws and techniques are pragmatically extremely powerful, but theoretically greatly misleading.

It isn't just specialisation that leads to scientific teams consisting of experimenters, theoreticians and technologists - all of these roles are actually differing standpoints, and all are essential to the Scientific process. But they will contradict one another! Disagreements are unavoidable, and dead ends are likely in many scenarios.

**Postscript**

This paper concentrates upon the underlying bases of the methods and reasoning used in postulating Quantum Entanglement. Despite the fact that I think this torpedoes Quantum Entanglement from the word go, QE forms the last line of defence for the regressive Copenhagen Interpretation of Quantum Theory, which must be defeated, so a job must be done on it!

I am currently working on a new body of work entitled

*, which I hope to publish as an extended issue of the Shape Journal in coming weeks...***Quantum Disentanglement**
## No comments:

## Post a Comment