The Entangled Universe article by Anil Anathaswamy in New Scientist (3046) tackles a range of supposedly connected ideas in current Sub Atomic Theory. But, as with that overall stance itself, he joined the increasingly accelerated rush into the mixture of facts, “Laws” and speculation that has become the norm in this confusing area.
Every suggested solution begets yet another “rule of thumb” - designed to enable some sort of regular paths through a limited area, and the overall description is of a plethora of such meta rules which alone defines what can and cannot be done.
Clearly, we are being guided through an alien land, and without the necessary signposts of Physical Ground, to resolve anomalies; we are forced to travel with a dependence upon local maps. You have no single theoretical stance, so you have to keep them all, and decide when and how to jump from one islet to the next!
It is an almighty mess – very like the proliferation of epicycles in the Ptolemaic version of the Solar System, It can give you useable answers but no coherent, consistent and overall Theory.
The Gordian Knot of invention must be severed with a goodly dose of Reality – but how?
Clearly this is easier said than done, and after a couple of re-reads and copious notes, I realised that attempting to follow Anathaswamy’s stepping stones between the various positions, would not clarify, but only confuse! I decided instead to write a series of separate papers- each one tackling a different bit of this messy article.
But it soon became a large response. I have written 13 short coherent papers each on a different topic, with a total length of some 6,000 words. But I still think it is the best way to deal with the New Scientist article as a helpful review.
Understanding Reality does not merely involve the ascending of a simple staircase of key discoveries. It is clear that there are innumerable floors (or levels) in this “Mansion of Truth". Nevertheless, it is always very difficult to find the way up to the next floor, for no staircases are immediately evident, and to ever find the presumed Grand Processional Stairway just isn’t possible – for it doesn’t really exist!
Of course, it is easy to convince yourself that everything you need to know and understand will be available on the current landing (Level), especially as the number of available doors (study areas) seems infinite, and, crucially, if one door turns out to be useless, you can always switch to the door opposite (the alternative arm of the Dichotomy?), and try that!
Mankind’s basic, and still universal, Pragmatism ensures the maximum possible is extracted from each level, but cannot go any further in a coherent and comprehensive way, without transcending that current level’s impasses.
However, new "hidden staircases" are becoming apparent with real potential for revealing a breakthrough.
One is the investigation into the inexplicable Quantum Entanglement, while another is certainly possible with Nuclear Decay. Finally, a third possibility could well be in the genetic role in Evolution – particularly involving the so-called Junk DNA. And, there are many more such quandaries that are beginning to reveal possible ways up!
The following papers, mostly from December 2015, are perhaps the most fruitful concerning the required General Holistic Approach.
This edition is a collection of essays and reviews on Quantum Entanglement, and why the concept is almost certainly nonsense. I have been working on picking apart these ideas for several months now, and a Special Issue is in the pipeline for early 2016, with the working title Entangled Universe, which will aim to debunk this area of “Physics” once and for all.
Until then I thought it would be a good idea to collate here my earlier attempts to tackle this mess, many of which have already been published on the blog and in previous issues, as an introduction to this work, and for many, an introduction to the concept of Quantum Entanglement itself.
This new issue is a kind of review of how far we have got in describing and assessing Man’s struggle to understand his world.
It has not been a straightforward history, for Man had had to literally change the game, in order to make any progress in understudying both his own context and, indeed himself. But in making that significant progress, it has been undoubtedly a heroic trajectory!
It is very important at this stage that a difference between Knowledge and Understanding be established. For, the latter was never an automatic development from the former.
To use the common description “Man has had to pull himself up by his own bootlaces” - or to use V. Gordon Childe’s appropriate title Man Makes Himself. Attempting to understand the world has not been at all easy, and perhaps surprisingly, has been predicated upon just how successful Man has been in more everyday tasks of survival and even prosperity. For, his basic general method was initially to grasp whatever was to his advantage, whatever that entailed, and gain himself both a measure of leisure and repose.
The brilliant ideas did not come first! For, it proved almost impossible to solve all the many problems of Mankind’s usual hunter/gatherer existence, including the many seemingly unavoidable and unbridgeable impasses in his contradictory development.
For well over 90% of human history, Homo Sapiens roamed the Earth in small family groups, his most sophisticated tool being a sliver of brilliantly knapped flint. Clearly, significant developments in his mode of life were impossible without large gains in that sphere. And while there were brief interludes during that long “childhood”, when he was able for a time to acheieve remarkable things - such as the cave paintings at Lascaux: they were brief and excenptional events. Something permanent in his means of life had to occur, to enable real and persisting gains.
It wasn’t until the invention and spread of agriculture and animal husbandry in the Neolithic Revolution that the developments in human understanding really took off. For instead of constantly living on the edge of survival, Man could then settle and gather in growing aggregations of people.
Even then the trajectory of developement was never smooth or incremental. Indeed, it was characterised by a series of “false leads” which enabled progress to be made, but which always, in the end, ground to a halt in yet another impasse.
So this brief foray attempts to trace out the subsequent paths, dead-ends, and hopefully the way forward, from where we have finally reached.
In the first instalment of Jim Al-Khalili’s series on BBC 4 entitled “Let There be Light! The Secrets of Quantum Physics", he tackles the long-standing argument between the position of the Copenhagenists and that supported by Albert Einstein, on what is termed Quantum Entanglement.
As is usual in this area of Physics, analogies are used to attempt to “solve” (though really only describe) crucial anomalies in Reality.
So, here Al-Khalili uses playing cards to represent what is supposed to happen with Quantum Entanglement at the sub atomic level. And, by a series of modifications, ends up with an experiment, using two simultaneously caused quanta of light, which are, “therefore, entangled”, and he looks at their polarizations, to see if the idea of entanglement is “correct”. But, he defines the test as being a dispute between the two contradictory explanations of a certain case of the phenomenon – one by the Copenhagenists, and the other by Einstein.
So, Al-Khalili asserts one must be right, and the other must be wrong! But, I have to insist, “Why should they be the only considered possibilities?”
The way Al-Khalili puts it, one answer proves the that the Copenhagenists are right, and that the phenomenon is totally inexplicable physically, while the other (Einstein’s) proves that the two photons’ properties were fixed when they were created, and no inexplicable link between the two would be necessary.
Al-Khalili uses his described Laser Set Up with photons, but insists that we see it in terms of his analogy with the playing cards, so how might he be misleading us? Can the playing cards change, or are they fixed? Clearly, we are persuaded that they cannot change, all by themselves – that would be magic – especially if the change was due to a measurement made elsewhere, at the other case. But this is also misleading us even more!
Our quantum entities are not playing cards that are fixed forever - they were created (in the more usually used example of Pair Production) modelled here by the split light into two photons (and considered to act in exactly the same sort of ways with regard to quantum properties), and the assumption of that creative process being the production of two massive particles from pure energy alone, is made without any chance of it being mistaken.
NOTE: We cannot continue such a discussion without questioning Al-Khalili’s many, quite definitely questionable assumptions. He refers to a photon, which we are to accept as a disembodied quantum of pure energy. Then, also, in the alternative argument, two particles can be created out of just such a high-energy photon. No possible substrate is assumed to be involved in these phenomena, and finally, in conclusion, that separated entities can be still instantaneously linked, no matter how far apart they get. These are not to be questioned. They are assumed to be totally unassailable. What do you think?
But, this theorist (Jim Schofield) sees the area very differently. The phenomenon of Pair Production is due to the dissociating of a known-to-be-physically-existing unit (not pure energy) in a universal substrate, made up of large numbers of these, each consisting of two mutually orbiting particles, of one electron and one positron, which can also hold and transfer internal quanta of energy by the promotion of that orbit. It has been observed in colliders as the Positronium - in it's stable state we call it a Neutritron. (By the way, this assumption also solves electromagnetic propagation through space, and all the anomalies of the full set of Double Slit Experiments – a supposed cornerstone of Al-Khalili’s set of assumptions embodied in the Copenhagen Interpretation of Quantum Theory). Finally, this alternative also stands upon very different holistic grounds, which means, “Everything affects everything else!” and also “Nothing is eternal!”
Theoretical particle - the Neutritron
So, if our Pair were linked (synchronised) at their joint point of creation, and, thereafter, were in-step-evolving from there on, then both Einstein’s and the Copenhagenists’ assumptions (both of which are entirely pluralist) and requiring both eternal laws and eternal entities – must be wrong.
Al-Khalili’s presented alternatives are not intrinsically opposite, so that one or the other must be the truth!
It is, on the contrary, an example of a classical Dichotomous Pair of concepts, due entirely to common, yet wrong, premises, by both sides of the argument. The contradictory pair was entirely due to their mistaken common premises (embodied basically Plurality for both sides).
And, as the philosopher Hegel clearly demonstrated, a sound critique, and then a necessary replacement of those erroneous premises, would remove the seeming contradiction, and allow the impasse to be transcended, and a consistent and better theory to be possible, while opening the door to further developments too.
Effectively, both sides of the argument were determined by the same errors, and hence no resolution would be possible without those common and false premises being removed and replaced with something closer to the truth.
Now, such alternative reasoning may sound to be something of a circuitous route, but it is far superior to the thing it replaces. Let’s face it; the premises of the Copenhagenists mean that certain things just cannot be explained physically, and we must not even try! And, as long as we have an overall, formal means of getting what we want, in a given situation, then we must be satisfied with that.
NOTE: The final part in the experiment to test Bell’s “thought to be final proof”, was that if there was NO built-in relation between them, then the overall results, in his analysis, would be “more than 2”, whereas if there was an in-built relation (as Einstein insisted) the overall results would be “less than 2”. But, this is really only testing between the two options proposed by the Copenhagenists and Einstein, and consistent with Formal Logic. Yet, with a non-pluralist, changing situation, that test would not be appropriate. The tenets of Formal Logic would NOT apply! The thinking is entirely pluralist, hence it must have the supposed, totally underlying laws – independent of context – the same happening in all circumstances – in fact they must be FIXED! Whereas, that will certainly not be the case at all – and the holist stance is bound to be be much closer to the truth than the pluralist.
As with all pluralist experiments, they are set up specifically to reveal a given pluralist law, and one, which, to the holist, is anything but that. It is, in fact, totally determined by the context of the experimental set up. It isn’t a fixed Natural Law at all.
Finally, the whole World of Formal Logic, of the Principle of Plurality, and Form as Cause is certainly mistaken. In the end, its laws are those of the World of Pure Form alone – Ideality, and NOT of the real subject of Physics – Reality.
And, to cap it all, the assumptions used were, at the time of their establishment, historically unavoidable.
Mankind didn’t come into the World already ideally equipped for such problems. They have has to develop them from scratch over millennia, and the posing of the problem in that way was the only thing they could do at the time.
What Bell's Inequalities are about, is not what is claimed. It is about formal descriptions of reality and not the material world itself.
In so-called Quantum Entanglement, the assertion is that in measuring one of an entangled pair of particles, it influences the quantum state of the other, even if they are a million light years apart, and it does this literally instantaneously (obviously much faster than the speed of light, at any rate).
For any ordinary mortals reading this, I must point out that the believers of this "magic" are sub atomic physicists - and this kind of drivel is pretty well par-for-the-course in those quarters.
However, when it comes to actually "confirming" this phenomenon, they must measure one entity and then the other, or even simultaneously to prove their case. My concern is, "How do the experimenters know a change has been made to the other one of the pair?" For, if you measured the first, then it would immediately influence the other member of the pair. Clearly there is a problem here.
Do they regularly measure them alternately or simultaneously to attempt to establish a pattern? Questions arise even for those who support the theory. How could you ever know what the undisturbed state of either one was?
You can't of course! So what do the "believers" say?
They insist that prior to measurement they are both simultaneously in "all possible states at once" until you actually measure one of them, which then forces it into a particular one of those possible states.
Such ideas recur throughout this theoretical stance: it is the basic myth of superposition once again! This concept states that a particle (before measurement) is simultaneously in all possible positions (like a wave), but with a fixed probability of being in each and every one. And, this remains the case until we measure its position, and by doing so, fix it into a single possible position.
Ryoji Ikeda
Now, though this is totally counter-intuitive (and most probably wrong), it does allow statistics to be used over a number of cases, and the statistically arrived-at answers do indeed match certain observations in reality.
The mathematicians make it all work by taking a Wave equation and associating probabilities to all possible points on the wave, which are interpreted as being probabilities that the particle is in each possible position.
Notice that this method cannot deal with the position of a single particle, but can give overall estimates of a whole group!
As a physicist myself (and one who was originally a mathematician), I have a name for such methods - I call them frigs! They are mere tricks. Such techniques are often used in Mathematics as clever ways of arriving at hard-to-find solutions to purely abstract equations.
So maybe you can see how they happened.
With this in mind we return to Quantum Entanglement - this totally counter-intuitive standpoint is described as having a before-measurement-particle only existing "as a fuzzy cloud of probabilities of all its possible states." And this is how they avoid the otherwise necessary infinite regress! Instead of an oscillation with each and every measurement, we are expected to believe that before measurement, such quantum entities are not in any particular state at all, but when measured an entity will suddenly be in a specific state, and its remote entangled partner will somehow be affected by this intervention too!
In other words, more generally, we can conceive of such things as particles, but nevertheless, often take such things that are as particulate as its position, as a quantum property, as if controlled by a wave. The trick involved, for it can be nothing else, is that of all possible positions in a wave represented by the probability of the particle being there. And this is, of course, complete nonsense, both in the way it is presented and used by these scientists.
Unless, that is, you consider there to be an actual substrate, filling all of space, which is both affected by, and can in turn itself affect, the enclosed particle.
In previous work undertaken by this researcher all the various anomalies of the infamous Double Slit experiments were completely explained away by the assumption of the presence of such a universal substrate - at the time called Empty Photons.
The idea of a substrate was far from a new supposition, it had at one time, been the consensus view. But a substrate was never detected, so the prior theoretical idea, known as The Ether, was permanently dumped as insupportable, despite the fact that James Clerk Maxwell, using his theoretical model of The Ether, derived a correct set of Electromagnetic Equations which are still used to this day.
Clearly, all the points made here must be addressed. In fact, this theorist suggests that the whole myth of superposition and multiple simultaneous states, was invented to explain the results of things such as Quantum Entanglement.
Now, the reader might wonder how scientists could be so influenced: for it runs counter to the basic materialist conceptions that are the key premises of Science. The actual reason for this is clear. They have abandoned Physical Explanation for Purely Formal Description. They are no longer physicists, for they have rejected the physical world - they are merely mathematicians!
Einstein's dismissal of Quantum Entanglement is encapsulated perfectly in his phrase:
"The Universe is real - observing it doesn't bring it into existence by crystallising vague probabilities"
For such are most certainly, idealistic notions.
There can, however, without recourse to idealism, exist a hidden universal substrate with wave-like characteristics, and a sort of symbiotic relation between that substrate and physical particles moving through it.
It is the impossibility of answering my question about the "entangled particles" measurement that precipitates this monstrosity of a theory! The counter to that position by de Broglie, and later by David Bohm, about so-called "hidden variables" did not solve it, as these features were never found, no matter how detailed was the study of the particles involved.
What was really needed to attempt to explain the sub atomic world was a separate substrate, involving a reciprocal and recursive relationship between a particle and its context. For then, and only then, can we have a passage of time between the initial influence, and then the recursive effect. The assumption of an intrinsic "Pilot Wave" meant simultaneous effects, but the role of a substrate as intermediary allowed this crucial delay.
It is the formal, and even stilted nature of the mathematical physicists' thinking, that draws them inexorably towards the Copenhagen myths, and unfortunately away from reality.
Niels Bohr's insistence that the Quantum States "explained" things that classical physics could not was false in the first part, while true in the latter condemnation. In fact neither approach could explain our observations. Bohr's position was descriptive of certain forms, but not in the least bit explanatory. Forms do not explain! They can describe reality, but they don't even do that perfectly. All equations are descriptions of idealised forms, they are not even accurate descriptions of any natural part of reality, they are always approximations, simplifications. Those forms can then only be applied back on to areas of reality that we have carefully prepared, or farmed into experimental domains. Here lies the role of technology in all our investigations. The form's validity is then confirmed by successful use in these domains.
The battle between the two standpoints embedded in Science was never resolved, because both sides of the argument subscribed to the same belief - that equations represent reality as it is - an obvious fallacy when you stop to think about it. Both the classicists (such as Einstein and de Broglie) and the new school mathematical-physicists (Bohr, Heisenberg et al) were completely wedded to form.
Even Einstein's Relativity, and Space-Time Continuum were crucially formal ideas.
So, in spite of a small section of physicists refusing to embrace the Copenhagen Interpretation of Quantum Theory, these remnants (in the 1960s), after 30 years of argument, required a final means of settling the dispute. And for the Copenhageners John Bell's suggestion was a godsend.
But he did this using only the purest basic forms of Mathematics, to which both sides mistakenly subscribed. Bell used Set Theory, and its embodiment in Venn Diagrams to "do it".
Now here had to be the most inappropriate "proof" concerning anything in concrete reality, for it only dealt in idealistic laws, and this was to prove what reality really was, and to do it by this means alone!
Bell used this method to construct a set of inequalities which could be clearly demonstrated in Venn diagrams, and as such, he had to be handling fixed things: no qualitative modifications or evolution of those forms could take place, as it was impossible by such means. It would be accurate to state such a basis as the premises for Mathematics and Formal Logic only.
Bell used these as a basis for tests about reality. He used his Inequalities to set limits, and if in concrete reality they were clearly exceeded, then the claims of the opposing realists were "proved to be wrong", and Quantum Entanglement was proved correct.
Many will have noticed it was a proof which only really convinced the faithful! This kind of "proof" was reality to them, it was their everyday modus operandi. But this gave the Copenhageners the result they required. The vast majority of physicists now claimed Quantum Mechanics victorious, and Realism finally defeated.
Bell had generated his required test using Formal Mathematics, and, as that was the "Essence of Reality" it simply must deliver a valid and correct test. But the actual conclusion of this method should be no, you cannot prove the nature of concrete reality solely by resorting to Formal Logic. Only other forms are provable solely by Mathematics. And only phenomena consistent with Formal Logic are provable by the methods of Formal Logic! Nothing else is possible in either case.
Nevertheless, though all experiments seemed to support the idea that Bell's Inequalities proved the Quantum Mechanical position to be true, the fact that it wasn't correct actually refused to go away. However, this recent Dutch experiment mentioned in New Scientist was supposed to settle the dispute forever...
The test was proved over many productions of Entangled pairs, and it was the statistics of the many runs overall result that delivered the required answer to Bell's Inequalities.
So, what had actually been achieved?
It was his formalisms that were proved correct!
He had suggested a test for his Formal reasoning, not for any feature of concrete reality.
Lots of so-called "loopholes" - all put forward by scientists who actually agreed with their opponents on the mathematics involved, turned out to be not only wrong, but entirely inapplicable. But as they came from the same camp in their attitude to the primacy of form, proving things in these loopholes was inappropriate anyway. They merely corrected their formal mistakes - absolutely nothing to do with concrete reality at all! All the Dutch group achieved was the defeat of their opponents on Formal Reasoning only.
Hanson lab at Delft University of Technology
However, it is easily proven that by the means he used, Bell's Inequalities can only be used to address Ideality - the world of purely formal relations. They don't actually mean anything in concrete reality at all!
I concur that this condemnation of the precedence of Form over Content is still not enough to debunk these ideas. The most crucial principal in all such experimental investigations, both classical and Copenhagen school, is the cornerstone of all formalism and all analysis - the Principle of Plurality. This view of the world sees it as composed of many simultaneous natural laws, with different mixes of these happening in each and every observed situation. This can be drawn as distinct from Holism, which sees all things as inter-connected and inter-dependent, where Plurality sees only inherently separable component parts, which can always be deconstructed and analysed. Analysis can be made of any given situation, however complex, through isolation, simplification and control of those components, extracting the laws from the mix. Such methods are the basis of literally all scientific experiments.
This is all fine (and Science certainly wouldn't exist without such analytical methods), until erroneous assumptions are made about what this means - Plurality assumes that any law extracted in this way, is identical to that when acting in totally unfettered reality. And this is not true. In unfettered reality all "laws" are modified by their context. Realising this is the first step towards a Holist stance on Science. The objectivity of this position is confirmed by the fact that any law extracted by the usual method of farming and controlling a context for the experiment, can only be reliably used in that same context. The Pluralist view survives (and indeed thrives and dominates) because we are extremely adept at arranging for it, at controlling our environment, and this makes both prediction and production possible.
But, in theory, all reasoning using such laws as the actual components of reality, is bound to be wrong. Pluralist laws and techniques are pragmatically extremely powerful, but theoretically greatly misleading.
It isn't just specialisation that leads to scientific teams consisting of experimenters, theoreticians and technologists - all of these roles are actually differing standpoints, and all are essential to the Scientific process. But they will contradict one another! Disagreements are unavoidable, and dead ends are likely in many scenarios.
Postscript
This paper concentrates upon the underlying bases of the methods and reasoning used in postulating Quantum Entanglement. Despite the fact that I think this torpedoes Quantum Entanglement from the word go, QE forms the last line of defence for the regressive Copenhagen Interpretation of Quantum Theory, which must be defeated, so a job must be done on it!
I am currently working on a new body of work entitled Quantum Disentanglement, which I hope to publish as an extended issue of the Shape Journal in coming weeks...