Showing posts with label Copenhagen Interpretation of Quantum Theory. Show all posts
Showing posts with label Copenhagen Interpretation of Quantum Theory. Show all posts

02 February, 2020

Issue 68 of SHAPE: Susskind's Universe






This latest edition of SHAPE Journal tackles Cosmology, the philosophy of Mathematics and its deleterious effects on modern Physics. It does so through a critical response by this author to several lectures by leading physicist Leonard Susskind - but why single him out in particular?

Susskind is professor of theoretical physics at Stanford University in California. Stanford is a private University and is regularly ranked one of the top three universities on earth, employing the very top academics in their fields. For this reason alone Susskind is a key physicist to tackle - he is also considered one of the fathers of String Theory.

As well as this key contribution to Sub Atomic Physics he brings in many other areas of interest such as Cosmology - and presents himself as something of an all round science expert. His vast series of lectures on YouTube are a vital outlet for the latest ideas in contemporary physics based on the flawed assumption of the Copenhagen Interpretation of Quantum Theory. As part of my continuing attack on the latter I felt the need to take down one of the leaders of this field, and Susskind fit the bill perfectly.

In the infamous Smolin–Susskind debate, Susskind’s argument and support for the “anthropic principal” tells you everything you need to know about his quasi-religious idealism - encapsulated in the words of Brandon Carter: “The universe must be such as to admit the creation of observers within it at some stage. To paraphrase Descartes, I think, therefore the world is such.”





Susskind, for me, epitomises all that is wrong with science today. Susskind and his like are responsible for the ruination of the subject via their Pluralism and rampant Idealism. In his unapologetic support for Mathematics as the language of the Universe, Susskind entered my sights as a key target in the war against Pluralist science.

26 April, 2019

Copenhagen is Wrong!





The house of cards that is Quantum Theory is really starting to fall...

And now Lee Smolin is on-side.

In a recent Perimeter Institute Lecture, Smolin delivered the trenchant view that the Copenhagen Interpretation of Quantum Theory is in fact wrong, and I agree with him!




However, while his Realist arguments were indeed correct, and Copenhagen Theory is totally Idealist: that, I'm afraid that will never be enough.

For, in spite of Hegel's profound and transforming criticisms of the Plurality of both Formal Logic and its use in Science, almost 200 years later his improvements have still rarely been applied to either. The Copenhagenists will never relinquish their theory, for they don't even know why it arose, due to profound errors in its premises, and also because in all the circumstances in which they use it - it certainly does work - pragmatically! But, even then, it also never explains why.

For Explanation, as the primary purpose of Physics, has now been totally abandoned. Instead, this so-called Theory gives the right quantitative answers, in highly-constrained circumstances only. And, in doing that, it is entirely consistent with what Modern Physics has now become - an extension of Mathematics.

For, henceforth, it can never explain why things happen the way that they do, but can only "match" how it works purely pragmatically. It is content to be only technologically-useful, amd hence explains absolutely nothing!

Now, of course, its supporters would all totally disagree with this, because of the pseudo-philosophical inventions that the originators inserted, in order to make it look like an "Explanatory Theory" - namely, via their cobbled together Equation, which imports illegitimate probabilities into a basic Wave Equation. And then replaces all Particles, at what they term "The Quantum Level", with Wave/Particle Duality - delivering the alternatives of using the wave-like Equation, until the entity suddenly became, once more, a descrete Particle, and they could term the transition "The Collapse of the Wave Equation", when it resumed its Particle Form.

It is, of course, NO kind of Legitimate Scientific Theory: it is a clever fix to get around contradictory behaviours, that their prior premises just could not cope with. And, that was indeed the case: the prior theories just didn't work at these levels: they were inadequate, it is true!

But, literally all the theories in the past were also inadequate, but were re-investigated to review the premises assumed, and usually, they, in the end, would finally arrive at something better! By the early 20th Century, however, a whole series of dramatic and debilitating results had begun to be revealed that undermined the prior assumptions.




The Michelson/Morley Experiment had dismissed the existence of the Aether as filling all of Empty Space with a Universal Substrate.

And, Henri Poincaré and Ernst Mach, finding ever more physical premises that seemed to be wrong, proposed Empirio Criticism, a form of Positivism, as an Amalgam of Explanations and Mathematical Equations, which together, they believed, could deliver what was needed.

The Piste was set: and it was all downhill from there to finally dumping Explanations altogether!

For, nobody demurred at the conclusions from the Michelson/Morley Experiment, so Wave-like phenomena, at the Sub Atomic Level, could no longer be ascribed to that now dispensed-with medium: instead, somehow, such effects had to be embedded into what Particles actually were themselves!

The key experiments were those involving Double Slits, which became the touchstones for many of the new discoveries. But, the Wave/Particle tricks which were instituted to explain the many anomalies, are very easily removed by assuming an undetectable Universal Substrate in those experiments, and Substrate Theory immediately dispensed-with every single one of them.

Now, though obviously insufficient by itself, this theoretical exercize could not be ignored! For, it demonstrated that classical Waves were somehow involved: the Wave/Particle Duality invention was a frig! And distinct Waves and Particles were, somehow, still intrinsically involved.

Two parallel lines of theoretical research were undertaken.

The First was to investigate the possibility of such an Undetectable Universal Substrate - composed entirely of pairs of mutually-orbiting Leptons of diametrically opposite properties - thus delivering the required passive-undetectability, while at the same time allowing that Substrate to be affected-by those interlopers, while also delivering the subsequent affecting-of those very same entities, in differing, later circumstances.

It must be stressed that the objective here was a theory-first investigation: just as James Clerk Maxwell had used in his Analogistic Model of the Aether, which was the Basis for his still universally renowned Electromagnetic Equations!

[Indeed, there is the very sound point also, that theory-first investigations actually avoid the inevitable pluralistic aberrations of all data-first investigations, using the now Standard Scientific Experimental Method of directly constrained and maintained contexts]

And, as Frank Wilczec has recently insisted upon with "The Materiality of the Vacuum", the above theoretical Analogistic Model is not without foundation, even if his composition of that "undetectable Universal Substrate" differs from that used here.





James Clerk Maxwell's Analogistic Model, has long gone, BUT its use was justified by its delivery of the Electromagnetic Equations!

Now, the Second line of theoretical research was somewhat akin to Maxwell's - by assuming such a substrate with the New Model's composition, could all the anomalies that precipitated Copenhagen be physically explained instead?

Now to achieve this objective, the initial simple definition in terms of Electron/Positron, mutually orbiting pairs, had to be extended to also involving Units composed of Taus and Muons, and even Neutrinos, but a rich and successful Substrate of Leptons was devised, and the inventions of Copenhagen physically replaced!

BUT, of course, this is just like Maxwell's Aether Model, the detailed content of the Analogistic Model of the Universal Substrate will, indeed, be wholly replaced too!

This researcher, like Maxwell, knows very well that all our theories are never the Absolute Truth, but at best, contain sufficient Objective Content to deliver more than what they replace.

Criticisms of the New Model are not only legitimate, but absolutely necessary. Yet, criticisms that exist only in order to re-establish Copenhagen are most certainly NOT! Copenhagen is a dead-end, and new ideas and models are now required to replace it and push Physics into new territory.

Implicit in the new approach was a root and branch critique of the premises, and even basic amalgamated philosophical stances, underlying modern physics. Succintly, Copenhagen is Pluralist, while Reality is Holist!

And Science should be materialist, while Copenhagen is certainly idealist!

By all means improve upon the New Physics, but leave Copenhagen where it derserves to be - Dead and Buried!


 



For more on burying the Copenhagen Interpretation, please read the Special Issues of SHAPE Journal above, and look out for our forthcoming 10 Year Anniversary Issues on Substrate Theory.

20 January, 2019

The Ghosts in a Ghost Substrate


A Hauntograph by Michael C Coldwell
A Hauntograph by Michael C Coldwell

The following is a quote from an article in New Scientist (3205) called How a ghostly, forgotten particle could be the saviour of physics, and I extract it here long before the writer gets to revealing which particular particle she is telling us about. For they are indeed legion in the standard Copenhagen approach, and also because there is an alternative theoretical stance which has proved that it can cope very well indeed with such problems, but which is currently anathema in consensus Sub Atomic Physics.


"THIS is the story of a particle that has refused to die. For 50 years, it has haunted particle physics, with hints of its presence appearing in maddeningly ambiguous ways. Some believe they have seen it. Others think it is a figment of our imagination. But every time we think it is definitely not there, a sudden gust of wind knocks over the furniture and once more there is confusion." 


Ghosts in New Scientist 3205

 Now, the most fleeting of particles actually occur at the very heart of the Copenhagen Interpretation of Quantum Theory, indeed they involve absolutely all-of-those included in the concept of Wave/Particle Duality, wherein such entities can sometimes act as classical Particles, while at others, act as if subject to their own intrinsic Wave, which somehow determines, but does not reveal, where they are!

The reason for such remarkable behaviour is never explained, but instead is "put down to" Heisenberg's Uncertainty Principle, which makes the Sub Atomic Realm very different from the rest of Reality, by being totally indeterminate.

Yet, the suggested physical alternative to Copenhagen, achieves a general resolution of all its founding anomalies present in the ill-famed Double Slit Experiments, merely by including a Universal Substrate in the situations.

Though, it has to also be undetectable!

Now, I bring it up here, because just such an Undetectable Universal Substrate has been theoretically-defined, solely in terms of mutually-orbiting pairs of diametrically-opposite Leptons. And, several crucial Units of that Substrate involve the very Neutrinos that are considered in this article.

So, let us investigate exactly how the presence of just such a Substrate affects phenomena occurring within it.

For then, with such a Substrate, the Wave/Particle Duality construct of Copenhagen dissolves instead into a classical Particle interacting with the Substrate to produce Waves, which are then transformed by passing through the Slits and, thereafter, interfering with each other, to then affect the slow-moving, causing Particle, when it finally arrives at the caused Interference Pattern, in that Substrate!





And, thus, such delayed interlopers having been affected by their differing particular passages through that changed Substrate, to finally produce, as Particles, the overall pattern on the detection screen.

Waves are rendered once more properties of extended, connected Substrates, when disturbed by a Particle. which scamper ahead and ultimately produce an interference pattern, that then affects its own much slower-moving cause.

And, the required undetectable Units for such a Substrate, are indeed possible via mutually-orbiting pairs of diametrically-opposite, composing sub-units.

You only have to consider Pair Production and Pair Annihilations, each involve one electron and one positron! For, we are informed that such processes convert Energy-to-Matter and Matter-to-Energy! How??

How about changing from and to mutually-orbiting-pairs instead? The pairs cannot be detected and so the particles appear to vanish or appear, as if from nowhere!


Ghost Particles by Michael C Coldwell
Ghost Particles by Michael C Coldwell


For, such have actually been observed to occur at Fermilab, in the Tevatron, and were named there as Positroniums. And, as it happens, all the other Leptons could also be so linked too!

Aren't such ghost particles usually called Photons - meaning disembodied Energy gobbits?

And, it makes more sense than electrical and magnetic sinusoidal oscillations, acting at right angles to one another, and carrying Energy, supposedly in totally Empty Space - not to mention turning into physical matter and antimatter particles spontaneouly! How all this is meant to happen has never been explained.

In this alternative model it is. The full nature of electromagnetism is indeed encapsulated in an orbiting charged Particle, including the involved Energy, which occur in such mutually-orbiting pairs!

Now, I am well aware of the consensus stance, but reject it both philosophicallyand physically, for I consider such a purely Maths-based stance to be merely an idealist formal construct: indeed a wholly pluralist complexity, embodying all the premise-errors due to an insistence upon Eternal Natural Laws, and also to the total omission of all qualitative changes from all of Mathematics, as well as all consequent non-dialectical Reasoning too.

Though, the complexity of their multi-dimensional Mathematics effectively hides it, their involved philosophical stance is an illegitimate amalgam of several wholly contradictory stances, apparently justified only by the immature Pragmatism of "If it works, it is right!"

The problems outlined in this piece can be transformed, once a Universal Substrate of the kind outlined above is involved, and not only due to their actions as parts of that Substrate, but also as occasionally, temporarily free-moving units dissociated from their usual Substrate roles.

For example, the theoretical Substrate research has revealed by the discovery of several different modes of those units, both as transformed Substrates, and as free-moving Streams and Vortices - the latter allowing a totally non-Copenhagen explanation for quantised electron orbits in atoms.
 

"The bad news is that the latest round of experiments set up to look for it claim that it can’t possibly be there."


But, consider again the theoretical research - when particles can exist in three different modes as :-

  1. Free-moving as mutually-orbiting pairs
  2. Existing as a part of the Universal Substrate
  3. Free-moving as parts of a dissociated Substrate Unit

For then the described anomalies will be due to the transfers in-and-out of being Substrate Units, and in-and-out of being mutually orbiting Pairs!

But then:

"As long ago as the 1960s, however, physicists measuring the quantity of electron neutrinos reaching Earth found a major shortfall, with one experiment detecting only 25 per cent of the expected number."


Clearly, with a totally space-filling Universal Substrate composed in part of just such Particles, the capture of some of that Solar Stream into the all-pervading Substrate composed, in part, of the same kind of Particles, seems more likely than not!

The difference between a Totally Empty Space, and one filled with a complex Universal Substrate, surely has to be colossal. Indeed, not only would such a Substrate allow the explanation of many phenomena, but it would turn a vacuum peopled with only colliding particles, into a maelstrom of turbulences, propagations, transfers and Energy.

But,

"Rather than being massless, each neutrino did in fact have a tiny amount of mass, no more than a millionth that of an electron. This mass gives neutrinos a remarkable ability to switch between flavours"


Now, all-known-Neutrinos have been integrated, via mutually-orbiting Pairs, into my Universal Substrate, and designated as potential Gravitons or alternatively as a similar, though-much-tinier Photon, so clearly, in addition to the usual releases and captures to-and-from the Substrate, and even the regular dissociations and re-associations of pairs to-and-from the individual components, while the mini-photons will also be literally everywhere.

Thus, such a Substrate area, with is multiple processes, could very easily be misinterpreted as described in the above quote!

"That meant electron neutrinos produced in the sun’s core could transform into either muon or tau neutrinos, evading our searches on Earth."

OR, as I hope has become clear, a Universal Substrate, containing all these kinds of neutrinos, in abundance, will undergo dissociations due to the surges from the Sun, so enabling their detection as part of that Solar Stream. Not an oscillation between different types at all!

And the article goes on to mention many other anomalies - all succumbing to the same explanation as I have outlined above. I will not list them all!

But clearly, they all assume that these happen in totally Empty Space - either occurring naturally "in Space", or artificially maintained as in all pluralist experiments. But, clearly in neither situation is the undetectable Universal Substrate actually removed! It is always there, but in different states, perhaps.

However, once you have embraced the infinite variety of Ideality, and causal explanations are no longer open to you, instead you get this type of thing -

"the idea is to invent a fourth, “sterile” flavour of neutrino capable of shape-shifting into any of the other three."


Need I say more!

Now, to those who demand full explanations from myself, as the dissenter, may I say that I am just a theorist, and the task, if it is considered to be worthwhile by other Physicists, is surely to find ways of experimentally investigating my suggested undetectable Universal Substrate.

But, I'm afraid this will be impossible pluralistically!

The whole stance of current Physics prohibits such a task, which involves a complete revolution both philosophically and experimentally.

Finally, having read this article more than once, it has to be said that it reeks of Modern Mathematics, as it must, of course, if the only place to look for "reasons" is Form, therefore redirecting the focus totally from concrete Reality's Properties & Causes to Ideality's infinite Forms.

Notice how even experiment has become subordinated to confirming to the Mathematics!

09 November, 2018

A Crucial Turning Point


Turning Point by Philip Johnson


When major redefinitions are both necessary and difficult


After almost a decade, as a full time writer upon Science and Philosophy, it was becoming increasingly clear that the wherewithal to fully address all the regularly-occurring, and clearly-evident problems, was still not yet sufficiently defined to enable me to "Make the Necessary Turn!". I had published almost 1,000 papers, and written a further 1,700, and most were sound in reasoning. But the core objective, which had gradually become clear, was still far from having been coherently presented as an integrated whole.

It would have to include a total demolishing of the Copenhagen Interpretation of Quantum Theory, in Sub Atomic Physics, and also a much wider-ranging definition of a fundamentally different philosophical Stance, which would have to be Holistic.

The difficulty was, and still is, that literally all intellectual disciplines since the Ancient Greeks have been entirely Pluralistic, and have become intrinsically imbued in all the available and acceptable methodologies for all of my professional career in Universities in three different countries. Indeed, I had usually succeeded in academia largely because I could do the pluralistic stuff, and would still unavoidably commence any new problem via the old ways first.

But, so many attempts have come-to-a-stop resulting in classic Hegelian contradictions, and their unavoidable impasses, that I had to attempt to take his Dialectical Logic much further than Hegel had ever managed to do, and even Marx had not developed his Dialectical Materialism deeply-and-profoundly enough to tackle many Scientific Conceptions and Reasoning, such as occur in Mathematics and every single one of the Sciences.




As a competent mathematician, I naturally read Marx's Mathematical Manuscripts, but they did not help, so I had to begin to address the problems myself.

I soon began to unearth holistic truths within my usual professional researches into providing Computing assistance to a wide range of professionals studying in many very different disciplines. The Key Revelations finally occurred in addressing real, complex and expressive movements in Dance, in order to enable both the teaching of accurate Performance, as well as that of Choreography too. And, ultimately, I devised the first effective way of successfully delivering what the Dance expert required, with appropriate Access-and-Control, using computer controlled Multimedia of recorded resources.

Surprisingly, the main problems were caused in the very same way that both Zeno and Hegel himself had noted in dealing with Movement, and the latter had even begun to understand, and then concentrated upon, but, with him being an idealist, the sort of impasses I came up against were not what he would ever have encountered, never mind tackled.

Though this work certainly greatly enabled my necessary assault upon the required Philosophy to address Reality-as-is, there was still another major problem to do with the premises underlying any Coherent, consistent and comprehensive methodology in pursuing such a path.

The problem resided in the omission of an absolutely vital premise in Physics!

Towards the end of the 19th century, Michelson and Morley had conducted an experiment which "proved" that there was NO Universal Substrate involved in the propagation of light - no Ether, as it was then called, and a whole significant branch of Physics based upon the assumption that such a Substrate existed was dropped as untenable.

It most certainly couldn't be detected, but the formulae based upon assuming that it did still worked.


Maxwell's theoretical aether


Indeed, James Clerk Maxwell had devised his brilliant set of Electromagnetic Equations developed upon his now-discredited Model of the Ether, so I had to re-institute the Universal Substrate as an existing, but currently undetectable entity, and applying that concept alone, to all the Double Slit Experiments, enabled me to dispense entirely with all anomalies of those experiments, and all the assumptions of Copenhagen that had been devised to deal with them altogether.

But, that wasn't all that was required! There just wasn't a useable Holistic Methodology in Science at all. There was, however, a kind of holistic stance, but applied with a pluralistic Logic.

Clearly, I should explain.

Science, since the Ancient Greeks, had been an Amalgam of several contradictory stances, but always using the still-agreed-to basic tenet of Pragmatism - "If it works, it is right!", those involved in dealing with such things, simply switched between stances, until they found one which worked.

So, a "holist-component", within this approach, attempted to explain phenomena entirely in terms of the known properties of the components involved (it was that aspect which persuaded me to become a physicist). But, also severely modifying that sound stance, was the universally adopted Principle of Plurality. which insisted that all causative Laws were wholly independent of one another: they were fixed- everything could be explained merely in terms of the summations of eternal Natural Laws.

So, clearly, this severely disabled that stance from effectively coping with a clearly holistic and developing Reality. And, finally, those laws could be "correctly-encapsulated" in the Perfect Formal Equations of Mathematics - which is, of course, Idealism!

Now, believe it or not, this Amalgam was considered to be "The Scientific Method". It was neither consistent nor coherent, but it appeared to be "Comprehensive" due solely to the ubiquitous Pragmatic Tenet.

And Holism without Plurality and Pragmatism seemed to be totally impossible to apply, as its tenet was "Everything affects everything else!": and the key unanswered question was "How?" Just how, and in what ways, did the many affecting factors change one another?

Hegel called his solution to a complex changing reality, Dialectics, but he was only concerned with Thinking. Marx recast it as Dialectical Materialism, but never comprehensively addressed Science.

And, here's the rub, not only did Science need Dialectical Materialism, BUT Dialectical Materialism also needed Science.

And all this still had to be systematically addressed. Literally nothing had been done!

So here goes.......


29 October, 2018

Heisenberg’s Uncertainty Principle and an Undetectable Substrate




This short piece does not stand alone!

It represents a very late, yet crucial, stage in a major philosophical and physical critical assault upon the Copenhagen Interpretation of Quantum Theory, but from a steadfastly materialist standpoint, involving a very different philosophical position, and also the inclusion of a currently undetectable, yet fully-defined and explained Universal Substrate. All the technical questions have been dealt with elsewhere, but there still remains one last piece of the jigsaw -

Heisenberg's Uncertainty Principle!

The purpose of this essay is to debunk Heisenberg's excuse for the Copenhagen stance, which dispenses with the Classical assumptions about Reality, but only within the special Sub Atomic Realm, where he insists determinism no longer applies, and only an assumption of indeterminism allows Mankind to deal with the phenomena we find there.

And, consequently, in such circumstances, NO Causal Explanations were possible, and the only methods capable of delivering anything useable were Statistics and Probability.

But, this opponent of Copenhagen, having already managed to theoretically explain many currently "physically-inexplicable phenomena", by assuming the universal presence of a currently existing, yet passively-undetectable Substrate, which can be, both affected-by and affecting-of, any encountered physical entities, and thereafter, even widening that body of explanations, both extensively and successfully - it suddenly struck me how-and-why the Copenhagenists get away with their entirely formal descriptions.

The reason is Heisenberg's Uncertainty Principle coversfor the incorrect omission of the Universal Substrate as an absolutely crucial premise, within the Sub Atomic Realm in Physics!

What does the Copenhagen Interpretation smuggle in to even make a formal description possible?

It is, of course a Wave Equation!

And, where do such equations usually apply?

They apply to phenomena in Media!






Certainly, the presence of such a Universal Substrate cannot currently be detected, so, in spite of its omission causing innumerable problems, it was still dropped permanently as a necessary premise.

Now, local incidents can cause non-local (extended) effects in such substrates! And, in addition, such effects can then affect not only local entities, but also by propagating to wide areas of the substrate, hence affect more distant entities.

They can even affect the very entities which originally caused them - in reflected-and-recursive interactions, as in the Double Slit Experiments, and in various kinds of resonance.

There is new evidence to consider from current Very Low Temperature Physics - soon to be imminently extended to Gravity-Free conditions in the Space Laboratory in orbit around the Earth - plus it is also abundantly clear from my own researches into Substrates (composed of undetectable joint-particles) that these are not only several in number, but also diverse in their achievable aggregate Phases, presenting very different conditions and possible phenomena to traversing interlopers.

How on earth do you deal with such influences with NO detectable substrate?

Physically, you can't!

So what did they do?

By using forms derived from Mathematics, and previously used with phenomena in observable substrates, you can, with difficulty, also FIT-UP-TO real data just such such formulae, even with no detectable Substrate, BUT never deterministically!

All sorts of workarounds are necessary, both formally and philosophically, to achieve, and then use, these formulae. The formal tricks are no problem, as scientists have been using such rigs throughout their History. But, the philosophical contrivances are more difficult, so the New View would have to take on Philosophy - a very well established discipline! They had to remove, "physically", the bases assumed by the philosophers. And, this was achieved via the Heisenberg Uncertainty Principle - for "At the bottommost levels determinism no longer holds, indeed, a kind of indeterminism holds instead."

Plurality strikes again!

The very principle, which, along with Pragmatism, allowed all the many contradictorily-established disciplines to exist simultaneously, was again brought to bear in this incredible anachronism.

Now, quite apart from the assertions being made here, the whole Philosophical Basis, for the usual range of intellectual disciplines, has already been established by this philosopher-physicist, in his analysis and description of the whole trajectory of intellectual development of Mankind, from their Hunter/Gatherer beginnings, to the present time, which have made various past and present Amalgams of contradictory premises appear "legitimate" via that usual banker premise of Pragmatism - "If it works, it is right!", which, of course, does no such thing, though it can deliver a workable basis for technological gains and productions.

Indeed, Philosophy itself is also one of these disciplines, whereas it should be the measureof them all, and provide the means for dismantling such false separations of disciplines, intellectually at least.

But, the Amalgam of contradictory premises underpinning all the Sciences, was unavoidably adopted, historically, as the only way Mankind had discovered to Control-and-Use many contradictory aspects of Reality to their benefit. It was suggested, initially, by Abstraction, which allowed some sort of discussion of things, achieved by both simplifying and naming them, and with the increasing socialisation of Human beings via the Neolithic Revolution, and the change in their mode of Life, primarily to staying in one place and Farming. For, then Abstraction began to be used in Descriptions, by simplifying observed shapes into Perfect Forms, and studying those in place of their naturally occurring sources: they began to idealiseas well as simplify, and both of these greatly increased what could be done in studying them, and what might be done with them.

From this we arrive at Euclidian Geometry, and the developable power of its Theorems and Proofs. This became a kind of standard for all other intellectual disciplines, and in particular, for Formal Logic, and therefore all the others in which Reasoning was applied too.

But the Mathematics into which Euclidian Geometry grew, was also entirely pluralistic - in that all entities involved were assumed to be separable, and always exactly the same - that is totally unchanging qualitatively - indeed they were considered to be eternal!

All of these disciplines were hamstrung by this totally false limitation.

Now, this imposition of Plurality onto all of these disciplines, including their common Lingua Franca - Formal Reasoning, made absolutely certain that they would always be limited to situations in which nothing ever changed in any profound or qualitative way, so when applied to anything real, it would necessarily only apply to stable situtations involving such things. So, anything involving real development, would necessarily be totally excluded.

This was a crippling restriction, so when Science began to be developed upon the exact same basis, such a Principle implied that no Natural Law (which would necessarily be eternal) would be affected by any changed context. And this tenet both severely handicapped, and yet also enabled a warped-version of Science for many centuries!

It hamstrung it by banning all Qualitative Change to any extracted Laws.

And, it enabled a version of it, as long as the severely-constrained Context, necessarily arranged for to get such Laws extracted in the first place, was identically replicated for its subsequent effective use.

In addition, this also meant that though such "eternal Laws" could be effectively and productively used, they were not those acting in allcircumstances, but only those in the single contexts that alone validated its use. What is generally termed Classical Physics, was actually entirely so crippled, that it should have been termed PluralistPhysics, usable only in very limited constrained circumstances, so that any supposedly General Theory based upon that Law, would always be wrong. And, thus all findings would be both simplified, and also idealised, by taking a pluralist mathematical formula, and fitting it up to the data collected from that pluralist single situation.

Theoretically, as in generating an explanation, that formula would also be wrong: it could be legitimately be used pragmatically, but never theoretically. Indeed, a thorough-going analysis of such a "Law", would reveal it as an illegitimate Amalgam of a Materialist Stance, along with an Idealist stance, and one crippled by Plurality, so would be useless for both explanation and use within any normal natural situation.

And crucially, this was the Physics that failed to cope with Quantized Phenomena: it neither would, nor ever could, cope with such phenomena adequately in any method of experiment in a real world - which also included an undetectable Universal Substrate.

The perpetrators of the Copenhagen Interpretation did not even know of its built-in disabilities - so they kept all the errors of Plurality, and decided instead to throw out Explanation as totally impossible, due to the Sub Atomic Realm being a different world, changed by the Principle of Uncertainty formulated by Werner Heisenberg.


Werner Heisenberg at the blackboard

Clearly, the usual assumptions were indeed adequate above a certain size of the participating components being studied. But, according to Heisenberg, once that size was left behind, and a World of the extremely small was entered, the rules changed dramatically! We had entered the World of Quantum Physics, where things just behaved very differently. Below that level, things became indeterminate - acting within a range of possibilities, and the old determinate Physics could no longer be used.

Indeed, a particular Wave Equation actually delivered that range, but in a very odd way! It delivered only the probabilities of a particle being in each of the whole range of locations covered by that Equation. BUT, we already have detailed knowledge of such phenomena! Long ago, scientists conquered similar situations when they were happening within an affected and effecting visible Substrate. Some material interloper could both disturb, and, in special circumstances, be recursively affected by that disturbance.

Could such methods be appropriate in this area too?

The assumption of a currently undetectable Universal Substrate was included, theoretically, in every single one of the Double Slit Experiments, and every single anomaly was physically explained without any recourse to Heisenberg or the Copenhagen Interpretation whatsoever. It seems that, as with so many of the strange anomalies of the Quantum world, and the subsequent ‘idealisation’ of Physics, this crucial missing premise is to blame.

This paper has been recently published in my new book The Real Philosophy of Science




28 October, 2018

The Real Philosophy of Science




A complete rethinking of the Philosophy of Science is now vital. As climate change accelerates and capitalism slowly dies around us, it is no longer hyperbolic to state that human civilization now hangs in the balance. For the positivist consensus, salvation must come from science and its greater capacity to understand these problems and proffer vital solutions. Unfortunately for us all, and unbeknownst to most, science is in dire straights too. As it stands, contemporary science is not equipped to deal with these profound qualitative changes, or even its own shortcomings and failings. Physics, for example, has been undergoing a secret existential crisis for an entire century.

For the last ten years, physicist and philosopher Jim Schofield, has been publishing new theories and damning critiques of the scientific consensus in SHAPE Journal. His polemical writing largely rejects the epistemology of science as it is usually conceived, and instead poses a dialectical view of scientific history, its impasses and mistakes, and our flawed methods and assumptions.

The Real Philosophy of Science is Jim Schofield’s first full-length book on the subject, but it is long overdue. Many hundreds of papers have been published by the author. Included in this vast body of work is a final refutation of the Copenhagen Interpretation of Quantum Theory, alternative explanations for all of the anomalies of the Double Slit experiments, an extension of Charles Darwin’s theories to look at how they might apply to the evolution of reality in general, and the Theory of Emergences, which shows how Marx’s ideas apply beyond the social realm – to the revolutions which happen in nature. For Jim Schofield, it is only through real Marxist intervention in science, and scientific rethinking of Marx, that we can transcend the hidden impasses that now plague human thinking, and set the course of civilisation back on track.

This is my new book The Real Philosophy of Science. It's currently available from Smashwords and it should soon hit all major eBook distribution platforms.


16 September, 2018

Jim Al-Khalili and the "Two-Slit" experiment


Al-Khalili has learned nothing on the Copenhagen Interpretation of The Double Slit Experiments

In a recent lecture (see the YouTube clip below), Jim Al-Khalili repeats the usual Copenhagen Interpretation of the ill-famed Double Slit Experiments, and his arguments have not changed one iota.




The video is captioned: "If you can explain this using common sense and logic, do let me know, because there is a Nobel Prize for you.." 

Well Jim! Have you seen our video?







Many years ago I listened to an In Our Time radio programme presented by Melvin Bragg, in which a gaggle of prestigious supporters of the Copenhagen Interpretation of that same experiment, put forward an identical account. But, neither version could transcend the contradictory accounts of particles "sometimes acting like Particles, but at other times acting like Waves".

Ever since Zeno's Paradoxes in Ancient Greece, applications of Formal Logic to certain puzzling scenarios would always end in such contradictory endpoints - entirely inexplicable in Formal Logical terms.

The problems were not trivial!

They were caused by a founding principle of both Mathematics and Formal Logic termed Plurality.

And, the reason that this crucial flaw was never addressed was because the Greeks purposely limited their intellectual disciplines to concepts and things that did not change - they remained the same qualitatively.

And, perhaps surprisingly, it proved to be an extremely empowering stance to take! For, assuming, or even ensuring, such stability in situations, certainly made them predictable.

First, this was the essential Foundation Stone of Mathematics - and legitimately established a whole new and extendable discipline, absolutely-valid for things that remained the same qualitatively: it enabled an effective Discipline of Forms and their Quantities.

But, its powerful methods of Extension and Proof, persuaded the Greeks to carry them over to a system of reasoning, later termed Formal Logic: so the new discipline could only be applied to concepts that did not change.


The logical contradiction of the Double Slit can be traced back to Ancient Greek Philosophy

This same supposition was also embedded in the initial approach to Science too.

In fact, NO real attention was given to this important disability, for about 2,300 years, when Hegel finally realised that Qualitative Changes were just NOT addressable within Formal Logic as-it-then-was, and he determined to unearth as many of these Dichotomous Pairs of contradictory concepts as he could, in order to find out their disabling bases.

He was successful in a whole number of cases, where he found that both concepts in such a Dichotomous Pair arose from the same-inadequate-premises, which would have to be changed to turn the usual non-transcend-able impasse, into a transcend-able Logical Fork.

Now, following this initial success, a great deal more has followed, enabling a major transformation of Formal Reasoning to include the tempo, processes and even the causes of truly qualitative changes.

Yet merely the application of Hegel's initial discoveries, to the Double Slit Experiments, would be sufficient to address every single anomaly of that confusing evidence. For, the mistaken premises can be both incorrect or actually missing: and the inclusion of a currently undetectable Substrate within those experiments did indeed physically explain everything!


Michelson-Morley disproved Luminiferous Aether

Now, of course, no such Substrate has ever been found, and the Michelson-Morley Experiments had seemingly banished and then denounced that assumed-to-be-present Ether, as non-existent. But, that did not banish any currently undetectable Substrate that was, nevertheless, both affected-by-and affecting-of interloping entities.

Every single anomaly was easily explained by assuming auch an intermediary.

But it had to be established that such a Universal, yet-undetectable, Substrate could both exist, and, deliver physical explanations of everything involved. It was initially undertaken to establish, theoretically, just such a Substrate - composed entirely of various mutually-orbiting pairs of Leptons, with diametrically opposite properties.

The first of these was a stable version of the Positronium, which was re-labelled as a Neutritron. This was a remarkable entity, neutral in every way, which could exist in three different modes and associations, and also be dissociated back into its separate components - one Electron and one Positron.

A very loosely-associated medium, termed a 'Paving', was possible, which could propagate Electromagnetic Energy at the Speed of Light, but could also be dissociated into its individual Neutritrons - identical to Photons, which could behave like a random Gas, or be driven into energetic Streams and even into Vortices.


A Substrate of particle-pairs like the Positronium, could be an invisible medium for EM radiation


Now, every kind of Substrate Unit was, because of their mutually-orbiting nature, also capable of carrying quanta of electromagnetic Energy via the promotion of that orbit: and could deliver such quanta by the demotion of their orbits.

And, elsewhere, it has been possible to explain the quantized orbits of Electrons within Atoms, by the dissociation of the Paving, and driving of Neutritrons into accompanying Streams and Vortices, which because of the constant return of the driving electrons, can find stable orbits wherein the net transfer of energy between electron orbit and maintained vortices arrives at a balance via stable maintained radii.

I could go on, but my sole purpose here is to counterpose the above Physics to what Al' Khalili peddles.

What do you think?

10 September, 2018

Issue 61: The Implicate Order






This series of papers attempts to draw a definitive line between the philosophical stance of physicist David Bohm, The Implicate Order, and his rejection of the Copenhagen Interpretation of Quantum Theory, on the one hand, and the seemingly-convergent Idealist philosophy of Errol Harris on the other, and to do so from an emerging Marxist standpoint in that crucial area.

The connections between these two thinkers were brought to my attention by my namesake James Schofield, and his thesis on the “Dialectical Holism” of the latter. His PhD deals with some interesting Physics and this introduced me to concepts such as Ontic Structural Realism, and set me thinking about Bohm seriously again. So thanks, James!

It is, and has been, very important because of the total lack of a Marxist critique of Copenhagen, via a clearly explained and explanatory superior alternative stance in Sub Atomic Physics. Indeed, this key absence has been so important that is has even frequently disabled the Marxist stance too, even politically!

For, these seemingly obscure questions always were of paramount importance from the time of Karl Marx’s split from Hegelian Idealism. For, without the conquest of Science by this new Materialist stance, it would be crucially disabled in everything else that it dealt with. Marx knew it, and intended to deal with it, but he was a philosopher and an academic historian, by training, and ill-equipped to tackle such a wide-ranging discipline as Science-in-general.

In addition, his historical studies with the new standpoint immediately required the conquest of Economics, as the touchstone for the tumultuous, indeed, revolutionary changes that were so important in the developments of Human Societies throughout History. His initial task, therefore, just had to be a very different treatment of Capitalist Economics, and the change turned out to present him with an enormous undertaking, recasting the whole of that subject from an entirely different and wholly new basis. This took him the rest of his life, and Science in general was never addressed by Marx.


Materialism and Empirio-Criticism by Lenin


The dangers of this crucial omission were realised by V. I. Lenin, when key members of the leadership of the Bolshevik Party had begun to show great interest in the positivist Physics of Henri Poincaré and Ernst Mach, which was understandable because of this evident hole in the Marxist stance.

Lenin knew, immediately, that this was serious, and he immediately set about a refutation of the Positivists in his book Materialism and Empirio Criticism - which successfully pulled Lunacharsky and others back into the fold, philosophically! But, he too was no physicist, so the hole was still not filled, and hasn’t been ever since.

Clearly, to this Physicist and Marxist that vital task is the most important in contemporary Philosophy; and this is already well underway. But the long standing historical omission of this undertaking could not but encourage committed Marxists to seek a world-class physicist who strongly rejected the Copenhagen stance, and the increasingly dominant candidate was David Bohm.

Indeed, in my youth I too sought answers with Bohm’s alternatives, but the problems in Science were extremely well entrenched and surprisingly old.

In spite of Bohm’s Materialism, there was with him, as with all scientists, a very long-standing Idealism, imported via Mathematics, and a Plurality via Abstraction - yet also and surprisingly Holism via Explanation, all amalgamated via the crucial glue of Pragmatism. This uneasy mix actually underpinned the whole of Science, and Pragmatism alone allowed a switching between different areas of study, where different assumptions “could work”.

And, of course the co-existence of these directly contradictory stances was not realised by those involved: they considered a “seemingly-contradictory-appearance” as being due to as yet not-fully-understood-areas, which would, later, be removed by new Knowledge.

But, that would never be the case, whilever this unaware amalgam prevailed.

And, the differences between Einstein and the Copenhageners, and between Bohm and the rest were all due to this congenitally-contradictory, assumed Amalgam as Basis.

So clearly, Bohm has to be dealt with, as vitally as Lenin had to deal with the Positivists, but this time fundamentally.


Wholeness and the Implicate Order by Bohm

01 August, 2018

A Muse upon Halton Arp's Intrinsic Redshift


Halton Arp

Halton Arp was a brilliant Astronomer, whose observations increasingly challenged the consensus theories in Cosmology. But, he more or less stood alone, and the bans on his continuing to have access to the world's greatest telescopes, and the difficulty in getting his interpretations of published observations themselves being accepted for publication, has severely constrained the propagation of those conclusions. And, as the means to obtain the best data are now almost totally restricted to multi-million dollar devices such as the Hubble Space Telescope and the Large Hadron Collider, such exclusions are ever easier to institute.

The defence of past positions becomes ever easier, and genuine counter-proposals are easily shut out by peer review, and don't get effectively aired.

Now, neither I, nor anyone else, is in a position to gainsay or agree with Arp, for that would certainly at least involve a directed observational undertaking to prove or disprove his conclusions. But, Arp has found such undertakings impossible to arrange, as have many others! Yet, if only some of Arp's conclusions are true, they would be revolutionary.

Arp made his name with his bestseller - Atlas of Peculiar Galaxies (1966), which led to him noticing a whole series of cases, which seemed to suggest intimate associations between mature galaxies, and what seemed to be "nearby pairs of dwarf galaxies" in which their redshifts in their spectra made nonsense of such an association.  For, they alone placed the pairs of galaxies at a vastly different distances from the observer than the supposed "Parent Galaxy" - but only if the usual cause of red shifting was the correct one, due entirely to the speed of movement away from us.

Arp could suggest an alternative cause, which he termed the Intrinsic Redshift, which was not only due to the age of the dwarf galaxies or quasars, but also varied in a quantized way, with time and distance from their birthplaces! His evidence, as he has presented it, is persuading, mainly because of the seeming associations with a "parent Galaxy". For, these pairs appeared to be equidistant on either side of the Parent, and always positioned upon its minor axis!





Having noticed this many times, Arp began to purposely seek them out, and remarkably found more than single pairs involved. In fact, further pairs on the same minor axes were found, and their Red Shifts decreased with distance from the parent - NOT continuously, but in a quantised pattern. Clearly though, directed searches for certain configurations among billions of Galaxies, might turn out to present a "selected-out false generality".

But clearly, if the correct scheme was devised and undertaken, such remarkable chance conformities would be easily revealed!

Of course, if Arp were right, the whole of the current Cosmological Theory, including the Big Bang and the age of the universe, would be undermined, and new answers required literally everywhere. And Arp was aware of these difficulties, and attempted his own alternatives to the usual Big Bang scenario. Yet, literally thousands of reputations and multitudinous published papers have been produced, all over the World, by prestigious and eminent scientists. Many would have a great deal to lose if he were right!

Now, the writer of this paper, a physicist, also has an axe to grind, concerned with the necessary presence throughout the Universe of a totally undetectable Substrate, which is both affecting-of, and affected-by entities and processes taking place within it.

This too seems a very way-out suggestion, except that unlike James Clerk Maxwell's description of the then universally believed-in Ether, the undetectable units of such a substrate have been theoretically devised involving only known Elementary Particles, and with only this single inclusion, every single one of the anomalies of the Double Slit Experiments have been physically explained. In addition, both the Propagation of Electromagnetic Energy through so-called "Empty Space" as also been cracked, as have the physical extension of active fields in the same situations.

Now this research is still ongoing, but it too affects everything currently supposed to occur in Empty Space, due to the densities of the Universal Substrate in different circumstances, as they also do Arp's theories. And another potential resonance occurs with the Origin of Matter, in both the new theories it comes from the pre-existing Substrate.

Clearly, both theories might be buried by the necessary research, but if they are wrong, so be it!

However, there is obviously a great deal wrong with the Copenhagen Interpretation of Quantum Theory, as well as the consequent theories of the Cosmos, and such research might well rid us of those too!

07 March, 2018

New Special Issue: Death to Copenhagen!






The demise has been noted, and the last rites said, so the time has come to bury the beast!

The Copenhagen Interpretation of Quantum Theory must be laid to rest. Such a final necessary conclusion has definitely arrived, as its remains are beginning to stink!

Three things had to be done to bring things to this long desired state:

First, an alternative coherent, consistent and comprehensive Physical Explanation had to be found for all of the many anomalies of the Copenhagen Interpretation. And, it has been achieved here, by the inclusion of a currently-undetectable, but definitely- existing Universal Substrate, which has been theorised using only known particles.

Second, the illegitimate amalgam of contradictory philosophic stances, underpinned by the Principle of Plurality, and excused by the regular recourse to Pragmatism, had to be totally dismantled and replaced with a new approach to Science.

Third, there has to be an explanation as to why the equations used in Quantum Theory give answers that match the Reality. And, this turns out to be nothing more complex than fitting-up pure formal relations to a situation, by using that situation’s measured data. It is a useful oft-employed pragmatic trick!

But, the real cause of this great impasse was the abandonment of theories which actually delivered the most Objective Content, and replacing such a policy with one which allocated all cause to Pure Forms, so jettisoning things with unsubstantiated features for a “pure and whole description”, that delivered reliable predictions without a trace of the actual causality involved.

Presented here is the long-delayed interment of a long dead idea. To work out where we go next, we must return Physics to a genuine materialism.



This terminating instalment is just the latest in a series of issues of SHAPE Journal to debunk Copenhagen - the rest can be found here:


 


The Significance of Oscillation





The Nature of Reality and the Significance of Oscillation



Our first conception of Reality was always bound to be of its human-scale entities, eminently stable and persisting forms, processes and products. For what actually exists, at greatly smaller levels, is entirely unavailable to our senses, and because such knowledge did not inform our prosperity and survival, such as is implicit in the "our-scale" things we need and use to survive or even to prosper.

But, what are those larger-scale seeming permanencies? Are they really as permanent as they appear to be? No, they are not! Indeed, the more we investigate them, the more frequently do we encounter much smaller-scale components, which are all in persisting oscillation.

Clearly, opposing effects are simultaneously effecting things, which find some sort of persisting balance in oscillation! We can see less permanent versions of similar oscillations, even at our own macro level, in the vibrations of strings, and even in those of many rigid things, but they soon lose energy to the surrounding atmosphere, and fade away.






The clearly evident vibrations, at our level, are always transient...

What are far more important, are the oscillations at the micro level, which do seem to persist indefinitely, without ever fading away! And, by far the most important form of such oscillations is the orbiting of one body around another, which can be "permanent", and actually exist at every level from the Sub Atomic to the Cosmic.

Various explanations of these orbitings, some of which can last for billions of years, have been suggested, The Cosmic persistences being said to be due to their happening in a vacuum, so there is nothing to carry away the involved energy. While, at the other extreme, in the oscillations of atoms in solids, are said to be actually maintained by the ambient-temperature energy of the surroundings, in intimate contact with the oscillating atoms.

But, what about those orbiting electrons within atoms?

They can be both promoted and demoted, from-and-to external energy sources, BUT NOT in a continuously-changing way: the orbits only exist at particular radii - all others are prohibited (the famed quantisation of orbits).

But, these same properties also persist in the supposed vacuum of Space!

Now, similarly "quantized" orbitings have been created, in the laboratory, and at the macro level, when deliberately made to occur in a liquid Substrate, under surprisingly simple conditions. Now, before anyone gets too excited, may I inform any doubters that ALL that was involved in these experiments was a single silicone oil substrate, and absolutely nothing else!

They were carried out by French physicist Yves Couder: and the writer of this paper has concluded that some of the remarkable phenomena produced there, in particular the orbitings of the produced "Walker entities", were made possible entirely by the presence of Vortices generated within the Substrate, and both maintained and even quantized by the two-way transfers of energy, between such a causing orbit and its produced vortices, due entirely to the constantly repeated orbits, only balancing-out to stability at particular orbit radii!




The significance of this, if true, will be enormous! For, this theorist has already applied these ideas to the quantized orbits in atoms, and also has explained all of the anomalies of the Double Slit Experiments, primarily by the presence of a currently undetectable Universal Substrate.

The Copenhagen Interpretation of Quantum Theory is now already mortally wounded! And, many of the inexplicable properties of "Empty Space" are also receiving physical explanations, in terms of this same currently undetectable Substrate.

And, that Universal Substrate isn't just an unsupported speculation, without a detailed explanation of its composition, that too is well under the way, and currently holding up very well.

Though this may be dismissed as unsupported speculation, I have to strongly demur - especially when the alternative, the Copenhagen stance, isn't even a Theory, but merely a formal description, bolstered by a remarkable series of speculations!

No indeed. I take my methods from those of James Clerk Maxwell, who, seeking an explanation of the then also currently undetectable Universal Substrate - The Ether - devised a purely theoretical Analogistic Model, composed of Vortices and "electrical particles", to explain the functions of that substrate, and from it managed to develop his world famous Electromagnetic Equations! He was using the only means available to the investigator deprived of sufficient access to what he required to address a given problem. He knew, very well, that our theories, at best, only ever reflect the amount of Objective Content in our explanations, and that they would inevitably be replaced by others, containing more of that Objective Content thereafter.

So, theories, which explained more than those they replaced, were entirely legitimate in maintaining the necessary ongoing ascent. But, wild speculation was still prohibited - no Gods here!

Instead, the sincere investigator would seek analogues elsewhere in Reality that were already to some valid extent understood, and attempt to weave an effective explanation out of those - That is what is being dome here. That is the postulation of a currently undetectable Universal Substrate, using everything we know about "Empty Space", and as many analogues from known and detectable substrates as possible.
Clearly, if validly established, the universal presence of a material, yet hidden Substrate, will undermine many assumptions in current physical theories.

And, the first giant Sequoia to fall will be the Copenhagen Interpretation of Quantum Theory.

While the next must be all of those that insist upon totally Empty Space, anywhere!

For, with the kind of Universal Substrate that has been developed to demolish Copenhagen, a rich background for literally all phenomena has been established, and, crucially the supposed cause of Quantization becomes a generally possible physical ground, occurring everywhere.

Add to this the increasing role of both oscillations and orbitings, and their intrinsic participation in interactions with that Substrate, and a whole new approach at all scales becomes crucial.



20 February, 2018

The Quantum





THE QUANTUM

Is it intrinsic or caused?


The whole basis of the Quantum Revolution in modern Sub Atomic Physics is that the Quantum of electromagnetic energy is intrinsic to the Nature of Reality, so that all theories have to be modified to build this requirement into them.

But, could it be that there is a purely physical explanation for energy being cast into, and thereafter maintained as such descrete forms?

This might seem almost sacrilegious to literally all modern physicists, but a purely theoretical investigation into the effect of a totally undetectable, but both affected-and-affecting Universal Substrate, has questioned that assumption, by finding physical ways that such energy might have a Quantum Nature imposed upon it - especially as the very same nature would, thereafter, in its normal means of propagation, be rigidly maintained.

There was, of course, another good reason for embarking upon such an investigation, which was clearly that this theorist was unhappy with the anomalous results of the New Theory, especially in the whole series of Double Slit Experiments, which had become cornerstones of the Quantum Revolution, and had, in addition, also led to the dumping of Physical Explanations entirely, for a wholly formal mathematical description as being "more reliable", and still allowing prediction and reproducible phenomena.

As far as this physicist was concerned, however, the real heart of Physics was in its explanatory power, rather than its reliable formal predictability, so he embarked upon this exercise to also investigate all possible avenues for such an alternative to be available. And the Double Slit Experiments with their supposed "Wave/Particle Duality" cried out for a substrate-based solution.

But, there was no discernable Substrate!

Clearly, if one existed, it would have to be totally undetectable, while also effectively functioning as a propagator of Electromagnetic Radiation. Whatever it was composed-of would have to be self-hiding while entirely capable of carrying quanta of electromagnetic energy.

It could only be a joint-particle, of the same basic model as the atom but composed of sub-units - opposite in every possible way, but capable of carrying energy also hidden within it. The obvious theoretical candidate had to be a mutually-orbiting pair consisting of:-

An Electron of ordinary matter and a negative charge

+

A Positron of antimatter and a positive charge.

These, being of the same size, they would share the same orbit, but always occupying the exact opposite positions within it.

Such a joint particle would be invisible, but could carry a quantum of energy by the promotion of its internal orbit!

But, why should that carried energy be only as a Quantum?

The answer to that question is also answered by this new particle, but only revealed in its crucial role as part of a Universal Substrate. So, before we go any further we should establish just how this joint, neutral particle could possibly form an effectible and affecting Substrate.

It doesn't seem possible for such a neutral particle to be capable of forming any kind of "connected-Substrate", but investigation of these particles, in very-close proximity to one snother, has shown, theoretically-at-least, that they can.

For, when very close together, the sub-particles of one such unit will transiently be able to affect those of another. Indeed, as the sub-particles orbit they will cause an oscillating attraction and repulsion affect upon the two involved overall Units. And, the consequence would be that units - getting close enough, would be captured into a loosely-connected form, termed a Paving! And, all participating units would oscillate about mean positions at fixed separations of all the units involved. This would make a bucket-brigade propagation of quanta possible at a fixed speed - the Speed of Light - "C".

Now, for those impatiently awaiting the Creation of the Quanta, we are finally approaching the point where an explanation can ultimately be delivered.

It concerns the various possible modes of the Paving! For, though the form described above will be the default mode of the Substrate Units in quiescent regions, it can very easily be dissociated into its components Units, which in the consequent free-moving form are usually termed Photons.

And, the same material interlopers that precipitated such a dissociation will also tend to drive them into Streams and even Vortices. And though those these modes will be usually temporary, there is one set of circumstances in which that will not be the case.

It is that inside all atoms! For there, the dissociating cause is an orbiting electron, which will constantly return, and re-affect the photons and the caused Vortices in every single orbit. The energy will come from the orbiting electron and into the photons and Vortices, but the orbit will decline somewhat, so that upon a certain orbit, the energy will be FROM the photons and Vortices and BACK into the orbiting electron.

And there will be only a small number of optimum orbits, which display a persisting balance, so become Stable: these are the Quantised Orbits for that atom! 





And hence, all transfers of energy changing these orbits will always be between that atom's stable orbits!

It is only in atoms where the specific quanta of energy are determined!