17 June, 2019

Substrate Theory continues... soon!

Image from Alternating Current by Michael C Coldwell

To celebrate their 10 year anniversary SHAPE Journal's epic series on Jim Schofield's Substrate Theory continues, with the next instalment due to be published in coming days...

Part 1 - The Lepton Substrates - Issue 65 was published last month: a bumper issue which collects all relevant papers documenting the evolution of this new physical theory, from the concept of a new aether of "empty photons" to a whole new look at the standard model, with magnetons made of Taus and Muons to the possibility of Neutrino-based gravity fields. 

Part 2 - Towards the New Physics - Special Issue 65 presents the latest research on this theory, examining how the Universal Substrate can help tackle all the biggest questions in physics today, from the Spacetime Continuum to the Uncertainty Principle, from Casimir Effect, Redshift, Time Crystals, Superfluids and Dark Matter, to Virtual Particles and the work of Frank Wilczek.

Watch this not-so-empty space...

26 May, 2019

Issue 65: The Lepton Substrates

10 years SHAPE Journal presents Substrate Theory vol. 1

SUBSTRATE THEORY I: This issue of SHAPE Journal is the first in a two part bumper edition on Jim Schofield’s Substrate Theory, curated to mark the 10 year anniversary of this publication, and to finally bring together all of the crucial materials for this burgeoning physics.

Both issues feature photography series Alternating Current by Michael C Coldwell

This is a completely new approach to sub-atomic physics that hypothesizes the existence of as as-yet undetectable heterogeneous material substance, filling all of known space. Unlike its distant cousin, James Clerk Maxwell’s Aether Theory, this new Substrate conception can explain all Quantum phenomena, the anomalies of the Double Slit experiments, Wave/Particle Duality and its own strange illusivity.

Without wishing to sound hyperbolic, the ideas contained within these issues are nothing less than a revolution in science - a complete rethinking of contemporary physics from the ground up; commiting to the scrapheap of scientific progress much of the last century’s detour into a realm which we might call Quantum Ideality.

The near-total dominance of Mathematics in the field has lead us further and further away from the material reality we purport to study. The more advanced our technological solutions become, the more convoluted our route to truth, the more self-fulfilling our prophecies.

In Substrate Theory we see a genuine attempt to return physics to materialism, but also to try and explain materially, the many weird and wonderful phenomena we have observed at the Quantum Level. This is no simple return to the halcyon days of simpler classical physics, not a retrograde movement at all, but instead a new gesture towards a truly holistic study of the material universe - a methodology that eschews virtual particles, Quantum Entanglement and all manner of Mathematical constructions, which uniformally fail to explain the material causes of the most basic physical phenomena - gravity, the propagation of light across space, disembodied magnetic and electrical fields.

Saturation (2012) by Michael C Coldwell

While Substrate Theory certainly has the potential to upturn the entire apple cart of modern physics, the pieces of the puzzle still need assembling, even if all the elements have been devised.

This issue begins that process, collecting together various prior publications on the theory, and weaving them together into a coherent narrative. This is a somewhat difficult undertaking as the ideas are always evolving - the goal posts moving.

Our editorial solution to this problem is to split the task into two parts.

In the first installment, The Lepton Substrates, past papers on Substrate Theory are collected together in historical order, with some new editorial pointers mindful of when certain ideas were first published, at what point in the theory’s evolution certain assertions were made, and that some of the original ideas have been necessarily superceded, but are still required here to understand the theory in general.

All the foundation arguments and elements of Substrate Theory are explained and discussed here. From the original notion of ‘Empty Photons’ pervading all space we start to see what these entites might actually be. How pairs of mutually orbiting Lepton particles must be undetectable, but could easily pave ‘empty’ space. What those different Leptons are and how they can carry quanta of energy in orbits, passing them bucket-brigade across the universe. How quantised orbits may have a material explanation in a dialectical relationship between levels of the Substate.

Over the course of The Lepton Substrates the full picture begins to form and the unexplained contradictions of Quantum Theory start to fall like dominoes.

Imaginary Units (2012) by Michael C Coldwell

In the second part of this series, Towards the New Physics, we look beyond the nuts and bolts to the potentials, publishing Jim Schofield’s latest writing on Substrate Theory, which sees how the Universal Substrate can help tackle all the biggest questions in physics, from the Spacetime Continuum to the Uncertainty Principle, from Casimir Effect, Redshift, Time Crystals, Superfluids and Dark Matter, to Virtual Particles and the work of Frank Wilczek.

What becomes increasingly apparent in all this research is that even if this particular model of a Substrate is wrong, the basic premise that one exists is right. It has been the missing premise since 1927 - the invisible elephant in the room of an ever-more esoteric Physics. Substrate Theory explains far too much to be ignored any longer, in a discipline which has all-but abandoned explanation for the most obscure of Mathematical games.

23 May, 2019

Substrate Theory!

Jim Schofield's Substate Theory will be published in coming days!

Part 1 - The Lepton Substrates is now available here

Dominances and Opposites

Natural laws are perhaps better thought of as natural dominances

How Holistic nature of reality both maintains Stabilities 
yet also enables Real Qualitative Change, via Emergences

The initial choice, historically, by a Mankind finally beginning to emerge as an intellectual interpreter of both its World, and of its own existence, was naturally to assume the absolute generality of the Principle of Plurality, rather than the seemingly totally opposite, and much more difficult to implement Principle of Holism. It was, at that crucially emergent moment, a completely unavoidable step, if any sort of progress was then to be made.

But Plurality also simplifyingly insisted upon the fixed, or qualitatively-unchanging-nature of all elemental things and laws, whilst Holism instead rested upon the inevitability of precisely such Qualitative Changes in absolutely everything!

NOTE: Of course, Mankind's initial, primitive Reasoning had also, long before that emergent moment, subscribed to the very same premise, as a basis for beginning to cope with Reality.

And, of course, most things certainly didn't seem to change in their apparently-evident, essential-natures, "all-the-time", but, on the contrary, only in their contributing amounts (their quantities): so the assumption of qualitative constancy was the only place for an initial intellectual study to even begin. Even these could account for some "qualitative changes" in certain circumstances, assumed to be caused by many different contributory things (that were themselves merely changing purely-quantitaively), yet still enoughfor a balance of simultaneous contributions to be changed sufficiently, so that a different involved component, could move into so-called Dominance and flip a situation over into a different mode.

But, it now finally seems likely that the assumption of a Pluralist World may need to be wholly dispensed with, and the alternative assumption of a Holist World, be imported into Science to replace it, in order to tackle its increasingly numerous anomalies and contradictions, generated by that initial premise.

Now, both of these Basic Principles are clearly enabling simplifications, so a mere switch between them cannot alone solve the many epistemological difficulties that have accrued over many millennia.
Some sort of detailed review of what we mean by these apparent alternatives, must be instituted to reveal just in what ways, and how profound, the real differences are.

Plurality, the philosophical assumption underpinning Science, deals solely with qualitatively-fixed relations only, which are usually achieved by extensive and detailed farming of given situations, to remove as many simultaneously acting factors as possible, so as to ideally leave only one, and, thereafter, to continue to hold things "still" - so rigorously, in fact, that only a single relational Law would ideally remain, making its extraction, and subsequent effective use possible.

Holism rejects this conception for its unavoidable distorting of any athus-found Law, always artificially made into an eternally fixed abstraction, which never, as such, occurs naturally in Reality-as-is! The Holist alternative considers that, in every natural situation, there will always be a reasonably extensive set of different contributory factors, all acting together, and, in spite of this, seemingly indefinable complexity, does end up delivering something as an overall end-result, due to all of them doing what they do, BUT together producing a single outcome. And such results can turn out to be very different in totally unconstrained conditions.

The commonest case, in all school science experiments, was always an "overall law", which normally varied along with its multiple component factors. 

Another, and different case, could be a kind of naturally self-maintaining Stability, within which any variation tending to distiurb that Stability, would be immediately countered by an opposing reaction of another, having the very same cause - but always returning the situation back to Stability.

And, in any overall resultant Law, considered holistically, factor changes could flip it from its previous Dominant result to the opposite Sub Dominant outcome - now become the New Dominant.

NOTE: this last point is referred to somewhat early... A long and necessary diversion in establishuing a wholly New Stance, will later in this paper, reveal in full how and why this occurs!

So, the usual result by Mankind was that conditions would be increasingly controlled, enabling a reasonable approximation to an artificial wholesale Stability to be implemented, to purposely facilitate a given required use: so such careful farming of conditions, rapidly became the norm, when both revealing and then applying, assumed-to-be-eternal Laws: but NOT, it must be emphasized, to stop it varying, but, on the contrary, and mistakenly, to supposedly eliminate the effects of other simultaneously-acting laws cumulatively affecting the result by mere quantitative changes in their various contributions.

And, of course the age-old pragmatic tenet still ruled the roost ultimately, with, "If it works, it is right!"

Enabling Technological Use rather than Increased Understanding totally dominated Science.

So, Science developed with that always-distorting premise - that of Plurality, and, therefore, a necessary Use-Methodology, had to be involved to initially rigidly farm the conditions "to ensure that the targeted Law would be-artificially-fixed, and couldn't be "variously-affected", as it would be when occurring in Reality-as-is.

[In addition, it was also wrongly assumed that this fixed version also behaved in exactly the same way in Reality-as-is, and that the differences there were simply due to the summation of all the other, also present and similarly fixed, Laws! Whereas, holistically, all simultaneously present Laws actually modified one another: it wasn't just a simple summed Complication!]

It meant, of course, that the Pragmatic Objectives could always be the sole, aimed-for intension, so the consequent Explanatory Theory, derived from the farmed data so-obtained, was therefore wrong- and not just because of the farmed conditions (as explained above), but also, and crucially, because a Pure Form, from strictly Pluralist Mathematics, was then carefully tailored-to-fit that same already distorted data, so that the achieved formal law, as an Equation, was doubly modified: for it was, in addition, idealist, rather than truly materialist!

And whilever it was only pragmatic results that were required, the Technology involved could indeed deliver the required results, and so the ubiquitous method largely went unquestioned.

Nevertheless, any consequent Theory associated with that equation, would be incorrectly taken as being the required general, natural relation: the Theory, and hence the Explanatory Science so derived would necessarily be incorrect!

And, of course, as physicists delved ever deeper into Reality, the consequent inaccuracies became ever more unavoidable, and increasingly delivered debilitating anomalies in Theory via the assumed premises of Science, which inevitably began to deliver ever more contradictory results.

In addition, the devised solutions, all of which are unavoidably constrained by their basic assumptions, were forced ever deeper into the idealistic hinterlands of Pure Mathematics, in order to seek for the Perfect Forms that could be persuaded to fit!

[Isn't that exactly what Einstein did with Relativity?]

And, that could only lead away from Reality and ever deeper into Ideality, a domain of Pure Form alone, and hence the only true realm of Mathematics, and NOT in the domain of Concrete Reality- which should be the realm of Science!

NOTE: The problem was also that Mathematics had soundly started by merely being a direct Reflection of Reality: so it did contain something of that source. But, it was a distorting mirror because of its in-built and generally-and-intrinsically assumed Plurality.

NOTE: Reality has only three dimensions we can legitimately abstract, while Mathematics can have as many as you like, as long as you maintain the exact "geometric relations" between them all, that had previously been the case for the original three Spatial Dimensions alone, but could even be "philosophically" adjusted to even allow-in the admixture of Probabilities and Wave Theory - as in Copenhagen.....it appears!

And, that infers that the extra Dimensions must concretely exist, for the relations carried over from the Spatial Dimensions have to still hold in exactly the same way: and paradoxically to also NOT exist in our real space, but unseeably elsewhere, yet maintaining the very same inter-Dimensional geometrical relations both with the usual real three, and each other!

Now, as is well known in some areas of academia, ever since Zeno of Elea, certain Dichotomous Pairs of contradictory concepts, namely those of Continuity and Descreteness, could lead to nonsense when applied to certain Real World Movements, but his revelations were largely ignored, by using the pragmatic excuse of "If it works, it is right!", and it wasn't until Hegel found a solution, 2,300 years later, in the necessary modifications of their defining premises that many of these were removed.

But, of course, the ongoing problem was Plurality! For, it prohibited qualitative changes from ever occurring in such concepts. Hegel realised that such change had to be included in Reasoning, involving Dialectics as a new Science of Logic.

Now that, of course, is much easier said than done. And also this was being attempted by an idealist, who was exclusively concerned with philosophy as Human Thinking, and arguing! Whereas the required changes were even more important, essential in rescuing all the sciences, from the very same affliction.

Indeed, how would multi-factor situations actually behave, when affected by all their simultaneous and varying contributions, and even more confusingly, mutually modifying one another, as is the main premise of Holism? Just how far would the actual modifications precipitate one another, into ever more changes? Would it not be an infinite process? NO, it would not!

In Plurality, though properties would be fixed, quantitative changes would indeed be potentially infinite! But, changes, in a Holist World, would NOT be merely quantitative, but also qualitative, and such could be in any direction, so they could be either conducive or contending with respect to others - and in a multi-factor system, delivering both kinds of variation - balances at some points would become inevitable... And, never as a fixed and final result, either!

Indeed, the contending and conducive factors would be delivering a Dynamic Balance, wherein constant changes would still be involved, but changes in one could, in such a balance, be countered by opposing changes in other factors! The same disturbing factor could affect different internal factors in opposite ways.

In other words self-maintaining Stabilities, would be the most likely outcomes, some persisting so long as to appear eternal to our timescale. Stabilities which would require very significant disturbances to be fatally undermined, and which would, in such rare circumstances, ultimately totally dissociate the Stability completely.

NOTICE how far we have been transported in this endeavour, away from Pluralist Formal Logic into the dynamism of Holist Reality!

And, even at this most basic level the importance of "opposites" begins to become ever clearer.


The significance of the above ideas was demonstrated to this researcher, when theoretically investigating Pre-Life simultaneous Chemical Reactions, in particularly conducive circumstances on the early Earth.
Interestingly, I did not initally identify any individual chemical processes, with a view to defining-in-detail the ascent to the Origin of Life, via what qualities were involved at each and every step. For, I knew that such a journey would be strewn with unpredictable Emergences, and hence well beyond any theoretical means I could muster to re-construct that long trajectory.

So, I was much more interested in the dynamics involved in a proliferation of different processes, affecting one another, either conducively or contendingly, to produce ever higher levels of sub-systems, systems and super-systems via such creative events always delivering the entirely New, which would never be delivered by the usual pluralist Chemistry, as no chemical process is ever conceived-of in that way: they are all straight-forward processes with logical outcomes confined within Pluralist Science.

So, to consider such situations, holistically, would therefore involve dumping the usual pluralist methods, along with their severely constrained circumstances, producing only the revelation of single pluralist laws as the required outcomes - so, instead, considering whole collections of many simultaneous processes, all acting together, and producing a variety of products, which then, in turn, immediately became actively inolved in the overall system of processes, as a consequently constantly changimg mix, and also changing all the contributing processes, and so, importantly, changing the Contextin which all the involved processes are performing, and all of them changing somewhat along with such a varying Context.

Now, you can see why the outcomes of individual processes - all previously carried out separately in individual farmed pluralistic experiments, would certainly never suffice in determining what happens in these holist and multi-factor circumstances! So, the required focus must consider things very differently, primarily as delivering a Classical Dynamic Holistic Interlude (see The Theory of Emergences (2010) by this author), which will usually end up in a temporary, if long-lasting, Balanced Stability, with measureable overall characteristics, which can all be determined experimentally, (in the old-fashioned "Equilibrium" type experiments of the School Lab), and which will involve ongoing inter-process effects, in process-chains or even process-cycles, and with self-selected and evident Dominances, usually caused by the presence of certain resource abundances, due to current local conditions.

But, the most likely change, thereafter, without too much of a variation in the overall Stability, is usually a seemingly dramatic Switch to deliver an Opposite to the prevailing Dominance. And, believe it or not, such a switch can be explained, and with it the significance of Opposites in Holistic Dialectical situations generally.

Very different this Holist Science malarkey, isn't it!?

Now, the consideration of such a situation and its development will be qualitative rather than quantitative, and systemic rather than productive. For, in such a set of simultaneous processes, they will be not only changing each other's contexts, but also using-up or supplying each other's Resources too.

Connected Worlds - interactive ecosystem installation, New York

So, processes related by the product of one, being the resource for another, will form Chains, amd even occasionally Cycles, the preponderances of available resources due to the current location will also determine which Systems are most successful, or Dominant!

Indeed, there will actually be two sets of processes benefiting from the same abundence.

The first will be those going generally in one particular direction >>

The second will be those generally going in the opposite direction <<

And, with universal competition for the very-same resource, most of both sets will lose out - all except the Dominant one in its direction, and its Opposite (sub dominant) in its contrary direction! 

(Notice there is a seemingly contradictory hierarchy of opposites)


The consequences for current research into the Origin of Life, of ths Holistic Approach, culminated in the Theory of Emergences (2010), which demonstrated once-and-for-all that research to re-create Life from scratch is impossible: and even much worse, that absolutely NONE of the essential interludes in that development will never be reproducable either - as they will all be the results of natural Emergences, the outcomes of which will always be impossible to predict!

A bit of a problem for science, where falsifiable predictions are so vital for proofs!

Capitalism Hits Home: A Movement for Women

Good stuff from Harriet Fraad on the women's movement and capitalism. 

Accumulation by Dispossession

How Modern Capitalism Works: David Harvey strikes again!

27 April, 2019

10 Years of SHAPE Journal

Substrate Theory of Physics

Coming May and June - two new Special Issues of SHAPE Journal

a definitive guide to Jim Schofield's Substrate Theory of Physic

To mark 10 years of the journal, SHAPE will be publishing two Special Issues on Substrate Theory, a definitive collection of papers on this new model of physics.

Also in the pipeline for this summer is a new documentary film on Revolution!

Watch this SHAPE.

Here is first part!

26 April, 2019

Copenhagen is Wrong!

The house of cards that is Quantum Theory is really starting to fall...

And now Lee Smolin is on-side.

In a recent Perimeter Institute Lecture, Smolin delivered the trenchant view that the Copenhagen Interpretation of Quantum Theory is in fact wrong, and I agree with him!

However, while his Realist arguments were indeed correct, and Copenhagen Theory is totally Idealist: that, I'm afraid that will never be enough.

For, in spite of Hegel's profound and transforming criticisms of the Plurality of both Formal Logic and its use in Science, almost 200 years later his improvements have still rarely been applied to either. The Copenhagenists will never relinquish their theory, for they don't even know why it arose, due to profound errors in its premises, and also because in all the circumstances in which they use it - it certainly does work - pragmatically! But, even then, it also never explains why.

For Explanation, as the primary purpose of Physics, has now been totally abandoned. Instead, this so-called Theory gives the right quantitative answers, in highly-constrained circumstances only. And, in doing that, it is entirely consistent with what Modern Physics has now become - an extension of Mathematics.

For, henceforth, it can never explain why things happen the way that they do, but can only "match" how it works purely pragmatically. It is content to be only technologically-useful, amd hence explains absolutely nothing!

Now, of course, its supporters would all totally disagree with this, because of the pseudo-philosophical inventions that the originators inserted, in order to make it look like an "Explanatory Theory" - namely, via their cobbled together Equation, which imports illegitimate probabilities into a basic Wave Equation. And then replaces all Particles, at what they term "The Quantum Level", with Wave/Particle Duality - delivering the alternatives of using the wave-like Equation, until the entity suddenly became, once more, a descrete Particle, and they could term the transition "The Collapse of the Wave Equation", when it resumed its Particle Form.

It is, of course, NO kind of Legitimate Scientific Theory: it is a clever fix to get around contradictory behaviours, that their prior premises just could not cope with. And, that was indeed the case: the prior theories just didn't work at these levels: they were inadequate, it is true!

But, literally all the theories in the past were also inadequate, but were re-investigated to review the premises assumed, and usually, they, in the end, would finally arrive at something better! By the early 20th Century, however, a whole series of dramatic and debilitating results had begun to be revealed that undermined the prior assumptions.

The Michelson/Morley Experiment had dismissed the existence of the Aether as filling all of Empty Space with a Universal Substrate.

And, Henri Poincaré and Ernst Mach, finding ever more physical premises that seemed to be wrong, proposed Empirio Criticism, a form of Positivism, as an Amalgam of Explanations and Mathematical Equations, which together, they believed, could deliver what was needed.

The Piste was set: and it was all downhill from there to finally dumping Explanations altogether!

For, nobody demurred at the conclusions from the Michelson/Morley Experiment, so Wave-like phenomena, at the Sub Atomic Level, could no longer be ascribed to that now dispensed-with medium: instead, somehow, such effects had to be embedded into what Particles actually were themselves!

The key experiments were those involving Double Slits, which became the touchstones for many of the new discoveries. But, the Wave/Particle tricks which were instituted to explain the many anomalies, are very easily removed by assuming an undetectable Universal Substrate in those experiments, and Substrate Theory immediately dispensed-with every single one of them.

Now, though obviously insufficient by itself, this theoretical exercize could not be ignored! For, it demonstrated that classical Waves were somehow involved: the Wave/Particle Duality invention was a frig! And distinct Waves and Particles were, somehow, still intrinsically involved.

Two parallel lines of theoretical research were undertaken.

The First was to investigate the possibility of such an Undetectable Universal Substrate - composed entirely of pairs of mutually-orbiting Leptons of diametrically opposite properties - thus delivering the required passive-undetectability, while at the same time allowing that Substrate to be affected-by those interlopers, while also delivering the subsequent affecting-of those very same entities, in differing, later circumstances.

It must be stressed that the objective here was a theory-first investigation: just as James Clerk Maxwell had used in his Analogistic Model of the Aether, which was the Basis for his still universally renowned Electromagnetic Equations!

[Indeed, there is the very sound point also, that theory-first investigations actually avoid the inevitable pluralistic aberrations of all data-first investigations, using the now Standard Scientific Experimental Method of directly constrained and maintained contexts]

And, as Frank Wilczec has recently insisted upon with "The Materiality of the Vacuum", the above theoretical Analogistic Model is not without foundation, even if his composition of that "undetectable Universal Substrate" differs from that used here.

James Clerk Maxwell's Analogistic Model, has long gone, BUT its use was justified by its delivery of the Electromagnetic Equations!

Now, the Second line of theoretical research was somewhat akin to Maxwell's - by assuming such a substrate with the New Model's composition, could all the anomalies that precipitated Copenhagen be physically explained instead?

Now to achieve this objective, the initial simple definition in terms of Electron/Positron, mutually orbiting pairs, had to be extended to also involving Units composed of Taus and Muons, and even Neutrinos, but a rich and successful Substrate of Leptons was devised, and the inventions of Copenhagen physically replaced!

BUT, of course, this is just like Maxwell's Aether Model, the detailed content of the Analogistic Model of the Universal Substrate will, indeed, be wholly replaced too!

This researcher, like Maxwell, knows very well that all our theories are never the Absolute Truth, but at best, contain sufficient Objective Content to deliver more than what they replace.

Criticisms of the New Model are not only legitimate, but absolutely necessary. Yet, criticisms that exist only in order to re-establish Copenhagen are most certainly NOT! Copenhagen is a dead-end, and new ideas and models are now required to replace it and push Physics into new territory.

Implicit in the new approach was a root and branch critique of the premises, and even basic amalgamated philosophical stances, underlying modern physics. Succintly, Copenhagen is Pluralist, while Reality is Holist!

And Science should be materialist, while Copenhagen is certainly idealist!

By all means improve upon the New Physics, but leave Copenhagen where it derserves to be - Dead and Buried!


For more on burying the Copenhagen Interpretation, please read the Special Issues of SHAPE Journal above, and look out for our forthcoming 10 Year Anniversary Issues on Substrate Theory.

24 April, 2019

Current Praxis: The huge gap between theory and practice

Theory and Practice: To Serve and to Organise

As a long-time active Socialist, and latterly Marxist Theorist, I am acutely aware of the gap between my extensive efforts on the web, and its almost total lack of connection with the on-the-street organisations of the disadvantaged - and this is clearly a general problem.

Now David Harvey, surely not only the leading Marxist theorist living today, but one whose Internet offerings have become extremely widely-read - BUT, he is nevertheless concerned about his lack of connections with organisers on-the-street, and has put up two interviews with Chris Caruso, in his excellent Anti-Capitalist Chronicles (out of Democracy at Work), which excellently addresses these precise questions.

As everyone can see, naturally emerging protest demonstrations and even loose organisations are arising with crucial agendas all the time - the most important one currently being the Yellow Jacket Movement in France.

But too many of such occurrences don't last: single issues, no matter how important, cannot survive if they don't link up with others to enrich the content, capabilities, understanding and fraternal, social strength of their efforts.

The interviews carried out by David Harvey are crucial, and one, which is about this problem, and the linking of internet-based propagation and help, for the exciting developments in the streets and localities, is included here to introduce them to you.

Perhaps we can help too - either here on SHAPE, or by delivering your questions and concerns to places like the Anti-Capitalist Chronicles.

Contact us privately via email:

or leave a comment under this post.

14 April, 2019

Special Issue 64: The Limits of Mathematics

This edition deals with the various limitations of Mathematics from a variety of different scientific and philosophic angles, and features a fantastic guest paper by Abdul Malek, a Theoretical Physicist and Dialectician from Montreal, Canada.

It has taken me many decades to realise quite how limited Mathematics really is. I have the advantage of having been a gifted mathematician long before I switched to Physics. I made that significant change because Mathematics is a purely descriptive abstract discipline, of a very special type, and I wanted to really understand things rather than merely describe them in abstract form.

Unfortunately, as we shall see, Physics has become little more than an extension of Idealist Mathematics. Physics was converted into a Pluralist Science of Stabilities: and one driven idealistically by Purely Formal Laws.

No wonder it is in an untranscendable terminal impass as a Science! Indeed, we can legitimately go a great deal further, and insist that it no longer investigates Reality-as-is, but instead can only deliver a distorted formal reflection of that World: it is an investigation of Ideality - the infinite World of Pure Forms alone: the Abstract Realm of Mathematics.

In short, Physics can only be saved via a wholesale rethinking of Mathematics and how we use it.


13 April, 2019

Frank Wilczek and the Universal Substrate

Artwork from Michael C Coldwell's Alternating Current series

Coming May and June - two new Special Issues of SHAPE Journal
a definitive guide to Jim Schofield's Substrate Theory of Physics

In an Origins Project lecture, at Arizona State University, Frank Wilczek gave a contribution upon the Materiality of Space (see below for video).

What was remarkable was that much of what he had to say resonated, very markedly indeed, with my own ideas based upon the concept of an undetectable Universal Substrate (the hidden materiality of the vacuum) but, nevertheless, coming from a very different place; namely the more usually accepted consensus positions of today's Sub Atomic Physics.

Indeed, the last paper I wrote was also concerning Wilczek's work, and his supportive ambitions for the Large Hadron Collider in 2010, which I'm afraid I dismissed as a total myth.

However, this lecture has dramatically altered my assessment of him, as both a scientist and indeed, a philosopher. By alternate, indeed diametrically different means, he has arrived at very similar conclusions, to those I postulate, and this delivers a very different slant upon valid pathways towards the Truth that we, as physicists, always seek!

Indeed, the situation delivered far more than that: for he was introduced-by, and afterwards disagreed-with Lawrence Krause, who seemingly from the same theoretical stance as Wilczek, also demonstrated how that seemingly identical basis, was indeed diametrically opposite in various extremely important premises.

For Wilczek is a physicist: while Krause is, at heart, a mathematician!

And, as it became clear, Wilczek and myself, though arriving at very similar positions on Empty Space (he even mentions the word "substrate"), were nevertheless getting there, on the one hand, due to conforming to the same basic premises, still managed to do it, in spite of using very different means and sources for our theories. And, the subsequent presence and disagreements of Krause, also confirmed that his differences, in spite of working in the very same areas as Wilczek, put him in a very different position indeed.

Krauss is an idealist, whereas Wilczek is actually a materialist.

Now, by far the more important revelation for me was the possibility of arriving at similar conclusions from very different experimental evidence and theoretical bases. It clearly confirmed both for myself, and for him, that we, as scientists, did not either seek or expect to find Absolute Truth, but, on the contrary, what I term Objective Content - that is aspects or parts of that never-to-be-reached Absolute Truth, but which supply the best view of Reality we currently have: and which would always be open to improvement by new Objective Content, if it proved to be closer to that unobtainable objective.

In addition, Wilczek made absolutely clear what were legitimate theories in such Objective Content, citing, as I often do, James Clerk Maxwell's Aether - a fictional Analogistic Model composed of Vortices and Electrical Particles, from which he directly derived his Electromagnetic Equations - forms with enough Object Content that we still successfully use them today.

And, this also says something quite profound, and generally not understood, about how equations are derived.

For, most equations are what I term Pluralistic Equations, derived initially from intensely pluralistically-farmed experiments, and thereafter wedded to Pure Equations from Mathematics by adjusting the Equation's constants to make them fit. And, that is very different indeed from Maxwell's Holistic derivation of an equation direct from a Physical Explanatory Theory.

Indeed, elsewhere, and at another time, working with the mathematician Jagan Gomatam, I was able to use equations he had developed directly from theory to do with the beating of the Human Heart, which in contrast to equations as a consequence of experimental data, actually were able to demonstrate both Fibrillations and Heart Attacks.

But, how many modern day physicists do things that way round, and thereby actually knowing why it gets closer to the Truth?

Now, Wilczek certainly doesn't define Empty Space as I do - filled with an undetectable Universal Substrate of Leptons. But, he does insist that Empty Space is filled with something material.

His current model uses Quantum Fluctuations, but both theories are identical functionally in how they explain both Pair Productions and Pair Annihilations: and crucially Wilczek clearly admits to having the same stance upon the necessity of such currently-valid Analogistic Models!

Now, as to where Wilczek and this theorist differ, it is certainly in exactly what materiality, which actually fills the vacuum, and is both affected-by what is happening to it, and consequently what those effects upon it do to things contained within it. With literally only directly undetectable Quantum Fluctuations, we can commend any attempt for The Theory to directly determine any subsequently arrived at formulae, but at the same time, it is almost impossible to theorise as to what that form is likely to be.

While, in contrast, with this theorist's known Universal Substrate Units, both aspects can be adequately and correctly carried through to completion - that is for the full-detail, Analogistic Model (á la Maxwell) from which to generate the necessary Equations, as Maxwell did from his Model of the Aether.

There is much more in Wilczek's lecture than I have dealt with here. Some of his philosophical points are particularly powerful...

Clearly, the replacement of Quantum Fluctuations, and, of course, my Analogistic Model of the Universal Substrate, has yet to be achieved.

But the stance is right!

11 April, 2019

A Mirror of Reality

Reflection photograph by Michael C Coldwell
Reflection photograph by Michael C Coldwell

A Mirror of Reality at the Quantum Level?

Throughout the history of science, the attempts at explaining things correctly have been unavoidably stymied by who, and indeed what, we, the human interpreters, actually have access to, and how we interpret that knowledge.

For example, there isn't, nor could there be, any intrinsic human capability for addressing such questions - for Mankind was, initially at least, merely a clever ape, which for over 97% of its existence, as Homo sapiens, never got beyond the purely pragmatic tenet of "If it works, it is right!", as their only "intellectual" tool. Indeed, all of Mankind's congenital capabilities were selected-for only by Evolution, and, therefore, determined solely by Darwinian Natural Selection, involving just those capabilities enabling the species' overall survival and effective reproduction. Everything else has been only very recently attained - entirely socially - which only began within that last 3% of Mankind's total existence, and which could never be based upon the Full and Real determining Truth of the situation, as it wasn't then, and still isn't now available!

How on earth could this species of ape actually access such things? They only, and very-slowly, invented just a subset of the necessary words, and even that only over the last 1% of their existence, and as the History of Human Thinking, since then, has shown, every single gain has been, at its very best, approximate, and certainly never wholly sufficient. Nevertheless, though the bulk of their socially-created-language has always been exclusively descriptive, attempts at Explanation have been gradually improving, especially since the advent of Science.

But, the engine of Explanation has, unavoidably, always been Description. They could only start with Analogy!

For, though it does NOT deliver why things behave the way that they do: it does deliver how things behave, and in very different contexts that can at least begin to move the task towards common or similar causes.

Even thereafter, they could only proceed with natural and evidently-connected sequences of events. But, the actual reasons, or causes, for those connections were not usually evident.

So, in the early stages, such conceived-of causes were initially invented! And, it was only with the advent of a scientific search for actual, physical causes, that the process could be improved beyond the supernatural and the purely speculative.

Now, this contribution is evidently NOT an adequate treatise upon such questions, though they have been, and will continue to be, addressed fully elsewhere.

But, the above few points were clearly going to be indispensable here, if only to demolish the myth, that we already have all we need to Understand Reality: we are still a long, long way from that!

After all, it took almost 2,300 years for the more significant of the errors initiated by the Ancient Greeks, to at last be addressed by the German Philosopher Hegel. And, we still have, a further 200 years later, to comprehensively extend those crucial contributions to materialist Science - for they were in Hegel's hands entirely idealist!

So, in this paper, I will limit my objectives to a celebration, as well as a critique, of a certain PBS Space Time release on YouTube, which, I believe, shows where we are at in Modern Sub Atomic Physics at the present time!

Its topic is Virtual Particles.

And, it is remarkable how both that idea, and the alternative one that I have been pursuing (an undetectable Universal Substrate), perhaps surprisingly, actually appear to resonate-analogistically with each other, as attempted explanations of Reality at The Quantum Level!

First, the presenter tells of phantom particles appearing and disappearing in Space "literally in-and-out of nowhere"- the famous cases of Pair Productions and Pair Annihilations, involving one Electron and one Positron, present, perhaps, the best examples.

Now, elsewhere, similar virtual matter and antimatter pairs are also said to be created out of nothing, by "cheating the Universe", achieved by borrowing sufficient energy to do this, and paying it back by their almost immediate annihilation! And the Source for the energy required?

"It is the invisible Quantum Field!"

And also, near Black Holes, virtual matter and antimatter pairs of units are said to be split by the surrounding Event Horizon, to leave one IN, and the other OUT, consequently, overtime, delivering appreciable Hawking Radiation.

But, my own alternative explanation, for the former case, assuming an undetectable Universal Substrate, is achieved by involving, as crucial part of that Substrate, an undetectable joint-Unit, produced by the mutual-orbiting of the very same two sub-particles as are considered above. And, though these can absorb energy by the promotion of their inner orbit, too much energy will dissociate the union to deliver the two particles - free once again. Yet also, as part of that same stance, an appropriate encounter between two such free-moving, potential partners - of those same kinds - could cause their joint-capturing into a mutually-orbiting pair, and, therefore, become undetectable, apart, of course, from their effect as an energy-supplying Photon.

Indeed, all the Units of the undetectable Universal Substrate are conceived-of in that same, mutually-orbiting-pairs form, so energy can be internally held, and so will be generally available throughout the Substrate, from the promoted orbits of all such Units.

With such ideas, many problems consequently vanish!

And, with regard to the latter case, the suggested undetectable Universal Substrate will be absolutely Everywhere, and will both be affected by, and itself-affect the situations it encounters, including majorly transforming ones, where Substrate perturbations will cause all sorts of very different structural Phases, along with their differing consequent Effects.

E C Stoner Building reflected by Michael C Coldwell

Now, the main purpose of this paper is to compare Virtual Particles (particularly as described in the video above) with the Units of a suggested undetectable Universal Substrate. 

For, the video's presenter describes Virtual Particles as - not being physical, but, instead, being our simplified and idealised mathematical representation of the quantum mechanical behaviour of Fields.

This is clearly the crux!

For, as physicists, we always have to explain things physically. The clue is in the name!

And, the Universal Substrate as defined by this theoretical physicist is entirely physical. The natures of its Units are such as to actually physically supply Fields as useable energy, both held-within and delivered-from, various structural re-organisations of the Substrate's mutually-orbiting-pair type units. Though, these Units, all of which being such mutually orbiting pairs of exactly opposite matter and antimatter Lepton sub-units, deliver either individually or over-local-populations, no obvious means of passive detection, they, nevertheless, are both effecting-of and being affected-by, conducive interlopers within their various different physical Phases or "Fields"


Now, the problem for consensus physicists has always been the clear existence of Wave-like effects when no Substrate capable of producing them is considered to be present.

The infamous Double Slit phenomena caused by, say, moving particles seems to be totally inexplicable.

So, particles were given Wave/Particle Duality to explain such phenomena.

But clearly, another alternative could be to re-instate a Substrate, like the Aether, but for it to be wholly undetectable due to its unique, though still entirely material, composition. 

And, such a Thought Experiment was conducted, and surprisingly solved all the various anomalies of the full set of Double Slit Experiments. Undetectable or not, it would still both affect situations, and itself be affected by occurring phenomena within it.

But, physicists rather liked Totally Empty Space! It greatly simplified, and also made possible, all kinds of experiments - for attaining a vacuum, which was eminently possible, also "delivered" Totally Empty Space too. The presence of such a Substrate, especially as it wasn't detectable, would greatly complicate ALL experiments! For, all the usual perturbations as of other detectable substrates would occur here too.

And, in addition, the initial assumption of Plurality, at the very beginning of Mankind's intellectual concepts, had forced the absolutely essential, pragmatic farming of experimental situations, to greatly simplify, as well as select-for a particular targeted context with a single dominant factor, that would both clearly display, and then allow-the-extraction of that sought-for relation. And this was best achieved by pragmatists, who had learned how to do it effectively over a couple of millenna.

The theoretical physicists thus left it to their experimental colleagues to achieve the appropriate conditions, and, sometimes, to even extract the necessary data! Only then, did the theoreticians move in, armed increasingly with their "solve-all" discipline - Mathematics, to then find-a-form which they could fit-up to the acquired data.

So, with generations of such processes of simplification and idealisation, no-one wanted to reverse direction, and have to holistically juggle with multiple simultaneous varying factors, which had prevented development so completely in the distant past.

And finally, this technique had been justified by the assumption of the Principle of Plurality. which made the so-extracted relation into an eternal Natural Law-which isn't ever true!

Plurality may hold in Ideality, but never in unfettered Reality.

There are also many fundamental areas of Reality, which are still totally unexplained, particularly to do with Charge, Direction and Energy in Fields!

Now, the ever-present, yet never-explained properties of Attraction and Repulsion (usually linked to Charge) are clearly the major problem, for both my alternative explanations, and those based upon Virtual Particles.

They must attempt to provide the bases for a substitute to those non-physical, entirely-formal descriptions, at the very heart of the whole Copenhagen Interpretation of Quantum Theory.

For, that is a very old trick, indeed, and uses not a single causal explanation, but, instead, a whole range of probabilities, including counter-intuitive cases, to smuggle-in outcomes as selections from that range.

NOTE:A related argument is often proffered to counter supposed direction in the Evolution of Living Things, by purely random damage to Genes, certain cases of which, counter-intuitively and by-chance lead to development.

NOTE 2: To counter such "fixes" requires a philosophical discourse upon the opposing Principles of Plurality and Holism, which has been exhaustively pursued elsewhere, but would deflect us here from a more reachable and understandable, yet important objective for this paper.

Now, I will not pretend to be able to fully explain Attraction and Repulsion, but, once given an evident Force and its clear Direction, obviously evident by its affecting of a given entity, but I will deliver a full detailed Field, composed of of physical particles, with every single one containing, both the exactly correct amount of energy along-with-its-direction, sufficient to power the Field Effect at that point onto the affected interloper. and absolutely nothing will be taken from either the usually-supposed cause, or from the affected recipient: for they will both be totally unaffected in their prior-properties, by the actions of the Field! So, the active agent in establishing the Field, and supplying all the requisite energy, and its necessary direction, will be entirely due to the Units of the Universal Substrate alone.

Now, we must compare this with the Quantum Mechanical "explanation" supplied here as the consensus alternative, by this video.

Let us also attempt to deliver that alternative.

It is very different!

It involves an infinite number of possible amounts and directions, which are involved literally everywhere in the assumed Field, and are even simultaneously-present in every single, individual position, but this set includes every single possible option, including both Directions, but unlike this alternative Substrate version, the Copenhagen versions all have no physical container, nor are they specific: they instead are an immaterial infinite set - present everywhere!

And this appears to be an underlying vibrational(?) set of possibilities throughout the Quantum Field.

BUT, a real Physical Explanation can never really be even attempted: the best that can be delivered is a description of a kind of parallel universe, in purely mathematical forms!

In abandoning Explanation, these theoreticians are also abandoning Reality, for a parallel, merely-reflected world of Ideality- the realm of Pure Forms and absolutely nothing else.

Reflections and photography Michael C Coldwell 2019
Reflected World of Pure Forms by Michael C Coldwell

They can use their Mathematics, along with pragmatism - based upon experience - to deliver usable predictions, without any idea of what is actually going on, and why!

This is termed Technology! Science must attempt to actaully explain phenomena.

In working with Mathematics, they are exploring the truly infinite world of Forms available in Ideality, hoping to find appropriate patterns for everything that occurs in Concrete Reality. But, of course, that is impossible, as Reality is holist and consists of many sets of simultaneous factors all acting together, and influencing one another, in many different situations.

But, Physical investigations of these can be, at least partially, uncovered - that is what real investigative experiments are for!

In Ideality, you can't possibly know which of them: so you substitute, mathematically, all possibilities and hope, by a very different kind of experiment, to get enough multi-possible sets to pragmatically confirm, in each case, a particular probabilistic formal model.

But it will deliver useable Predictions ONLY.

It is, of course, an admission of Defeat for their chosen version of "Physics", and will only be ousted by the Creation of a Holist Physics to replace the dead-theoretical-end of current Pluralist Physics.

This article has now been published in SHAPE Journal, Special Issue 64

06 April, 2019

21st Century Marxism: The New Philosophy of Science

With the major Financial Crisis of 2008, its still-present consequences, and the clearly evident incapability of the current system to address them - the also still-remaining inadequacies of current Marxism just have to be addressed and resolved too, if this slump, like the last one, is not also to inexorably lead into another World War.

For, the evident crises in the USA, the UK and even a once seemingly buoyant China, along with the continuing Middle East wars, which all appear irresolvable - merely replacing one conflict with another, while continuing to concentrate ever more Wealth and Power into extremely small sets of Capitalist Elites.

Yet, the essential theoretical re-equipping of the World Working Class falls currently far short of what is necessary to address these situations. For, in spite of the recent long-delayed extentions of Marxism into the now enormous role of Debt worldwide, the absolutely crucial further development of Marxism to effectively deal with Science in general, has still not even been adequately addressed, never mind achieved!

The proof of this is very clearly demonstrated in the still undefeated Copenhagen Interpretation of Quantum Theory in Sub Atomic Physics, along with the ever wilder speculations in Cosmology, and even the drifting of the Life Sciences away from the standard established by Charles Darwin, and towards an importing of the wrong turnings in Physics into Genetics, and many other areas, and away from any possibility of a true Dialectical Approach into the common dead ends of the current totally pluralist approaches.

Indeed, the most debilitating decline has established itself, most damagingly, in Marxist Philosophy, wherein, not only have theorists abandoned applying Marx's methods to the Sciences, but have also even rejected that task conclusively, with a conscious, and openly-admitted return to Hegelian Idealist Dialectics.

It is yet another repetition of an oft-resorted-to retreat, wherein the still unconquered areas in Science, are assumed to be impossible in their current state, so the return to Idealism is considered the only way to an absolutely necessary re-equipment of The Method. That, most certainly, is NOT the way to do it!

But, consider how long it took Marx himself to deal with Capitalist Economics. How much more enormous do you think the full range of Sciences are to completely recast from the situation, after over two millennia of Greek Plurality, and into an as yet still far from complete Dialectical Materialist Revolution in scientific method?

Indeed, what Marxism has always required, in order to deal effectively with that enormous range, has been the successful dedication-to, and adequate developments-of that approach, to also re-equip it generally for the problems we face today.

I have been a professional physicist for almost 60 years, but it took extensive excursions into Mathematics, Computing and even a long period of inter-disciplinary researches into subjects as far apart of Dance and Engineering, Biology and computer controlled test-rigs, followed by a decade of intensive study of Philosophy to finally be in a position to deal effectively with Copenhagen.

It wasn't a return to Idealist Dialectics that was needed, but a real Revolution in Materialist Dialectics.

This vital turn is now complete.

ResearchGate now features my book on this subject: The Real Philosophy of Science

Read it!

18 March, 2019

Issue 64: The Holistic Universe

In this bumper edition we collect together the most important cosmological writings of Marxist philosopher and physicist, Jim Schofield: his work on the nature of the Universe.

In his ongoing application of Dialectical Materialism to the many disciplines of science, Jim has increasingly turned to Holism as the answer to the persistent crisis in Physics. But this ancient philosophical stance isn’t what most people think of when they hear the term “holistic” science.

Reclaiming Holism

Much like the rampant misuse of the word “quantum” by quacks and snake oil salesman the world over, the word “holistic” has been dragged through the dirt for several decades, becoming synonymous with the worst kind of pseudoscientific drivel, in the minds of many scientists, and in the popular consciousness too, particualrly when it is applied to the field of medicine.

For the team at SHAPE Journal, it is high time this vital word was reclaimed for those who use it deadly seriously. While holism is often used as an exuse by some to abandon analysis and scientific rigour in favour of some questionable belief system - the rational always subtended by the spiritual - the philosophical concept itself, implies no such thing.

The dictionary definition of the term doesn’t suggest this either.

Holistic is posited as the antonym of ‘atomistic’, as the study of wholes rather than parts, or an acknowledgment that parts cannot be understood without reference to the whole (and vice versa), that contexts and the changing relations between entities, are as important as the entities themselves. That holism is oppositional to reductionism doesn’t entail an abandonment of analysis, but a crucial acknowledgment of what analysis actually is; the limitations of all analyses and the necessity for examining the real material contexts in which any findings occur.

To really understand what Holism is, it is important to understand it in terms of its opposite, the currently dominant Priniciple of Plurality, and Jim spends much time explicating the differences between these two philosophical approaches. As he states in What is Holism:

“Plurality saw Reality as being determined by a set of eternal Natural Laws, which simply summed in various mixes and proportions to deliver everything that there is. The task of studying Reality (in all its diverse forms), therefore, had to be to reveal what these Laws were, and any means that could be used to reveal them more clearly was considered a legitimate method for finding such clearly defining things. For, as they were eternal, they could not be changed by context. So, if the context was significantly adjusted to most clearly display a given Law, that would in no way change the sought-for Law. Context would still determine what was seen normally, but merely due to the summing of a set of eternal laws in a given set of proportions.”

This is contrasted with Holism in the same paper:

“This was most carefully defined by The Buddha in India, about the same time as Plurality was being revealed in Greece. And, in a nutshell, it was defined as, “Everything affects everything else” or alternatively as, “Everything is always in constant change!” You can see how very different this premise made the process of understanding Reality. Instead of the pluralist assumption of the addition of FIXED things, there was instead the holist assumption of the mutually-affecting combination of easily changeable and hence constantly CHANGING components.”

Holism is most vitally different in how it sees time rather than space - it’s not just about looking at wholes rather than parts, but looking at changing properties over the assumption and manipulation of fixed laws that we see in all the sciences. Hopefully you can begin to see the relevance of this to Dialectical Materialism and to our understanding of the evolution of the Universe.

The tendency in Physics is to assume the laws that control the Universe have always been the same, but there is no evidential reason to assume this - the flaw is an unspoken philosophical assumption - and it has lead to a very skewed view of Cosmology.

This set of essays begins the task of looking at the Universe and its history holistically - the Universe as an interconnected and evolving Everything.

Mick Schofield
SHAPE Editor

16 March, 2019

The Casimir Effect and Substrate Theory

Explaining "vacuum fluctuations" without quantum field theory

"Any medium supporting oscillations has an analogue of the Casimir effect. For example, beads on a string[3][4] as well as plates submerged in noisy water[5] or gas[6] illustrate the Casimir force" 

(my italics)

The quotation above is significant, even if it is just from Wikipedia! It allows us to consider a very different explanation to the consensus one usually adopted for the actual Casimir Effect, and it allows us to compare them.

The Casimir Effect (between two conducting plates in a vacuum) presents an excellent phenomenon for contrasting Copenhagen Interpretation of Quantum Theory with a new alternative account, suggesting the effects of an undetectable Universal Substrate composed of units consisting of mutually-orbiting pairs of Leptons (Substrate Theory) which replicates the idea in the quote in the exact circumstances of the actual Casimir Effect.

Clearly, that quote makes such a comparison absolutely necessary, for it immediately suggests an undetectable medium (though extremely fine-structured perhaps) as potentially delivering exactly what we observe, rather than QTís disembodied ìvaccuum fluctuationsî.

If composed of appropriate Leptons, these joint-units could be completely undetectable (cancelling-out all observable effects), while delivering the necessary properties of such a medium, and possibly also being capable of the propagation of electromagnetic energy, and fluctuations required to deliver the observed Casimir Force.

Such an invisible and connected medium has been fully theoreised by this researcher - termed a Paving and formed from Neutritrons (units composed of the mutual-orbiting of two Leptons - one Electron and one Positron) it presented significant suggestions that, in spite of the neutrality of such joint-units overall, that they could on very-close-approach, produce an affecting oscillating effect of alternating attractions and repulsions created-entirely due to cross-influences between the sub-units in different adjacent Neutritrons, which would loosely-link the joint-units together, to form that Paving, with the involved overall units constantly oscillating about equally spaced positions, and thus enabling a means of EM propagation, due to the demotion of energy from the orbit of one unit, and its promtion to the orbit within the next, immediately adjacent unit, thus delivering a bucket-brigade-transfer, and consequently propagating a quantum of energy, at a fixed speed - giving us C.

A Neutritron

Now, if such an undetectable Substrate permeated the universe, especially as it is composed of oscillating units, it could also be a real alternative to the so-called Quantum Field of empty space. It would, for example, be capable not only of propagating energy, but also of holding and delivering it in appropriately conducive contexts. And the point about the Paving also shows how at tiny separations similar linkages with the orbiting-electrons and relatively static nuclei in the atoms of a sheets of conducting material, would also be possible in the same sort of way.

Now before the vast majority of Physics academics succumb to damaging heart attacks, may I inform them of the alternative explanation of Quantized Electron orbits in atoms?

As soon as even the remote possibility of an undetectable Universal Substrate was suggested, its necessary composition and consequent properties were required. Particularly as the sole composition by Neutritrons had already been able to remove every single one of the anomalies of the full set of Double Slit Experiments, without any recourse whatsoever to the Copenhagen Interpretaion of Quantum Theory.

And, an extension of the Theory of the Universal Substrate composed only of Neutritrons immediately revealed that the devised Paving was by no means a stable form. For fairly low applied energies would dissociate the Paving into individual units, and they could either thereafter act like a released random gas, or be driven by moving energetic interlopers into streams, or even into vortices, and though forms like the latter would usually be temporary - that would not be the case when caused by orbiting Electrons - for the orbits would cause the Electrons to constantly traverse the very-same-route, so the Vortices could be maintained by the returning electrons. And, remarkably, energy could also be passed back to the orbiting electrons by these vortices! For the overall energy available, only certain orbits would be possible: a physical explanation for quantization.

It soon became clear that if appropriate different extra Substrate Units were available, Electrical, Magnetic and even Gravitational fields could all be features of a heterogeneous Substrate. After all, it would explain why the supposed causes of the Fields were never diminished by the energetic actions of those Fields.

Magnetons theorised as part of the Substrate

The required new units appeared to also be possible as mutually orbiting pairs of Leptons, but now with differently sized components, so that Magnetic Dipole Moments would be unavoidable. And the involved Units could both propagate and indeed subtend actual Fields, due to retained energy in the Unitsí internal orbits.

Even the required undetectability could be achieved by equal numbers of mirror-image joint units, which as a ìrandom gasî would be undetectable, but as statically formed areas, associated with their initiators, could easly subtend the appropriate Fields.

Read the rest of this paper on ResearchGate