Showing posts with label Substrate Theory. Show all posts
Showing posts with label Substrate Theory. Show all posts

26 July, 2022

Special Issue 77: The Systems Theory of Everything Part III

 





Special Issue 77 contains the third instalment of The Systems Theory of Everything.

This series of issues attempts to set out the first definitive account of Jim Schofield’s new Systems Approach to Science. The various papers collected here, and over the next few editions of this journal, explore the proposed theory and explain why it is such a radical departure from the current universally applied scientific method. 

The series continues by examining how Systems evolve over multiple Levels, and how this fact effects the reductionist discipline of Physics.



Contents:

Introducing Schofield’s Systems Theory

Top-down and Bottom-up Development within Evolution and Physics

Natural Active Stabilities

Assumed Restricted Scenarios and their consequent Man-Made Laws

Levels and Tempos

Mankind’s Greatest Mistake

How the Mistake Affected Theory

03 December, 2021

Invisible Media


Substrate Theory proposes that an invisible medium of Leptons permeates all known space - a Materialist and Holistic explanation for all of the strange phenomena at the Quantum level, that somehow seems to lack accessible Causality. Here we begin to consider whether such Invisible Media could be operating at other Levels of Reality too.

Recent work designed to reveal the full, and yet often hidden motive forces within Reality-as-is, has begun to founder, upon certain important Effects, seemingly without evident Cause. In Sub Atomic Physics, they were the latest excuse for theorists to drift ever deeper into an out-and-out Positivism, by substituting purely-descritive Mathematical Formulae instead of Causal Explanations!

Indeed, actual Causal Chains have been traced downwards through increasingly evident, yet relatively-independent, Hierarchical Layers involving multiple, diverse Effects, and their often Unknown or unavailable Causes, until, finally, an Invisible Wall appears to totally halt that process! And, which, therefore, effectively permanently terminates these investigations, and, indeed, stops-dead in continuing to supply both the culprits and their actual physical Effects: the investigations seem to encounter a Final Limit - as an evident inability to reveal exactly what needs to be changed further, in our descriptions and explanatoions, to fully deliver an adequate theory of the event.

Of course, much of this can be put down to loudly trumpeted misdirections by those who benefit most from such terminating advice to any campaigning oppositionists to the current Status Quo. But, not everything is down solely to such imposed difficulties!

Indeed, many of them seem to boil down to incomplete and hence wrong-and-misleading usually generally- accepted definitions, and consequent “Wrong Theories”, by supposedly independent, or even seemingly fully- committed and serious prior researchers. Even the “experts” on our side can be the unwitting culprits.

It is always worth remembering that we do not have totally unhindered access to Reality-as-is! It is always insufficient and structured in the the easiest-to-grasp- way, due frequently to historically effective precidents. And these established ideas, as well as their clearly useful side, are also-and-inevitably misleading, when confronted with the Wholly New!

But, recent extensive investigations in entirely new approachess (by this reaearcher) have begun to uncover wholly new definitions of previously-unknown, active causal elements, which seem to quite wrongly prematurely terminate investigations, as being incapable of further necessary revelations, as the currently involved elements appear to have exhausted all possibility for further change. And this is not the usual kind of added contribution, as it is subtractive rather than additional.

And, the reasons for these premature terminations seem to reside elsewhere in an extended overall set of limitations, only now making themselves felt, additional to the previously supposed only important causal factors.

And these are NOT, as you might think, as being due to purposely ommitted factors built into the usual inadequate definitions, but instead, actually due to a running-out-of-possubilities, due alternatively to wholly Invisible factors, that have become so, caused by the wholly Pluralistic-and-hence-truncated definitions, instead of the Real and much more widely-defined Holistic definitions, which alone can enable certain factors, entirely from outside-of the usual, to both change, and even, sometimes, switch to different, and even Wholly Opposite properties, when the conditions pass certain significant thresholds elsewhere.

The actual Hierarchy of related Levels is not considered!





These occur because apart from embedded effects, due to the presence of certain agents - for in a truly Holistic World, collections of mutually-affecting factors can be simultaneously present, but usually in most circumstances only allowing a limited selection of possibilities, but can, if an unusual mix is arrived at, which implements the exact opposite of what had previously seemed to be always the same, and also delivering a very different outcome.

Finally, and perhaps most importantly, the Multi- Layered-Hierarchy of the semi-independent layers of causality is NEVER usually considered, and that is precisely when the causality originates in a usually correctly ignored Layer, so that the New Cause seems to come from Nowhere!

The Invisble Walls can be breached and Explanations from without are validly brought in!

NOTE: Inklings of such events were seen millennia ago by Zeno of Elea, and formalised very much later within Hegel’s Dialectics, but a systematic revelation required the revolutionary changes implemented by Karl Marx in his own Dialectical Materialism, which he predicated on his new Holistic and Materialist approach to History!

Indeed, the very essence to Holist Situations involving a number of interacting and mutually-affecting factors, is, first, that they don’t merely add-up, but, acutally to widely-varying degrees affect outcomes, and even in rare cases flip to the exact opoosite of what pertained in seemingly all other cases!

NOTE: In the classic “if-then” clause in Programming, not only is there no Causal Reason ever given for the usual outcomes considered without any full explanation, but the third option (evident if the elsewhere factors are considered), actually adds adds “neither” as another outcome, which can also actually terminate a causal sequence also, with no-one being the wiser!

And this was previously taken to be a totally unchangeable thing, but, in fact, in this precise context, at least, could only flip to the opposite.

And this takes Holist Studies into a wholly new and usually totally unanticipated realm, which I have termed Systems Approach Processes, opening-up vastly extended, if also, rare ranges of options that initially are always deemed impossible!

This paper is taken from the latest issue of SHAPE Journal on Holism and Subatomic Physics (Special 75)
 



02 December, 2021

Special Issue 75: Holism and Subatomic Physics


Read Special Issue 75



This edition continues this journal's exploration of a nascent Holist Science.

Though I have been approaching the mysteries of the Subatomic world for a very long time (see Substrate Theory and The Theory of the Double Slit) - I now must tackle the many anomalies of this area of Physics, Philosophically, to have any chance of establishing a new Holist approach, before Physics effectively destroys itself, with its increasingly generated contradictions.

These papers bring together findings from Substrate Theory, The Theory of Emergences and the new Systems Approach to Science, to make the case for using Dialectics in Physics and the search for further "Invisible Media" across the Sciences.


20 February, 2021

Issue 72 of SHAPE Journal: Is the Universe Electric?

 




What is the Electric Universe? 


This edition examines the controversial Electric Universe group of physicists and their ideas, comparing them to the consensus position in Cosmology, and attempting to reveal both of their inadequacies, regarding a shared basis in Pluralist thinking. 

If we were to judge Electric Universe solely by their representation in the mainstream media and in popular science writing, we would quickly discard their contribution entriely, as pseudo-scientific conspiracy theories, lacking in any evidence or even brand it a dubious cult. Have a little read of this article from Vice magazine for a flavour of the discourse in question. Electric Universe adherents are called believers, rather than investigators, in the writing, and the substance of their ideas is written-off as total crackpottery. 

SHAPE Journal takes a more nuanced position to these matters. There are certainly major problems with much of what falls under the Electric Universe banner - but the project seems to be a surprisngly broad church - and one that welcomes many outsider scientists and non-conformist thinkers in Physics, that actually have something worthwhile to say. Some of the research undertaken by people affiliated or associated with the Electric Universe, is actually rather good indeed - but doesn’t seem to benefit in terms of credibility, from their link with EU. 

The Electric Universe was established by Wallace Thornhill in 1994, and now has a fervant worldwide following and annual conference. In 2007, Thornhill published a book with David Talbott under the same name, and this became something of a bible for the movement, alongside the film they made, Thunderbolts of the Gods. The guiding principal, is that electricity is the most important force in the Universe and Plasma is the dominant form of matter. 

Jim Schofield first became aware of Thornhill on Youtube, during the early stages of his research into Substrate Theory, as Thornhill also seemed to insinuate the presence of some hidden substrate - a sort of reformulated Aether theory being necessary to explain the propogation of Electromagnetic radiation across the Universe. He also seemed to reject the mathematical reductionism in Physics that Schofield was pushing against in his own research. Unfortunately, Thornhill went no further down this road - and it quickly became apparent that the leader of this movement had no coherent integrated basis for these ideas, no over-arching theory at all in fact - and that he and his closest followers were worryingly prone to fishy mythological references and conjecture, relying on rhetoric rather than evidence to support their arguments. Despite all of the gaps, the absence of evidence for many of their ideas and lack of quality control on the research that falls under the EU umbrella, there is some interesting stuff to be found there - Gareth Samuel’s “See the Pattern” videos being one such example.

Plasma research and Plasma Cosmology theories seem to be the source of the best contributions the extended Electric Universe family has to offer. Work on plasma filaments, the Structured Atomic Model and various hints at some electrical medium pervading space, all have potential with verifiable ideas being postulated. 

Both Plasma Cosmology and Jim Schofield’s Substrate Universe, attempt somewhat similar things - reexplaining physical phenomena in space, using only known particles of matter (Leptons) in various different arrangements and states, linking up the Universe in various ways, allowing the propagation of EM radiation and the construction of vast electrical and magnetic fields, if not gravitational ones too. Both offer materialist solutions without recourse to the mathematical idealism we see in the mainstream - spooky action at a distance, Quantum Entanglement or Uncertainty, for example. 

It certainly seems possible that these nascent sciences could end up supporting one another, or even combining, to construct a new view of the physical Universe grounded solidly in material reality, and its observable electromagnetic properties. For this reason, and others, Jim Schofield has given the Electric Universe gang a little more time than most theoretical physicists would, with particular interest in the front line of Plasma research. 

Digging deeper reveals other important connections. For Schofield, Eric Lerner’s research into Plasma and Fusion Reaction, is some of most exciting happening today - certainly pointing towards a much more holistic way of conducting Physics research, and scientific experiments, more broadly. 

Although not directly associated with Electric Universe, in the video below we see Lerner talking with Gareth Samuel about Fusion Energy, Plasma and Cosmology. The unacknowledged role that Plasma plays in the Universe is of key concern to both Lerner and the wider Electric Universe crowd, who see plasma filaments as vital to linking up their electrical stars and galaxies. 




Whether or not all these ideas have much merit, Lerner has certainly shown both the importance of Plasma in understanding the Universe, and that much of the received wisdom in Cosmology is not settled at all - the Electric nature of the Universe is still open to question.


24 January, 2021

Special Issue 71: Enter the Microverse!

 

Enter the Microverse by Jim Schofield

Special Issue 71 of SHAPE Journal


This edition looks at our latest Physics research and continues this publication’s ongoing critique of the Copenhagen Interpretation of Quantum Theory - and the general consensus within this Science, that the dynamic physical Universe we inhabit can be reduced to Mathematics for study. This issue argues for a re-examination of the materialist Microverse, rather than the idealist Multiverse. Philosopher Jim Schofield argues for the use of Analogistic Models, rather than Mathematical Abstractions, in a new Holistic Physics, influenced by the research of Yves Couder, Eric Lerner and Karl Marx - to stop stabilising reality in order to study its forms. Instability is the real key to understanding...



The Pluralist Limitations


in Modern Sub Atomic Physics



Though the limitations imposed by the Pluralist Stance is a reucrring feature of my criticisms of Current Science, it isn't always realised just how profoundly damaging it is.

And the main reason is, that there is so much of a many-layered superstructure overlying and effectively hiding these mistakes, that they are easily omitted from the general foundations upon which discussions are based.

By far the most insulating of these is the undoubted power of the "enriching" role of Mathematics, upon the Reality it is assumed to accurately represent, which is significantly mistaken, not primarily because it is LESS than true Reality, but because initially, at least, it appears to contain vastly MORE - and of a coherent, consistent, and frequently very beautiful nature within what we term Ideality - and particularly those aspects of the World which conform only to forever Fixed Laws.

And, what underpins that mistake, is the fact that within always-temporary interludes within Reality (both temporally and spatially) those relations DO indeed fit the actual circumstances perfectly - but ONLY while the necessary fixing conditions are maintained.

It certainly didn't help that Mankind soon learned to artificially achieve-and-maintain such stabilities for himself, both in investigative experiments, and also-and-necessarily in directed Productions - FOR Mankind wrongly assumed that they were revealing the actual underlying relations, hidden beneath a collection of other simultaneous and non-mutually-interfering relations.

But in doing so we also assumed the total independence of all those relations - NOT affecting one another, but merely summing somehow, and thereby hiding the assumed pristine fixed relations underneath.

Unfortunately for Physics, this is certainly NOT the case!

It would be so, if-and-only-if, Reality were Pluralist in nature, but it isn't: it is Holist.

And the proof of this is that "all development" can be shown to happen via holistic inter-relations, and NEVER due to mere collections of Fixed Laws.

But, what is it that Ideality has to offer that can't be derived from holistic Reality? It is true, for everything which involves Pure Forms, and the complexities possible thereby - and that "richness" again intervenes, because though certain complexities are Real and due to holistic interactions, they can also be occasionally approximally-approached by wholly pluralist complexities - BUT never causally and interpretably, as is possible only in Reality.

Now, this distortion of the Sciences, and ever more generally applied to Logic, it is still universally dealt-with, based upon the assumption of Plurality throughout: so there is absolutely NO general realisation of the unavoidable distortions so unavoidably produced. And, the misconceptions are merely guaranteed, as being what is attempted to be understood, and so gets further and further away from everyday Reasoning. And consequently seen as the norm in such a specialist area as Sub Atomic Theory, so cannot but be also dominated by the Plurality, always employed throughout that specialist area. And, of course, being distantly separated from everyday Common Sense Logic, it cannot but be thought about in terms of Mathematical (pluralist) Equations, so that further investigations will ONLY be via mathematical means, and hence well hidden behind Equations and the usually allowable Pluralist Manipulations, as the easiest and reliable means of delving any deeper into that invisible World.

But, of course, it can only be carried-out pluralistically, using only easy to achieve manipulations of their Algebraic Forms, and consequently assuming that what comes out of such manipulations readily reflects Reality-as-is - when that is NOT the case at all!

If the totally dominating developments are exclusively Pluralist and Mathematical, they will NOT be the correct Physical Truth involved, but a purely rationally formalist development of those incorrectly attached Forms: and by the addition of more of the same kind of assumptions, will inevitably lead the search, ever further away from Reality-as-is - though occasionally, and always for the wrong reasons, arriving in a place where we can convince ourselves that it is an exist-able situation in the Real World, when that isn't the case at all.

So, let us begin to demolish Plurality from top to bottom!

First, as all pluralist relations are eternally Fixed Laws, they, at best, will be viable only within a Descrete artificially organised Range of Applicability. For outside of that Range, each will be totally wrong!

Neither could it self-transform into what describes the situation outside of that range: for its variables are only changeable quantitatively, and being beyond its existence limits, it will no longer exist within the Fixed Law. It may vanish altogether to be replaced by something else! For, even if it continues to exist within a New Relation, it will be differently related to wholly New other variables.

Indeed, pluralist-dedicated investigators, often use the passing of a Threshold Value to signal the demise of the relation, and the consequent dominance of another different, but as yet unknown one.

And, one such cannot transform itself into any following Forms: for they are eternal Forms only.

Indeed, though we associate them with situations and processes, they ONLY deliver the performance of, and relations between, fixed Forms, and absolutely nothing else.

Indeed, such a mathematical approach cannot be called "a Theory", because, though it directly relates variables in descrete interludes of purely quantitative changes, it says NOTHING about the crucial transitions of qualitative change involved.

Unless, you are an idealist and believe the numeric laws magically drive Reality, and merely knowing what will happen and when, is Understanding Reality! It isn't! 

You also have to know "Why?"

23 April, 2020

Issue 69 of SHAPE: Waves and Fields






Waves and Fields in Media
This new issue of SHAPE Journal tackles some of the most important problems in Physics from a Marxist perspective - revealing the science’s overlooked assumptions and disingenuous methods in comparison to new materialist Substrate Theory. This new kind of Physics assumes a hidden medium of Lepton particles permeates the known Universe, propogates light and explains some of Physics’ darkest corners - from pair production to quantised orbits to Dark Matter to Heisenberg’s Uncertainty Principle.

As scientists dug ever deeper into Reality, even approaching the mythical “Fundamental Particles” of the Universe, which were supposed to be the original causal sources of absolutely everything, the proliferation of impasses became so abundent that “Explanation” itself was abandoned as a myth, and replaced by Mathematical Forms alone.

Unfortunately Mathematics is a Pluralist Discipline, requiring only fixed determinations of disembodied entities and forms, so it was always congenitally incapable of addressing what was required. Instead of seeking the engines of Qualitative Change, which might explain the evolution of matter and the rules which seemingly govern its behaviour, the discipline was still permanently orientated to seeking only eternal Natural Laws. Sub Atomic Physics, experimentally, was primarily restricted to seeking new discoveries in ever-higher- energy Colliders, or ever more powerful Telescopes - as if somehow technological progress would prove to be a magical salve to these shortocomings. All the while, Physical Theory was religated into the mere exotic manipulation of Mathematical Equations. Physics had been totally emasculted!

As a result, such an important scientific discipline has been effectively disabled from being analysed in the usual classical way, and, in spite of the sophistication of the Mathematics involved, on any completion, it has still been returned to its inadequate means of the past - explaining very little, yet delivering over long periods of time, a stream of seemingly constant states, each of which maybe “Wholly New”, to give the false impression of real descriptive Progress, but that old simplifying Phase has long passed.

What is clearly required today is a Revolution in Explanatory means, which can no longer avoid the crucial mechanisms of Real Qualitative Change - something the old single causes can never deliver.

Now, such a requirement cannot be solved by the usual kind of Causality: it is not a specific result requiring a single cause at all, but, on the contrary, more like a Complex, multi-factor System, that in exceptional circumstances changes into a very different one, or even a wholly new, never-before-experienced Emergent State.

It is a problem, which most certainly, involves many simultaneous factors, which have NOT led to Chaos, but, on the contrary, have arrived at an overall-interacting- mix of processes, that usually delivers a reliably stable- and-unchanging overall state (in fact appearing as a permanent natural and unchanging arrangement allowing that assumption of the eternal laws of physics which dominate the field), but which in particular and unusual sets of internal changes, can flip-as-a-whole into a new and different, but temporary stability, once again appearing to be another “permanent” result!





Clearly such Qualitative Changes are never the result of a particular single cause, with a known outcome, but, on the contrary, actually a temporarily-balanced outcome of a complex system, which will find a balance in one of its many possible outcomes, yet most-of-the- time stopped by the internal consequent changes that actually automatically oppose any externally imposed changes, in various corrective ways. But, exceptionally, can alternatively also wholly dissociate, and thereafter re- organise into something entirely NEW. And as the whole thing involves many different factors and levels, each and every one accompanied by one or more balancing opposites, so, the overall results are not easily diagnosed.

Now, unlike the usual Pluralist kind of Causality, the Dialectical form is never, so-directly, predictable: indeed, though the changes often take place within relatively-short Emergent Interludes, it initially involves the trajectory of a whole sequence of dissolutions, and a following, and often-entirely-new sequence of constructive associations, to finally deliver one-or-another from a whole range of possible final outcomes. Indeed, these Emergences are so quick-and-diverse, that the detailed trajectories seem impossible to theoretically reveal!

But, nevertheless, it was surprisingly discovered happening within Social Revolutions, and at much slower tempos, which could indeed be effectively interpreted, dialectically. Indeed, it was, in just such solvable situations, that Karl Marx finally got a general handle upon holistic processes of change, and enabled successful outcomes within a whole series of such Revolutions occurring in the 20th century.

Now, historically, the total absence of such a Dialectical means of analysis, meant the continuance of the prior Pluralistic Approach, which could never cope with such Changes, so the usual method was to, successively, greatly simplify the real situation, until it finally DID conform to Plurality, for each and every recognised Law, applied separately, within separated, tailor-made series of set-ups, would be necessary. And, to compound the felony, the other defining aspect of Plurality - that such acting laws were independant of one another, so their combined results would be given by mere additions, would also and quite incorrectly, be included in the analysis.

Naturally, in spite of such multiple, separately carried- out experiments, to model the Natural Combined Event, the results would NOT conform sufficiently, so further restrictions were included to remove any possibly- transmitting intermediaries, such as any Media, which could intervene, with the excuse, seemingly verified by the Michelson/Morley Experiments, that Space itself was totally empty of any Universal Substrate!

But, of course it was the role of such substrates, which enabled many phenomena to be adequately physically explained by Waves within them.

The Wave or Particle Nature of Light exemplified by the disagreements between Newton (for Particles) and Huyghens (for Waves) was resolved by the belief in Waves, but somehow without-a-Medium.

But this, in the 20th century, was removed from physical explanation alltogether, and via a construct known as Wave/Particle Duality, “unified” by a wholly illegitmate mixture of Wave Theory and Probabilities, gave useable results, purely pragmatically, via mathematical formulae alone.

But, there is the much more important aberration ofUnderstanding, now dominant across all the Intellectual Disciplines, which does not merely distort our view, but damagingly emasculate our ability to travel, even slowly, towards the objective of revealing the Truth of Reality. For Plurality, as is totally unavoidable in Mathematical Reasoning, has long been uncritically extended to both Formal Logic and literally all the Sciences. So that Qualitative Change, which is the only engine of all the varying qualities of Reality, and all development too, have been totally excluded from our view of the Universe.

Yet, there has been a tenuous, and often, all too frequently, an almost invisible link - to a solution, literally equally as old as has been the influence of the current consensus established by the Greek Intellectual Revolution, some 2,500 years ago.

It consisted of a totally opposite stance, which was developed by the mystics of India, simultaneously with the Greek Revolution, and greatly influenced by The Buddha, in which the determining essence of Reality was NOT Stability, as it was with the Greeks, but Qualitative Change - as evidenced by The Whole Living World, and the Consciousness of Man, both of which were available all the time and everywhere, though far more difficult to extract than the simplifications of Plurality.

It became known as Holism!





But these two alternatives appeared as mutually exclusive, so once one of them had been decided upon, consideration of its direct opposite appeared not only impossible, but actually incomprehensible! Though even within the Greek dominated arena, a dissenter called Zeno of Elea appeared immediately after the Intellectual Revolution, who in his Paradoxes, where he applied legitimate contradictory concepts to Movement, was able to expose many rationally untranscendable impasses, though he wasn’t aware of it, to the falsity of the Pluralistic stance imposed by the Greeks, upon General Reasoning from Mathematics.

And it wasn’t until some 2,300 years later, that the German Idealist philosopher Hegel, resurrected Zeno’s crucial exemplars, and considerably extended them to a much larger number of Dichotomous Pairs of Contradictory Concepts, that a possible reason for them appeared to be revealable.

And it boiled down to the impossibility of Plurality in situations where certain things could and indeed did change over time, in these particular cases, due to their being caused by two simultaneously-present opposites, in which the current relative proportions of each of them could change, and naturally cause a resultant switch in dominance between them.

It was a simplified explanation, but was then extended to a more complex System wherein everything varied, but arrived at complex Systems that settled into self- correcting, apparently permanent Stable States (such as those considered primary in Plurality and Physics).

The keys were still natural opposites, but “interpenetrating” in various ways to deliver an important System of corrections to Pluralist Formal Logic, which Hegel termed Dialectics.

NOTE:

An important contribution to this theory was recently developed by the writer of this paper in his researches into the pre-Life Chemical processes, which had to have preceded the first appearence of Life itself, in Systems of Natural Organic Chemical Reactions, and their overall trajectories of change, which were the prerequisites of Life, and which I called Truly Natural Selection.

Of course, Hegel’s version as an idealist philosopher was to his mind, and could only be about Human Thinking: but the transforming step was completed by the Young Hegelian follower of Hegel - namely Marx himself, who transferred over all of Hegel’s gains to a wholly Materialist stance, which only then could be correctly applied to both Reasoning and Science, initially by his own Key Intermediary of History. within the most profound Qualitative Changes ever - those occurring within Social Revolutions, as their necessary slow-tempo revelations, and his ever burgeoning Critique of Capitalist Economics, as the detailed coherent definitions of all such Qualitative Changes, in an on-going, constantly developing, yet constantly contradictory System.

Sadly, the final steps in this new Intellectual Revolution, which just had to be the detailed application of Dialectical Materialism to Science, took another 149 years to be addressed by this particular 21st century Marxist, and only completed in the latter part of 2019.

In this new issue, these dialectical studies are taken further, looking at the role waves and fields play in physical holistic systems, and how their study might change Physics forever.

08 March, 2020

Special Issue 68: Redefining Philosophy





Redefining Philosophy? 


You would think after two and a half millennia that a Universally-Agreed-Basis for Philosophy would by now be well established, but that is not only far from being the case, it is also inevitably so!

So, let us reveal the unavoidable trajectory of Mankind’s Intellectual Development into a real perspective. Rational Thinking of any developable kind is at most 2,500 years old, in an overall hominid historical Trajectory of several million years. Man began to try to think rationally in the last 0.0005% of that time, leaving 99.9995% when they didn’t, and indeed couldn’t think rationally at all.

And, of course, the actually-occurring tempos of that development have certainly not been embodied in a constant upwards climb: for sometimes progress was at zero for long periods. Sometimes things went backwards.

For 2,300 years after the Greek Intellectual Revolution it was fatally damaged by an assumption that few philosphers recognise - the hidden assumption of Plurality. This assumed that all relations, properties and Laws are fixed qualitatively and separable from one another.

Only in the early 19th century did Hegel, the German Idealist Philosopher, attempt for the first time to integrate Qualitative Change into General Reasoning.

But even that was not universally accepted.

Indeed, it couldn’t be, whilever Philosophy remained idealist: for the solution could not come from Thinking itself, but in the our understanding of Concrete Reality. Only with the extension and vast further development of those ideas, which Hegel termed as Dialectics, was the possibility of a breakthrough even possible.

And, when it was attempted by Marx in the limited area of Capitalist Economics, it took him the rest of his life to address that single discipline, And in doing so, he was developing the stance as much as applying it.

Qualitative development was in everything, and every significant area of study, such as Science, would have to not only receive the same sort of attention as Economics, but would also be as much another voyage of discovery, very much more complex and unknown than Economics had been for Marx.

And in the the 140 years since Marx’s death, this task wasn’r even attempted. It has taken this Theorist and Philosopher over 10 years to lay the most basic of foundations.

But they have been remarkable!

To even begin the process, a wholly new approach had to be researched which produced the wholly new. For all Qualitative Change must produce the wholly new.

In all reasoning previously established using Fixed Laws and Pluralist Logic, the rationality involved, when it could be used, produced actual results - and the same ones every time it was used, and whoever used it! But Qualitative Changes are Dialectical, produced in what used to be seen as impossible developments, for which they were termed Emergences.

To grasp what an Emergence actually is, we must compare it to one of the previous pluralistic Laws, all of which have predictable outcomes.

The outcome from an Emergence, on the other hand, is NEVER predictable prior to its commencement, Indeed, you have to be an exceptional Dialectician to even predict the next phase of such a transformation, and only when the final result is imminent, can the culmination of a completed Emergence be guessed at.

So clearly the revolution in Premises and Bases required here will be very different from the prior Pluralist Methods.

The classical Qualitative changes involved in an Emergence start with a Stability, the destruction of which originally appears to be totally impossible, but which is then threatened by a whole series of crises, which usually, but ultimately, would cascade down into a total dissolution of the Stability, towards what seemed to be impending doom, but could, and often did, begin via series of crises attempt to build towards a new, and finally achieved self-sustaining Stability!

The new philosophical approach would have to reflect all of that too, in order to deliver an understanding of Real Development.

07 August, 2019

New SHAPE bookshop



Substrate Theory - Special print edition of SHAPE Journal



There is a fundamental flaw in Physics. Space is not empty.

Substrate Theory can help tackle all the biggest questions in physics, from the Spacetime Continuum to the Uncertainty Principle, from Casimir Effect, Redshift, Time Crystals, Superfluids and Dark Matter, to Virtual Particles and the work of Frank Wilczek.

This special print edition of SHAPE Journal has been produced to mark 10 years of the publication. Collated here in print are two issues (65 and Special 65), originally published in May and June 2019, which collect together key papers on Jim Schofield’s ground-breaking new theory of physics.








20 July, 2019

Vortices




The Vortex seems to be a very important feature of turbulence within the Substrate - particularly in on-going, constantly repeating situations, such as when associated with orbits. For, in the more usual transient situations, such as linear translational movements, the energy involved soon gets dissipated and lost. But, in those constantly repeating situations, as within the orbits of electrons in atoms, there is a chance, because of the regular return of the electron, that any so-caused Vortex could be regularly maintained by those many returns, and, via energy transfers in both directions, finally even arrive at a balanced situation, and a constant, stable state achieved.

However, in such situations, the vortices only occur if a prior Substrate has been dissociated, all along the orbital path, into individual Units, which are then further affected by the constantly returning electron, to be driven into separate vortices, as a kind of imposed Spin! It matters, of course, what the composing units are made of, and their properties, and finally, just how their imposed Spin will affect them.





Will, for example, a unit with an electrical charge, in such circumstances, produce a magnetic effect, due to the Spin, and deliver that effect along its axis? For, it is said that a spinning charged particle, like an Electron will do exactly that!

Now, also, a translationally moving charged particle, as well as carrying an electrical field along with it, will also produce a magnetic line of force perpendicular to its direction of motion that also moves along with it.

It will depend, of course, upon whether it is spinning or not! But, if it is spinning, the magnetic line of force will be in a single direction, determined by the axis of that spin. Whereas, if the magnetic lines of force emanate in every possible direction perpendicular to the direction of motion of the particle, AND the effect could then NOT possibly be due to the spinning of the particle, but something else.

NOTE: Indeed, an alternative was developed in earlier Universal Substrate Research, by this theorist. involving Magnetons, which certainly delivers all the directions perpendicular to the direction of motion case. These being due to detatched Magnetons from the charged particle's accompanying Electric Field, due to impacts with the Substrate it is passing through, and these are then ejected off, in all sideways directions, as linked magnetic lines of force.

BUT - as the magneton itself has an intrinsic Magnetic Dipole Moment, it too requires an explanation of its properties, which cannot involve itself!





POSTSCRIPT:

These ideas are consequences of the Holistic Stance in Philosophy, wherein many features are no longer mere sums of eternal Natural Laws (as with the Pluralist Stance), but, on the contrary, due to the simultaneous actions of many different factors, both affecting one another, and even naturally developing into stable balances of contentions, which though long-lasting, are never permanent.

And these are naturally of paramount importance in Holism, as they are the equivalents of the Permanent Stabilities of Plurality, but unlike in that stance, in Holism they are the means by which Qualitative Change occurrs in Natural Development, within short but crucial Emergent Interludes of Change.

Indeed, whereas in Plurality, not oly can Qualitative Change never be truly Explained: it is actually totally prohibited.

Reality is supposedly created solely out of eternal Natural Laws, which simply SUM, delivering only Complication, never the entirely new. Its sleight-of-hand in pretending to deliver Change by mere Quantity into Quality, totally omits any real explanation of how that actually works.

21 June, 2019

Special Issue 65: Towards the New Physics





by Jim Schofield


Part 2 of our special anniversary series on Substrate Theory is finally here!

This selection of papers constitute more recent additions to this burgeoning new Physics and many of these have never been published before.

Increasingly, I no longer feel like a lone voice in this. Other physicists are starting to move in this direction - Lee Smolin and Frank Wilczek are joining a growing group of dissenters in mainstream Physics, unhappy with its infinite descent into the Idealist wormhole, away from materialism and realism.

This series is a significant celebration of both the Journal’s (and its principle theorist’s) 10 years spent in theoretically addressing the current ever-deepening crisis in Modern Physics. This is represented by the now consensus position embodied in the premises of this subject as they are brought together in The Copenhagen Interpretation of Quantum Theory, which has steadfastly taken Physics away from physical Explanation of reality, and instead towards a wholly idealist stance, that assigns full causality only to the set of formal equations, primarily derived from High Speed Accelerator Experiments, primarily conducted at the Large Hadron Collider at CERN.

My hypothesis is that Copenhagen was almost universally instituted throughout Sub Atomic Physics, as a set of formal tricks for dealing with a missing / hidden Substrate - papering over the cracks of the waves in nothing.

Elsewhere, in my book The Real Philosophy of Science, these philosophical problems have been tackled, but here we must also tackle physically the very real possibility of an undetectable Universal Substrate - look at why it might have escaped detection and how we might finally prove its existence.

17 June, 2019

Substrate Theory continues... soon!


Image from Alternating Current by Michael C Coldwell


To celebrate their 10 year anniversary SHAPE Journal's epic series on Jim Schofield's Substrate Theory continues, with the next instalment due to be published in coming days...

Part 1 - The Lepton Substrates - Issue 65 was published last month: a bumper issue which collects all relevant papers documenting the evolution of this new physical theory, from the concept of a new aether of "empty photons" to a whole new look at the standard model, with magnetons made of Taus and Muons to the possibility of Neutrino-based gravity fields. 

Part 2 - Towards the New Physics - Special Issue 65 presents the latest research on this theory, examining how the Universal Substrate can help tackle all the biggest questions in physics today, from the Spacetime Continuum to the Uncertainty Principle, from Casimir Effect, Redshift, Time Crystals, Superfluids and Dark Matter, to Virtual Particles and the work of Frank Wilczek.

Watch this not-so-empty space...

26 May, 2019

Issue 65: The Lepton Substrates


10 years SHAPE Journal presents Substrate Theory vol. 1



SUBSTRATE THEORY I: This issue of SHAPE Journal is the first in a two part bumper edition on Jim Schofield’s Substrate Theory, curated to mark the 10 year anniversary of this publication, and to finally bring together all of the crucial materials for this burgeoning physics.

Both issues feature photography series Alternating Current by Michael C Coldwell

This is a completely new approach to sub-atomic physics that hypothesizes the existence of as as-yet undetectable heterogeneous material substance, filling all of known space. Unlike its distant cousin, James Clerk Maxwell’s Aether Theory, this new Substrate conception can explain all Quantum phenomena, the anomalies of the Double Slit experiments, Wave/Particle Duality and its own strange illusivity.

Without wishing to sound hyperbolic, the ideas contained within these issues are nothing less than a revolution in science - a complete rethinking of contemporary physics from the ground up; commiting to the scrapheap of scientific progress much of the last century’s detour into a realm which we might call Quantum Ideality.

The near-total dominance of Mathematics in the field has lead us further and further away from the material reality we purport to study. The more advanced our technological solutions become, the more convoluted our route to truth, the more self-fulfilling our prophecies.

In Substrate Theory we see a genuine attempt to return physics to materialism, but also to try and explain materially, the many weird and wonderful phenomena we have observed at the Quantum Level. This is no simple return to the halcyon days of simpler classical physics, not a retrograde movement at all, but instead a new gesture towards a truly holistic study of the material universe - a methodology that eschews virtual particles, Quantum Entanglement and all manner of Mathematical constructions, which uniformally fail to explain the material causes of the most basic physical phenomena - gravity, the propagation of light across space, disembodied magnetic and electrical fields.


Saturation (2012) by Michael C Coldwell


While Substrate Theory certainly has the potential to upturn the entire apple cart of modern physics, the pieces of the puzzle still need assembling, even if all the elements have been devised.

This issue begins that process, collecting together various prior publications on the theory, and weaving them together into a coherent narrative. This is a somewhat difficult undertaking as the ideas are always evolving - the goal posts moving.

Our editorial solution to this problem is to split the task into two parts.

In the first installment, The Lepton Substrates, past papers on Substrate Theory are collected together in historical order, with some new editorial pointers mindful of when certain ideas were first published, at what point in the theory’s evolution certain assertions were made, and that some of the original ideas have been necessarily superceded, but are still required here to understand the theory in general.

All the foundation arguments and elements of Substrate Theory are explained and discussed here. From the original notion of ‘Empty Photons’ pervading all space we start to see what these entites might actually be. How pairs of mutually orbiting Lepton particles must be undetectable, but could easily pave ‘empty’ space. What those different Leptons are and how they can carry quanta of energy in orbits, passing them bucket-brigade across the universe. How quantised orbits may have a material explanation in a dialectical relationship between levels of the Substate.

Over the course of The Lepton Substrates the full picture begins to form and the unexplained contradictions of Quantum Theory start to fall like dominoes.


Imaginary Units (2012) by Michael C Coldwell


In the second part of this series, Towards the New Physics, we look beyond the nuts and bolts to the potentials, publishing Jim Schofield’s latest writing on Substrate Theory, which sees how the Universal Substrate can help tackle all the biggest questions in physics, from the Spacetime Continuum to the Uncertainty Principle, from Casimir Effect, Redshift, Time Crystals, Superfluids and Dark Matter, to Virtual Particles and the work of Frank Wilczek.

What becomes increasingly apparent in all this research is that even if this particular model of a Substrate is wrong, the basic premise that one exists is right. It has been the missing premise since 1927 - the invisible elephant in the room of an ever-more esoteric Physics. Substrate Theory explains far too much to be ignored any longer, in a discipline which has all-but abandoned explanation for the most obscure of Mathematical games.




23 May, 2019

Substrate Theory!



Jim Schofield's Substate Theory will be published in coming days!

Part 1 - The Lepton Substrates is now available here

27 April, 2019

10 Years of SHAPE Journal


Substrate Theory of Physics

Coming May and June - two new Special Issues of SHAPE Journal

a definitive guide to Jim Schofield's Substrate Theory of Physic


To mark 10 years of the journal, SHAPE will be publishing two Special Issues on Substrate Theory, a definitive collection of papers on this new model of physics.

Also in the pipeline for this summer is a new documentary film on Revolution!

Watch this SHAPE.

Here is first part!


26 April, 2019

Copenhagen is Wrong!





The house of cards that is Quantum Theory is really starting to fall...

And now Lee Smolin is on-side.

In a recent Perimeter Institute Lecture, Smolin delivered the trenchant view that the Copenhagen Interpretation of Quantum Theory is in fact wrong, and I agree with him!




However, while his Realist arguments were indeed correct, and Copenhagen Theory is totally Idealist: that, I'm afraid that will never be enough.

For, in spite of Hegel's profound and transforming criticisms of the Plurality of both Formal Logic and its use in Science, almost 200 years later his improvements have still rarely been applied to either. The Copenhagenists will never relinquish their theory, for they don't even know why it arose, due to profound errors in its premises, and also because in all the circumstances in which they use it - it certainly does work - pragmatically! But, even then, it also never explains why.

For Explanation, as the primary purpose of Physics, has now been totally abandoned. Instead, this so-called Theory gives the right quantitative answers, in highly-constrained circumstances only. And, in doing that, it is entirely consistent with what Modern Physics has now become - an extension of Mathematics.

For, henceforth, it can never explain why things happen the way that they do, but can only "match" how it works purely pragmatically. It is content to be only technologically-useful, amd hence explains absolutely nothing!

Now, of course, its supporters would all totally disagree with this, because of the pseudo-philosophical inventions that the originators inserted, in order to make it look like an "Explanatory Theory" - namely, via their cobbled together Equation, which imports illegitimate probabilities into a basic Wave Equation. And then replaces all Particles, at what they term "The Quantum Level", with Wave/Particle Duality - delivering the alternatives of using the wave-like Equation, until the entity suddenly became, once more, a descrete Particle, and they could term the transition "The Collapse of the Wave Equation", when it resumed its Particle Form.

It is, of course, NO kind of Legitimate Scientific Theory: it is a clever fix to get around contradictory behaviours, that their prior premises just could not cope with. And, that was indeed the case: the prior theories just didn't work at these levels: they were inadequate, it is true!

But, literally all the theories in the past were also inadequate, but were re-investigated to review the premises assumed, and usually, they, in the end, would finally arrive at something better! By the early 20th Century, however, a whole series of dramatic and debilitating results had begun to be revealed that undermined the prior assumptions.




The Michelson/Morley Experiment had dismissed the existence of the Aether as filling all of Empty Space with a Universal Substrate.

And, Henri Poincaré and Ernst Mach, finding ever more physical premises that seemed to be wrong, proposed Empirio Criticism, a form of Positivism, as an Amalgam of Explanations and Mathematical Equations, which together, they believed, could deliver what was needed.

The Piste was set: and it was all downhill from there to finally dumping Explanations altogether!

For, nobody demurred at the conclusions from the Michelson/Morley Experiment, so Wave-like phenomena, at the Sub Atomic Level, could no longer be ascribed to that now dispensed-with medium: instead, somehow, such effects had to be embedded into what Particles actually were themselves!

The key experiments were those involving Double Slits, which became the touchstones for many of the new discoveries. But, the Wave/Particle tricks which were instituted to explain the many anomalies, are very easily removed by assuming an undetectable Universal Substrate in those experiments, and Substrate Theory immediately dispensed-with every single one of them.

Now, though obviously insufficient by itself, this theoretical exercize could not be ignored! For, it demonstrated that classical Waves were somehow involved: the Wave/Particle Duality invention was a frig! And distinct Waves and Particles were, somehow, still intrinsically involved.

Two parallel lines of theoretical research were undertaken.

The First was to investigate the possibility of such an Undetectable Universal Substrate - composed entirely of pairs of mutually-orbiting Leptons of diametrically opposite properties - thus delivering the required passive-undetectability, while at the same time allowing that Substrate to be affected-by those interlopers, while also delivering the subsequent affecting-of those very same entities, in differing, later circumstances.

It must be stressed that the objective here was a theory-first investigation: just as James Clerk Maxwell had used in his Analogistic Model of the Aether, which was the Basis for his still universally renowned Electromagnetic Equations!

[Indeed, there is the very sound point also, that theory-first investigations actually avoid the inevitable pluralistic aberrations of all data-first investigations, using the now Standard Scientific Experimental Method of directly constrained and maintained contexts]

And, as Frank Wilczec has recently insisted upon with "The Materiality of the Vacuum", the above theoretical Analogistic Model is not without foundation, even if his composition of that "undetectable Universal Substrate" differs from that used here.





James Clerk Maxwell's Analogistic Model, has long gone, BUT its use was justified by its delivery of the Electromagnetic Equations!

Now, the Second line of theoretical research was somewhat akin to Maxwell's - by assuming such a substrate with the New Model's composition, could all the anomalies that precipitated Copenhagen be physically explained instead?

Now to achieve this objective, the initial simple definition in terms of Electron/Positron, mutually orbiting pairs, had to be extended to also involving Units composed of Taus and Muons, and even Neutrinos, but a rich and successful Substrate of Leptons was devised, and the inventions of Copenhagen physically replaced!

BUT, of course, this is just like Maxwell's Aether Model, the detailed content of the Analogistic Model of the Universal Substrate will, indeed, be wholly replaced too!

This researcher, like Maxwell, knows very well that all our theories are never the Absolute Truth, but at best, contain sufficient Objective Content to deliver more than what they replace.

Criticisms of the New Model are not only legitimate, but absolutely necessary. Yet, criticisms that exist only in order to re-establish Copenhagen are most certainly NOT! Copenhagen is a dead-end, and new ideas and models are now required to replace it and push Physics into new territory.

Implicit in the new approach was a root and branch critique of the premises, and even basic amalgamated philosophical stances, underlying modern physics. Succintly, Copenhagen is Pluralist, while Reality is Holist!

And Science should be materialist, while Copenhagen is certainly idealist!

By all means improve upon the New Physics, but leave Copenhagen where it derserves to be - Dead and Buried!


 



For more on burying the Copenhagen Interpretation, please read the Special Issues of SHAPE Journal above, and look out for our forthcoming 10 Year Anniversary Issues on Substrate Theory.

14 April, 2019

Special Issue 64: The Limits of Mathematics





This edition deals with the various limitations of Mathematics from a variety of different scientific and philosophic angles, and features a fantastic guest paper by Abdul Malek, a Theoretical Physicist and Dialectician from Montreal, Canada.

It has taken me many decades to realise quite how limited Mathematics really is. I have the advantage of having been a gifted mathematician long before I switched to Physics. I made that significant change because Mathematics is a purely descriptive abstract discipline, of a very special type, and I wanted to really understand things rather than merely describe them in abstract form.

Unfortunately, as we shall see, Physics has become little more than an extension of Idealist Mathematics. Physics was converted into a Pluralist Science of Stabilities: and one driven idealistically by Purely Formal Laws.

No wonder it is in an untranscendable terminal impass as a Science! Indeed, we can legitimately go a great deal further, and insist that it no longer investigates Reality-as-is, but instead can only deliver a distorted formal reflection of that World: it is an investigation of Ideality - the infinite World of Pure Forms alone: the Abstract Realm of Mathematics.

In short, Physics can only be saved via a wholesale rethinking of Mathematics and how we use it.






 

13 April, 2019

Frank Wilczek and the Universal Substrate


Artwork from Michael C Coldwell's Alternating Current series

Coming May and June - two new Special Issues of SHAPE Journal
a definitive guide to Jim Schofield's Substrate Theory of Physics



In an Origins Project lecture, at Arizona State University, Frank Wilczek gave a contribution upon the Materiality of Space (see below for video).

What was remarkable was that much of what he had to say resonated, very markedly indeed, with my own ideas based upon the concept of an undetectable Universal Substrate (the hidden materiality of the vacuum) but, nevertheless, coming from a very different place; namely the more usually accepted consensus positions of today's Sub Atomic Physics.

Indeed, the last paper I wrote was also concerning Wilczek's work, and his supportive ambitions for the Large Hadron Collider in 2010, which I'm afraid I dismissed as a total myth.

However, this lecture has dramatically altered my assessment of him, as both a scientist and indeed, a philosopher. By alternate, indeed diametrically different means, he has arrived at very similar conclusions, to those I postulate, and this delivers a very different slant upon valid pathways towards the Truth that we, as physicists, always seek!

Indeed, the situation delivered far more than that: for he was introduced-by, and afterwards disagreed-with Lawrence Krause, who seemingly from the same theoretical stance as Wilczek, also demonstrated how that seemingly identical basis, was indeed diametrically opposite in various extremely important premises.

For Wilczek is a physicist: while Krause is, at heart, a mathematician!

And, as it became clear, Wilczek and myself, though arriving at very similar positions on Empty Space (he even mentions the word "substrate"), were nevertheless getting there, on the one hand, due to conforming to the same basic premises, still managed to do it, in spite of using very different means and sources for our theories. And, the subsequent presence and disagreements of Krause, also confirmed that his differences, in spite of working in the very same areas as Wilczek, put him in a very different position indeed.

Krauss is an idealist, whereas Wilczek is actually a materialist.





Now, by far the more important revelation for me was the possibility of arriving at similar conclusions from very different experimental evidence and theoretical bases. It clearly confirmed both for myself, and for him, that we, as scientists, did not either seek or expect to find Absolute Truth, but, on the contrary, what I term Objective Content - that is aspects or parts of that never-to-be-reached Absolute Truth, but which supply the best view of Reality we currently have: and which would always be open to improvement by new Objective Content, if it proved to be closer to that unobtainable objective.

In addition, Wilczek made absolutely clear what were legitimate theories in such Objective Content, citing, as I often do, James Clerk Maxwell's Aether - a fictional Analogistic Model composed of Vortices and Electrical Particles, from which he directly derived his Electromagnetic Equations - forms with enough Object Content that we still successfully use them today.

And, this also says something quite profound, and generally not understood, about how equations are derived.

For, most equations are what I term Pluralistic Equations, derived initially from intensely pluralistically-farmed experiments, and thereafter wedded to Pure Equations from Mathematics by adjusting the Equation's constants to make them fit. And, that is very different indeed from Maxwell's Holistic derivation of an equation direct from a Physical Explanatory Theory.

Indeed, elsewhere, and at another time, working with the mathematician Jagan Gomatam, I was able to use equations he had developed directly from theory to do with the beating of the Human Heart, which in contrast to equations as a consequence of experimental data, actually were able to demonstrate both Fibrillations and Heart Attacks.

But, how many modern day physicists do things that way round, and thereby actually knowing why it gets closer to the Truth?

Now, Wilczek certainly doesn't define Empty Space as I do - filled with an undetectable Universal Substrate of Leptons. But, he does insist that Empty Space is filled with something material.

His current model uses Quantum Fluctuations, but both theories are identical functionally in how they explain both Pair Productions and Pair Annihilations: and crucially Wilczek clearly admits to having the same stance upon the necessity of such currently-valid Analogistic Models!

Now, as to where Wilczek and this theorist differ, it is certainly in exactly what materiality, which actually fills the vacuum, and is both affected-by what is happening to it, and consequently what those effects upon it do to things contained within it. With literally only directly undetectable Quantum Fluctuations, we can commend any attempt for The Theory to directly determine any subsequently arrived at formulae, but at the same time, it is almost impossible to theorise as to what that form is likely to be.

While, in contrast, with this theorist's known Universal Substrate Units, both aspects can be adequately and correctly carried through to completion - that is for the full-detail, Analogistic Model (á la Maxwell) from which to generate the necessary Equations, as Maxwell did from his Model of the Aether.

There is much more in Wilczek's lecture than I have dealt with here. Some of his philosophical points are particularly powerful...





Clearly, the replacement of Quantum Fluctuations, and, of course, my Analogistic Model of the Universal Substrate, has yet to be achieved.

But the stance is right!