29 April, 2013

Cosmic Dust Clouds

Just how do the truly vast cosmic dust clouds form?

You would think that with a Big Bang initial stating point, followed by a series of supernovae a little later, that everything would be flying apart, and along roughly radial paths from their “explosion points”. And it is not at all clear how such a “dispersive” set of circumstances could erect such clouds of something or other, light years wide, so that they should end up as relatively stable, and long lasting structures.

There is, of course, the usually proffered “explanation”, which has “quantum fluctuations” present from the very first instant, which therefore, would “build-in” an unavoidable unevenness, and hence lead to many local concentrations. But, lets face it, such is a groundless dream, totally lacking in any concrete evidence – some sort of backwards extrapolation to explain the inexplicable in terms of the current, established Copenhagen prejudices. But that certainly isn’t it!

Now, the very fact that the clouds are opaque to light indicates that they must include particles of Matter (i.e. dust of solid elements) that are much larger than Hydrogen or Helium atoms, and hence could only have been produced from matter building stars in supernovae. For, current theories have all elements above H and He, produced by fusion in a series of sequential star forms, and finally dispersed, far and wide, by Supernovae.

Yet though that may explain our clouds’ contents, it really doesn’t explain their seemingly static state, which can be confirmed by that other consensus theory, that put down the demise of such stability to the shockwaves of subsequent supernovae, that break the “balancing stability”, and start a gravity-based concentration around local centres. For, with that theory, it is admitted that Gravity-caused aggregation alone is insufficient to end that state.

So, how did these clouds come to be in such a state originally?

Let us initially take some of the usual ideas and where it seems appropriate, add a few more.

The concept of a Big Bang of Pure Energy alone surely has to bite the dust? It is an internally contradictory idea that has been patched up with various speculative add-ons – the most significant one being that it didn’t expand into a pre-existing and totally Empty Space, but actually created Space itself as part of the same Big Bang process.

So, from a vanishingly tiny dot (the Physical Singularity) we have Energy, sufficient to construct a whole Universe, which, nevertheless, was full(?) of “quantum fluctuations”, and created Matter as it also created its own, required Space.

NOTE: In a nutshell this has Energy from Nothing making all Space also out of Nothing, and producing Matter as it went: an interesting Origin, don’t you think?

It was certainly NOT, we are assured, any sort of explosion, for that would require both pre-existing Matter and Space. Instead, we are told, it was a kind of emanation(?) of Pure Energy, creating Space and Matter as it went: a whole Universe, its actual Space and absolutely all of its contents, had spewed out of Nothing! But, surely creating-Matter is NOT merely the reverse of creating-Energy-from-Matter as in Atomic and Hydrogen Bombs? The pre-requisites for both of these are both Matter and Space.

So, just how does disembodied, Pure Energy condense(?) into Matter? And what is the Form of that Matter when it is first produced?

The problem is that Fusion needs Matter to work!

Indeed, it fuses smaller units together to make larger units, but with a necessary loss of some of the matter involved into Energy. You cannot reverse that to get Matter out of Energy alone! And what do they mean by “concentrate” or “condense”? How do you squeeze Energy until it produces Matter? Or is it somehow self-squeezing? Do they mean that after a certain, threshold density of Energy it becomes stable Matter?

But surely as all this energy was originally in “zero” Space, and then continuously had more and more Space as the process progressed, doesn’t that mean that the energy-density MUST be getting less and less, and will never again reach those earliest levels?

The received wisdom (if we allow a temporary slipping into “explosive” ideas) is that Hydrogen and Maybe Helium were produced by the Big Bang, though clearly some even smaller “bits” would have had to have been produced first, as both Hydrogen and Helium are combined entities.

So, in spite of these clear anomalies, let us initially stick with the consensus scenario, and have unevenness from the start, and as soon as Matter was created, it started to pull together in the more dense parts of the density spread. The idea is that these would very rapidly grow into stars of truly immense size, and as that volume increased, accelerate through the stars history, so that they would very quickly (in cosmological terms) exhaust all the matter-growing phases producing Helium, Carbon, Oxygen, and so on all the way to Iron, before really exploding in a truly giant Supernova.

Of course several of these would go through these stages more or less simultaneously, so even the explosion-less Big Bang soon produces massive actual explosions, which according to current theory are the ONLY situations in which all the higher elements are produced (remember, our Cosmic Clouds are of dust).

We now have, therefore a cataclysm of Supernovae – all exploding outwards, but as yet still in a relatively tiny Universe (compared to now). Clearly, these explosions would “bump into” one another, and in certain areas “cancel out”. Perhaps this was the actual source of our vast, light-years-across, dust clouds, which, in consequence become relatively stable: They might, as whole “clouds”, still be moving, but internally the individual movements of the components from both involved Supernovae may approach a “random mix”, and thus produce a stable overall state within the cloud [Both gravity pulls and collisions could in time achieve such a state].

This, I’m afraid, is the best that I can do with current theories, though I must admit that I cannot really agree with the majority of their “standing-ground” – their founding assumptions. For, they are clearly purely formal and abstract, indeed mathematical bases.

By abandoning “old-fashioned” “Physical Explanations”, and instead relying exclusively upon Equations, as the true essences driving Reality, they have abandoned a materialist standpoint for a completely idealist one.

All their bases are purely formal abstractions, which they develop in solely mathematical ways.

The trick of turning multi-dimensions as used to cope with multi-variable relations, into a many dimensional Universe, and thereafter develop from their Equations and formal extensions, purely formal “explanations” for everything, places them squarely in Ideality – where mathematicians dwell, turning their backs upon Reality, which is the only land for real physicists.

The turning point was, without any doubt, the victory of the Copenhagen standpoint of Bohr and Heisenberg at Solvay in 1927, as an almost inevitable development of the mathematical achievements of Planck with his Quantum, and Einstein with his purely formal Relativity.

The slope became so steep it was impossible to stop the slide, without directly questioning the enormous formal (mathematical) inroads into Physics, which the vast majority depended so vitally upon, and the whole Sub Atomic Community began the slide, headlong down to Idealism.

To those who disagree with this standpoint, may I mention String Theory, the Higgs’ Boson, Theories of Everything, involving 11 (or more) dimensions, Branes, Parallel Universes to name only a few...

Are these not purely formal speculations without any real Physics whatsoever? Of course, they are! 

Mathematics, as a discipline itself, deals in the purest of Forms, which they get from glimpses in Reality, and which are increasingly “nailed down” by the most careful construction and maintenance of Domains to eliminate almost everything but a final, formal relation. What crucifies such methods is that these “farmed” results are then believed to be the eternal, underlying truths of Reality.

They aren’t!

So, the current theoretical position is an amalgam of constantly new facts, due entirely to mammothly developed technology, and the farming of the Domains studied, and the purely formal relations thus extracted, though, of course, always related to those supplied by the mathematicians, who have been studying such Forms, in their own pure terms alone, for millennia.

Indeed, the new legitimacy inverts the established Scientific Method, by expecting to find essences and even new entities, hidden in their beloved Equations, rather than in Reality. While, at the same time, constraining experimental work into the ever narrower, and higher energy area of forced collisions as THE only experiments worth pursuing. And all that is founded upon the assumption (which has become a Principle) of Plurality – where found relations are presumed to be independent of their contexts, and hence actually eternal, additive components, capable of producing any complex situation.

15 April, 2013

Dialectical Reasoning

The revolutionary methodology of reasoning, handed on from Hegel to Marx, was of a very unusual type compared with what had been universally employed previously.

For, being holistic, rather than pluralistic (as literally all prior reasoning had been, and, of course, all of Science certainly had become) the new approach started from the total inter-relatedness of all things, and hence fulfilled the credo, “Everything affects everything else!”.

But, such a stance does seem to totally exclude the possibility of Analysis, which is surely the central plank of the scientific method, and has to look beyond individual (and separable) contributions to integrated and mutually transforming effects at a higher level, to get any sort of handle upon how Reality actually behaves.

But, in spite of these major difficulties, it alone can cope with both Change and Transition as caused processes, and that has to be its critical contribution to human reasoning.

One vital feature was that the multiplicity of contributing factors meant that in any situation both complementary, and even totally contradictory, factors would certainly be present and making a contribution. And, any observed overall effect, would be the result of the increasing dominance of certain mutually conducive factors over other less effective sets. And, even that situation would never be permanent, but would have the ever-present possibility of such a “current “solution” being overturned as the general situation changed, and even a directly contrary dominance could come into overall hegemony.

To address qualitative change is very different from purely quantitative changes within a stable situation.

The conceptual model adopted, therefore, became one of contradictory pairs of overall outcomes, and as the most important aspect of the studied situation, its development into something entirely different. It could be dealt with (to an extent) by the activities of these Opposites – the Dialectic of the situation!

Now the validity of this rather surprising approach has been confirmed innumerable times, but only in developments: it is not about stable, quantitative and slowly-changing situations, but about transforming and qualitative changes.

Perhaps before this discussion gets out of hand, the crucial evidence of The Impasse should be brought in?

Most conceptions of situations are far from being the “absolute truth” of it, but are usually an acceptable and useable approximation: the assumptions, processes and even entities involved do get reasonably close to what is going on, and in most stable circumstances “do the job”: conclusions and even predictions can be relied upon. But, such is never the case forever. No matter how clever (or even wise) were our suppositions, there will always be situations where the conceptions and assumption fail. Now, our fallback practice is to have a second-string theory, which also works in some very closely related cases, and we switch to this to see if it does the job here too. And sometimes it does!

We then have two mostly workable alternatives, and we pragmatically switch between them to be in the position to carry on with our objective.

NOTE: But, we must not confuse this with pure unprincipled Pragmatism, as displayed in the current models of the Nuclei of atoms. For there is, at the present count, an unrelated set of some twelve alternatives to juggle between, This two model alternative is not only much more tightly constrained, but, as it will turn out much sounder philosophically.

But, there are cases when even these dichotomous pairs fail to deliver anything at all. And this indicates a true impasse, where the possibilities of the current situation have been left behind completely. No matter what we do using these alternatives, they still always lead to a contradiction: they are both wrong! The situation seems to defeat our usually applicable pair of alternatives, and we seem to be able to go no further.

But, as you may already have guessed, our two alternatives can never be wholly arbitrary, or unrelated to one another, they both will have a measure of the Objective Content of the situation within them, and it was that, which caused them to becomes our pair of alternatives.

But the occurrence of The Impasse, instead of being a dead-end, is perhaps the much more productive situation - for it is only here that the necessary transcending solution can actually be addressed.

NOTE: In his book Zen and the Art of Motorcycle Maintenance Robert Pirsig called such situations the vital periods of “Stuckness”: situations to be sought out and welcomed as the places to make real progress in our understanding.

Dialectics takes the naturally emerging pairs of such dichotomies as temporary truths in the short term, but also as anvils on which to beat out a transcending alternative.

One obvious area, totally unintelligible to today’s physicists, involves the alternatives of Wave and Particle in Sub Atomic Physics. And, perhaps, the most famous is the perennial Descreteness and Continuity dichotomy, as Zeno was clever enough to demonstrate so superbly in his Paradoxes.

But surely, such a method, is not, repeat NOT, predictive, as are most quantitative equations in Science, though, on the other hand, it does give the person a changing situation - to think about what factors are involved, and which way a transition is likely to occur. In contrast to the usual method in Science, which can ONLY predict within its appropriate and defining Domain (set of producing conditions), the holistic approach is much more general and unconstrained.

The holist alternative does attempt to juggle all the involved factors as changes occur and by defining them into pairs of opposite-yet-possible outcomes, points strongly to a particularly limited pair of possibilities.

It is by no means as crude as it at first sounds, for whatever “wins” in a competition of many contributing and distortable factors, will always get there due to its cooperating, and even integrated and mutually modified, set of conducive factors to ensure dominance. While, when a transition does occur, it will again be to one with a similar set of conducive factors, which are likely to be the opposite of what pertained before.

The natural marshalling of simultaneous factors will always take such a form, for such groupings ensure proliferation best. It is about multiple factors with different directions interacting to lead to a particular overall dominance.

A pluralist equation doesn’t even include what factors are present. It is merely a quantitative relation within a static, non-changing situation: it is incapable of saying why it behaves as it does, and the nearest it can get to suggesting what might replace it, is for it to “blow up” into one of its terminating singularities.

But, though this contribution is only a beginning, Dialectics did reflect the true dynamics of multiple interacting factors in real systems. The seemingly arbitrary concentration upon opposites is NOT what was being inflicted upon the situation by Mankind: it was NOT simply another imposition. For the division into conducive and antagonistic contributions to combined effects did cause related groups of factors to form conducive, mutually-supporting sets or systems. And in any complex situation, the direction of these proto-systems would be defined.

It is also important to understand just how dominance occurs: it is basically a version of Selection, which I have elsewhere termed Truly Natural Selection, and it occurs not only in Living Things, but at all levels, even between chemical reactions, which might compete for the same resources

And, a working through at this basic level turned out to deliver a viable model. Mutually conducive or supporting processes, where the product of one was the necessary resource for another, would certainly mutually affect one another. And such could even develop into quite long sequences or even cycles.

Clearly, as such systems came together they would really be greatly more successful than lone processes or mutually contending pairs of processes. The conducive systems would soon collar the majority of the available resources and begin to dominate.

Yet, such sets would not all require the same conditions and resources, so many such systems would occur. 

The rivalry between them would be of a different character. It would not be direct competition – for they required different things, but efficiency and rate of production would tend to some systems growing bigger then others.

NOTE: though too early to deal with it here, these ideas have led to the Theory of Emergences, which addresses how the “wholly new” comes into existence – clearly crucial in any complete theory of Evolution.

When the Bastille Finally Falls!

How can we identify the current cul de sac into which Modern Physics has purposely and noisily marched, and not merely criticise as Prophets of Doom, but also be able to present a ready alternative and much better show already waiting in the wings? Now, if such an alternative were both fully assembled and available, as a coherent, consistent and comprehensive standpoint, along with a clearly useable methodology, then there would be no real problem. But, that ideal situation is far from being the case at the present time.

There are, of course, many very good examples that could be brought into any ongoing argument, but altogether too few, and at this time, too little developed, to stand against a united chorus of “Yes, but” type responses from the sizeable majority representing the currently “universally-agreed” side.

For, in spite of the grave weaknesses of that currently accepted position, it has now been “in charge” for a very long time, and in any ping-pong battle, hurling examples from each side, there can be absolutely no doubt, who will have the deciding weight of projectiles.

It is certain, however, that if the philosophical case were allowed to be put, the new alternative would win hands down. But, who has such arguments about Philosophy these days? You know the answer, it is, “Nobody!”

And, the vast expansion in media of all kinds only reinforces that situation. Twitter one-liners dominate these days, so wit will trump argument, and humour will always trounce commitment. So, there is certainly a major problem in getting anything at all complicated out there, and then discussed.

Now, such episodes, when reaction rules, have happened before. There were times after the failures of revolutions across Europe in 1848, when reaction ruled, and even a new Bonaparte was installed as Emperor in Paris. And, similarly, after the demise of the 1905 revolution in Russia, the leadership of the Bolshevik Party was down, as Lenin said, to “You, me and him!”. Yet in 1870 during the Franco-Prussian War the sans culottes of Paris rose again and instituted the Paris Commune, while in Russia in 1917 the very same Bolsheviks actually took power, and established the World’s First Socialist State.

So, though the task at present seems impossible to carry out, it will NOT remain so!

The swoop downwards from all-powerful repression to powerless impotence does indeed occur, and it will occur in this task too - and for similar political reasons. As the Crisis of Capitalism again dominates across the World, and the Ruling Class as usual insists upon the Working Class footing the bill, the latter will finally SNAP! And all hell will break loose!

In such turmoil all prior short-odds predictive bets will be OFF, and everything will be up for debate!

06 April, 2013

Issue 29 of Shape

This small set of papers was a response to a significant change in the position of an establishment group of physicists, as their latest adjustment in coping with the continuing and unresolved Crisis in Physics.

For, though for many years (and even decades) mathematical-physicists have been rummaging through the seemingly endless depths of the World of Pure Form alone (Mathematics) for a solution to their evidently pressing need for a Theory of Everything, their many and varied, speculative journeys have become ever more unbelievable.

Yet, without in any way dramatically changing their avowed stance, these theorists have switched their attention to a very different area in the search for this required “end of the Rainbow”, and it is interesting what their new turn has involved!

For it does seem to acknowledge the real cause of their continuing dilemma – the lack of an appropriate philosophy as a basis for their driving laws!

So, from a purely descriptive/predictive pre-occupation with quantitative Form (equations) they have finally turned to the most “philosophical” of the Laws in their collection, with the purpose of finding there the hoped-for salvation.

They have turned away from trusting only Pure Form to instead address Pure Chaos!

Of course, though Mathematics has been, and still is, used even in this area, it is the Second Law of Thermodynamics that seems to fit their requirements most accurately. For it is not a relational law!

If anything, it is a philosophical Principle: that everything is perpetually running down: all Order is dissociating into all Chaos!

It is certainly appropriate in very many areas (and the engineers, who first thought of it, would insist that it pertains absolutely everywhere).

Two major contributions to this standpoint have recently appeared. One in the pages of New Scientist (2886) by Vlatko Vedral, and the other in a two-part TV series by Jim Al’khalili on BBC entitled Order and Disorder.

Here are my responses to these positions.

Read them here

Figure and Ground

The Dangers of Simplification
This seemingly interminable series of papers on Fields is a product of the way we always attempt to deal with such phenomena.

We have learned that the most productive approach is to avoid confusing complexity, and, instead, work to simplify situations as far as we possibly can. So, we select & isolate situations, attempting to leave only what we are seeking: we simplify first conceptually, and then concretely until we have both a revealing and amenable Domain - ideally conducive to our further studies.

By now, we are, without doubt, the masters of such isolating and constraining of phenomena in such a way as to “completely reveal” their supposedly “Key Relations”.

It has, indeed, become the fundamental approach for all our experimental set-ups, and, therefore, produces not what we think we have revealed – Fundamental and Universal Laws, but, on the contrary, specific and limited relations locked fast into the specially arranged, conducive situations we have erected.

Thus, our “Truths” are always fragments – particulars. And so, though we crave overarching and universal laws, we never actually get them.

We get a multiplicity of particular laws-plus-their-contexts.

So, with such a complex area as Fields, and indeed ALL actions-at-a-distance, this fragmentation is multiplied even more.

Yet, before this revelation gets too depressing, it has to be emphasized that we certainly know how to use what we currently extract. Our methods have been very successful, for we know precisely where to apply our “partial truths” – in the appropriately constrained situations! As long as these correct contexts are accurately constructed, we do indeed have places where our laws work: we can predict, and hence also produce!

Our methods equip us for production, but also inevitably disarm our ability to explain why things are the way that they are, and behave in the way that they do, when left to themselves!

We are very adept technologists, but not adept scientists (though we think that we are), and, most certainly, are nowhere near being even competent philosophers.

Now, the pragmatists will dismiss any such criticisms of both their method and standpoint, because their purposes are in no way compromised by the inadequacies of their approach.

Continuing “Progress” still appears to be continuously assured. But, of course, without the essential development of understanding as well as straightforward use, what we get can only be an aberrant growth.

It is really a maximal exploitation of a partial truth, rather than a step on the path to an ever wider and deeper understanding of our world. [Like the young man who built me a working Amplifier, but could not tell me why it worked, or what the various components were actually doing: neither could he use what he had to design something new].
Indeed, if the stream of scientific explanations ceased forthwith, technology (as with my young electrical constructor) would etiolate and die, like a pea shoot without sustenance.

Science is the source and lifeblood of technological progress, and even more important, it can also be the means to actually understand the world.

Now, returning to our problems with Fields, the difficulty is that our isolating and simplifying also walls us off from what we are trying to understand. For such things are not appropriate to such methods: for Fields are certainly NOT isolatable phenomena! Why can I say this?

It is because the “Figure” and the “Ground” in such situations are not only inseparable, but also actually mutually defining and determining! We simply cannot separate them without destroying what they are.

For example, is a Field actually erected by its “causing” charge, or is it actually a response of the Background to the presence of that charge?

For we usually assume that our Grounds are always totally inert – mere formal references, whereas the holist suggestions outlined above change all of that!

The two always have a reciprocal relationship, and perhaps an evolutionary one too.

Now, rather than halting the conclusions here, and arguing whether these assertions fit all cases or not, let us first concede something called Dominance.

Though the philosophical basis for the ideas being explained here constitute Holism, they are NOT the same as that early version espoused by The Buddha, though it is still much closer to his position, than it is to the sub atomic physicists of today.

It does, in contrast, admit that things are not all of equal weight, and in many situations, particular relations can dominate to such a major extent that they can be fairly easily isolated, extracted an then used in the pluralist sense described above as the usual scientific experimental practice.

But, “Exceptions always make Bad Law”, and Dominance is not triumphant either everywhere, or permanently.

It is a surface effect, upon a holistic World, where literally everything does indeed affect everything else, and in many crucial areas we have to deal with not only Systems of Processes, but also hierarchies of such Systems too.

A great deal is always going on simultaneously, and our Simplifying, Isolating and Constraining in order to extract any usable order does indeed change the overall situations that we are trying to understand.

The classic example is, of course, the Weather, but there are many cases where such situations also defy Analysis by our usual pluralistic means.

My favourite is Miller’s Experiment, wherein he attempted to make an emulation of the conditions upon the primitive Earth – before Life had emerged, in the hope that he could reveal something of the developments leading to that revolutionary Origin of Life.

Sealing “everything necessary” in a glass containing-system, and adding heat and electrical discharges (as lightning), he set the system in motion, which was as near as he could get to the actual primaeval Weather System, in order to see what might occur.

As we all know, after only one week, the water in his system had already turned a deep reddy-brown, and on dismantling of the system, he was able to show that amino acids had somehow been synthesized.

But as to how this had happened, there was no way that he could confirm the processes involved.

The absolutely essential isolation from any present-day contributions, also prohibited any time-based Analysis, and most certainly, many strands of changes must have been happening throughout that momentous week, both as parallel simultaneous processes, and as parts of crucial ongoing and changing sequences. So, without any possibility of intervention, NO further explanations were possible.

This is, and always gas been, the classic dilemma of investigating a Holist World using the only available methods - pluralist science could get nowhere in such investigations. They seemed to be Unknowable. And in spite of the undoubted success of Miller’s Experiment, it was also the “end-of-the-line” in most scientists’ eyes. Pluralist science offered a great deal more and it was there that ALL the research was concentrated.

So, these inevitable cul de sacs in attempts to develop a Holist Science did dissuade anyone else from embarking on such a seemingly doomed-to-failure route.

Yet, it would be wrong to consign this approach to the dustbin just yet. Darwin’s Origin of Species was a masterpiece of Holist Science, and other major holist contributions have also been made. But, the philosophical ground, and necessary methodology for a general holistic, yet scientific approach has still not yet been defined. It still awaits a generally applicable methodology!

Now, this author has attempted to apply such a method to the infamous Double Slit Experiments, beloved of the currently dominant Copenhagen School in Sub Atomic Physics, and he was finally able to explain all the anomalies involved, without any recourse to Wave/Particle Duality or the probabilistic formulae of the Copenhagenists.

So, with this demonstration the Copenhagen View was proved to be NOT the only possible approach, and he has since embarked upon a particular area of Physics, which has long annoyed him.

It is, of course, Action-at-a-Distance, the propagation of electromagnetic radiation through totally Empty Space, and, of course, the “daddy-of-them-all” FIELDS!

So, let us assume the very worst!

Let us say that our “Figure” is really composed of multifarious and mutually determining processes, while our “Ground” is not only very similar in its diverse content, but also both determines the behaviours of the contents of our supposed “Figure”, and is, in turn, modified by them.

Now, here is surely a suitably messy situation to attempt to make sense of.

How might we do it?

Well, we do have a vast set of pluralist techniques, that though compromised conceptually, do give us “something”; and what we get is never merely pure invention, it always contains some aspects or fragments of the Truth. So, as long as we don’t wander off down the usual road, we can use these gains in a different way.

Though all gains made by such methods are always predicated upon restricted and maintained Domains, they do include an important measure of what is called Objective Content.

So, rather than careering off down the pragmatic sweet, downhill road to Production, we should gather as many closely related sets of pluralist Results as possible, and attempt to make some sort of conceptual integration out of them instead.

And, with such a change of philosophy and of methodology things can change profoundly.

We now consider all the skewed, pluralistic evidence, knowing that it has been extensively processed, and hence treating much of what we have with a measure of scepticism, and instead, attempting to formulate a common explanation, that would, in each biased pluralist set up, produce what has been extracted, but would integrate all cases into a single explanation.

Now, at this point we must address the universally applied frig that is the traditional answer to their “sets of pluralistic results”

That frig is the belief that each pluralistically obtained relation (a Law) is in fact the actual Truth for those factors, and if we simply add all such obtained Truths together, totally unmodified, we will get True Reality.

It replaces the true inter-relating integrations with crude Complication. The various Laws are summed to reconstruct what really happens.

NO THEY DON’T! What has to be done is to attempt to merge the individual isolations into a functional and integrated whole. That is much more difficult, but is essential!

NOTE: The alternative to the Copenhagen explanations of the Double Slit Experiments that was my own holist alternatives were amazing different in every possible way. And though the Copenhagenists could immediately motor off with their probability equations, they also brought understanding to a dead halt. Whereas, the holistic explanation have opened up theoretical prospects not only in these areas, but generally!