27 January, 2019

Issue 63: The Dialectics of Natural Selection


This edition re-issues my work on Truly Natural Selection, on its 10 year anniversary, alongside some more recent contributions to this vital subject.

This series of papers extends Natural Selection beyond the Living World and into Reality in general.

It sees all “complication” not just as a summation of Parts, but as a necessary development of things, involving wholly new features, when it is usually, and correctly, renamed Evolution.

Where with Life we have the mechanisms of qualitative change as variation based on mutation, plus selection via competition, this more general form drives change via selection between competing chemical processes, and the significant transformation of context.

Fitness to survive, reproduce and prosper in the form which drives Evolution, is replaced in the more basic form by advantage to conducive, complementary processes and the successive transformation of the underlying situations entirely without Life being either present or necessary.

This view of Reality runs entirely counter to the famed Second Law of Thermodynamics, and therefore, requires physical explanation. We can do this in terms of context, whereas the Second Law is a product of interludes of maximally constrained stability, while competitive advances-in-order occur in quite different situations of unconstrained, maximum opportunity. And these alternating phases turn out to be the natural features of systems driven in cycles of any kind. The pattern of longer periods of relative stability, interspersed with short interludes of radical, qualitative change, is, in fact, the norm in the trajectories of such systems.

And the Key Events in these processes are the revolutionary episodes, which we call Emergences. Clearly, the most significant and undeniable of these has to be that which produced the very first Living Things. And this Event alone confirms that Selection in some form must have preceded Life! It was the source of Life on Earth.

Many important fallacies are addressed in these papers, including the usual mathematical definition of Probability, and its false use as a Cause of Life. And, most crucially, we address the concept of competition involving mutually conducive and mutually contending chemical processes, which are necessary for Selection in these circumstances.

The crux has to be the revolutionary Events called Emergences, which had clearly already occurred throughout the history of Reality, prior to the Emergence of Life, and which are generally ignored by most current Science.


Truly Natural Selection extrapolates Darwin’s Natural Selection backwards into non-living systems, and the competition between simultaneously acting processes, involving both the consumption of resources, and the generation of consequent products.

Such active systems would invariably transform their own bases, and rampant positive feedback situations would always dwindle as necessary resources were used up, while other processes could accelerate due to the adequate production of their resources by other processes. Now, apart from such relatively independent processes, there will always be other relations between simultaneous processes, all the way from necessary sequences of dependant processes to either mutually- supporting, conducive processes, and at the opposite extreme mutually-contending and opposing processes.

So, even in such non-living mixes, the processes would directly effect one another and a kind of competition would most certainly ensue. And along with these, there would also invariably be the ever-present, one-way, Second Law of Thermodynamics type processes which would seemingly prosper on a wide range of products and effectively parasitically benefit from all available productive processes.

These ideas, in a totally holist way, were developed to extend concepts originally thought to be confined only to the Evolution of Life, first to its actual Origin, and thereafter to the whole spectrum of developments that have occurred ever since the start of the Universe.

And these ideas finally became a cornerstone of my Theory of Emergences, and a new dialectical view of selection and change in nature.

25 January, 2019

Genes and Dialectics

Considering the currently-dominant reductionist methodology in the study of Genetics, it is crystal-clear that the same sort of drawbacks that are limiting current Sub Atomic Physics, will also restrict Genetics in similar ways.

Let us see what those ways are!

The first stance to emphasise, must involve an essential switch from Plurality to Holism, as it is clear that functional areas in a Genome are NOT merely, and or even solely, some fixed-blueprint for the construction-of, and the processes-for, that Living Organism, in the usual sense at all. For, if, I have it right, individual genes can not only influence functional processes elsewhere in the body of the organism, but also provide THE mechanism for the future Development of the whole collection of that species of organism, over successive generations. For it is THERE that the initiating changes take place!

And, the question, of course, must be "How?"

Now, to prepare to attempt to answer that question, I can only start by re-stating the alternative general Holist view of interactions. For, they naturally involve multiple, simultaneous and active factors, which, in most circumstances, do something akin to a Vector Sum, wherein individual directions are involved, as well as their more obvious contributing weights. In other words, you can get both support and/or contention from the various contributions to deliver an overall result - and that will not always be the same! Indeed, it will differ at various significant stages in a particular development.

Now, depending upon circumstances, some of these could selectively-cancel, leaving a dominant "summed" result - looking very similar to one of its components, or, surprisingly, something like that of its opposite.

Relevant studies by this author can be found on the pre-Life stages in Truly Natural Selection. These have just been re-published in the latest issue of SHAPE (63).

Issue 63 of SHAPE on Natural Selection

Both of these are cases of a Stability - maintained results, where small changes in one factor, are compensated for, by the very-same-causes, eliciting balancing (returning) changes in another opposing factor.

Now, the above ideas emerged when considering relatively simple dynamical situations, but can, and indeed must, be extended to processes and even to complex Systems.

This rich, living world is no Lego-build, as is assumed by Plurality! Looking at the building blocks alone will undoubtedly prove insufficient.

For example, if we start by considering chemical reactions (processes), the factors involved are, always, both more numerous and of qualitatively different kinds. Both resources and products are related in a given chemical processes. And, the product of one, can be a required resource for another. So, extended-linear-sequences, could be linked by products-as-resources into chains of processes. And even Cycles become possible where the End-Product of a linear sequence, becomes the necessary Resource of the first process in that same series.

So, apart from the simpler stabilities, these relatively self-resourcing cycles produce a very different kind of Stability indeed, which also can deliver a whole variety of secondary products, which have proved to be absolutely crucial in Living Things. Considering these more complex aggregations, there will regularly be both secondary required resources and consequent extra products, involving many of the successive links in such structures: and, consequently, intricate networks will result - all-of-which will be susceptible to changes in resource supplies, both overall and internally within the structures.

The problems of the usually applied pluralist approach are immediately evident. For, in ALL pluralist experimental set-ups, they are expressly designed to remove "all-but-one" of the involved factors, and then to promote the single remaining and isolated relation into an eternal Natural Law- always deemed to be totally independent of any of its contexts.

And, remember, these revelations were made when considering supposed "bottommost rungs" occurring in real world developments.

But, even these criticisms are nowhere near enough!

Mankind soon concentrated historically upon things that didn't seem to change qualitatively at all, as revealing the truly essential factors - it being their dominance that maintained things so well. We actually chose precisely those things, which effectively hid the crucial mechanisms of change, and consequently of Development too!

But, isn't the most important purpose of genes to deliver such developmental changes - so that, somehow, when a particular mutation of a gene becomes established, it, along with others, produces the advantageous change in the resulting organism that contributes to the dominance of that change by Natural Selection.

And, of course, the functions of the Genome will be far more complicated that the analogies called upon so far! For example, how does the Genome deliver its "instructions" - not only to the relevant tissues, but also internally, within the Genome itself? And, of course, exactly when do these processes take place?

Is it before conception in the gamete producing areas?

Is it after conception, but before the development of the new foetus?

Do many achieved unions get rejected very early on: do they get ejected regularly?

And, surely, many functions will only be activated later either during the pre-birth developments, or at particular times, post-birth, in the maturation, and even in the eventual decline, of the organism.

Now, the communication problem is solved in a remarkable way, for every single cell of the organism, contains a complete copy of the whole Genome, as well as a secondary set, held within its Mitochondria, so many necessary communications are present locally within the necessary cell.

But, at far more dispersed situations, actual communications had neither direct routes to follow, or even unique "addresses" to guide the message to the right place, so the "messages" were produced in abundance, and broadcast literally everywhere, via the bloodstream, which because they were sent everywhere, would indeed finally get to its intended target, where it would be recognised by finding its own mirror-image-shape to connect perfectly with, triggering its necessary function.

But, such mechanisms generated problems, when dispersed elements had, not only to be both located and activated, but also despatched to the right place, along with others similarly located and despatched, to arrive at the right time and place to deliver the necessary functions.

I have to say that much of what I see about this the on the web fails to address these problems, and instead expects to find all the answers with its ever growing database of Genomes, but with no detailed description of the mechanisms involved.

This brief paper can do no more, at this stage, than indicate what is missing from this Science, which clearly will require a veritable revolution in the necessary underlying philosophy, before the really crucial questions can be adequately addressed.

The dialectical door was found by Hegel, and we were led through it to materialism by Marx, but the crucial domain of Science has not been comprehensively addressed from this stance, and until it has, such wrong turnings, as are evident in areas like Genetics, will not be overcome.

20 January, 2019

The Ghosts in a Ghost Substrate

A Hauntograph by Michael C Coldwell
A Hauntograph by Michael C Coldwell

The following is a quote from an article in New Scientist (3205) called How a ghostly, forgotten particle could be the saviour of physics, and I extract it here long before the writer gets to revealing which particular particle she is telling us about. For they are indeed legion in the standard Copenhagen approach, and also because there is an alternative theoretical stance which has proved that it can cope very well indeed with such problems, but which is currently anathema in consensus Sub Atomic Physics.

"THIS is the story of a particle that has refused to die. For 50 years, it has haunted particle physics, with hints of its presence appearing in maddeningly ambiguous ways. Some believe they have seen it. Others think it is a figment of our imagination. But every time we think it is definitely not there, a sudden gust of wind knocks over the furniture and once more there is confusion." 

Ghosts in New Scientist 3205

 Now, the most fleeting of particles actually occur at the very heart of the Copenhagen Interpretation of Quantum Theory, indeed they involve absolutely all-of-those included in the concept of Wave/Particle Duality, wherein such entities can sometimes act as classical Particles, while at others, act as if subject to their own intrinsic Wave, which somehow determines, but does not reveal, where they are!

The reason for such remarkable behaviour is never explained, but instead is "put down to" Heisenberg's Uncertainty Principle, which makes the Sub Atomic Realm very different from the rest of Reality, by being totally indeterminate.

Yet, the suggested physical alternative to Copenhagen, achieves a general resolution of all its founding anomalies present in the ill-famed Double Slit Experiments, merely by including a Universal Substrate in the situations.

Though, it has to also be undetectable!

Now, I bring it up here, because just such an Undetectable Universal Substrate has been theoretically-defined, solely in terms of mutually-orbiting pairs of diametrically-opposite Leptons. And, several crucial Units of that Substrate involve the very Neutrinos that are considered in this article.

So, let us investigate exactly how the presence of just such a Substrate affects phenomena occurring within it.

For then, with such a Substrate, the Wave/Particle Duality construct of Copenhagen dissolves instead into a classical Particle interacting with the Substrate to produce Waves, which are then transformed by passing through the Slits and, thereafter, interfering with each other, to then affect the slow-moving, causing Particle, when it finally arrives at the caused Interference Pattern, in that Substrate!

And, thus, such delayed interlopers having been affected by their differing particular passages through that changed Substrate, to finally produce, as Particles, the overall pattern on the detection screen.

Waves are rendered once more properties of extended, connected Substrates, when disturbed by a Particle. which scamper ahead and ultimately produce an interference pattern, that then affects its own much slower-moving cause.

And, the required undetectable Units for such a Substrate, are indeed possible via mutually-orbiting pairs of diametrically-opposite, composing sub-units.

You only have to consider Pair Production and Pair Annihilations, each involve one electron and one positron! For, we are informed that such processes convert Energy-to-Matter and Matter-to-Energy! How??

How about changing from and to mutually-orbiting-pairs instead? The pairs cannot be detected and so the particles appear to vanish or appear, as if from nowhere!

Ghost Particles by Michael C Coldwell
Ghost Particles by Michael C Coldwell

For, such have actually been observed to occur at Fermilab, in the Tevatron, and were named there as Positroniums. And, as it happens, all the other Leptons could also be so linked too!

Aren't such ghost particles usually called Photons - meaning disembodied Energy gobbits?

And, it makes more sense than electrical and magnetic sinusoidal oscillations, acting at right angles to one another, and carrying Energy, supposedly in totally Empty Space - not to mention turning into physical matter and antimatter particles spontaneouly! How all this is meant to happen has never been explained.

In this alternative model it is. The full nature of electromagnetism is indeed encapsulated in an orbiting charged Particle, including the involved Energy, which occur in such mutually-orbiting pairs!

Now, I am well aware of the consensus stance, but reject it both philosophicallyand physically, for I consider such a purely Maths-based stance to be merely an idealist formal construct: indeed a wholly pluralist complexity, embodying all the premise-errors due to an insistence upon Eternal Natural Laws, and also to the total omission of all qualitative changes from all of Mathematics, as well as all consequent non-dialectical Reasoning too.

Though, the complexity of their multi-dimensional Mathematics effectively hides it, their involved philosophical stance is an illegitimate amalgam of several wholly contradictory stances, apparently justified only by the immature Pragmatism of "If it works, it is right!"

The problems outlined in this piece can be transformed, once a Universal Substrate of the kind outlined above is involved, and not only due to their actions as parts of that Substrate, but also as occasionally, temporarily free-moving units dissociated from their usual Substrate roles.

For example, the theoretical Substrate research has revealed by the discovery of several different modes of those units, both as transformed Substrates, and as free-moving Streams and Vortices - the latter allowing a totally non-Copenhagen explanation for quantised electron orbits in atoms.

"The bad news is that the latest round of experiments set up to look for it claim that it can’t possibly be there."

But, consider again the theoretical research - when particles can exist in three different modes as :-

  1. Free-moving as mutually-orbiting pairs
  2. Existing as a part of the Universal Substrate
  3. Free-moving as parts of a dissociated Substrate Unit

For then the described anomalies will be due to the transfers in-and-out of being Substrate Units, and in-and-out of being mutually orbiting Pairs!

But then:

"As long ago as the 1960s, however, physicists measuring the quantity of electron neutrinos reaching Earth found a major shortfall, with one experiment detecting only 25 per cent of the expected number."

Clearly, with a totally space-filling Universal Substrate composed in part of just such Particles, the capture of some of that Solar Stream into the all-pervading Substrate composed, in part, of the same kind of Particles, seems more likely than not!

The difference between a Totally Empty Space, and one filled with a complex Universal Substrate, surely has to be colossal. Indeed, not only would such a Substrate allow the explanation of many phenomena, but it would turn a vacuum peopled with only colliding particles, into a maelstrom of turbulences, propagations, transfers and Energy.


"Rather than being massless, each neutrino did in fact have a tiny amount of mass, no more than a millionth that of an electron. This mass gives neutrinos a remarkable ability to switch between flavours"

Now, all-known-Neutrinos have been integrated, via mutually-orbiting Pairs, into my Universal Substrate, and designated as potential Gravitons or alternatively as a similar, though-much-tinier Photon, so clearly, in addition to the usual releases and captures to-and-from the Substrate, and even the regular dissociations and re-associations of pairs to-and-from the individual components, while the mini-photons will also be literally everywhere.

Thus, such a Substrate area, with is multiple processes, could very easily be misinterpreted as described in the above quote!

"That meant electron neutrinos produced in the sun’s core could transform into either muon or tau neutrinos, evading our searches on Earth."

OR, as I hope has become clear, a Universal Substrate, containing all these kinds of neutrinos, in abundance, will undergo dissociations due to the surges from the Sun, so enabling their detection as part of that Solar Stream. Not an oscillation between different types at all!

And the article goes on to mention many other anomalies - all succumbing to the same explanation as I have outlined above. I will not list them all!

But clearly, they all assume that these happen in totally Empty Space - either occurring naturally "in Space", or artificially maintained as in all pluralist experiments. But, clearly in neither situation is the undetectable Universal Substrate actually removed! It is always there, but in different states, perhaps.

However, once you have embraced the infinite variety of Ideality, and causal explanations are no longer open to you, instead you get this type of thing -

"the idea is to invent a fourth, “sterile” flavour of neutrino capable of shape-shifting into any of the other three."

Need I say more!

Now, to those who demand full explanations from myself, as the dissenter, may I say that I am just a theorist, and the task, if it is considered to be worthwhile by other Physicists, is surely to find ways of experimentally investigating my suggested undetectable Universal Substrate.

But, I'm afraid this will be impossible pluralistically!

The whole stance of current Physics prohibits such a task, which involves a complete revolution both philosophically and experimentally.

Finally, having read this article more than once, it has to be said that it reeks of Modern Mathematics, as it must, of course, if the only place to look for "reasons" is Form, therefore redirecting the focus totally from concrete Reality's Properties & Causes to Ideality's infinite Forms.

Notice how even experiment has become subordinated to confirming to the Mathematics!

06 January, 2019

New logo based on El Lissitzky's Red Wedge


Welcome to SHAPE

Special Issue 62: Change

This new special issue features a single essay on the crucial subject of Changing Reality, and our general inadequacy in dealing with it.

Even in the obviously mutable structures we erect, from houses to cities and societies, we fail to notice, address and understand the change taking place. And when we look at the natural world around us, we have even greater difficulty, deferring instead to eternal gods or unchanging Laws of Nature, ignoring the incessant flux, its rhythms and its tempos...

05 January, 2019

Real Holism?

What is Holism? It isn't the usual New Age pseudoscience which unfortunately co-opts the term. 

Having commenced reading a book upon supposedly "Holistic Science", I was surprised how very differently, and with very different conclusions to my own, in the way that very idea of Holism was handled. 

The Holistic Inspiration of Physics: The Underground History of Electromagnetic Theory by Val Dusek

For, my emphasis strongly contrasted Holism with what I term Plurality as crucial-and-opposite premises, in considering the World, whereas, at least in his introduction, the writer contrasted it with what he termed "Atomism" and Reductionism (or even "Mechanism"). His emphases upon top-down-causality as distinct from bottom-up-causality seemed to primarily define his Holist Stance.

Yet, in all Causality, it has to be Properties that primarily enable explanations of phenomena: and though this does seem easier in bottom-up explanations, and therefore naturally-selective of subscribing to Reductionism, the universal resilience of Stabilities, regularly overcoming challenges from below to nevertheless persist, also invokes credit to some sort of apparently "top-down" causality too.

But, the absence of any reference whatsoever to Plurality does not bode well, and I will explain why this is so.

Plurality actually enabled Abstraction for the very first time, to be greatly and powerfully extended into wholly new areas. It happened when the ancient Greeks invented Euclidian Geometry, and thereafter, the much larger discipline of Mathematics, that it begat. It was simultaneously a Great Enabling Revolution, and yet also at the same time, a potentially-destructive "time-bomb" to possible further developments at a later and more advanced stage in Mankind's intellectual thinking.

For, Abstractions don't exist as such, physically: they are aspects of Reality that can be extracted and discussed, because they reflect something important in what they are aspects of, while also being common to a wide range of other physically existing things too.
Number is the clearest example!

NOTE: But, as we will see, the absence of even a mention of all this, and the emphasis upon "top down drivers" opens the door implicitly, without direct mention, to Principles like that of Plurality, as well as to the allocation of Cause to Form - to mathematical versions applied to physical relationships too.

Mathematics is entirely Pluralist!

But, the Greeks took Abstraction further, to include descriptors of Shape, Position and Direction. And, they soon were able to relate these into "always true" Rules: which they established into a system of relationships, between these, amounting to a kind of Spatial Reasoning.

But, only of fixed Abstractions.

Now, though quantitative features can vary, the Principle of Plurality insists that the formal Abstractions and their inter-relationships don't: they cannot change qualitatively.

And, this limits Mathematics to only those areas where such will always be the case.

But, and this is very important, the success of Mathematics caused the Greeks to extend Plurality to general Reasoning too, delivering Formal Logic, and later to Science also - and this limitation to everywhere in these contexts is most certainly NOT legitimate!

Indeed Pluralist disciplines cannot deal with qualitative developments of any kind: they can only apply if the situations are either naturally or artificially kept rigidly-unchanging in the "essential" qualities involved, and only vary quantitatively.

Clearly, not even mentioning this crucial ground in his introduction does not bode at all well for where he personally will be taking the important topic.

In addition, many other features, that are often mentioned, also need explanations.

For example, how on earth can top-down causality work?

It cannot be the same as the role in bottom-up, as instead of many-to-one causalities - simply adding-up, many-to-many cannot be so easily dealt with, for the actual containing Context is also being changed, and even multiple Recursions via a series of bottom-up caused changing Contexts, can often be unavoidable. Top-down causalities therefore often drift into one-to-many restrictions such as Principles or Iron Laws from "on high"!

NOTE:this criticism also applies, though less often, in bottom up causality, but not most of the time, though when it does occur, the established relation being assumed no longer pertains, and a partially or even a wholly different one is usually required.

We call the situation when-Plurality-pertains as the state of Stability, and, when it irretrievably fails, as that of Instability!

NOTE:That Science, as-originally-conceived-of, after the Greek Revolution, could only be relied upon, by successfully imposing the most extensive and rigid maintaining of the Context - tailor-made for a single aimed-for Law!

Nevertheless, with the rapidly increasing skills of the scientists and technologists in both achieving and maintaining those requirements, the successful application of these methods rapidly increased.

I'm afraid that up to the end of his Introduction, his stance appeared to be, so far at least, wholly inadequate! 

But, there is an excuse for the current mess that is evident in the theoretical and philosophical bases in this area. For, they consequently reflect the dire standards, both in Education, and perhaps even more so, due to those same inadequacies in the political organisations of the Working Class. 


For, though the means to avoid these errors had been finally established, first by Hegelian thinking, and then, more generally, by Marx in his Dialectical Materialism of concrete Reality too, it was never comprehensively applied to Science, despite being the forefront of philosophy from the early 19th century onwards.

The still fast-moving monolith of Science carried on exactly as before, constrained by its dominance by those committed to, and staffed by scientists exclusively from, the privileged classes.

But Marxism had been taken on by many in the Working Class Movement, and, hence, it was down to that strand to also apply it to Science.

But, to the detriment of both, that didn't happen either!

And, as indicated by Dusek's Introduction, this book will not be tackling the real problems either! Nonetheless I will be following this initial crit with further contributions in the near future...

03 January, 2019

DNA and the Social Development of the Brain?

In a CARTA video available on YouTube, by Leah Krubitzer, she delivers a remarkable alternative to the usual Genome-dominated idea of development in the brain. It clearly demonstrated that though the forms-of-development within the brain are solely determined by genetics, the actual contents of those developments are NOT so determined at all, but are primarily influenced by actual use, via behaviours in the real world - especially by behaviours caused by major crises inflicted upon the recipient.

This "seems" to return to the oldest problem of all in explaining development!

Is it wholly blue-printed within the DNA (Genome) of the living entity, which effectively determines everything that subsequently occurs, though occasionally changed by what are wholly accidental Random Damages to individual Genes, OR, can things learned during life be passed on to descendants as "Acquired Characteristics" (such as in current epigenetics and a return to Lamarckism)?

For, her contribution could also be seen in that way, but that certainly isn't, and indeed wasn't her case either. What Leah Kubitzer reveals is more a development of the former than a return to the latter as will I hope be revealed!

Indeed, various studies both of unusually sensually-equipped animals both occurring naturally, as well as in addition others having had their sensorial means artificially totally-restricted, and the effects on their brains compared over many individuals.

The results were remarkable!

Actual use, over time, had physically-changed the brains involved.

Particular areas of the brain appeared transformed in the animals artificially reduced in sensorial abilities, to enhance some of those that remained, to end up with a brain-structure similar to that of the duck-billed Platypus, whose brain-area dedicated to its super-sensory bill was relatively enormously enlarged. But, the changes noticed, in both, amounted to a great multiplicity of the connections from that part of the brain to other areas within it, and consequent increases in those particular areas too, the behaviours-possible were down to those vastly increased connections, and NOT to wholly new functional areas.

The brain was physically changed by enhanced behaviours, but then, via new connections, also enabling, in addition, wholly-new further development of such behaviours.

But, there is a great deal more in these findings than what immediately presents itself.

Notice that the duck-billed Platypus, as a species, hasn't changed much in vast periods of time, and yet the particular individual animal (a possum) deprived of its sight changed at a remarkable rate, and made all the necessary developments very early in its life - the initial period was absolutely crucial.

Nothing new was available to that animal, indeed it was a deprivation that precipitated the developments.

But, we already have a process in which the given Genome of an animal delivers all its "built-in" behaviours, with any changes being down to random chance mutations of genes. And, in addition, it is supposed to be only by such purely random damage to the genome that any changes can ever happen, and thereafter by Natural Selection, which chooses the best adaptations to predominate over succeeding generations.

Is not that single one-way causality also somewhat challenged by these findings?

For, the deprived individual animals developed not only new connections in their brains, but also enhanced, or even new, behaviours benefiting from those changes.

Yet, the developments in the brain must also be inherited! How else do modern animals including ourselves come to be as they are? And, at the same time, why do some organisms remain unchanged for millions of years?

Mankind has, in the past, clearly revealed important processes in this area, but, as usual, always conceptually simplified them in order to more easily develop them further - for simplifying Abstractions always enable such things (see all of Mathematics!).

And, there has to be more to the mostly wholly redundant and unused majority of any organism's Genome, where the genes seem to be in "rooms used as depositories of rubbish, and like those in many a Stately Home, full of no longer used cast offs".

I don't believe in 'junk' DNA! It is more likely that this represents The Past of that organism in some way, in its evolutionary development: occasionally transformed by mutations, but later bypassed either temporarily or permanently, as a repository of things that once worked but now bypassed. Could this be how prior evolutionary solutions return, in supposedly convergent evolution, such as the return of fish-like traits in sea mammals? 

Primarily, though, the dead weight of our pluralist history in Mathematics, Reasoning and even Science has imposed upon our thinking the myth of eternal Natural Laws and Reductionism, which fatally damages our ability to make sense of such Development at all, by falsely converting it into the mere Complexity of many summed-fixed-things and laws.

The consequent missing ingredient was therefore Real Qualitative Change, and hence the absolutely necessary means to ever understanding Creative Changes.

Darwin, Engels and Marx

Now, since Hegel and then Marx, the methodology of Dialectical Materialism had been devised to address such developments, culminating only in 2010 with the Theory of Emergences (by the writer of this paper), which finally tackled the trajectory of alternating Stabilities and Emergences that characterise Qualitative Development in literally all spheres.

And, echoes of it are clearly shown in the case of the deprivation of sight in a possum leading to a rush of developments in other senses - for in the Theory of Emergences the termination of seemingly permanent Stabilities can only be precipitated by crises that cannot be resolved from within them, and therefore necessarily results in both a collapse of the current Stability, and thereafter the consequent construction of a wholly new one, which is finally established via another and wholly different Stability and set of new capabilities!

02 January, 2019


  1. Is wealth the legitimate reward for hard work?
  2. Is it the necessary spur to absolutely essential innovation?
  3. Do the owners of inherited Stately Homes and Landed Estates deserve their Life of Luxury?
  4. Is all wealth actually created by the Captains of Industry?
  5. And is all War absolutely necessary to defeat those who wish the privileges of others for themselves?


The Poor can do none of these things, except, of course, they create absolutely all new value by their labour!

How else is wealth really produced?

Yet, we are supposed to live in a Democracy!

Why don't the majority just vote in a Socialist Government to redistribute wealth more fairly?

The answer is, once more, they don't have the wealth to do it!

Socialists don't have the mass media in their pockets - no daily newspapers to state their case. They don't have round-the-clock radio and television programmes on their side, nor the mainstream film industry, nor any glossy magazines.

They don't even have the resources to build their organisations, and fund their organisers, their researchers, their publicists, or their journalists and writers.

They can't pay for roadside adverts, or those every few minutes on TV!

They can't buy the votes of elected representatives, by Knighthoods, or the like, or even just a regular ticket to The Good Life.

But the Rich do have enough to do all of these things... as much as they like! Could that be how they always stay in the driving seat?

Now, the Labour Party has a committed socialist as its leader. If we had a General Election now, he would win!

What will the rich do?

What did they do in 1945?

You know, don't you?

So, how could they be stopped THIS TIME?