30 June, 2015

A New "Constructivist" Experiment

Long Exposure of double pendulum with LED at end of each pendulum

Here is a suggestion for a Couder-like constructivist experiment, to attempt to get similar stable effects as results. The key elements for such experiments have to be in the tuneable interactions of mutually affecting oscillations, underpinned by a constant applied vibration, that also interacts significantly, but is also the crucial energy supply, which keeps the resultant system going.

It is hoped that a final, overall rotation will again deliver the required cup de grace – Quantization!

To deliver the appropriate conditions, let us choose a Compound Pendulum as our starting point, for even without any of the intended extra additions, these produce fascinating and complex behaviours. So, that if it is successively modified and appropriately “tuned”, it will take us through a range of stabilities, and even a final extra example of quantization – entirely as a result of oscillations and rotations alone.




But the usual form of a compound pendulum will require several additions in order to tune it into the possible stable forms that we are seeking. The diagram below shows the necessary construction.





This new form would commence by changes to the structure of the pendulum and the overall vertical vibration to attempt a “Walker-like” stability. Once this has been achieved, we could add a horizontal rotation via the included motor, and see the effects achieved at different rates.

If the discoveries of the Couder “Walker” experiments are applicable generally in appropriate circumstances, when driven by an overall vibration as a key component, that is also a continuous source of energy, we should expect to get both the establishment of a stable system, and, even more important, the clearly quantized results of the various added horizontal rotations.

Who fancies having a go at this task?

Greece: On the threshold of revolution?



Significant things are happening in Greece. Could the country that gave the World Democracy, now establish a European Socialist State?

The 2008 Slump was a clear indicator of the bankruptcy of Capitalism. Even its usual methods of exporting all their problems to the Third World isn’t working as it used to. Now, we will have a return to Nation-State rivalries and even war in Europe.

The "West versus Russia" confrontation is between Capitalist States. No easy Communist targets remain!

So, once more, the Working Class is the major target for regaining some sort of re-establishment of the Old Regime. They will be made to pay!

Greece proves it!

The pensioners and the youth are increasingly being made to pay by enforced poverty. The first break in the Phalanx of Pro-capitalist parties, including those saying they are socialist, is NOW occurring in that great Cradle of Civilisation.

Victory to the People of Greece!

Kick out Capitalism for the only rule for and by The People: S O C I A L I S M!

28 June, 2015

Persisting Resonances



Let us muse about the natural oscillations of a fixed set up.

The simplest is that of a fixed taut string anchored at both ends. For, such has a whole quantized series of possible natural oscillations. If a disturbance (such as a pluck) were applied near the centre of the string, it would oscillate with the so-called fundamental frequency of that particular length, weight and tension of the string. You don’t have to apply a necessary frequency of disturbance – anything will do, as it is the addition of energy that will set it vibrating at its natural frequency.

In other words, the fundamental frequency is a property of that precise set up only. But if the agitation of the string were applied at precisely one quarter of the length of the string, it would then vibrate at twice the fundamental frequency. And similar agitations at particular precise points, along the length of the string, will produce other so-called harmonics – frequencies simply related to the fundamental. But, such a device is not rigidly controlled, so variations in string tension will change the set of related frequencies that are produced. Nevertheless, we have a very simple construction, which naturally possesses Quantized Frequencies.

NOTE: this infers that elsewhere-in Nature, when such quantized frequencies occur, might also have very similar causes, as we will see!

Now, if our string is totally still, but elsewhere, its exact fundamental frequency is made to occur elsewhere, but clearly hear-able at our string, that would begin to vibrate too in resonance with the separately situated source.

The intervening substrate (the air) will have itself been set into that same oscillation, and communicated both the original’s frequency and its energy to our originally quiescent set up, and by resonance, set it too into motion. But, in addition, many initiators may well be mixes of many frequencies, but even they can set our string into resonance, but only at one or another of its harmonic set.

The frequency of a marching squad of soldiers crossing a bridge, will not all be in perfect unison, but could be enough to set the bridge in resonance and even shake itself to dissolution. 




The key thing is the transference of energy.

Now, various questions are posed.

The main one is concerned with the fact that to set our string in motion at its fundamental frequency, it doesn’t have to be initiated by a direct input of that same frequency. It can be almost any mix of frequencies as long as some related one is present. So, we can conceive of a non-specific energy from without.

This poses questions about persistence of oscillation.

In Yves Couder’s famous Walker set up, he had a constant input of energy, at a particular frequency of the whole set up, which was externally driven, and this was certainly what kept the formed Walkers as persisting entities.

Now, considering the processes in Couder’s work, using examples involving vibrations of strings, may well mislead, as further work seems to indicate that it isn’t fixed frequencies that are the key, BUT related frequencies in all the participating vibrations. Indeed, it seems likely that the same could be achieved with a different set of frequencies, as long as they are appropriately related via careful tuning.

Of course, Couder’s Experiment was different in that a second and different oscillation was set in motion by the applied persistent oscillation – the falling drop of the same oil onto the main oil bath. That was sent back up to, thereafter, deliver regular kicks via its persistent bouncing, and finally the surface of the oil in the bath also had a surface oscillation, which because of the bouncing drop was turned into an interacting standing wave.

Clearly, there is no reason why basically similar stable arrangements could not be set up in other appropriate media.

Our task is to relate these findings to the phenomena within the atom, by including consideration of an undetected universal substrate, even within the atom itself.




The Atom and the Substrate series is being published on Shape Journal. The first issue The Substrate is available now. Part 2 on The Atom will be published in the next week or two.
 

25 June, 2015

The Crucial Missing Ingredient?




When considering, more generally, the significance of the creation of Yves Couder’s Walkers in his famous experiments, the role of the substrate was crucial, because that was the only material thing involved. So, the question of a universal substrate seems, once more, to be back on the agenda.

Let us see why!

The remarkable thing about Couder’s experiments is that they needed only this substrate, plus various oscillations to actually create a stable entity, which, if that substrate were both universal and undetectable, would seem to have arisen entirely out of energy alone.

So, quite apart from the specifics of Couder’s experiments, these discoveries seem to pose much more general questions.

Instead of inert matter being set into various phenomena by disembodied laws, and energy alone, we are confronted with the possibility of there also being a substrate, with the capabilities revealed in Couder’s important experiments. Indeed, if such a universal substrate were composed of known material entities, but constructed in such a way as to be undetectable, the whole assumed basis for all phenomena is totally transformed, indeed, returning things back to the old positions based upon the assumption of The Ether.

And, that is not all!

Just by adding an overall rotation of the right kind, Couder managed to impose quantized orbits upon his Walkers. The possibility of a substrate within the Atom becomes not only a possibility, but also may deliver a physical explanation for quantized electron orbits, no longer down to the Copenhagen Interpretation of Quantum Theory. Clearly, a full explanation, of Couder’s macro experiments, could turn out to revolutionise Sub Atomic Physics too.

So, what exactly happened in these experiments and “Why?”

The primary question of the Couder experiments is, “What causes the creation of a localised stable structure, seemingly entirely out of vibrations within a single medium?”For usually, oscillations in such a medium naturally propagate right across it, and are not restricted to some self-made locality. How on earth can they be concentrated into a small locality within the medium, and one, which then persists and acts as a stable entity, with properties of its own?

Posed as I have put it above, there seems to be no possible answer to this question, but, of course, it turns out that there are several oscillations involved, which affect one another to produce such an entity. The Key addition is that of a falling drop of the very same substrate, that in appropriately tuned circumstances actually bounces back up from the surface of the substrate. This drop, which then continues to bounce, defines a particular point, where the drop and substrate interact, which then becomes the centre of the ultimately produced Walker.

But, even this doesn’t fully describe what occurs. Clearly, resonances are involved between the bouncing drop and the constantly vertically vibrating tray of substrate. But, on its regular return, the bouncing drop gives a regular kick to the surface skin of the substrate, and this causes a standing wave in the surface, surrounding the bouncing drop. So, we also have recursion occurring too.

Now, we can conceive of a forced vibration in a pond, which would simply produce continuous outward moving waves. But in the Couder experiments, we have a nexus of resonant and recursive processes, which are self re-enforcing, and with energy derived from the constantly vibrating substrate, these become the famous Walker.

When you attempt to trace through causes and effects, we see that it isn’t a simple linear sequence, but related cycles, so that quite apart from resonances, we also have causes producing effects, which then recursively become causes.

This is the basis for the walkers!

But, Couder didn’t merely set up the various experiments and they immediately worked. On the contrary, he had to adjust the overall vibrations of the substrate, and the size and height of the released drop until it gelled into -

1. His bouncing drop
2. The vibrations of the whole substrate
3. A standing wave in the surface of that substrate
4. The combined stable entity called The Walker.

Clearly, even the nature of the substrate would be crucial in allowing what occurred, for would it also work with water, for example. Clearly, the surface tension and even viscosity would have to also be right. The physical explanation of such a localisation and balanced self-maintaining stability is crucial here. For, its occurrence, even in Couder’s carefully arranged cases, must also change the general way we think about things. They do not happen in Empty Space, but always and ever within a substrate.

Considering only a proton and an electron as can exist within an atom, would now be under question If such entities exist within and affect a universal substrate, which itself also affects the entities involved, we will have to rethink the features that we find there, and explain them very differently. For, it is ONLY with such a substrate that the necessary feedback can take place to produce that entity’s evident stability.


Part I of The Atom and the Substrate is now available.

15 June, 2015

New Special Issue: The Atom & the Substrate I



This new issue of the journal is the first in a landmark series outlining an entirely new approach to Sub-Atomic Physics. This will consist of two substantial issues containing work of truly great significance.

Such fanfare is not hyperbole. In the second of these installments there will be a full refutation of the too long incumbent Copenhagen Interpretation of Quantum Theory, both in general, and as a study of the Atom. Finally this regressive interlude in Physics can be stopped and the science re-orientated upon a wholly new path. For not only is this a debunking of Quantum Theory, but of the flawed pluralist position that preceded it.

These two new issues will be published online in quick succession, starting here with The Substrate and followed by The Atom. The issues tackled within are fundamental both philosophically and physically, and present for the first time, a purely physical explanation for quantisation.

Read it here.


13 June, 2015

Big Blip: Seeking Answers in the Knacker’s Yard


The Rejuvenated LHC?

In yet another celebration of the Large Hadron Collider, on the occasion of its resuscitation after a two-year refit, New Scientist (3022) recently proffered an article by Elizabeth Landau entitles The Big Blip.

It is a muse about what may be discovered by the updated device, and the possible questions posed on the consensus standard model of the collection of “fundamental particles" in the Sub Atomic realm, the by the confidently expected “new evidence”.

A major concern seems to be about matter and antimatter – indeed concerning a purely theoretical concept of their differences and origins. In what has become the usual theoretical route, these were defined to match known evidence, but without any idea as to why they came to be.

And, using this prodigious machine, an experiment has been devised to reveal more evidence, and enable its encapsulation into yet more formalist relations - NOT, I must emphasize, as any sort of physical explanation, but merely as a set of formalisms which can be massaged (via the usual methods) to fit the revealed evidence.

It is important to consider such events and their purposes, knowing that explanation in general had long ago been abandoned as mere self-kid. (In 1927 at Solvay the significant turn was made). And, thereafter, the only accepted, reliable encapsulation of what was found, were a zoo of fundamental particles, and various formalisms about them embodied in mathematical equations.

And, with such a “revolution”, the means for further study unavoidably narrowed into a series of ever more powerful Accelerators (Colliders), of which the LHC is the latest and most powerful version. In addition, the methodology was also similarly restricted solely to smashing up such entities to see what they might be made of.




Indeed, such a History has produced a unique realm of concepts and rules, which are worked out by very competent mathematicians on blackboards – effectively working exclusively in abstractions. It has produced a compelling realm, which is necessarily compilation in its own realm, and is situated a very long way from the believed basis of the preceding Classical Physics.

The main imperative has been to reveal the Origins of Reality itself solely in terms of these ultimate, descrete and abstract entities – to supposedly lay bare the true determining basis for absolutely everything.

It is, of course, a lost cause! And, the reason is that it has become no longer a study of Reality, but something very different indeed. It is a study of abstract concepts, elicited from certain phenomena, but most definitely NOT a comprehensive and meaningful extraction, but, on the contrary, a selection according to a set of formal criteria, leaving absolutely everything that doesn’t so conform behind. Consequently it necessarily becomes a study of an entirely Formal World with the narrowest of connections with Concrete Reality.

Indeed, if you are to subscribe to this approach, you will no longer be investigating Reality-as-is, but, instead, only addressing a selection of simplified and idealised extractions, which are usually ONLY obtainable from intensively-farmed Domains within Reality. So, clearly, what is then studied in current Sub Atomic Physics is a version of that very same Formal World – the realm of Pure Form alone, which is termed Ideality, and served by its methodology, which we call Mathematics!

Yet, in spite of truly gigantic amounts of funds, resources and competent mathematicians, all invested into experiments such as those on the LHC, it certainly cannot be said that the original source, Reality itself, has been clearly defined.

On the contrary, the only way that the theorists involved have coped is by the adoption of the Copenhagen Interpretation of Quantum Theory, due initially to Bohr and Heisenberg, which embraces a supposed Reality with Wave/Particle Duality at its very heart, and Formal Description, which is deemed to be more profound and even determining than any attempted Physical Explanation.

Indeed, a “New World” is said to exist down there, which cannot be dealt within any other way, and most particularly not in the “old ways” applied universally before the quantum! And, this essentially different realm must henceforth be dealt with solely according to the bases, assumptions and methods of Copenhagen.

Of course, it is also true that many of the premises and principles of Classical Physics were, indeed, wholly inadequate in these deeper investigations of the nature of Reality. Yet, instead of solving the evident shortcomings of that established approach, the denizens of Copenhagen turned the other way, and “on principle” looked solely to Forms as the true essential drivers of absolutely everything.

But, clearly such things are ONLY abstractions from Reality, mostly carefully arranged-for, to enable reliable predictions in given contexts, and which are totally unable to explain properties and causes.

Indeed, without a moment’s doubt, equations are now seen as the actual “causes” of phenomena, rather than being merely formal descriptions of them. Clearly, Copenhagen and its whole consequent realm is entirely an idealist retreat!

In the next few weeks, two new Special Issues of the SHAPE Journal will be published, which explain Quantization in a very different, physical and materialist way. 





Part I of The Atom and the Substrate is now available.

11 June, 2015

Gravity is a push force!



This diagram is from a forthcoming new series called The Atom and the Substrate, which postulates that a heterogeneous sea of particles permeates the entire universe, allowing for the propagation of light, magnetism and even gravity. 

The idea that a substrate may cause gravity through collisions of inert particles came from Newton, via Glenn Borchardt. Wedded to my hypothetical Neutritron paving we start to get a new model of the ether than begins to explain a whole range of phenomena, and poses a serious threat to Quantum Physics as a whole!


Part I of The Atom and the Substrate is now available.



26 May, 2015

Capitalism’s Major Flaw


Profit!

The article in New Scientist (3022) entitled "Capitalism’s Hidden Web of Power", questioned the current analysis of the 2008 slump. Yet, the slump, itself, was more or less taken for granted, and the only important questions to be addressed were deemed to be about how to police irresponsible companies, so that such things could be “nipped in the bud”, and retrieved before too much damage was done. The idea was that, with sufficient information, an impending crisis could be avoided. The recession would not be so deep, and the recover would be much swifter!

These correcting analysts insisted that the criteria for assessing risk were inadequate. Various institutions and involved researchers now vied to add the obviously as yet excluded component, which they termed “Complexity”.  [Not the many other meanings of this word, but merely how complicated many companies had become was what was meant]. But, of course, they were wrong too!

Complexity as it was now being revealed is there on purpose - to HIDE things!

But, it isn’t the methods used in organisation that are the unintended reasons for this problem, but exactly what they are always trying to hide!

The reason for these desperate swoops is the nature of Capitalism itself! And, of course, this has to be hidden at all costs. Tell me, are these regulators going to rearrange things so that the real causes are plainly evident? Of course they aren’t.

No one really looks closely at how Capitalism works. It is treated as a natural given, and all efforts are concentrated upon re-organising the deckchairs on the sinking Titanic!




So, why do slumps occur? For that they certainly do! How can a system sustain itself in the face of the most unavoidable chronic crisis?

If Capitalism, as is claimed, delivers, of itself alone, “a better life for all”, why is it so frail, and so regularly (and catastrophically) compromised? Can you really blame it all on Complexity? Of course not!

It is much more likely that it is inherently and fatally flawed. And simply cannot avoid these major calamities. Is it not inevitable given how Capitalism actually works? Whilever the so-called experts are studying Complexity, no one is studying modern Capitalism as an economic system. And, no one is addressing its regular crises and inevitable, ultimate collapse.

What is the guiding Principle of Capitalism? It is the acquiring of PROFIT! And, what precisely is that? Is it like wages for work done? NO, it certainly isn’t! It involves having wealth, and investing some of it to get a nice regular addition to it! It is “Much getting More” without really doing anything for it.

“But”, I hear the cry, “we are risking our wealth!”

No, they aren’t! The Stock Exchange guarantees that. Only the uninformed small investors will be wiped out! The big boys never get ruined. They even make money out of slumps. NOTE: In 2008 a famous British Capitalist was seen in Iceland, as its economy was collapsing, buying up whatever he could get for a song! How do you think he did out of his hurried trip?

PROFIT is an added overhead above all real costs and payments, to pay both owners and investors an unearned bonus. And, indeed, some of these playing the Stock Exchange don’t even care what their investments finance. For as soon as a profit is to be made they SELL!


  

Now, you may well ask, “How on earth does it ever work (between the slumps of course)?”

It is because there is always the promise of unearned profit literally forever.

That keeps moneyed people investing, and others setting up companies. But, the values generated by such activities are never real values. So-called confidence inflates expectations and hence Market Values always above Reality. Yes, always!

Indeed, within Capitalism, inflation is not only inevitable, it is actually essential, for it helps the company owners in two different ways. First, it decreases the current values of the wages they pay their workers, and, second, it also decreases the real value of any capital loans that they must pay back. [Just imagine what a mess they would be in when Deflation is in charge – the value of borrowed loans increases, and the value of their workers wages increases too]

Capitalism is a system for owners, and is built upon Credit in its every corner.

And, of course, the mismatch between Reality and inflated values occasionally gets revealed, and the whole edifice begins to crumble.

And who then is made to pay?

Who is still paying for 2008 today?

17 May, 2015

Vortices





Diagrams taken from a forthcoming issue - a new theory of the atom.


Part I of The Atom and the Substrate is now available.




10 May, 2015

Why Do Models Work?

“The Cognitive Art of Feynman Diagrams” by
Edward Tufte - installation at the Fermilab art gallery

Why Analogistic Models Contain Significant Content

Without a doubt the actual content of analogistic models has to be the key to their relevance in scientific explanation. For though they are clearly never fully accurate descriptions, and certainly also are always less than totally accurate explanations of the phenomena they are used to tackle, they are also never totally arbitrary inventions, they must contain something objective about the modelled situation.

Let us attempt to reveal what their contained objectivity can be.

Now, though we can, and do, insist that they are analogues for what we are investigating, they are not, and could never be, 100% accurate – containing a full set of one-to-one mappings, they are intrinsically similar situations, and they, therefore, reflect the common sequences, and kinds of entities and properties found throughout Reality quite naturally.

Perhaps surprisingly though, even artificial analogistic models can also be profitably constructed to aid in the understanding of newly addressed phenomena, as long as the constituents involved are taken from concrete evidence of possible components occurring elsewhere in concrete Reality. The method then is to involve such initially unrelated elements into a model, expressly to deliver the actually noticed properties of the thing that you are attempting to explain. Indeed, even more surprisingly, it is often these kinds of analogistic models that deliver the most profound insights, and can also demolish false assumptions dramatically. I will definitely include the mention of such a model later in this paper.

So, let us start by looking at a series of valuable examples of various kinds of analogistic models. James Clerk Maxwell’s famous model of the Ether (that was then presumed to fill all of the Empty Space in the Universe) was just such an informed and creative construct. He knew about many kinds of phenomena, which he had to explain, and the usual simple (and magical) Ether was just too vague to explain anything adequately for him. So, knowing what he wanted to produce from his model, he brought together (without any evidence) the sorts of constituent that might, if appropriately organised, deliver what hew knew was necessary. He adventurously constructed “vortices” and “electrical particles” into an analogistic model, and from this he managed to deliver his famous equations of electromagnetic radiation.

His model did not by any means reveal the actual form of the Ether, and his constructs didn’t exist as such, but his model delivered a great deal more than any of its predecessors, and even more than he designed it to deliver. His resultant Equations were revolutionary. Now, before we explore why such “fictitious” models worked, let us look at some others. Einstein’s Space-Time continuum was also an analogistic model. Once again, no one could prove such a thing actually existed, but it did deliver what Einstein knew were properties that needed explanation. His famous Theory of Relativity was based upon this model, and many things, in addition to what he consciously put into it, which came out of his constructs have since been confirmed in Reality.

Even Niels Bohr’s original model of the structure of the atom with a central positively charged nucleus, surrounded by orbiting electrons in an entity which was mostly empty space, was taken from the Planet-moon systems observed in our Solar System. It was not a true description of it, but yet another analogistic model.

Once again, it defined far more than the models that it replaced, and that was again because it contained more real features within its conceived-of forms.

Even later, when confronted with a confusing maze of “fundamental particles”, Richard Feynman devised his famous Feynman Diagrams – they were, of course, the most abstract of analogistic models, and delivered what no other models could, namely what was called Quantum Electro Dynamics (QED) – the most accurate and useable way of dealing with this amazing Particle Zoo.

And, there is, perhaps, the most audacious version of an analogistic model produced by Yves Couder in his attempt to find a new way of revealing the secrets of the sub atomic world, by modelling it in the Macro World out of unbelievable components. He revolutionised experimental physics by devising and constructing a model entirely out of silicone liquid and various vibrations, resonances and crucial recursions. He managed to create his famous “Walkers” entirely from the above, which was a kind of self-maintaining entity with properties closely comparable to those within the atom.

Finally, the author of this paper, confronted the anomalies of the famed Double Slit Experiments, decided to devise an undetectable Paving of Empty Space composed of undetectable particles – in fact mutually orbiting pairs, each consisting of one electron and one positron, which, because of their opposite matter types and electrostatic charges, became undetectable in this joint form. Yet, this paving actually fully explained the anomalies of the Double Slit Experiments without any recourse to the flights of fancy delivered by the Copenhagen Interpretation of Quantum Theory, when that is used as the sole dependable source for dealing with all sub atomic phenomena.

All the anomalies fell away! Nothing of idealist philosophy was needed to make sense of what occurred, the new materialistic, analogistic model of Empty Space did it without difficulty. [It was both as analogistic, and as artificial, as Maxwell’s model of the very same thing].

Needless to say, a barrage of criticism followed, either from the mechanical materialists of the old school, or from the idealists of the new school, with, as a banker, the fact that no such Paving had been physically detected! But, of course, that isn’t the point, is it? What is important has to be whether this analogistic model explained a great deal more than anything else could. Now, how can we explain these relative successes clearly based upon non-existing constructs?

Their value is that they are determined by the features in Reality to be explained – and, initially, at least, this can only be achieved by organising what is generally known into a purposely constructed model, aimed by using real features from elsewhere, into an amalgam, which delivered what was required. Such a model would never be the Absolute Truth, but it can be both intelligently and intelligibly constructed to contain more Objective Content – elements, parts or aspects of the Truth, than what it replaces. And in doing so, it makes the actual real phenomenon more understandable: and also by crucially revealing things that were absent previously, makes further developments more likely, if only by establishing a whole new kind of model, which allows us a great deal more to consider with some aspects real, and others intelligent placeholders for what has yet to be revealed. But, why should these analogies even be available? Why should such similar (though possibly also profoundly different) resonances occur in such different contexts? The answers must be contained in what it is that is similar in all phenomena, and hence possible everywhere in one way or another?

We really have to address the question, “What is common throughout all possible phenomena that always guarantees that such analogies will certainly exist?” It must be that they are all – every single one of them, always produced as the result of many different, simultaneous factors, which will always come together into overall situations of Stability (if only temporary). Form the possible results of such complexities, when the factors present are NOT separable, eternal laws, but on the contrary, mutually interacting and modifying arrangements, which will finally settle into a selfmaintaining overall stability.

Clearly, features will become present which are a result of this higher level of stability, and hence about how such mutually modifying factors arrive at such a state. Such properties will be true, at least at the abstract level, of all such systems. Indeed, when you think about it, it is likely that all phenomena are such! The search for fundamental particles and their basic eternal laws is therefore a pluralist myth. No matter which “law” you choose to address, it is certain to be the stability reached by multiple factors at an even lower level! The aims of pluralist Sub Atomic Physics are impossible to achieve, with the assumptions and principles that underlie the entire area of study.

The Principle of Reductionism is clearly based entirely upon Plurality, and hence assumes that Analysis will be always possible, all the way down to its targeted Fundamental Particles. These suggested analogistic commonalities seem to indicate very different relations could be expected to get to such stabilities in very similar ways. Such things as spins and orbits are likely to occur at all levels, as are things like vibrations, resonances and recursions. It is very likely that this is what we are settling upon with our analogistic models. Not ultimately Absolute Truths, but commonly occurring natural resonances, which we term as Common Objective Contents.

This article has been taken from the new special issue of the Shape journal Analogistic Models III


New Special Issue: Analogistic Models III



The last installment of our Analogistic Models series of issues.
 

06 May, 2015

Singularities Suck!



What is a Singularity?

Let us take Zeno’s Achilles and the Tortoise Paradox to investigate. Achilles and the Tortoise are to have a race. Achilles gives the very slow Tortoise a head start, and lets it move out in front, while Achilles waits, confident of his vastly superior speed. Finally, he sets of after the Tortoise. But, by the time he reaches the place, where the Tortoise used to be, some time will have elapsed, so the Tortoise will no longer be there: it will have moved on. Achilles again chases the Tortoise, but when he reaches the place where the Tortoise used to be, some time will have elapsed, and the Tortoise will have moved on. Repeating this line of reasoning, it is clear that it is an infinite repeating cycle. Using this algorithm Achilles will traverse an infinite number of iterations without ever reaching the Tortoise!

Now, the reasoning is flawless, but it isn’t real! It really does produce an infinite, never-ending process.

This is a Singularity!

It is what we call an “ill-formed algorithm”. And, such are very frequently used because they do, for a time, take us ever closer to our sought-for solution. But, they never end, and we have to include a get-out-clause to terminate the infinite process.

An example in our race algorithm above would be something like –“when the gap between the runners drops below one inch (say) terminate the process immediately” The algorithm is one example of how we simplify and idealise situations in order to solve on-going problems. It has long been our primary method, and still is to this day.

For, such forms can press you ever closer to the solution you seek. By terminating at some finite stage in the cycling process, you can be left with a relatively accurate set of results. Indeed, there are many real world situations which have to be addressed in such ways, as there are no others. For example all weather forecasting programs are entirely of such type. The whole method we call Simulation is similarly full of these kinds of methods and terminators.

But, there is even more to it than even that. Some extracted relations, usually encapsulated into equations termed Natural Laws (because they look eternal), also fail in the same way when extrapolated beyond their required conditions of applicability. Indeed, the whole discipline of Mathematics is full of Sinks and Explosions, where they were given the name – Singularities! Get-out-clauses abound!

Now, it should be crystal clear that to transfer such Singularities into reality is obviously a major error.

They do not exist in Reality.

Even the most famous Singularity of all – The Black Hole must never be taken in the way it seems to be – by the extrapolation of a relation beyond its range of applicability. In all such cases, the old relation reveals its inadequacies, it is no longer even approximately true: it is now just WRONG!

If by any chance you have abandoned physical explanations, and put your entire trust in Formal Equations, you are subsequently in very serious trouble. And, that is the Crisis in Modern Physics: for that is precisely what they have done!

So, how on Earth do they cope? With NO explanatory science to use in such situations they are forced to do TWO things. First, they change the way they use their Formalisms to deal in Probabilities. And, second, they resort to unsubstantiated Speculation to fill the (vast) gaps.

And, the dead ends proliferate all the time. Innumerable significant qualitative changes cannot be handled by their extrapolated formulae, and add to the number of unsolved (and by their methods unsolvable) problems of crucial importance, which are given over to unfounded speculations. Whether it is the Origin of the Universe or any wholly new development in reality. Whether it is the Origin of Life or the End of the Universe. None can be tackled because of this catastrophic retreat!

Don’t you believe me?

What about String Theory, Quantum Loop Gravity, Multiple Universes, Branes and the rest? You may well ask, “Why are they locked into this evidently wrong standpoint and methodology? Can’t they just change over to a sounder basis?"

Believe it or not, the answer is “NO”.

Yet, effective alternatives have been around starting 2,500 years ago (and I am not talking of Ancient Greece). The main alternative to the usual Principle of Plurality (the cornerstone of western thought) was delivered then by The Buddha, in which he based himself upon the alternative Principle of Holism. Yet, in spite of others following that line, including the great German Philosopher Frederick Hegel, everyone else chased the clearly pragmatically effective Principle of Plurality. Hundreds of Years and innumerable disciplines have been constructed and added to via such means, and literally NONE of them will ever consider abandoning it.

There are those who have embarked upon this better way, but the establishment of Holist Science is still in its infancy.

Meanwhile the fictitious Singularity rules O.K.

01 May, 2015

The "Purest" Drivel!


You Cannot Make a Silk Purse out of a Pig’s Ear!

The article entitled “Quantum Purity” in New Scientist (3016) delivers no revolution!

Indeed, it is a perfect example of “re-arranging the deck-chairs on the sinking Titanic”. It offers NO profound additions to a majorly flawed consensus Theory about the quantum, and certainly presents no kind of alternative.

For the very real problems, inherent in modern Sub Atomic Physics, are not even recognised as existing, never mind presenting any solutions to them. The “New, Pure Theory” is not even scientific: it explains absolutely nothing, and, at best, infers incomplete treatment of particular key areas, without delivering any solutions.

It does recognise that something is crucially mishandled by the usual “scientific” paradigms, but then proceeds to wrap them up in the same sort of idealist presentations as before! There may be something useful in recognising their two categories, but standing where they do, these theorists are clearly incapable of seeing what is actually causing these two views, and can only offer “insufficient information” as their diagnosis of the problem.

The question is, in a World pre-ordered and limited by a particular mistaken stance, can they ever break out?

The answer, as this article clearly demonstrates, is that the answer has to be “NO!”

And, to clearly describe that stance we have to go back to, long before Copenhagen and even Relativity redirected Physics, we have to be clear on exactly what was, and still is, being studied in Science, and why that perspective leads them astray. 

For, it definitely isn’t Reality-as-is that is addressed - that would be much too messy and unintelligible.

What Mankind found that they could do, however, was carefully choose an area to study, then isolate it from everything else, and begin to selectively modify it – removing confusing factors, and controlling others, until, in an extremely well-farmed situation, some quantitative relation would be clearly displayed, and could, therefore, be extracted.
This is what scientists strive to achieve, and generally succeed in doing it.

But, it most certainly isn’t unfettered Reality, but, on the contrary, it is always an extremely well-farmed and different situation, with ITS consequently evident relation.

Now, let me be clear: all of this is eminently reasonable, especially if your determining objective is to USE that relation in some purposive way!

BUT, it is what is, thereafter, assumed about Reality that is the real and misleading problem.
Is the obtained relation an eternal Truth about Reality: always and everywhere: is it a Natural Law?

The answer has to be, “NO!”

It is a “law” of that precise and purposely limited context, from which it was extracted. And, the unopposable proof of this is revealed when you actually USE that “law” For, you have to re-construct the exact same Domain that was created to deliver it. Otherwise, it just doesn’t work!

All of Technology proves in every one of its applications. And, to get those vital conditions correct is what Engineers do: it is their valuable skill, especially when it comes to commercial production. So clearly, even the “technological” route is an entirely reasonable thing to do.

But, that is NOT the sole aim of the scientist.

Indeed, scientists have to conceive of how to discover it, then carry it out and finally EXPLAIN why it is so. But, our usual methodological tail wags the understanding dog! And, it leads us to assuming that the “extracted law” is actually present all over Reality totally unchanged from the state in which we extracted it. Everywhere in unfettered Reality this law is considered to be the same: it is an eternal Natural Law. No, it isn’t!

And, as we extract further such “laws” (each with its own appropriately farmed Domain), and insist that is precisely these “laws” that together in Reality-as-is make all the phenomena that you see happen.

And, there is more, to establish this assumption, there is an assumed to be universally true Principle, that of Plurality, which sees such “laws” as the real motive forces, and that they, each and every one, are intrinsically separate from one another. These “laws” don’t change, and any situation in Reality is then a mere SUM of such “laws” in various proportions: they remain exactly as we unearthed them in our experiments.

This is so fundamental that the whole discipline has to be necessarily re-labelled as Pluralist Science. And, that isn’t Science!

Now, that would be bad enough, but it isn’t all that is wrong.

For, the whole set of methods employed in Pluralist Science, have as their main objective, the extracting of these relations, via quantitative measurements, and, with these in their hands, they then arrive at an important process – they both simplify and idealise the data set, by fitting it to a pure, formal, mathematical relation. All deviations from this processed and tailored version are dismissed as irrelevant noise, due to non-essential factors.

So, to the error of Plurality, we also have to add that of Transforming Abstraction, which makes the data set into something else.

And, indeed, the formalised “law” actually becomes the “cause”(?)

Now this is clearly fundamental!

Though scientists historically always felt obligated to accompany their pluralist quantitative forms with Physical Explanations, these became more difficult to construct out of the increasingly opaque equations that were dredged from Mathematics to match JUST the extracted data sets, so the requirement for explanation was increasingly dropped.

Instead of the previously dualist philosophical stance, involving both idealistic equations, tempered by materialist explanations – this slipped into a wholly idealist stance, where it was the Equations that were the real determinators.

Now, it is this wholesale retreat that underpins the post-Copenhagen offerings, and without the necessary explanations, these were more and more evidently dead ends in the real understanding of Reality.

Indeed, even the new discoveries of these researchers, which contain real Objective Content, were emasculated by their subsequent theoretical treatments. They were on the threshold of realising the existence of holistic, mutually-affecting situations, which couldn’t be analysed in the old pluralist ways, but once more turned these into their usual solution – to treat them statistically without finding out exactly why they occurred. Indeed, Randomness has become the last refuge of Pluralist Science.


Postscript:

Now, clearly, this criticism needs further definition and amplification, and that has been undertaken in a whole series of papers and even whole Special Issues of the SHAPE Journal, which are already published on the Internet, or are already written and scheduled to be added to these in the near future. But here, we can briefly preface that work by answering the question:-

Where is the Door to Reality?

So having debunked the consensus in present day Sub Atomic Physics, as it has developed since the adoption of The Copenhagen Interpretation of Quantum Theory at Solvay in 1927, it is clearly necessary to indicate a superior alternative.

And, that is already underway and is beginning to deliver profound results.

For, in spite of Plurality and Formalism as the two evident impositions foisted upon Reality-as-is, there is also another purely physical limitation that prohibits an alternative being applied.

It is the assumption of Space being considered to be essentially Totally Empty – and this is not only out there in the Cosmos, but also everywhere else - down to the micro level, even existing between the supposed bottommost particles of Matter. But, that isn’t true either!

Indeed, this theorist (Jim Schofield) decided to tackle the famed Double Slit Experiments that are widely regarded as the most important proofs of the legitimacy of the Copenhagen stance. And, his first assumption was that this “supposed Empty Space” is actually filled with a Universal Substrate.

Of course, it had to be undetectable by the usual means, so he also defined a substrate of particles, which were composed of already known and stable sub particles that could deliver this substrate, as well as things like the propagation of Electromagnetic radiation/ With his defined substrate, he was able to explain ALL the anomalies of the Double Slit Experiments, without any recourse to Copenhagen at all!

The next major error had been, once the Ether could not be detected, was to abandon the concept of a substrate altogether. Now, if this achievement with the Double Slit was all there was, there would be an argument to dismiss it as it just doesn’t explain everything. But, an Indian scientist, Mohan Tambe, has put forward a different particle to make up a substrate, which seemingly effectively tackles Electrical and Magnetic Fields, and the American scientist, Glenn Borchardt has described yet another, which explains Gravity. It is, therefore, already underway to conceive of so-called Empty Space, as actually containing a variety of undetectable particles, which between them (or even in combinations) can deliver what Copenhagen certainly cannot!

Physics and Philosophy


The closing of Middlesex University Philosophy Department


Are both disciplines really about the Nature of Reality?

What do Philosophy Departments in our Universities do?

I am a physicist, and in my experience, Physics Departments (worth their salt, anyway) do Physics! They do experiments.

So, what happens in Philosophy? Is it like the Art Department in my own (undergraduate) University, which didn’t actually do any Art, but just talked about it? Indeed, I, as a physicist, ended up as the secretary of the Art Society in the University for two years.

Do the members of staff in Philosophy actually practice Philosophy? I ask this question because as a physicist today I simply cannot avoid Philosophy – and, I don’t mean the Philosophy of Science, for such restrictions merely end up being about the History of Mankind in Science, and not about the Understanding of Reality. Surely, Science and Philosophy should be very close bedfellows and work together on this same crucial task?

But, I have also observed that very few of the “doing” physicists have any real idea of their own philosophic stance and underlying assumptions. Yet, after generations of scientists being confirmed materialists (as a matter of course), they, as a body, in the latter part of the 19th century, found that the ground beneath their feet was beginning to shake, and deliver ever more damaging tremors, which threatened the very fabric of their once rock-steady stance. The discovery of the Quantum opened up the biggest ever can of worms. Indeed, great fissures opened up, and something had to be done!

The result, finally consummated in 1927 at the Solvay Conference, was a total abandonment of their prior Mechanical Materialism, but what replaced it was even worse.

In Sub Atomic Physics a totally idealist standpoint, with “materialist” experiments was adopted. Instead of the search for physical causes, the whole approach was re-orientated to merely seek Form in experimental results as sufficient in itself. Causes were either unknown or unknowable, but here in our hands were the useable results. We would not only abandon the next question, namely “Why?”, but we would seek our patterns as themselves being the causes of phenomena. The equations produced were conceived of as the Natural Laws of Reality.

The inference was clear: the found equations were the driving imperatives of Nature – they actually caused observed phenomena.

Let us reiterate that stance!

The purely formal relations (abstracted from Reality most certainly, but there had been simplified and idealised from purposely farmed experimental situations) were turned into being the sole drivers of Reality.

Science had become a branch of Mathematics (and could be researched mainly on a blackboard with chalk). It had been changed into an idealist discipline.


Of course, a fig leaf of “explanation” was vigorously defended, but it certainly was not an attempt at real explanation in terms of physical substances and their properties. It had been removed into the Parallel World of Pure Form alone, which I call Ideality, and as such was doomed

Of course, there had been a great deal wrong with the prior scientific standpoint. It has for centuries been a dualist compromise between an idealist belief in form, and a materialist search for causal explanations, that somehow remained together as a workable compromise. The equations were so useful, you see: they had to be an objective in any research!

And, along with this illegitimate compromise, the materialist stance was indeed entirely mechanist: it did not address reality's development at all.

The odd genius, such as Darwin, though transcending that stance, did not, and at that stage could not, change the basic standpoint one iota. The basic sciences were about eternal Natural Laws, as encapsulated in formal equations.

You could not THEN challenge that belief!

The vast majority of physicists really did believe that everything in the Universe could be directly explained in terms of a straight-through series of causes, right back to fundamental particles, and also that these were accurately described by the formal equations – the Natural Eternal Laws. You can see the amalgam. It wasn’t easy to demolish by individual gains in isolated areas. It was the generally agreed ground!

Thus, though many linking gaps were evident, they would ultimately be closed, resulting in everything being derived ultimately from the Primary Science that was Physics. And, increasingly within that subject the nitty gritty would then have to be the Study of Fundamental Particles.

Of course, such a stance did not represent the real situation. Indeed, it was miles away from a comprehensive position, but it had been productive at all levels for centuries, and would not be renounced with the occurrence of as yet unexplained anomalies.

Until undeniably demolished, most scientists would stick to Reductionism – the concept that eternal Natural Laws at the bottommost layer of Reality generated absolutely everything else above it – all the way to Life, Consciousness and Society. And in such a context, the various different subjects be they Chemistry, Biology and even Psychology were then only forced sub divisions of a single structure, as would be proved as the missing links were found one by one.

But, of course, that assumption is incorrect!

The divisions into different sciences are not down to mere human categorisation. They are, in fact, reflective of wholly different Levels of Reality, which when they first happened resulted in new worlds with their own relationships, that did not exist prior to that Event.

NOTE: The inability of scientists to discover the Origin of Life proves this conclusively: it isn’t a mere complication of existing laws from a lower Level, but the product of a seismic Revolution.

And, the reason that one can be so adamant about this is that the prevailing stance ignores Development almost entirely, and simply believes it is merely complication.

Indeed, this criticism is proved by the predominance of the Second Law of Thermodynamics in the views of present day Physicists. For such a Meta law is ONLY about dissolution, and says absolutely nothing about what created the evident structures in an indisputable prior Development. Puny efforts have been made to ally the Second Law with Randomness and Chance, in which that crucial progress in the history of Reality is put down to the roles of these random contributions, but the evident ineffectiveness of such efforts prove that they are mere constructs and reflect nothing of the real Development that has occurred at all!

The problem is, and always was, the shortness of the life spans of individual human beings. No individual could actually observe big developments in reality, and indeed, for millennia, Humanity has considered reality as a fixed thing – an achievement and then a maintenance of a natural stability in all things.

Now, stability, as commonly observed, appeared to be the normal state of Reality, and all change, if there was any at all, was tiny and incremental – occasionally passing some formal quantitative threshold, and thereafter slipping things over into a more conducive and maintainable alternative stability. It was an understandable mistake, but incorrect due to lack of evidence.

The first cracks appeared, centuries ago, in the studies of geologists, who revealed a changing world clearly recorded in the rocks beneath our feet.



But, this development was incredibly long winded – even thousands of years were insufficient to reveal changes, and significant transformative changes were much rarer – usually only apparent over millions of years. And, as the science developed, what it revealed not only included vast changes in the material forms of continents and even the oceans, but also revealed an indisputable evolution of Life itself, AND even the actual time of Life’s Origin on earth.

Yet this subject, Geology, was regarded as merely a “secondary discipline”: its conclusions were majorly distorted by the fact that Geology was a limited discipline, where investigative experiments were impossible, so clearly, the testable and superior discipline of Physics, which could explain things from bottom to top, was still unchallenged as the product of everything there is.

And sadly, even Geology was mute when it came to the clearly evident step-changes in the record of the rocks.

For such a record, very slowly built up over tens and even hundreds of millions of years, could only clearly deliver established stable levels gradually laid down over colossal periods of time, and the crucial dramatic changes would be located in time, but totally absent as an investigatable record. The essential interludes of major and often qualitative changes were simply unrecorded: all that could be clearly seen would be an impenetrable step-change.

These crucial interludes were invariably of relatively short duration. Indeed, the first indications that such transforming interludes actually occurred were found in the recorded histories and the archaeological revelations of earlier Human Societies. For there, without any doubt, such transforming interludes definitely occurred.

Even in the found remains of earlier human beings, there were clear examples of indisputable evidence of what was then described as the Neolithic Revolution. And, this occurred at a time when the only tools available were knapped fragments of flint, plus a few shaped softer materials such as wood and horn.

Yet in this remarkable interlude, Mankind invented animal husbandry, farming, pottery, weaving and many other new techniques. It was no, slow, incremental set of changes, but just such a relatively short-length Revolution.

Also, from the opposite end of Mankind’s studies, in Social History, came the concepts of Stability and Revolution – an oscillation between long periods of slow, incremental quantitative changes, and short interludes of major, qualitatively transforming changes.

Could this be the natural pattern of all developments at all levels?

Slowly, evidence began to accumulate that this was indeed the case... But, did it change the stance of our physicists? No! They actually admitted that switches occurred in the development of Reality, but also insisted that all of these would ultimately be totally explicable in the usual mechanical materialist ways. No change was necessary in their philosophic stance.

Clearly, following this blinkering, a crisis was unavoidably looming, and it would be a biggy! Indeed, there was a Revolution in the offing. Then finally, that time arrived - the avalanche of contradictions multiplied all the time, and a transcending of the causes (long overdue) became imperative. These head-down scientists were extremely unlikely to allow any generalist, head-up conceptions (by uninformed dreamers) to deflect them from their “holy” path.

The intervention just had to come from without!

It had happened in 19th century politics with the intervention of Karl Marx (a philosopher) into areas such as Social History and Economics, but the necessary intervention in Science was never carried through. Yet, Marx was initially only an academic philosopher – a follower of the idealist giant Hegel, yet his intervention transformed politics.


The question therefore posed at the outset of this paper can now be restated:- “Do academic philosophers DO philosophy?”

What is their position on the Copenhagen Interpretation of Quantum Theory, and the consequent switch among scientists to Reality itself, being determined by Formal Relations, as embodied in Equations?

Something MUST be going on, which simply must be in the remit of practising philosophers. For in my developmental studies, contributions by the philosopher Salmon enabled me to see the role of the Principle of Plurality in science. Yet, no moves are then made by today’s philosophers to address this flawed principle, as Hegel would most certainly have done 200 years ago!

The question has to be, “Why?”

Now, if, as seems to be the case, what is studied (and explained?) is merely the History of Philosophy, cannot the trajectory towards Truth be discerned in that, as it was in History by the philosopher Marx?

29 April, 2015

John Berger: ‘Writing is an off-shoot of something deeper’


"Words, terms, phrases can be separated from the creature of their language and used as mere labels. They then become inert and empty. The repetitive use of acronyms is a simple example of this. Most mainstream political discourse today is composed of words that, separated from any creature of language, are inert. And such dead “word-mongering” wipes out memory and breeds a ruthless complacency." JB
Very interesting piece by Berger for the Guardian newspaper on language and writing as a process of understanding - he always was a hero of mine!

Link to article

Lost Wisdom



The Pragmatic Road to Knowledge

I was reading a piece the other day which mentioned Zeno and his Paradoxes (from 500 BC), but “explained” them with a couple of throw-away lines. Yet, the crucial question, “Why did those Paradoxes work, as Zeno had intended?” That was certainly not delivered! Indeed, if you looked very carefully at the proffered “explanations”, you could only come up with something like – “He was using his assumptions in the wrong places!”

But, Zeno was making his points long before the discipline, we now call Science, so he took the principles that he, and his contemporaries employed – as being generally true, and then proved that they weren’t. But, in addition to losing this crucial context, the throwaway “explanations” were also totally insufficient, in themselves, because what was actually being dealt with was an example of a Dichotomous Pair – where ideas derived from the very same grounds (as assumptions and principles) ultimately delivered contradictory concepts. They couldn’t both be true.

Yet, they had been developed from what everyone believed to be a single consistent and coherent set of premises. But, even this analysis proved to be so subtle that it was not arrived at for a further 2,300 years. It wasn’t dealt with until Hegel’s researches into Human Thinking arrived at what he realised were unavoidable products of incomplete and inaccurate premises, caused by our only available method of Abstracting-from-Reality. For, though this had been a remarkable and important invention, it had to both simplify and idealise what was being observed in order to be able to make any current sense of any studied situations.

It wasn’t a mistake, for it was inevitable at our then state of understanding.

But, what was really devastating, was that Hegel realised that inaccuracies would always recur, time after time, for Mankind is making up his methods of Thought as he goes. These Dichotomous Pairs would definitely recur continuously.

Now, before there is a general chant of, “Give up now you’ll never do it”, two things have to be made clear.

First such abstractions were still very valuable indeed, because they did contain some Truth, if not all Truth! So, at some point further on in thinking with such abstractions, while they could most times be extremely useful, they could also produce these contradictory Dichotomous Pairs, which couldn’t both be true, and hence cast doubt on everything else founded on those same basic premises.

And, Hegel did not only deliver all this revelation: he also devised a methodology for transcending these dichotomies.

Now, somewhat surprisingly, to this day, most reasoning is totally unaware of his successfully devised methods of transcending the inevitable impasses. Listen to any politician, or reporter, and you will get absolutely no recognition of this vital feature of our thinking, never mind any attempt to apply the means to transcend them. Yet Hegel lived over 200 years ago!

Now, you would think that such a story is totally unbelievable, but the reason for Hegel’s gains being largely ignored is to do with where they inevitably led. For, Hegel’s best, and most dedicated students – the Young Hegelians, took his ideas much further – indeed, they transferred them wholesale from Hegel’s own Idealism into the opposite Materialist stance. And, thereafter, they were applied to absolutely all Development – not only in our heads but also to evolving Reality in general.

The leader of this philosophic revolution was Karl Marx, and he spent the rest of his life on the side of the Working Class, and against the ruling Capitalist Class.

Such ideas became anathema, and were never allowed to be taken further in the Citadels of Wisdom of the Capitalist States (unless, that is, they were emasculating it!).

24 April, 2015

Marxism and The Origin of Life


 How Philosophy Aids Science

“What?”, I hear you say, “What could possibly be a Marxist view of such an Event?”

Well, it is the only approach capable of solving that important question. And, the reason that such is the case is because Marxism, from the outset, was, and still is, a Philosophy. It is not only a political stance in the Modern World. It was first devised by Karl Marx, a philosopher, and a follower of the great German idealist philosopher, Frederick Hegel, and everything that he did consequently stemmed from what he and his tutor managed to find out about Reality, and Mankind’s place in attempting to understand it.

It is claimed by many (who do not understand it fully) to be “Scientific Socialism”, but such a description fails because, at present there is no comprehensive and consistent “Marxist Science”. And, such a description never tallys with what people see in the everyday actions of those claiming to be committed Marxists.

It is certainly more correct to call it “Philosophical Socialism”, because truly great gains in Philosophy were its real foundation stones.

And, the most significant step in that direction were taken beginning almost 200 years ago, by, first of all, Hegel, with his truly profound studies into Human Thinking, and , thereafter, by Marx who “Stood the idealist Hegel upon his head, or rather on his feet!”

By the time of Marx’s Communist Manifesto of 1848, the new philosophical stance had already been applied to History and Social Revolutions, and was later comprehensively applied to the overall development of Human Society and its latest phase, Capitalism, in his book, Das Kapital.

But, the most obvious and potentially fruitful alliance, benefiting both, would certainly be with the other strongly materialist discipline, Science.

But, that did not happen!

The reasons were understandable, but also unforgivable. Science at that time (and still to this day) was in the hands of the privileged classes, and they could not stomach Marx’s conclusions about the need for revolutionary change. Occasional exceptions among scientists were too few and too amateurish, philosophically and/or scientifically to divert the enormous momentum of technology-inspired success in Science, and the necessary turn around of that important discipline never happened.

Yet, as with all such necessary revolutions, there always has to occur a major crisis and consequent collapse of the old stability, to initiate major changes, and that finally occurred – too late! For those who could have led such a Revolution were dead or removed from having any influence, within the organisations of the Working Classes internationally.

The turning point should have been in 1927, when Bohr and Heisenberg won the day in turning Physics into a thoroughly idealist discipline with their Copenhagen Interpretation of Quantum Theory. But crucially, there were absolutely NO scientists who could deliver the telling blows, and demolish the speculative ideas of these so-called scientists.

But, also, and crucially their were NO self-proclaimed Marxists who could do it either. Since the Russian Revolution the development of Marxism itself, more or less, came to a halt.

But, finally, at long last, the situation is beginning to turn around. There are an increasing number of scientists who are turning away from Copenhagen and towards a more holistic approach, which grows ever closer to a real Marxist standpoint philosophically.

They don’t necessarily term their new position Dialectical Materialism, but that is where things are heading.

And within the Working Class movement there are Marxists who know what has to be done, and are beginning to do it.

In the last few years the Theory of Emergences has been described, with the remit of a stance applicable to all disciplines, and the proof of the pudding has been in its eating. The famed Double Slit Experiments have finally been explained physically without any recourse to the Copenhagen position.

And, we are proud to present a collection of contributions to the number one question in Science: It is, of course, the Origin of Life on Earth.

And a series of papers starts with "Ideas on the Origin of Life" in the current SHAPE Journal:


Issue 38 of Shape: Ideas on the Origin of Life



This latest edition started as a reaction to an article in New Scientist (3008) on Eukaryotic and Prokaryotic cells in the development of life, but soon drew in the prior work by this theorist on the Origin of Life itself.

It was worth stressing that either working downwards from living entities, or working upwards from non-living entities, would both fail to explain this crucial event, which rather than being a mere incremental development in the evolution of matter, was certainly a kind of revolution, and must have occurred in what we now term an Emergent Event. Thus this collection of papers became a kind of review of the ideas vital to a solution to the most important problem in Science: why does life exist at all?