29 March, 2015

Excuse me, I think you have it the wrong way round!


Göbekli Tepe

According to the editorial in a recent New Scientist (3014), it seems that I am what it calls a "secularist", and one who invalidly imposes an economic imperative onto the evolution of mankind.

Whereas in the opinion of this editor the discoveries at Göbekli Tepe, along with more recent Mayan findings, "prove" that what we call civilisation was a direct result of changes in religion. In that very short piece absolutely zero reasons are provided as to why all the achievements of the Neolithic revolution - from Farming, Animal Husbandry, Pottery, Weaving, Writing and all the rest, could have come solely from some religious imperative. What rubbish!

Nobody can doubt that ancient hunter/gatherers would have had some sort of religion, and that such beliefs would have given them confidence when shared with others, certainly aiding their survival. But to make such a cerebral change the reason for the emergence of civilisation is actually laughable. In those days there were almost as many languages as there were tribes, so social belief systems across significant populations (religions) could not develop. And to say that such beliefs were the causes and mainsprings of civilisation, and the above-detailed revolution, is to say the least, amazingly ignorant of real processes of qualitative development. Something else was needed to make larger social groupings viable, in order for such things to develop.

Primarily you have to be able to explain why human beings could economically remain in one place, and not only survive, but actually prosper. The density of population possible with hunter/gatherers was very low indeed, for there simply wasn't enough food in any single place for a collection of peoples to settle. Exceptions to this norm did exist, but they were both rare and temporary. 


Lascaux

In the south of France following the last Ice Age, herds of animals moved north through a narrow valley to summer pastures and returned the same way in autumn. Here it was possible for hunter/gatherers to exploit this bounty and remain in one place and flourish. The cave paintings at Lascaux demonstrate this - an always-travelling group could never have achieved such a wealth of images in one place, they wouldn't have been anywhere long enough to develop such rituals.

But such things are not only exceptional cases, they never lead to the larger populations required for civilisation. When we address the development of mankind in general, we simply cannot establish such revolutionary changes, as are embodied in a universal concept like civilisation, upon such rare exceptions.

The gains had to be long-lasting, and in the modern parlance, sustainable

So, the only general explanation for static villages, and ultimately cities and civilisation, was agriculture. The most important settlements were always near rivers, allowing possible irrigation (or flooding), replenishing the fertility of the land. 

May I roundly condemn such an editorial in a supposedly scientific magazine as positively reactionary?! To say that religion lead to these developments is, at best, confusing cause and effect. This so-called "secularist" prefers his own, more accurate self-description - a scientist.

I have written about Göbekli Tepe before, as it is certainly a very important site in these matters. The piece appears in Special Issue 24 of Shape Journal and offers a critique of the philosophy and method of contemporary archaeologists who seem to have rejected an economic imperative for social development (see V. Gordon Childe) in exchange for a more idealistic, cultural cause.



26 March, 2015

New Special Issue: The Unknown Ocean III


The 33rd Special Issue of the SHAPE Journal is now available. It is the third and last part of our Unknown Ocean series, continuing our journey deeper into the uncharted depths of reality, this time concentrating on the new methodology necessary for truly understanding it.

25 March, 2015

Thinking Randomly


The Inevitable Dead End of Pluralist Science

I am about to tackle a whole special issue of New Scientist (3012) on Random Chance (contributed by several different science writers). Now, I have, for some time, rejected purely formal investigations into physical phenomena, as being solely about Abstract Form, and, therefore, concentrating only upon description, rather than the actual physical causes of the natural happenings. Indeed, I have demonstrated that the seeming coherence of formal equations is more about the consistency in the parallel World of Pure Form alone (which I term Ideality) rather than being about the true nature of, and causes within, Reality itself.

Clearly, there are, indeed, some real situations in which physical reasons for a random-like outcome are also valid. But, both kinds came to be seen as the same – when they are most certainly NOT! Indeed the two get swapped around completely, as with The Copenhagen Interpretation of Quantum Theory, randomness is promoted from being a mere consequence of multiple opposing causes, to itself becoming the cause!

Thus, as the preparation for this may be a massive undertaking, I feel it necessary to prepare for all the usual tricks. And, as I have learned from similar situations in the past, I prepare by consciously laying out my current stance, so that I will be adequately prepared to register precisely where the two approaches begin to diverge significantly. In this preparation, we must, of course, start where the two approaches actually coincide – where physical reasons result in a situation best seen as completely random. And then, thereafter, trace out the diverging paths (and consequent assumptions) from that mutually agreed situation.

We must also, of course, make clear the differing assumptions and principles of the two approaches. They are categorised by the very different principles of Holism and Plurality, which turn out to be the grounds for the two alternative approaches.

Holism sees Reality as one composed of multiple, mutually-affecting factors; while Plurality sees such factors as separate and eternal Laws.

The distinctions turn out to cause the two alternatives to lead off in entirely different directions, because there is no doubt that such complex situations have contributions from all involved factors. All phenomena, in one way or another, are results of many different factors acting simultaneously.

The crux of the problem is whether that joint causality is merely a SUM of independent, unchanging factors, OR the result of all the acting factors being changed by presence of all the others, and delivering a result, which then cannot be explained in the same way at all.

For, the pluralist approach, where phenomena are mere complications of unchanging natural laws, assumes that explanations can be arrived at by extracting each of those involved, one-at-a-time, until all the laws involved have been identified.

Whereas, from the Holist stance, that conclusion is seen as a pragmatic method of deriving simplified and idealised “laws” via experiment, while, in contrast, Holism would have, somehow, to see how all the various factors involved are modified by their context – that is all the other factors, and the same will be true for each and every one of the rest.

Now, the pluralist view does approximate reality very well in carefully isolated, filtered and controlled situations, so, needless-to-say, that approach soon came to dominate our way of doing things. But it could never cope with certain results, the most significant being those occurring in qualitative developments, where wholly new features emerge, and most dramatically in what are termed Emergences (or in another parlance – Revolutions).

Plurality became the norm, for it could be made to be both predictable and very productive, as long as it occurred in stable situations (in both naturally stable situations, if available, or in purpose-built man-made stabilities).

Now, this coloured the whole level of discoveries arrived at with such a stance and method, and also unavoidably caused multiple areas of study to be separated out from one another, simply because causal connections between them were unavailable.




Reality-as-is was literally constantly being sub-divided into many specialisms, or even disciplines, which could not be made to cohere in any simple approach. And, as the required “missing-links” just proliferated, the gulfs between these separate, created areas of study grew ever larger – even though the belief was that, in time, these inexplicable gaps would finally be filled with explanations of the same type as within the known and explained areas, but not, as yet, discovered.

It amounted to cases of investigators “painting themselves into corners”, by following their primary principle – Plurality.

Now, such a trajectory was historically unavoidable, because the gains of such pluralistic science within their defined and maintained contexts were so successful and employable in effective use.
Naturally and pragmatically, Mankind moved as swiftly as possible, where they could, and the flowering of this effective, pluralistic stance was, of course, Technology. But, such approaches could never address the really big questions. And, of course, both understanding and development were replaced by increasing complexity and specialisation.
So, quite clearly, such a mechanistic conception could never cope with the emergence of the entirely new, which were always described as particularly involved complexities, yet never explained as such.

Now, to return to the supposed “common ground”, this was interpreted as a complex, multi-factor situation, in which opposing contributions tended to "cancel out”, and the result was a situation best described as one in which random chance predominated. And, of course, such could indeed be true, in certain situations – for example in an enclosed volume of mixed gases, you could derive both temperature and pressure by assuming a situation of random movements and collisions.

But, such required it to be a continuing and stable overall situation. If the gases were reacting with one another, and other substances were present, and reacting too, then the persistence of that stability might not reign very long.

As always, such an assumption was invariably a simplification and an idealisation of the given situation.

Now, such situations were frequently adhered to, as stabilities are the rule in most circumstances, so such ideas can be used with a measure of confidence, BUT, whenever any development began to occur, such a stance could never deliver any reasons why things behaved as they did.

In Sub Atomic Physics, for example, as investigations penetrated ever deeper into Reality, problems began to occur with increasing frequency. The discovery of the Quantum became the final straw! 



For, in that area, phenomena occurred which blew apart prior fundamental assumptions about the nature of Reality at that level, and, for example, what had always been considered to be fundamental particles, could seem to be acting like extended Waves. Yet, with the smallest of interventions these could immediately revert back to acting as particles again. These phenomena were deemed to be beyond explanation, and what was used instead of the usual explanations, was the mathematics of random chance, not merely about a given position, but significantly about the probabilities of it being in each of a comprehensive set of positions (presumably relating to a wave) when they reverted back to particles again.

The formalisms of waves were used in this very different and illegitimate context. At the same time, physical explanations were dumped completely. The only reliable means of predicting things was in the probability mathematics developed originally for Randomness.

Now, the question arises - what do the Copenhagenists think they are doing with this new position?

First, they have NO explanations as to why it happens the way that it does.

Second, they deal only in Forms – equations and relations! Do they consider these to be causes?

Well, maybe you have guessed their position. They have no adequate answers, and hide behind the known physically-existing random situations (as described earlier), and say that their equations have exactly the same validity as those. Not so! For, this researcher has managed to explain all the anomalies of the Double Slit Experiments merely by adding a universal substrate into the situation. You don’t (as the Copenhagenists insist) need their idealist “interpretation” at all.

But, nevertheless, a great deal more still remains to be done. If the real problem is the incorrect Principle of Plurality, and the only possible alternative is the Principle of Holism, then we are still a very long way from a comprehensive explanation of natural phenomena from this stance.

The whole edifice of current Science (and not just the current problems in Sub Atomic Physics) is undoubtedly based upon flawed assumptions - not only in how things are explained, but also in Method. It is exactly how we do all our experiments!

To change track will be an enormous undertaking, and a look at the current state of “a Holist Science”, reveals that almost nothing has yet been achieved. Honourable exceptions, such as work by Darwin, Wallace and Miller are simply not enough to fully re-establish Science upon a coherent, consistent and comprehensive basis, for that will be an enormous task. But, there is a way!

First, we carry on with the current pluralist methods, BUT no longer stick to the pluralist principle as correctly describing the nature of Reality. It becomes a pragmatic method only. We can use it both to reveal the contents of a situation and to actually organise productive use, BUT we must abandon the myth of eternal Natural Laws revealed by those methods. We must accept that what we can extract in pluralist experiments are NOT “Laws”, but actually approximations confined to the particular imposed conditions, in which such were extracted.

Second, we must begin to devise and implement a new paradigm for scientific experiments. Stanley Miller’s famous experiments into the early stages in the Origin of Life, and Yves Couder’s “Walkers” experiments, clearly show two valid routes.

But, most important of all, scientists must become competent Philosophers!

Pragmatism and positivist philosophical stances are just too damaging. A clear, consistent and developed Materialism must be returned to, but it can no longer be the old Mechanical version.

We must learn the lessons of Hegel, and actively seek the Dichotomous Pairs that indicate conclusively that we have gone astray, and thereafter, via his dialectical investigations into premises, recast our mistaken assumptions and principles to transcend our own limitations, at each and every guaranteed impasse.

The main problem in Holist Science is the fact of the many, mutually-affecting factors in literally all situations in Reality-as-is. Plurality validated the farming of experimental Domains to simplify such situations, and in doing so cut down the affecting contributions either to a single one, or to one that was evidently dominant. So, with repeated experiments with different target relations, such investigations delivered, NOT Reality-as-is, but a simplified and idealised set up, with a recognition of the factors involved, but with each one, in a particular Domain, presenting a version, which could also be replicated to allow effective use of the idealised form extracted. But, the whole series of different tailor-made Domains, each with one of the factors as target, also delivered a list of the factors involved.

Pragmatically, each could be used effectively to some required end, but the list, though each one was idealised, would as a full set, allow a theorist to attempt an explanation of what was seen in the unfettered and natural “Reality-as-is” Though achieved using pluralist methods, and hence distorting each factor, a Holist theoretician could use that list in a very different way.

It was, after all, what had been the situation in the past including BOTH views, that had been the usual way of doing things, before Wave/Particle Duality and the Copenhagen stance had brought that to an end.

Yet, of course, the new stance was not the same as before. For now, plurality was just a pragmatic trick, and the real Science was in Holist explanations.

The pluralist mistake of assuming eternal Natural Laws, which produced all phenomena merely by different summations, was no longer allowed. And the belief in Natural Laws was replaced by an acceptance that all extracted laws would not only be approximations for limited circumstances, but would also and inevitably at some point produce contradictory concepts, and that only by unearthing the premises that had produced these, and changing them for something better, would the produced impasse be transcended, and further theoretical development made

The Holist scientist did not deal in supposedly eternal Natural Laws, but factors that were indeed effectible by their contexts. So, this new breed made the investigation of how factors were indeed changed by their contexts, and were always prepared to directly address major impasses when they arose.
Now, there are different ways of doing this sort of Science. In this theorist’s re-design of Miller’s Experiment, he realised that the provision of inactive pathways within the apparatus would facilitate certain interactions and their sequencing. So, a series of versions of the experiment were devised with such appropriate inactive channels, which were also equipped with non-intrusive sensors, sampling on a time-based schedule. Then, on a particular run with a given internal channelling, data collected from the various phases, would be taken and could be analysed afterwards to attempt to discover certain contexts at different times. 


Something similar is currently done by Weather Forecasters in their simulation models linked to large numbers of sensors at weather stations, but the crucial difference in the New Miller’s Experiment was in the Control that was in the hands of the experimenters. That was never available to the weather Forecaster; he had to take what he was given, but the new holistic experiments allowed something greatly superior to their Thresholds, when one law was sidelined to bring in another. The Weather simulations with thresholds were a crude invention to attempt to deal with a real Holist situation, pluralistically.

In the described Miller-type experiments the investigation would be constantly re-designed, both in the inactive structures provided. And in the necessary kinds of sensors to maximise the information and allow progress to be made. Clearly such control is never available to weather forecasters.

21 March, 2015

The Past & Future Bases for Science



Can we definitively establish the philosophical bases on which Science has been built?

There are extremely important reasons for this question. First, there is the evident way that it has totally transformed human society via Technology. And second, it has inexorably led its practitioners (and the rest of society) into an unavoidable theoretical cul de sac, with persisting and innumerable contradictions and anomalies.

It surely, therefore, has to be the most important task of our age to attempt to transcend the caused impasse in understanding Reality more fully?

There must be a major and universal, but un-admitted crisis in Science as a whole. But, it isn’t transparently evident, because both scientists and the public are deceived by the continuing and even accelerating successes of Technology.

Another confirmation of this crisis is that I, in a long life in Science, have never seen so many of its leading practitioners shouting out its believed-in virtues so loudly (and regularly) from the rooftops!

They doth proclaim too much!

Practitioners are uncomfortable, but see only more contradictions and crises ahead, and have to convince themselves (and everyone else) that the current ideas and methods will see us all through to what we seek.

But, the proof of the pudding has to be in the eating!

And, in Sub Atomic Physics that is evident in both its experimental practice, and in its so-called “Theory” The experimental width has narrowed down to a solely technology-led development: new, ever more powerful telescopes (which see ever more) and Accelerators (which smash “fundamental” particles to smithereens in order to see what they are made of). And, both deliver a relatively constant stream of new discoveries and associated data to fuel the now established narrow paths of development, into a relative torrent.

Yet, the so-called theoretical results are indeed philosophically abysmal. And while ever-new, but inexplicable, discoveries in Space also lead to ideas like Dark Matter and Dark Energy, the cosmologists retreat into unfounded speculation concerning multiple dimensions and unseen Universes, along with the purest of pure Forms in Mathematics, such as String Theory and a Space-Time 4-dimensional “purely formal substrate” to deal with Gravitation.



The current Grand synthesis (though it is currently being challenged by even worse idealist nonsense) has for some time now been The Copenhagen Interpretation of Quantum Theory, with it “cornerstone proof” delivered by the famous Double Slit Experiments, and its abandonment of explanatory theory for pure mathematical descriptions, standing indubitably upon an entirely idealist stance.

Crisis? They are in a terrible mess, and daily only dig themselves deeper into the mire!

Oh, I am well aware they would all deny it vehemently, but I have never, anywhere, met worse philosophers than are presented by modern physicists. If the reader needs convincing of this, may I recommend reading the truly atrocious Physics and Philosophy by Werner Heisenberg, who along with Niels Bohr, established the Copenhagen stance, and won the day over giants like Einstein at the Solvay Conference in 1927. [Yes, it is a long time, and confirms my description of the impasse being still totally unresolved to this day]

But, the actual causes for this crisis were never revealed, and were by no means even new. They had been around for literally millennia, and never resolved. Indeed, from the time of the illustrious ancient Greeks, some 2,500 years ago, a major split took place in what later became known as Philosophy. The two streams that emerged were concerning the ultimate Nature of Reality: on typified it as holistic, as embodied in the ideas of the Buddha, and the other was pluralistic, which was subscribed to by literally all of the famous Greek philosophers.

It was an important split, and, at that time, unavoidable, due to Mankind’s still undeveloped ideas in such areas. They were, in fact, very different ways of trying to understand Reality, and both made assumptions to facilitate those investigations.

Perhaps, the holist view was, surprisingly, much closer to the true Nature of Reality, but at that time pretty well unable to direct any sort of investigative practice.

The pluralist view, on the other hand, was very different, for it did, indeed, provide an effective and useful method of studying Reality. It made it perfectly OK to “hold Reality stall” – to actually both isolate and change an area of Reality by farming it in carefully protected and maintained plots, where as many confusing or “inconsequential” factors as possible were either removed or held constant. And, this was considered not only legitimate, but actually the only way to reveal what were considered to be eternal causing Laws, that were actually driving it

Now, from a purely pragmatic view, the comparison seemed to be a case of “no contest”, for the holist view merely led to a human-based outlook, while the pluralist view led to a technological methodology that could be made to actually work. What it involved was an active intervention into Reality within isolated Domains, which were ideally tailored to reveal normally hidden relationships, by the elimination or control of literally everything else.

It was clearly a powerful approach, and when carried out in the best possible way, did reveal something, not only informative as to cause, but also useable in producing required outcomes. It was indeed almost magical! Clear patterns were revealed that could be extracted and even formulated into mathematical equations, to be used when required.

BUT, they could never be used in totally unfettered Reality. Not only had Reality to be completely controlled, but also in exactly the same way that had enabled the extraction of the revealed form to now be used. But, nevertheless, it was a great success! However, its consequent formulation into a philosophical standpoint was far from the Truth.

For philosophically, it identified the actual motive forces of Reality as precisely these relations that had so carefully been arranged for in the very unnatural and necessary conditions for their revelation and extraction. It just wasn’t true!

What had been developed was a Technology for using parts of Reality, but NOT yet a means of understanding Reality. So, even alongside this technological stance and productive method, there appeared a very different stance, which was holistic, and attempted to explain phenomena in Reality in terms of the revealed properties of the substances involved.

These were, of course, the theoreticians, who were always trying to integrate various discoveries into a comprehensive overall scheme.

Remember, these two different stances co-existed awaiting an expected future bridge between the two. But, this situation also led to an inevitable growing apart of the various groups of scientists involved. We had the observers and data takers – the Experimentalists! Also, there were the theorists with their ever-wider attempts at explanation. And finally we had the technologists, with their equations and pragmatic purposes.

And, as time passed they didn’t ever manage to approach a single view, but instead grew ever further apart.

Indeed, an interesting much wider social effect was imposed, due to the continuing success of the technologists, in that these gains were increasingly turned into the main purpose of Science, so that the constant question became, “But, how can you use it to benefit everyone?” And also, “What can you make using your discovery?”

The technological tail increasingly wagged the investigative dog!

The proof is evident absolutely everywhere. Bigger and better telescopes, and bigger and better Colliders are now the only sources of new data, and computing products get smaller and more powerful by the day. Can anyone doubt the totally dominating imperative in today’s Science?



But, the defeat of the explanatory theorists at Solvay, also involved a major turn “theoretically” – for without the presence of both alternative stances as embodied in the pluralist/holist Dichotomous Pair involving both physical explanations and formal descriptions, the very last vestiges of real theory were rooted out, and replaced with completely with mathematical forms, with some apparently “theoretical” trimmings but real coherent and comprehensive explanations were banned!

Sub Atomic Physics now subscribed fully to the idealist position that Formal laws drove Reality entirely.

What “explanations” there were, became entirely secondary and descended into mere analogies, rather than delivering a real understanding of what was occurring and why. And this descent was thoroughgoing.

Comparing the new offerings with those employed by James Clerk Maxwell, make the distinction clear. He had effectively used an analogistic model, of his own design, of the nature of the supposed Ether filling all of Space, and used it to derive his famous Electromagnetic Equations. But, nothing remotely like this was now evident. Where Maxwell's analogy was derived in accordance with known phenomena, and in both delivering them and his equations, the new analogues were very loose and uninformed placeholders to excuse their now untouchable equations. At best, these analogues were mere "illustrative tales from macro Reality".

Now, some 200 years ago, there was a strong opportunity to rescue Science from its inevitable major crisis. It came in the work of German idealist philosopher Frederick Hegel from his studies into “Thinking about Thought”. For, he realised that human beings were in a seemingly impossible position when it came to actually understanding Reality, for they were not evolved to tackle such questions at all. Physically and mentally human beings were, and still are, hunter/gatherers, for since that time, the usual process of Natural Selection no longer determined things with the human race’s abilities.

Via bipedal locomotion, and the new and flexible use of the hand, the brains of hominids had indeed developed, but with homo sapiens, and particularly after the Neolithic Revolution (much too recent to allow significant genetic changes to be Naturally Selected), things changed, and future developments had absolutely nothing to do with selection of genetic changes, and instead were all about socially passed on knowledge via teaching and nature of Man’s thinking had been genetically selected for in a past era, which involved absolutely nothing about the questions being addressed here.

So, all of today’s human beings did not come into the world equipped-and-ready to tackle the questions arising in Science and Philosophy.

They would be forced to “pull themselves up by their own bootlaces” – in other words used what they were equipped with to find pragmatic and indirect routes to greater understanding, and would be by making concepts that were NOT correct, but, nevertheless, contained enough Objective Content to allow effective uses to be achieved in certain achievable conditions. Such conceptions would always and inevitably have finite “shelf-lives” – the time would always arrive, when they would completely run out of steam in attempting to solve certain problems.

And, at such points, the incorrect concepts, would lead to Pairs of consequent concepts that were completely contradictory. They couldn’t both be true!

They were termed Dichotomous Pairs, and the only way Man could conceive of dealing with them was to first attempt to find which of the two was correct, and when this inevitably failed, as it always did, was to use which one of the Pair fitted a particular problem effectively, and to switch to the opposite one when it didn’t!

This “pragmatic” solution is still what we do to this day.

But, Hegel went further: he had a method of transcending the impasse presented by such a crisis.
He soon realised that the problem could never be solved by merely comparing the two opposites within the dichotomy. What had to be done (which was by no means easy) was to unearth the key assumptions and even principles common to both concepts in such a Pair.



The task became to undertake a deep critique of these bases and then find better alternatives. Only by this “dialectical” method could such impasses be truly transcended.

Clearly, what Hegel had discovered about Human Thinking, also applied to Science. For, all our theories and derivations had been devised within Human Thinking too. Science wasn’t independent of us, and never could be!

But, though we would always, time after time, hit the buffers, and the impasse would be signalled by the achievement of these Dichotomous Pairs, which could indeed be tackled indirectly by the discovering and replacing of the things upon which they were ultimately based.

With this achievement, the way was indeed open for the unification of Science involving this valuable new approach.

Why didn’t it happen?

It wasn’t just another trick in the Formal Logic set of reasoning methods. It was based upon an alternative holistic approach, which not only differed fundamentally from centuries of pluralistic methods in Science, but also did not deliver an alternative methodology for the full programme of scientific activities and uses. The task was NEVER undertaken!

Yet, though the pluralist approach had proved very effective in a very wide variety of practical uses, its requirement of the necessary provision of ideal conditions for each and every “law” being used was a very expensive way of doing things. And, its inabilities to accurately represent totally unfettered (indeed naturally acting) Reality, was not only misleading theoretically, but continually led to seemingly insuperable impasses and Pairs of contradictory concepts generated from the very same bases. brought each and every sequence of consequent steps to its own dead-end. And, as investigations penetrated ever deeper into the very heart of Reality, the system became increasingly useless in delivering any real understanding at all.

Technicians would undoubtedly continue in the old was, upon their well-trodden and successful paths, but that would no longer suffice for scientists.

A conscious and essential Revolution had to take place, which would be upon literally “all new ground”. A new Holistic Science had to be built, and the universally employed pluralistic methods would in many important areas have to be totally transformed, using an, as yet undelivered, holistic set of methods, and, of course, an explanatory approach to Theory once more.

The direction of investigations will have to be wrested away from those only interested in the exploitation of new discoveries, and back into the hands of those attempting to truly understand Reality.

14 March, 2015

The Origins and Development of the Solar System


If there was a substrate...

On watching a Horizon programme on BBC 2 recently (03/03/15), the imponderables about this “assumed-to-be-solved” area of Cosmology were increased considerably.

Instead of a “nice” Newton’s Laws explanation of the system (well within our observational range, and even with some voyages of discovery), that presumably commenced with a Cosmic Cloud of a range of particles produced by a preceding Supernova. Nevertheless, various problems seemed to be as yet unsolved literally everywhere.

Indeed, the pluralist stance, which assumes absolutely everything can be explained by an increasingly complex mix of fixed Natural Laws, has been under threat for centuries, and never seems to ever deliver the fruits of an assumed Reductionism, as it is supposed to do, but instead we are to take it on trust that such will be delivered, somehow “in the future”. But, the sort of things revealed in this programme, and from a stance situated at the very heart of the established scientific approach, should, therefore, have been fairly straightforward, but was clearly nothing of the kind.

Though the programme writers and makers didn’t highlight it at all, the resounding question had to be about that pluralist stance. And, it was incapable of delivering answers in the very area, in which Newton and his colleagues originally established the initial bases of Science, as we now know it, then it would be absolutely nowhere, when it came to considering the multiple Levels of Reality above this basic case – Life and Consciousness are way beyond such a stance.

So, if by some chance, you are not a pluralist – indeed, the very opposite – namely a holist, then you would not be surprised by these difficulties. You may not be able to explain an alternative Origin and Development, but you would know that such is definitely required.

Before we go any further, let us just make these alternative positions a little clearer.

The pluralist conception puts eternal Natural Laws as the active, producing factors, initially organising the Cosmic Cloud into an inevitable system.

But, of course, such a stance is idealism – for where do such laws come from? These Laws in their universally accepted way of encapsulating them – as formal Equations, cannot possibly be the primary sources of any process of development: they can only be the results of the interactions of physical entities with certain properties.

How can Laws possibly be primary?

Also as the organisation of the original pieces that went to make up ever bigger aggregates, and relations between them, the laws not only then came into existence, but would most surely change as things developed?

So this is the alternative holist position¬

Now, a basic scenario has been devised (by such holists) for such developments – knitted together from a study of many diverse developments occurring at many different Levels of Reality. And these ideas, paradoxically, occurred very late in Mankind’s own development, and came, originally, out of serious studies entitled “Thinking about Thought” by its initiator Frederick Hegel around 200 years ago, and thereafter by a study of the significant changes in History, first by Michelet and then by Marx.

And, what came out of those investigations was a trajectory, which seemed to be universal in all developments of every possible kind.

Perhaps surprisingly, this was never a matter of incessant, incremental changes – ultimately adding up to new forms, but, on the contrary, the seemingly strongly enforced maintenance of current forms over long periods of a seemingly permanent Stability, in which the overall structure did not change significantly. Yet these long and dominating periods were, nevertheless, interleaved with short interludes of major qualitative changes termed Revolutions, and then when considered in all possible contexts, as Emergences.

Clearly, once such a trajectory was looked at, for developments like that of the Solar System, it became clear that no simple, formal Law of Gravity would be sufficient.

For, such dramatic changes can only be as a result of competing factors – balanced in Stability, and finally overcoming their constraints and causing a collapse in the old set up, and the subsequent building of a new balance with in a new, different and higher Stability. 




Now, the pluralist Laws that would be the only factors, that, so far, Man had been able to call upon, were always, and inevitably, in the form of Abstractions - which were arrived at from data taken from Reality, but both simplified and idealised, so as to be representable into formal relations (Equations), and brought together according to Formal Logic, into merely consistent complexities.

Instead of being the assumed “primary drivers” of Reality in development, these were clean, man-modified versions taken from particular and conducively-designed contexts.

In addition, the ground for the usual cosmological considerations was originally totally Empty Space (which is, of course, also a mad-made construct).

Yet, to have totally Empty Space, as a universal ground required an origin too. And, when literally everything else is matter obeying its own Laws, then we have another series of Problems.

Where does the matter come from?

What initial form did it take?

Also, it becomes obvious that the concept of Empty Space itself is another of Mankind’s simplifications and idealisations. By making space devoid of all matter, it just became a stage upon which absolutely everything could happen – a static, inactive reference system, against which everything else could be measured.

But, the alternative, of a Space full of stuff, is just as likely – at least it gives us something to watch changing and developing in itself!

And, indeed, Man, when he began to study Reality, he soon filled Space with what he called a medium – a continuous, elastic and invisible substrate, which could effectively explain many things – such as the propagation of light and heat across the seeming void, which most definitely occurred.

Clearly, we cannot readily disentangle real, physical Reality from our always-inadequate conceptions of it.

For, Man didn’t arrive both ready-made and adequately equipped, to merely by thinking, arrive at the truth of any aspects of Reality. On the contrary, he too emerged from lesser beings. So all, yes ALL, of his conceptions about reality would be limited by his own current state of development. The Laws he found would, at the same time, as reflecting Reality-as-is, would also be limited by the current state of development of Man himself. How could it possibly be otherwise? 



Even our assumptions about the past will never be totally objective. Indeed, Man will always both simplify and idealise whatever he studies, to have any chance of even beginning to understand it.

And turning these abstractions into Natural Laws, entirely independent of Man, has to be erroneous.

Yet, at the same time, we cannot merely dismiss Man’s abstractions as simply wrong, for that also isn’t true. Clearly, there is Objective Content – aspects or parts of the truth, in his conceptions, and that is why they can be used both effectively and reliably in certain defined situations.

So, how do we characterise such conceptions, and plot a path in which these are brought, ever closer, to the Truth? The original answer to this question, by the ancient Greeks, was an example of the tail wagging the dog. For, Man established “Truth” by the consistency of his abstractions overall.

He was able to do it by assigning NO significant changes to things generally, and this allowed the formulation of the system termed Formal Logic to become established as the means of testing and even of developing conceptions.

Now, because of the ever-present Stability of Reality, the basic assumptions were approximately true, so the foundation stone of Formal Logic – A = A, the Identity Relation, could be assumed as the banker premise within the normal situations. Everyone now knows, and even knew then, that some things definitely changed, but they were seen as insignificant, at worst, and merely significant incrementally at best, and so, in most cases of productive use were basically a nuisance. So, a methodology was designed of “keeping things still” while studying them, so that the eternal Natural Laws would emerge, un-blurred by unimportant variabilities, to reveal what really mattered.

This approach became known as Science, and its use in carefully controlled Domains, by Man, became what we now think of as Technology.



Now, I must ask the reader to forgive this extremely cursory glimpse at Thought and Truth, but it had to be included here to begin to disentangle our unavoidable abstractions from our actual objective – Reality-as-is. And, hopefully, it would be the case too in the context that we would be in by attempting to work out the actual Origin and Development of the Solar System.

Let me indulge in one more essential diversion to reveal the dangers inherent in our lauded methods.

In Physics, with its original assumption of a medium filling Space (Aether theory), no trace of that medium could ever be found, so the concept of it, after much discussion, was finally, and supposedly irrevocably, dumped!

Space was back to being totally empty once again.

But, it didn’t help!

The problems began to proliferate, especially after the Discovery of the Quantum. For, this encapsulation of electromagnetic energy into disembodied gobbets, was clearly incompatible with a continuous medium of any kind.
But, it left a gaping hole in the required physical explanations of a whole raft of phenomena.

And, in addition, the, soon to become infamous, Double Slit Experiments also pushed the crisis to the limit! For, the results from these experiments seemed to simultaneously allot two totally contradictory properties to the key entities involved.

Sometimes, they acted as particles, while at other times like waves! Wave/Particle Duality was born, and surprisingly accepted as “The Truth”!

Yet, when this theorist included a certain kind of substrate, occupying all the spaces in this set up, ALL the anomalies just vanished!

Now, if the principle of an increase in Objective Content, is true, and a real measure of the closeness to Truth (as mentioned earlier), this meant that the new ideas must replace those of Wave/Particle Duality – the so-called Copenhagen conception, because it delivers more objective Content!

Now it was this researcher, who came up with the new theory for the Double Slit Experiments, and clearly, presented with the many imponderables in current ideas of the Solar System, it seemed worthwhile to reassess that conception, by bringing in the same universal substrate as had proved so effective in the Double Slit problem.

Now, all the advantages of a totally empty void would be gone, and a wholly new set of problems and solutions would be unavoidable.

The gulf between a totally Empty Void, and that filled completely with an active substrate is, of course, enormous.

For example, aggregations would have to occur within such a substrate, and any movement of the resultant bodies would also have to plough through this substrate like a ship through water. The effects of such disturbances upon propagation would have to be determined, and, thereafter, some means of the re-establishment of the normal conditions would also need to be explained.

The analogy with oceans may well be relevant, for they too propagate waves in spite of being seemingly messed up by the passage ships and whales.

The complex state of any substrate, and, in particular, the forces of re-establishment following disturbing passages through it, would have to be established and the involved phenomena explained.

It, as an assumed initial state, would certainly play some sort of role in aggregation – not least in propagating whatever causes such, for without a substrate, even Gravity becomes yet another case of the fabled Action-at-a-Distance once more.

To set up the required mindset for addressing all the usual problems of a universal substrate, we must commence by considering all the indisputable properties which will have to be delivered by such a filling of Space, in absolutely everything that we know occurs within it.

The analogy with ships in a sea gives us a starting point, but, certainly, cannot be taken too far, for though single entity transits may be similar, whole avalanches of particles would certainly not be the same.

For, as was assumed in the approach to the Double Slit phenomena, a single, moving particle (like an electron for example) would continually be disturbing the substrate – causing propagations of such continuously. So, with a veritable torrent of such particles, it would be both being repeatedly initiated, and also a roaring well ahead of its causes in a resulting stream of the disturbances within the substrate.

This would be caused by all disturbances moving forward of its cause (being at the Speed of light), while the causing particle would definitely be moving much more slowly.

Thus each and every particle, on interacting with the substrate, will project ahead of itself, a strong beam of disturbances, well ahead of what actually caused them.

And, perhaps, even more significant, absolutely all previously considered Actions-at-a-Distance, from Gravity to electrical and magnetic propagations and fields will also have to be fully explained in terms of this substrate too.

NOTE: At this point I must relate that several other independent researchers into the possibility of a substrate, but coming to the problem from very different specialist areas, have naturally each concentrated upon features that arise in their discipline, and have each come up with very different candidates as the units comprising such a substrate.

An electrical engineer was concerned with the subtending of magnetic fields within such a substrate, and defined his suggested unit accordingly. While another perplexed by quantization used a liquid as his model of the substrate, and concentrated in all his investigations on both Resonances of oscillations and Recursive effects to produce quantized orbits (which, by the way, he succeeded in doing). And, also an American scientist who is preoccupied with Gravity, caused his units to be defined in such a way as to deliver exactly that gravitational effect.

And, of course, lastly we have the writer of this paper, who in tackling the Double slit Experiments, ended up with a joint particle as the basic unit, with its internal sub-particles mutually orbiting one another. His objectives were to produce units of what he calls “a Paving”, which would be undetectable due to the opposite properties of its component parts, and to produce such a structure as to be able to hold and pass on quanta of electromagnetic energy, via the promotion and demotion of those internal orbits.

Now, such diversity is unavoidable.

For, the problems each individual researcher is addressing are all undisputedly real. So all these lines of study are revealing Objective Content. The problem is how can they be integrated into a comprehensive, explaining-all theory?

The current conclusion of this researcher is that is that the substrate must be heterogeneous – that is it contains several different units.

After all, criterion-number-one is that they are all undetectable, and that means not only to the usual means of detection, but also to each other – they could indeed co-exist.

And, what inter-reactions they do have are considered to only occur in very close proximity to one another – being very local indeed.

Now, the original suggestion, of this theorist, was that it was composed of a single kind of entity, but evidence from other researchers and theorists makes that assertion unsustainable.

And this relative independence of one another will cause them to act as if they alone inhabit Space. Each type may well inter-react with others of the same kind via different overall structures. 



It would be a mistake to assume that all of these different units act in an identical way like different elements in a gas. Just as in Maxwell’s remarkable model of the ether, it seems likely that different structures will be involved.

Remember, it was his model that enabled Maxwell to develop his, still used equations of Electromagnetism, so something of his model must certainly have been a reasonable assumption.

I have included his diagram from Margaret Morrison’s paper on the subject.

Various features of this model are relevant to today’s concerns. Crucially, he has reciprocal effects between his two involved entities. Dropping Maxwell’s model entirely along with the concept of the Ether was a major error, and it stems from the pluralist conception of Reality driven by Natural Laws. The alternative holist view does not see the gains of Science as Absolute Truths, but as analogistic models with a measure of Objective Content, so that all gains are partial and temporary. The best we can do is to strive to increase the amount of Objective Content in our constructions.

So, in our current problem, concerning the Origin and Development of the Solar System, we will have to solve, on a cosmic scale, what is also currently being addressed upon a Sub Atomic scale.

And, though very far apart in scale, they will still be related, as the substrate involved is common to both realms, but with different priorities to be solved.

On the Cosmic scale, Gravity and the general problem of Actions-at-a-Distance will be the dominant considerations. While at the Sub Atomic scale, it has to be the problem of quantization.

But, in an important way, the universality of forms such as orbiting, occurring at both levels, though with very different causes, again proves that Forms are secondary, and can never be causes in themselves.

The dominance of equations has to be a failed diversion from the real causes of phenomena, whatever the level.

Sound Fountains

13 March, 2015

Inflation in Capitalism


In capitalist societies, the phenomenon of Inflation is unavoidable; indeed, it turns out to be actually essential for the maintenance of the economic system. This is due to the invaluable roles it plays within such a system, based entirely upon borrowed capital to finance the establishment and development of the companies that carry out literally all its transactions.

So, when a loan (or investment) is required, it will usually involve two commitments. Taking a loan will involve the addition of interest to also be paid on as yet outstanding parts of that loan. And secondly the paying back of the borrowed Capital will also be required, usually within a given time period.

NOTE: Investments are somewhat different as the money invested is not paid back, but can be retrieved by selling the investment on the Stock Exchange (maybe at a profit), or if the firm collapses, most, if not all of it, could be lost.

Naturally, with such a risk, the annual dividends paid would therefore be higher.

Now, such pressures upon the system, where the whole development is based upon borrowed money, with additional charges unavoidable, necessarily causes Inflation – the value of the currency is always being devalued to some extent. But, let us consider the effect of such inflation upon the value of the sum borrowed. If every year the value of the currency is decreased by inflation, it means that what is paid back off the borrowed sum will be of less value than it was when borrowed. So, over extended periods paying off the capital becomes less and less in current values, and hence costs the borrower less in real current values to redeem it.

Now, no lender will be happy with such a decline in the money they have lent, so they reduce those effects by limiting the period of time that the loan, or any part of it can be outstanding. They also change the yearly interest charged to offset the decline in value of their loan.

Clearly, the lender must ensure that the terms of the loan, including the rate of interest, balances the books, and guarantees an acceptable profit upon the transaction. So clearly, that rate of interest will not only exceed the rate of inflation, but also will also actually contribute to that rate of inflation itself.

Also, it is always possible that the borrower will default, and the lender will lose some if not all of his loan. So, if there is any doubt about the borrower, the appended yearly interest rate will be upped considerably to protect the lender.

We should also see how bad it is for a borrower in difficulties, for the more problems he encounters, the more interest he will be charged upon subsequent loans, so such a business can be pushed into bankruptcy, while a dominant and flourishing business will be able to borrow money ever more cheaply.

You can certainly see why slumps occur!

Let us now look at all this from the other end – that of a worker in a company. Let us see how inflation affects the wages of such an employee.

Given a fixed wage, inflation will reduce its buying power. It will have the same effect as a cut in wages – hitting the earner, while helping his boss! So, employers borrowing money to finance their business get a “double help” from inflation. It decreases their loans, while also decreasing the real wages of their employees.

Clearly, a worker wanting to maintain the value of his or her wages will require regular wage rises, at least in line with inflation, and often above inflation to increase their cut in the process that they are involved in. The current slump 2008-2015 (so far) has seen no real increase in wages while inflation has been regularly present over almost seven years. The wages of the worker has therefore been regularly cut, while his employer has seen his own owings decline throughout. It is crystal clear who is being made to pay for the crisis, even though it was certainly nothing to do with those workers.



In addition, during that same period, literally millions of reasonably paid full-time jobs have been terminated – mostly in the public sector, and replaced by private sector jobs at lower wages, often involving either part-time working or even zero hours contracts

The whole of the slump is being reversed by getting the necessary wherewithall out of the workers.

But, this has only been possible in the more powerful capitalist states. The weaker ones (as explained above on loans) have been pushed more and more to the wall, by debts and ever-higher interest rates.

The points made here about loans, inflation and interest rates means that those in the deepest holes are forced to pay the highest rates for their necessary loans, and the local bourgeoisie implement the cuts in jobs and wages to try to foot the ever expanding bills. The interest rates being charged to working class families in financial difficulties have now reach phenomenal proportions upon the so call “pay day loans”, and along with fines for late payment, and even bigger hikes in interest rates, they are being impoverished.

Inflation is a necessary cog in the mechanism that is Capitalism, and it is also a cause of such things as slumps, and the means to extract the owners from their slumps, on the backs of the workers.

12 March, 2015

Quantitative Easing – Printing Money


How long can they get away with it?

Recently, on this blog, we have been revealing the essential role of inflation in capitalism, and how remunerative it is to business, both in borrowing money and in paying wages. But, that general principle exists within a context of production. So, something useable and buyable gets produced – value is added to primary resources to deliver more valuable products. So, how does Quantitative Easing fit into the picture?

For Quantitative Easing is merely another name for printing money! And, doing that with absolutely NO real capital to back it: so, absolutely no existing value or created value is involved. It has merely been invented out of nothing. No one lent previously acquired value to finance things. It is just printed out of paper and ink, and expected to be useable as if it was a holder of real value.

So, what was the intention of those who did it, in the midst of a capitalist crisis?

In the 1930s, Public Works were the suggested answer, for though the wages involved were just printed paper; it put unemployed workers back to work, and produced something of value in the process.

But, that is not the case with Quantitative Easing!

Now, by merely printing money and lending it to people at very low interest from actual governments, you would expect to merely debase the currency. But how would that normally show itself? In a global slump, such as the one they are still attempting to extricate themselves from, the relations in value between the currencies will not immediately and generally reflect devaluing actions. Individual countries will exploit any temporary gains, at the expense of other countries, and the Money Markets will deliver confusing indicators of what is going on.

But, the rival nations won’t sit back bemoaning how they are being treated: they will retaliate.

Indeed, having initially refused to follow the Americans and the British, the European Union, has now decided to do likewise. Quantitative Easing is being tried by more and more countries.

The eventual results will not need a genius to predict! An eventual return to real value will cause an avalanche of devaluations. The essential “confidence” in the power of the dominant capitalists will inevitably evaporate!

In the past there has always been another area of the world to exploit, to re-steady the boat, but where are they to turn now?

This crisis occurred when they finally decided to try their usual tricks, but this time directly upon the working classes, via bound-to-fail mortgages to the poor, and the then deceitful repackaging and selling of these, worldwide, as lucrative investments. They expected to reclaim the properties many times as the short-time owners failed to fulfil their financial commitments, but when they tried to foreclose, they found they didn’t have the property: it had been wrecked! The bottom fell out of the whole scheme and the 2008 slump worldwide ensued.


So, who was left to screw? There simply wasn’t anybody! So, a more subtle means had to be both devised and employed.

One effective trick was to keep wages below inflation for FIVE years, which made ordinary workers everywhere pay significantly to help capitalism out of its crisis.

Also, by sacking vast numbers of public service workers, and then hiring them back into privately owned firms, at zero hours contracts, part-time jobs, and even encouraging “black economy” self-employment, the lie of the working class being safeguarded was erected.

But such tricks cannot be said to have worked globally. Indeed, we have been forced back into the rivalry between capitalist states, to get national advantages of one country over another. The anti Russian clamour is part of this (remember Russia is now back in the capitalist fold), but ploughing it’s own furrow in opposition to the old dominators of world capitalism, especially, of course, the USA.

Also, the cheap labour advantages of China and other capitalist or neo-capitalist economies must wane with the demands for appropriate wages in these countries, so the globalisation trick of getting your manufacturing done where labour is cheap cannot be maintained indefinitely, especially as it has also been necessary to impoverish and de-skill the western working classes, sometimes producing echoes of the past with sub-classes even in countries like the USA and the UK.

And the importation of ex-colonial and ex-communist-bloc workers to the West, as a cheap labour alternative to their own indigenous working classes, has led to increasing hostility among the native unemployed, and consequently the immigrants being blamed for the situations in the metropolitan countries. The radical Islamist movement, both in the ex-colonial countries and in immigrant populations is a symptom of the failure of this ploy.

And, there has been another crucial problem.

The middle class investors are getting little or no interest from their savings, and small or even no dividends from their investments, so a major source of investment capital is gradually drying up.

Another way of re-building a source for such investments is by diverting as much profit as possible to the already very rich. So, at the time when the poor are getting poorer, the rich must be made richer, or they too will just sit upon their cash.

Never has the gulf between rich and poor been this big – as 1% of the population now own as much as the other 99%.


But, the dynamics involved in all this are not planned. They are needed in various ways, to constantly re-steady the floundering boat. But, nothing seems to be solving the crisis!

Take the Euro Zone, the UK’s biggest market, it is teetering on the edge of a new deflationary recession once again, and the right-wing Tories and UKIP want to blame the slump on them, and hence want to leave the European Union (while keeping all the trading privileges, of course).

The methods now being used are getting exceptionally desperate.

The economic blockade of Russian capitalism has been stepped up to enormous proportions, and has been possible because of Russia's dependence upon selling Gas and Oil to other countries. But, on top of this, the price of oil has tumbled due to US Fracking and Saudi Arabian full-throttle selling, and these, along with the World Crisis have caused there to be more oil than is needed to be bought.

You can see that this is still a major crisis, with no end in sight.

So, what will Quantitative Easing do in the long term? The Americans have got away with it because of their dominance in world trade, while the UK also managed to employ it, because of their dominance in world finance.

But now the Europeans are about to try it too.

If it becomes universally applied, any advantages so far seen will vanish for everybody

Printing money on a global scale can only have one outcome. The value of all currencies will tumble – shown not in exchange rates, but in roaring inflation.

The crisis will get worse.

07 March, 2015

Pure and Applied?


Philosophically, Mathematics has shown itself to be a very unusual discipline, in that it seems to be investigating Forms found in Reality, but is, in fact, more to do with how we humans conceive of, extract and handle such Forms in our thinking.

Historically, the recognition of such Forms in Reality-as-is was never a straightforward discovery - directly extractable as such from that source. For, to even recognise what we were seeing in that complex and varying context, Man just had to both simplify and idealise what he glimpsed, into something both fixed and intelligible. For such Forms as we arrived at, were not what we saw, for they were invariably imperfect and often somewhat transitory, so observers felt they were seeking an underlying perfection below the really existing complexity, and they, with experience and over time, became very adept at the necessary processes involved to reveal perfected and unchanging patterns.

As already mentioned, though seen as unearthing these Forms, they were always both simplifying and idealising them, into “ideal versions”, and, thereafter, studying them instead.

The believed excuse for such processing, was that these Pure Forms did seem to exist “out there”, and the processes of extraction were considered to be mere “tidying up”, and, thereby, releasing the crucial Forms from a natural context of confusing and inessential “noise”, caused by a complexity of other diverse causes, which could be removed to reveal the “real determining heart” of a situation. And, having extracted these causal essences, they could be investigated and their important intrinsic properties revealed.

It was, historically, the initial beginning of Science, even if the actual causality had been inverted, and effects labelled as the actual causes!

Yet surprisingly, many of these initial conceptions have been, at least partially, retained ever since!

And, again surprisingly, this discipline was the very first that Mankind was able to construct into what seemed to be a self-consistent set of relations and rules underlying Reality. And this was so universally taken on by those involved in such things, that, from the outset, these Forms were given Causal Attributes - real things were seen as behaving as they did, because they were obeying the eternal Laws of these Forms.

Hence, these first steps were clearly idealist, and not materialist, in the march of the human intellect.

NOTE: It ought to be mentioned at this stage, that the almighty retreat, in 20th century Sub Atomic Physics, embodied in the infamous Copenhagen Interpretation of Quantum Theory, was merely a retrenchment back to this ancient idealist stance.

Henceforth, in that area, Form was deemed to be the cause of ALL phenomena, and Equations replaced physical explanations almost completely.

So, in Ancient Greece, truly remarkable strides were made, particularly in Geometry, which was immediately available for investigation via drawing, and this amazing development ended up with what we now call Euclidian Geometry – which is still taught as a cornerstone of Mathematics worldwide.



But, it did have major drawbacks! And, these are not only in its idealist stance, but also in its major simplification by imposing eternality upon all its contributing Forms. They were fixed! And, the implications of this, along with their endowment of being also the cause of phenomena, had deleterious implications for the real study of Reality.

Clearly, the extrapolation of this assumption onto many other non-mathematical ideas was inevitable as well as being profoundly mistaken. For though Forms could be “found” in real situations, they were never the determining causes for those situations, and as the real determinators changed, so did the evident Forms.

Forms, as such, were permanent (that is as Formal abstractions), but their real existence was always temporary and never causal!

The consequences were extremely damaging: most things were dealt with as unchanging, and the basic tenet of Formal Logic, which was an intellectual product of this development – that is A = A – the Identity Relation, cast in stone the un-changeability of the ideas and elements involved in this major extension of what had been learned in Mathematics.

NOTE: That this is still around and propagated, was revealed in a book entitled A Certain Ambiguity by Guarav Suri and Hartosh Singh Bal, published only a few years ago (2010 - Ed).

But after, maybe, 2,000 years of Greek Science, based upon such a position, a breech was made in the then towering edifice of that “science-based-upon-Logic”, into one based upon careful observation of concrete Reality. Finally, a new approach was developed, which was, in fact, materialist Science.

Perhaps surprisingly, the new approach prospered in tandem with a rejuvenated Mathematics, because Science was now based upon quantitative measurements, and hence was regularly delivering dependable data sets, clearly also revealing Forms to be extracted and formulated into useable Laws. And, of course, the “ideal experts” for doing this, were the mathematicians, who by this juncture had amassed a truly remarkable number of Forms “to fit all possible patterns”.

An extremely fruitful cooperation developed between idealist mathematicians and materialist scientists, and sometime, and somewhere, something was bound to give! A unifying concept arose that was that of Natural Law. When experimental data was turned into an Equation, by the mathematicians, it was agreed by both parties to be a causing and eternal Law – the scientists had in fact succumbed to the idealist position of the mathematicians.

But, this purely pragmatic compromise was never a solution. Indeed, it was yet another example of a Dichotomous Pair of contradictory conceptions, which couldn’t both be true!

Yet, without a transcendence of the ever-evident theoretical impasse, the two groups pragmatically “agreed to differ” (at least partially), and both stances were kept – using one rather than the other, when it was clearly productive to do so.

And, though the idealism of Mathematics affected Science, the materialism of Science also affected Mathematics, and the result of this was the wholly different discipline of Technology – the application of scientific discoveries in advantageous inventions and devices.


These technologists were not interested in Theory, and they were also not enamoured of Pure Mathematics either.

They required a kind of Mathematics that enabled their purposes – they claimed Applied Mathematics as their own vital Toolkit, and indeed, regularly invented new tricks to facilitate their objectives, whether or not they conformed to either theoretical stance. They were completely pragmatic and nothing more.

Interestingly, the three, closely-related disciplines, not only went their own ways, but on quite different philosophical bases.

Science was primarily materialist.

Mathematics was entirely idealist (termed Pure Mathematics).

Technology stuck to Applied Mathematics, but was entirely pragmatic – “if it works, it is right!”

New Special Issue: The Unknown Ocean II


The second part of our Unknown Ocean series, and the 32nd Special Issue of the SHAPE Journal, continues our journey deeper into the uncharted depths of reality, questioning the impasses and anomalies proliferating in all areas of Modern Science today, and endeavouring to construct a new and sounder basis for our explorations of the world around us.

06 March, 2015

Why me must save the NHS!


Michael Sheen speaks on protecting the NHS - Bedwellty Park