11 September, 2015

Review: Quantum Weirdness is Reality

Jackson Pollock

A Quantum Tangle

What Bell's Inequalities are about, is not what is claimed. It is about formal descriptions of reality and not the material world itself.

In so-called Quantum Entanglement, the assertion is that in measuring one of an entangled pair of particles, it influences the quantum state of the other, even if they are a million light years apart, and it does this literally instantaneously (obviously much faster than the speed of light, at any rate). 

For any ordinary mortals reading this, I must point out that the believers of this "magic" are sub atomic physicists - and this kind of drivel is pretty well par-for-the-course in those quarters. 

However, when it comes to actually "confirming" this phenomenon, they must measure one entity and then the other, or even simultaneously to prove their case. My concern is, "How do the experimenters know a change has been made to the other one of the pair?" For, if you measured the first, then it would immediately influence the other member of the pair. Clearly there is a problem here.



Quantum weirdness proved real in first loophole-free experiment by Jacob Aron, New Scientist (3037)

Do they regularly measure them alternately or simultaneously to attempt to establish a pattern? Questions arise even for those who support the theory. How could you ever know what the undisturbed state of either one was? 

You can't of course! So what do the "believers" say? 

They insist that prior to measurement they are both simultaneously in "all possible states at once" until you actually measure one of them, which then forces it into a particular one of those possible states.

Such ideas recur throughout this theoretical stance: it is the basic myth of superposition once again! This concept states that a particle (before measurement) is simultaneously in all possible positions (like a wave), but with a fixed probability of being in each and every one. And, this remains the case until we measure its position, and by doing so, fix it into a single possible position.

Ryoji Ikeda

Now, though this is totally counter-intuitive (and most probably wrong), it does allow statistics to be used over a number of cases, and the statistically arrived-at answers do indeed match certain observations in reality. 

The mathematicians make it all work by taking a Wave equation and associating probabilities to all possible points on the wave, which are interpreted as being probabilities that the particle is in each possible position.



Notice that this method cannot deal with the position of a single particle, but can give overall estimates of a whole group!

As a physicist myself (and one who was originally a mathematician), I have a name for such methods - I call them frigs! They are mere tricks. Such techniques are often used in Mathematics as clever ways of arriving at hard-to-find solutions to purely abstract equations. 

So maybe you can see how they happened.

With this in mind we return to Quantum Entanglement - this totally counter-intuitive standpoint is described as having a before-measurement-particle only existing "as a fuzzy cloud of probabilities of all its possible states." And this is how they avoid the otherwise necessary infinite regress! Instead of an oscillation with each and every measurement, we are expected to believe that before measurement, such quantum entities are not in any particular state at all, but when measured an entity will suddenly be in a specific state, and its remote entangled partner will somehow be affected by this intervention too!

In other words, more generally, we can conceive of such things as particles, but nevertheless, often take such things that are as particulate as its position, as a quantum property, as if controlled by a wave. The trick involved, for it can be nothing else, is that of all possible positions in a wave represented by the probability of the particle being there. And this is, of course, complete nonsense, both in the way it is presented and used by these scientists. 

Unless, that is, you consider there to be an actual substrate, filling all of space, which is both affected by, and can in turn itself affect, the enclosed particle.

In previous work undertaken by this researcher all the various anomalies of the infamous Double Slit experiments were completely explained away by the assumption of the presence of such a universal substrate - at the time called Empty Photons



The idea of a substrate was far from a new supposition, it had at one time, been the consensus view. But a substrate was never detected, so the prior theoretical idea, known as The Ether, was permanently dumped as insupportable, despite the fact that James Clerk Maxwell, using his theoretical model of The Ether, derived a correct set of Electromagnetic Equations which are still used to this day.

Clearly, all the points made here must be addressed. In fact, this theorist suggests that the whole myth of superposition and multiple simultaneous states, was invented to explain the results of things such as Quantum Entanglement.

Now, the reader might wonder how scientists could be so influenced: for it runs counter to the basic materialist conceptions that are the key premises of Science. The actual reason for this is clear. They have abandoned Physical Explanation for Purely Formal Description. They are no longer physicists, for they have rejected the physical world - they are merely mathematicians!

Einstein's dismissal of Quantum Entanglement is encapsulated perfectly in his phrase:

"The Universe is real - observing it doesn't bring it into existence by crystallising vague probabilities"
For such are most certainly, idealistic notions.

There can, however, without recourse to idealism, exist a hidden universal substrate with wave-like characteristics, and a sort of symbiotic relation between that substrate and physical particles moving through it.

It is the impossibility of answering my question about the "entangled particles" measurement that precipitates this monstrosity of a theory! The counter to that position by de Broglie, and later by David Bohm, about so-called "hidden variables" did not solve it, as these features were never found, no matter how detailed was the study of the particles involved.



What was really needed to attempt to explain the sub atomic world was a separate substrate, involving a reciprocal and recursive relationship between a particle and its context. For then, and only then, can we have a passage of time between the initial influence, and then the recursive effect. The assumption of an intrinsic "Pilot Wave" meant simultaneous effects, but the role of a substrate as intermediary allowed this crucial delay.

It is the formal, and even stilted nature of the mathematical physicists' thinking, that draws them inexorably towards the Copenhagen myths, and unfortunately away from reality. 

Niels Bohr's insistence that the Quantum States "explained" things that classical physics could not was false in the first part, while true in the latter condemnation. In fact neither approach could explain our observations. Bohr's position was descriptive of certain forms, but not in the least bit explanatory. Forms do not explain! They can describe reality, but they don't even do that perfectly. All equations are descriptions of idealised forms, they are not even accurate descriptions of any natural part of reality, they are always approximations, simplifications. Those forms can then only be applied back on to areas of reality that we have carefully prepared, or farmed into experimental domains. Here lies the role of technology in all our investigations. The form's validity is then confirmed by successful use in these domains. 

The battle between the two standpoints embedded in Science was never resolved, because both sides of the argument subscribed to the same belief - that equations represent reality as it is - an obvious fallacy when you stop to think about it. Both the classicists (such as Einstein and de Broglie) and the new school mathematical-physicists (Bohr, Heisenberg et al) were completely wedded to form


Even Einstein's Relativity, and Space-Time Continuum were crucially formal ideas.  

So, in spite of a small section of physicists refusing to embrace the Copenhagen Interpretation of Quantum Theory, these remnants (in the 1960s), after 30 years of argument, required a final means of settling the dispute. And for the Copenhageners John Bell's suggestion was a godsend. 

But he did this using only the purest basic forms of Mathematics, to which both sides mistakenly subscribed. Bell used Set Theory, and its embodiment in Venn Diagrams to "do it". 

Now here had to be the most inappropriate "proof" concerning anything in concrete reality, for it only dealt in idealistic laws, and this was to prove what reality really was, and to do it by this means alone!

Bell used this method to construct a set of inequalities which could be clearly demonstrated in Venn diagrams, and as such, he had to be handling fixed things: no qualitative modifications or evolution of those forms could take place, as it was impossible by such means. It would be accurate to state such a basis as the premises for Mathematics and Formal Logic only.






Bell used these as a basis for tests about reality. He used his Inequalities to set limits, and if in concrete reality they were clearly exceeded, then the claims of the opposing realists were "proved to be wrong", and Quantum Entanglement was proved correct.

Many will have noticed it was a proof which only really convinced the faithful! This kind of "proof" was reality to them, it was their everyday modus operandi.  But this gave the Copenhageners the result they required. The vast majority of physicists now claimed Quantum Mechanics victorious, and Realism finally defeated.

Bell had generated his required test using Formal Mathematics, and, as that was the "Essence of Reality" it simply must deliver a valid and correct test. But the actual conclusion of this method should be no, you cannot prove the nature of concrete reality solely by resorting to Formal Logic. Only other forms are provable solely by Mathematics. And only phenomena consistent with Formal Logic are provable by the methods of Formal Logic! Nothing else is possible in either case.

Nevertheless, though all experiments seemed to support the idea that Bell's Inequalities proved the Quantum Mechanical position to be true, the fact that it wasn't correct actually refused to go away. However, this recent Dutch experiment mentioned in New Scientist was supposed to settle the dispute forever...

The test was proved over many productions of Entangled pairs, and it was the statistics of the many runs overall result that delivered the required answer to Bell's Inequalities.

So, what had actually been achieved? 

It was his formalisms that were proved correct! 

He had suggested a test for his Formal reasoning, not for any feature of concrete reality. 

Lots of so-called "loopholes" - all put forward by scientists who actually agreed with their opponents on the mathematics involved, turned out to be not only wrong, but entirely inapplicable. But as they came from the same camp in their attitude to the primacy of form, proving things in these loopholes was inappropriate anyway. They merely corrected their formal mistakes - absolutely nothing to do with concrete reality at all! All the Dutch group achieved was the defeat of their opponents on Formal Reasoning only.

Hanson lab at Delft University of Technology
However, it is easily proven that by the means he used, Bell's Inequalities can only be used to address Ideality - the world of purely formal relations. They don't actually mean anything in concrete reality at all!

I concur that this condemnation of the precedence of Form over Content is still not enough to debunk these ideas. The most crucial principal in all such experimental investigations, both classical and Copenhagen school, is the cornerstone of all formalism and all analysis - the Principle of Plurality. This view of the world sees it as composed of many simultaneous natural laws, with different mixes of these happening in each and every observed situation. This can be drawn as distinct from Holism, which sees all things as inter-connected and inter-dependent, where Plurality sees only inherently separable component parts, which can always be deconstructed and analysed. Analysis can be made of any given situation, however complex, through isolation, simplification and control of those components, extracting the laws from the mix. Such methods are the basis of literally all scientific experiments. 

This is all fine (and Science certainly wouldn't exist without such analytical methods), until erroneous assumptions are made about what this means - Plurality assumes that any law extracted in this way, is identical to that when acting in totally unfettered reality. And this is not true. In unfettered reality all "laws" are modified by their context. Realising this is the first step towards a Holist stance on Science. The objectivity of this position is confirmed by the fact that any law extracted by the usual method of farming and controlling a context for the experiment, can only be reliably used in that same context. The Pluralist view survives (and indeed thrives and dominates) because we are extremely adept at arranging for it, at controlling our environment, and this makes both prediction and production possible.

But, in theory, all reasoning using such laws as the actual components of reality, is bound to be wrong. Pluralist laws and techniques are pragmatically extremely powerful, but theoretically greatly misleading.

It isn't just specialisation that leads to scientific teams consisting of experimenters, theoreticians and technologists - all of these roles are actually differing standpoints, and all are essential to the Scientific process. But they will contradict one another! Disagreements are unavoidable, and dead ends are likely in many scenarios.

Postscript

This paper concentrates upon the underlying bases of the methods and reasoning used in postulating Quantum Entanglement. Despite the fact that I think this torpedoes Quantum Entanglement from the word go, QE forms the last line of defence for the regressive Copenhagen Interpretation of Quantum Theory, which must be defeated, so a job must be done on it!

I am currently working on a new body of work entitled Quantum Disentanglement, which I hope to publish as an extended issue of the Shape Journal in coming weeks...

10 September, 2015

Socialists: Are you adequately equipped?



This is a message to Socialists in the Capitalist West; but it is also relevant to those in the newly Capitalist East too. What has been missing, largely due to the diversion of what we term Stalinism, is real Marxism. We need to equip ourselves to do again what Lenin and his comrades did in 1917.

We have had a massive slump on a worldwide scale, and it isn’t over yet. Capitalism is faltering and it should be our greatest opportunity, yet in spite of the Arab Spring uprisings, and the Revolution in East Ukraine, we are virtually invisible... And, without a socialist alternative, in the recent General Election in the UK the Tories got in, again!

What have you been doing? I’m afraid even 250,000 in London after the event is no good!

The Greeks have been demonstrating all the time, and got an anti-austerity party in, followed by a resounding “NO!” to the austerity merchants. Even, the Egyptians had a Revolution and occupied Freedom Square incessantly, yet ended up with a military dictatorship once again.

The problem, surely, is in understanding the processes involved. It is not enough to condemn Capitalism. You have to know what to do about it!

Do you call yourself a Marxist?

If so, what do you think that such a position is merely a political stance, or is it a philosophy? (In fact, the most sophisticated that Mankind has ever produced). But, do you know what it is and how to use it? I don’t mean tactics, and regular day-to-day activities, I mean do you use it to understand these situations, and, crucially, how can things be changed?

I am certain that literally all activists against Capitalism have no idea of the power of this philosophy! To get some idea of its range and power see SHAPE Journal, on the Web, where it has covered all issues from Politics to Science, Philosophy to Art, and many more.

Did you know that the greatest archaeologist, V. Gordon Childe, was a Marxist?

Did you know that Lenin wrote a book condemning the world famous physicists Poincaré and Mach for the Empirio Criticist stance, which was the immediate predecessor to the current idealist stance in Modern Physics?

Have you read “The Part Played by Labour in the Transition from Ape to Man” by Engels?

Currently, SHAPE Journal addresses the Philosophy of Marxism in all disciplines, and has published 74 monthly Issues over the last six years, which has included over 450 articles, with another 250 posts on the SHAPE Blog. Have you seen them? They are all available for free from www.e-journal.org.uk 

And currently the main theorist on SHAPE is cooperating with others in the USA and India to bring about a replacement for the current, so-called Copenhagen stance in Physics.

Don’t you think, as a socialist, you should be addressing this body of current Marxist works, and even contributing yourself?

Jim Schofield

05 September, 2015

Socialism


Let us consider a few important questions in Economics.

In Capitalism, the establishment of a company not only needs ideas for a required and viable product, but also, and primarily, the money (or Capital) to establish the organisation, its necessary equipment, accommodation and staff to carry out the whole scheme. For, only with sufficient start-up Capital can all this be assembled and organised. And, there is, thereafter, an ongoing need for extra Capital to make changes and keep the company competitive.

Capitalism, as its name implies, requires constant access to more Capital. Indeed, there must be a regular supply of the resources necessary to make the intended products, long before payments for those, as sales, will be arriving.

The whole process depends upon the availability of such Capital, and the increasing march of Technology, also means that the costs involved soon far outreach the resources of craft manufacture and demand the most sophisticated and expensive machines to stay competitive.

The prior system to Capitalism could not deliver such things, so Capitalism was indeed an advance. But, of course, still the resources of Capital have to come from somewhere. Where would that normally be?

Now, in spite of being a life-long socialist, I am also an educator of long experience at every level of Education, but also a research scientist and computer expert. So, at a particular point in our careers, a colleague and I struggled to establish a company producing the most advanced multimedia aids in the world – designed to be used in certain very difficult areas of education. We knew we could do it, for in funded research, we had made exceptional products, that even won a National award in the United Kingdom “for excellence” from the British Interactive Video Awards organisation (BIVA), and, later, the award of degrees, based on our achievements, in addition to winning the most prestigious award in our field in the USA, we got only meagre financial resources, and often no grants at all, to fund new products, and market them worldwide. Now, there we were in a Capitalist System, why didn’t we apply to the usual sources of such Capital – for example the Entrepreneurs and the Banks?

Well, there were very good reasons for this.

We were primarily educators, working at this stage in Universities, so our primary job was that, and that came first. And secondly, our experience with several “interested parties” revealed that their interest and main criteria were about how much they could make out of the venture. NOTE: The Dragons’ Den TV programme reveals investors' concerns very clearly.

Our objectives were very different, indeed, from all of theirs. Frankly, the means to access this Capital meant that you had to subscribe to the motivations of that system or you would get NOTHING. We refused such offers, and decided to do two things.
  • First continue to apply for grants, and
  • Second, to work for NO pay and use what we would have earned from sales as our future Capital.
It was for very good reasons that we had a long uphill struggle amounting to 10 years, before a major breakthrough with our product Wild Child became a worldwide hit.

Now did we do the right thing? Of course we did! All the imperatives were determined by the discipline involved, which was teaching Dance Performance and Choreography, and our approach led in just 6 more years to having products distributed in over 100 countries. We even managed to outflank the establishments in our field by using the web.

So, there are important lessons here, as to what will be necessary in a Socialist Society, where there is no super wealthy class, and the Banks are all publicly owned, with an entirely different remit from what caused the 2008 crash world wide.

So, Capitalism concentrates available Capital into the hands of the class who see it as THE generator of more of the same. Indeed, the practical properties of what is produced are definitely secondary to company-involved’s power to make Money. And it has become a self-defining and self-perpetuating system with Capital as both its means and its purpose.

Interestingly, the efficacy of the products produced is not primary. So, a perfect product will still not persist perpetually for it cannot generate constant replacements, and, therefore, the requisite flows of Capital. Only new products can do this, so the whole system is constantly renewing itself, in order to regularly increase profits (Capital).

Now, it isn’t actually sustainable, because it is regularly disposing of past solutions to replace them with new ones, involving some new feature, so everyone is pressed to update to be at the very forefront of what is currently available. Clearly, such an imperative definitely will maximise profits.

Yet, such a system cannot be said to be reliable: it is always under pressure to renew, and this cannot be said to have been to improve products efficacy, so it pours Capital through the system via credit, which it must then pay back with sufficient interest – all the time! Short-term returns are the accepted measure of success, and NOT how much is owned and owing by the company involved.

Any loss of confidence among the investors, and a recession or even a Slump will ensue!

In fact, no one can actually repay what is owed. Many loans are taken on to repay previous loans, so the system is never-ending.

Also, no bank can return all deposited money back to account holders, and a run on a bank, if it isn’t rescued by getting a loan itself, will very quickly ruin it.




So, to replace Capitalism, you have to change the whole system from bottom-to-top.

At present it is the holders of most wealth who determine what happens. After all, they hold the real purse strings! And, of course, they have their own purposes (usually to get even richer, or at least protect their wealth and status).

Clearly, such a set up cannot continue forever: it is increasingly unstable, and its crises get more and more difficult to address.

Now, it is clear that Capital is necessary, but that doesn’t mean that Capitalism is inevitable. We have to make a clear decision after a revolution, “Who should hold and invest the wealth?”

Let us learn from what happened in the Russian Revolution!

The major institutions were all nationalised, and without a penny compensation to the ex-owners. They started as thieves, and continued as parasites, so they will get absolutely Nothing!

Now, this aspect was crucial. For what they considered to be their Capital was never really theirs in the first place. It had always been generated by subsequent production, but always ends up (primarily) in someone else’s capacious pockets. When addressing this mess, these thieves shouldn’t get a single penny.

They will have to work for a living, like everybody else.

Capital will not be allowed to go to any private individual. So, who, or more accurately, what should hold and allocate all Capital?

There is only one answer! It has to be the democratic organisations of those involved in its creation – the Working People! At first, it will be in their now-worker-owned Production Companies, but then, later, in their Democratic Organisations, such as the Soviets (or elected Councils) at every single level.

And, no individuals should wield total control of such wealth, even with such people’s organisations. For, they would inevitably use that power to their own personal advantage.

Clearly, though the State will play a role, the real question has to be about the actual form of Democracy that will be involved. And to totally prevent the building of organisations against these principles, there will be NO Stock Markets, and NO private Banks! No singularly powerful groups of cliques will be allowed – only democratic organisations, responsible below to their electorate, and above to their next democratic level organisation. Clearly no such easy solutions can, this time, be allowed.

What will be involved here is a real Revolution, and by its very nature, exactly what will be created in such an Event, cannot be prescribed completely beforehand. We don’t and can’t know what will emerge, except that such an event is the most powerful creative force that can exist!!

But, we must constantly guard against the rising of individuals and/or groups, who will undermine what is being constructed, to their own advantage. This is the risk.



No to a Cromwell! No to a Napoleon! No to a Stalin, or to a Mao!

This time the Revolution must be for The People!

01 September, 2015

Stability within the Atom?



Internal Turbulence & External Context

A General Introduction

One important consideration when dealing with vortices within a substrate, must be to include the initial background state before their creation. For, such would most certainly affect those vortices, in addition to their clearly obvious causes by any moving material intruder.

And, what is most clearly shown in the turbulent atmospheres both of the Earth, and even of Jupiter, is that the vortices occurring in those circumstances are not simple consequences of the causing limited-swift-flows, or intrusions of some kind alone.

In the case of Earth’s atmospheric disturbances, one of the key causes-plus-results of the overall situation is the high-speed of the clearly determining Jet Stream, which itself has causes, governed by the heat supplied by the sun and the spin of the planet. Whereas on Jupiter, the initial causes, at least, seem to come from processes inside the planet, itself, along with a similar context of planetary spin.

Clearly, the spinning of both of these planetary bodies significantly affects the moveable, cloaking substrate of the atmosphere, and, once again, the results react back to become further affecting causes in themselves.

Now, though all these considerations are vital, particularly in the examples mentioned on the planetary scale, the key question arises, “What will be different when considering a Universe-wide, substrate, composed of micro components, inevitably also extending to within the Atom?”



It is just possible that “down there”, we could profitably assume, as a simplifying, first approximation, that NO such external disturbances will be significant. We could locally assume a totally quiescent substrate only disturbed by the extremely close effects of internal components, and especially the orbiting electron (when considering, of course, a Hydrogen atom as the simplest possible case).

And, unlike the majority of disturbances in atmospheres, the situation within that enclosed space, must be affected, from the outset, by not only delivered effects, but also by the recursive effects of the vortices, caused by the orbiting electron acting back upon their original causes. For, with orbits the conditions are constantly being repeated time-after-time with each and every orbital return of the electron to previously affected parts of the substrate. 


And, of course, these “same again” effects will either accumulate to dissociate the atom, or, much more likely, settle, somehow into a stable and persisting situation.

The situation is crucial, because, almost uniquely, this set up produces Quantised Orbits of the electron involved. And, this means that a fixed set of allowed orbits results, of different radii, and hence a consequent set of fixed energy levels. And, it is these, which allow the storing of electromagnetic energy therein, and, in turn, govern precisely both the frequencies and emitted energies of any released quanta.

The processes involved, in these energy transactions, are brought about by the demotion of a previously promoted orbit - from a higher orbit and Energy Level to a lower one.

Indeed, in comparison with Yves Couder’s experiments in a silicone oil substrate, with the establishment of a stable entity (termed The Walker) in that case, and the possibilities, which that also strongly suggests as also being relevant for the Atom – means that they both seem to indicate a sufficiently undisturbed environment for such stabilities to be established purely internally.

Indeed, the case of Couder’s Walker seems to suggest a very similar conjunction of resonances of various oscillations plus recursive feedback to be the crucial physical, formative causes for the remarkable occurrences in the atom too.

To get a better idea of what this means, let us take a more common case – the products of vortices by a narrow and fast moving stream entering and passing through a still pond. It seems to be a relatively simplified situation, but is nothing like as localised and indeed “locked-in” as is likely to be the case within the atom. First, the causing stream is unlikely to be close to behaving with the usually-idealised “streamline flow”. On the contrary, it is certain to carry with it disturbances, from its own forming history.

So, these are also brought into the still pool. 



Also, there are no close constraints upon its subsequent passage, so vortices would be created, and then inevitably left behind, throughout its subsequent passage, to simply dissociate into almost random disturbances, while the continuing stream continues to generate more vortices elsewhere, thus contributing to its own ultimate demises a discernable, coherent stream.

But, within the Atom, on the contrary, the causing orbiting electron is a constantly returning enmity, repeatedly interacting with its own previously created vortices. And all of this is happening within the close confines of the atom, within a tiny, local area.

So, as with Couder’s Walker, stabilities could, and indeed must, be possible, if all the interactions are appropriately tuned to elicit the observed special effect of stable quantized orbits.

NOTE: Intended here is a suggestion, by this theoretical physicist, that the physical arrangement of the atom is such that an electron NOT in a stable (quantized) orbit will inevitably lose energy to the substrate, and so reduce that orbit until it matches an allowed level, at which it will become stable, and will stay the same until some external conditions cause it to change internally.

Now, clearly, these few inclusions are nowhere near a full explanation. The reason, for getting as far as we have in this problem, is the concrete evidence of Couder’s Walker, where with only a substrate and various oscillations a stable Walker was not only produced, but also maintained as long as the producing conditions persisted.

That alone was sufficient to begin to assume the possibility of a similar occurrence within the Atom. 

http://fuckyeahfluiddynamics.tumblr.com

But, just as with the Walker, where considerations and consequent explanations of the properties of the substrate and the bouncing drop, and of course, the absolutely essential matching of the involved vibrations, the situation in the atom will unavoidably also include many other considerations.

For, each element’s atom is different, with different quantized levels, and hence the influence of the many different nuclei (which also perform their own small orbits), will have to be included in a final and comprehensive Theory. 


Postscript:

The current state of play in these theoretical considerations has now been taken beyond these brief notes in the SHAPE Special entitled The Atom published in SHAPE Journal on the Internet in July 2015. You can read that issue here.


28 August, 2015

Drug-Funded Capitalism?



Let us be crystal clear what a slump or depression is in this Capitalist World. It isn’t due to the policies of a particular government, but entirely due to the very nature of the Capitalist Economic System itself.

For, the basis of this economic system is the borrowing of Capital (that is Money) to initially finance new start-ups, but mostly to finance vital changes to the current state of companies to safeguard or improve their profitability in competition with their competitors. And, the people with that required Capital, also have their own agenda: they invest it to get annual dividends from the companies’ profits, or even make instant killings by selling the shares they own of a succeeding company.

But clearly, this presupposes two essential things.

First, there must be in existence sufficient well-heeled investors able and willing to supply the necessary Capital.

And second, there must be sufficient confidence that the investments involved will be capable of being paid back, along with a tidy stream of ample dividends throughout the duration of that loan.

Now, of course the very existence of the Capitalist System would only be possible if there existed sufficient wealthy individuals with the wherewithall to invest. And, that was true even before there were any capitalist enterprises to supply such amounts.

So, the original source of Capital was not yet available from within the system!

Historically, its amassing always had to involve some kind of large scale theft. War and booty was one source, but it meant that the financers were military rulers or monarchs, using the proceeds of their victorious wars.

Though, England’s route was originally by the taking over of Church properties by Henry VIII, and later by the financing of Privateers (Pirates) by monarchs like Elizabeth I. Licences were given to these Pirates to attack and capture vessels bringing the Spanish Monarchs’ booty back from South America.

Such sources were able to pump-prime the new system, which at that stage was mainly to do with trade and plantations in new colonies.

In addition, the rewarding of monarch’s supporters by selling ex monasteries to them at cut down rates, enabled a new Middle Class to appear and grow, rapidly becoming the main source of Capital for the new economic system.

So, capitalism could only work if such wealth reserves of the very rich were available for investment.

Now, that original means of delivering an incessant supply of such wealth could not be counted on forever, so a regular problem for Capitalism, during its regular and unavoidable collapses due to lack of Capital or even lack of confidence, had to be to find new sources of this essential component of a continuing Capitalist Economy.


For the system constantly requires new sources of Capital.

The most common one was the promise of vast gains from new as yet untapped resources, and an early example was what could be obtained easily islands in the Pacific Ocean – the so-called South Sea Bubble. But, such highly merited policies never survived for long, and Capitalism becomes a repeating cycle of booms and slumps, though always needing to find new promises of “inexhaustible wealth” to provide its essential life blood – Capital.

Now, a History of Capitalism would list all these boom-creators, as well as their following slumps when each was exhausted of its potential.

And, the lesson of the current 2008 slump has to be, “ Who had to be found prior to that collapse to fund Capitalism in the following stage?”

Surprisingly, the decision was to lend money in the form of mortgages to the very poor.

Yes, that is correct – they turned to those who would inevitably default on their payments at the first difficulty.



This started in the USA, where it was considered to be a guaranteed winner! And, the reason for this optimism was that the mortgage holders would, ultimately, fail to pay, and the properties would revert back to those who had lent the money. And, the lenders could then sell the property to some other poor family, by getting them their required mortgage.

Clearly, with each property being constantly returned to the lenders and resold, along with some repayments from each short term “owner”, the investment would quickly amass a great deal of income, while at the end of a long process they would end up owning the property once again.

And, the idea was agreed by all bankers and financiers to be a total winner. Everyone involved on their side was so convinced that they parcelled up large numbers of these situations into what they called certain winners, and sold them to Banks and investors worldwide (making an extra profit in that transaction too).

It should have been a new and lucrative “bubble”, except that as soon as it became clear what the intentions of the mortgage lenders were, the local communities took it on themselves to completely trash the properties after yet another poor incumbent had been kicked out.

So, instead of decades of profit, the lenders ended up losing both the properties and their means of getting continuous profits from them.

The losses went through the roof, not only in the USA, but also worldwide, for those who had bought into this “wonderful opportunity” to milk the poor.

After a very short time, and along with the vast sizes of outstanding loans all over the world by normal capitalist endeavours, the World Slump began!

But also, the poor could not be the place for the next necessary boom.

Where on earth could it come from?

Both present day Mexico, and the USA’s own history with the Mafia and De Lorean, is showing the way.

Investors, there, are choosing illegal drug cartels as the best places for their investments, and the result is that the whole country, including the government is descending into a culture based upon Drug Profits.



It is my contention that the next boom will be based upon Drugs.

What do you think?

13 August, 2015

Energy Retention within Atomic Electron Orbits



One question that hasn’t yet been adequately addressed in our developing Alternative Theory of the Atom, is just how essential is its inclusion of a Universal Substrate, which could not only easily absorb energy from an orbiting electron within the atom, but also, somehow, return it all, in full, to steadfastly maintain that orbit entirely undiminished. It is, indeed, an unavoidable question! For, the whole theory rests upon just how easily such a substrate absorbs and then propagates such energy.

So, within the atom, there is seemingly a damaging contradiction, which, if it isn’t adequately explained away, will most certainly torpedo the entire Theory.




Indeed, any prior suggestion of such a substrate, in the prior history of science, was always finally dismissed, not only because it was never detected, but also because its absence seemed essential to guarantee the stability of the atom (among other similar arguments).

The intermediary for both holding and paying back any lost energy (in our new theory) was assumed to be a whole series of caused vortices, created within this substrate actually inside the atom. For, though such features, as seem unavoidable and yet essential in those circumstances, would at the same time appear to be impossible in straightforward linear sequences of movements within an unbounded substrate, the special case, within the atom, was considered to be significantly different.

And, to confirm this exception, the brilliant Experiments by Yves Couder et al with silicone oil and vibrations alone - delivering his celebrated “Walkers”, seemed to confirm that such maintaining phenomena were, indeed, possible, given the necessary conditions. The persisting stability of Couder’s Walkers seemed to be achieved by interacting vibrations that via both resonances and recursion produced the seemingly inexplicable and resolutely stable Walkers. And, as Couder also delivered “quantized orbits” of these Walkers at the macro level, the implications, of these discoveries, for the micro level clearly demanded to be addressed too.


Quantised orbits performed by "walkers"

NOTE: It also must be mentioned here that the assumption of a Universal Substrate (and its detailed definition) devised by the author of this paper, has already fully explained all the anomalies of the Double Slit Experiments, without any recourse, whatsoever, to Copenhagen, as well as full explanations of Electromagnetic Propagation, and even both Pair Production and Pair Annihilation too.

And, all these were made possible by the assumption and description of an undetectable, but real, substrate of particles.

The key question, of course, had to be about what precise kind of energy would be involved in these “within-atom” transactions. For, the units of the proposed substrate, could both hold and pass on energy to and from their atom’s internal orbits, but could also be moved bodily, thus involving Kinetic Energy as an alternative.

Clearly, if the units of the substrate are to be disturbed from a relatively static arrangement via shearing/contact effects caused by a moving particle (the orbiting electron), then, it would seem most likely that Kinetic Energy from the orbiting electron would be transferred to become Kinetic Energy of substrate units in the usual vortex forms.

Now, if this is correct, the integrity of the orbit will be breached, and, the only way that such lost energy could escape, permanently, from the atom, would be if the caused vortices gave up their acquired Kinetic Energy to either similar translational movements or even vortex-like movements to other sets of substrate units.

Now, with vortices occurring in the usual way, in a liquid like water, for example, that is the only transfer that can happen and such spin-off systems of vortices carry such gained-energy away, and these then ultimately dissociate as vortex-forms and become mere disturbances of the molecules of the liquid involved.

Cymatic vortices


But, here we are considering a very different substrate, which is not at all like molecular water.

The units are very much smaller than molecules or even individual atoms. And, as they don’t move about to any extent but more or less remain where they are, the means of energy transfer away from the cause is not available by mere translational movements.

So. Let us attempt to determine what could be possible in this special case.

First we can take another well known situation, where, say, an electron is moving through a substrate in a straight line, there can be no doubt that because of that movement the caused disturbances, including what energy they have absorbed from the electron, will be left behind and lost to that electron forever. Ultimately that energy would be dissipated through the substrate and be unrecoverable as a whole.

But, within the atom the situation is certain to be very different, for the causing electron is maintained within the atom and constantly returns to re-encounter the vortices it caused earlier, and will do this repeatedly, and at a whole array of vortices all around the orbit. And, on such re-encounters of the electron and such a vortices, there would be the possibility of both transfers back, as well as further transfers out between vortices and the orbiting electron.

Now, though this is indisputable, it doesn’t mean that all the lost energy will be return. So, most scientists would still not accept the maintenance of the orbit by such means.

Until, that is Yves Couder’s Walker experiments achieved the impossible via Resonances and Recursion and established Fixed, Quantised Orbits of his Walkers. Clearly, these effects PLUS energy taken in from the substrate generally was, somehow, able to establish and maintain those orbits.

Indeed, elsewhere in other studies, it has been established, by this theorist, that such a Universal Substrate is certain to constantly act as both Sump for waste energy, and Source for energy when demanded by such processes as Resonance.

Now, exactly how, and by what modes of energy retention, the substrate units acted in this case, isn’t yet absolutely clear, but it is obvious that a significant recursive pay back or even a resonant external topping up could indeed occur.

For more information on this theory please read The Atom & The Substrate on Shape Journal.



10 August, 2015

Where Should and Where Does Real Power Reside?

Michael Coldwell - Westminister (2011)

Democracy is always claimed to deliver effective rule both for and by the people. But, that, of course, such is never the actual case.

It gives the appearance of key decisions being made by the mechanism of elections, yet, if none of the allowed candidates are intending to perform a dedicated service to the people, what then is such an election actually about?

And, what else will determine the positions of elected candidates? Will any at all reflect the real requirements of the majority of the population?

Or, will they all be members of various powerful groups, aiming for state-power to act in favour of their own decided positions? Indeed, will they merely carry out what they (as privileged groups) think is best for their own interests?

Clearly, no ordinary citizen could have anything like the same rights and privileges as the members of these groups.

For, he or she couldn’t just decide to stand, and be accepted as a candidate. The whole process is so big, and completely separated from ordinary people, that to join the election as a candidate would be both too difficult and too expensive, and would confer NO rights or resources to propagate your position to the electorate via the media.

Only those with enough wealth and power can get such privileges.

And, as even Local Authorities now are increasingly forced by financial constraints to dance to the instructions from Central Government, no real independent local route to political activity and publicity exists.

Let us be clear, no matter what interest groups form and agitate for particular policies, they have absolutely no power to do anything about it.

There were such groups, not so long ago, that could, indeed, do so: they were called Trades Unions, and they could en masse withdraw their labour to affect decisions, with regard to their members rights, conditions and remuneration. But, increasingly, in the decades since Margaret Thatcher, the Unions have been rapidly emasculated in pursuing that sole, powerful right of Working People, the Strike!

Now, it gets more and more difficult as the governments pass laws against such rights.

Yet, many years ago in a large European country, the people were in revolt against a World War and dictatorial central power, and they invented their own answer. Wherever they lived, and whatever their job, and even where they lived, they began to form Councils, or as they termed them in their own language – Soviets.

These were never allocated from on high and handed down – ready-made by those above. On the contrary, they were totally devised from below, and differed markedly, depending upon circumstances.

But, everyone was allowed to speak, and decisions were made by simple majority votes. Anyone elected within the Council to do particular jobs, were both elected to do it, and also mandated to carry out actions on a whole series of issues, and trusted to act, as the electors would desire, if other issues arose.

Yet, any mounting disagreement with what such an appointee was doing, would allow instant recall, and an elected replacement by another allowed, if a set threshold were surpassed by those against the actions of the erring incumbent. There was never a carte blanche to such people: they had to act in the interests of the majority of their electors.

Now, this isn’t what Stalin did, later on, in Russia! For he decided from the top-down what Soviets were and what they could and couldn't do. But, what the people had done in their own Soviets, whether they were workers or soldiers or peasants, was designed to always reflect their demands.

A Key Event occurred in 1917, for, by the usual top-down methods, those in charge had organised a statewide Constituent Assembly (a Parliament), at exactly the same time as the people were gathering in their own countrywide Congress of Soviets. Both claimed sovereignty and were sitting simultaneously.

Who would actually win power?

The Bolsheviks had attained leadership of the congress of Soviets, and simultaneously stormed the Winter Palace where the government were sitting, and immediately arrested the whole Provisional Government. At the Congress of Soviets, Lenin stepped onto the rostrum and stated, quite calmly, “We shall now construct the Socialist Order!”

07 August, 2015

We're on Facebook



Selling books and joining Facebook whatever next... Don't worry we're not selling out to the man! Hopefully we'll reach out to some new people and get some interesting discussions going on our new Facebook page Shape Forum.

The Atom and the Substrate (book for sale!)



A limited edition, high quality print version of my Atom and Substrate work is now available to buy through Blurb books.

What if space isn't really empty? What if the entire universe is actually full of many different types of particles that we can't detect, even inside the atom itself? Jim Schofield's controversial new theory postulates that such a substrate exists and that its presence can explain the propagation of light through space, magnetism and even gravity. This publication is a print-to-order edition of the Shape Journal, compiling Special Issues 36 & 37 in one book.

05 August, 2015

The Significance of a Substrate (...and more generally of a context)

David Moore - Light pattern, camera in motion (1948)

We always, as a first approximation, ignore the effects of any underlying substrate, particularly when dealing with dynamical situations of clearly existent and visible entities “moving through space”.

Yet, this is, actually, an absolutely necessary assumption, as, without any real understanding of such an all-pervading context, we just have to simplify to even begin the attempt to understand. And the theories and models that we do manage to develop, in this way, can indeed suffice in many situations, while forming a relatively sound basis for further improvements too.

But, also, as our own abilities and consequent requirements develop, we must, inevitably, start to include more developed concepts of the evident substrate in our models and theories, when our initial efforts clearly fail to deliver.

The omission of substrates certainly simplifies the relations we can find and extract, but we must never forget that in employing such methods we are also both simplifying and idealising the real situation which we are trying to understand, by considering only both the most obvious, and the easiest-to-deal-with aspects of a much more complex situation.

NOTE: before we go any further, it must be emphasized that ignoring the possible presence of a substrate, severely distorts the way we deal with certain important phenomena.

The Propagation of Electromagnetic Energy though Space, the idea of Action-at-a-Distance, and a whole further set of phenomena, such as Pair Productions and Pair Annihilations, all make no sense at all without the presence of some sort of universal substrate. And, theorists simply abandon the attempt to understand, and are instead satisfied with a useable description only – indeed, they replace all causative explanations by purely mathematical descriptions – namely Equations.

No initial efforts, applied to entities moving through the air, for example, have to include the effects of that substrate (as friction for example), and they are never included, initially. Using only a dynamical model, and then making adjustments, based upon results, will take us a long way to our objective, without involving the effects of the substrate.

But, if and when, an actual substrate is, itself, affected by the passage of such a moving body, and then reacts back upon that moving body (or another one closely following behind) then the consequent vortices and the recursive feedback cannot be ignored, and our conceptions, theories and models have to be developed upon a very different level.

So, our first inclusion of substrate effects will undoubtedly be a negative/frictional addition.

But, further studies also show that created vortices can significantly aid by enhancing the speed of following bodies, which is clearly a positive additional effect upon that movement.

What is also slowly becoming clear is, at yet another higher level, where energy caused by the motion of a body (particularly an oscillation) can be communicated via a substrate to another quite separate body, elsewhere, and with ongoing vibrations of the source, this transference of energy to the receiving body is termed Resonance.

Berenice Abbott - The Exposure of Standing Waves 

So, considering this involvement of the substrate somewhat further, we can encounter a situation where communicating resonant energy, along with a returning recursive effect reaching back to the original causing source, and then interacting with it, will change it significantly.

Now, where such things can happen is certainly not common in everyday experiences, but they do occur. And, in investigating such an actual case, the French physicist, Yves Couder, arranged a set up consisting only of a single substance, silicone oil, to be set in motion by the falling of a single drop of the very same substance, and in carefully tuned circumstances a wholly new and stable entity, termed the Walker, was created.

Now, this was a remarkable discovery, for what was achieved was totally inexplicable by the usual means, but, clearly, both Resonance and Recursion were involved.

Though this was a highly controlled experiment, no one could call it complex: it consisted of a single substance in the form of a substrate and an incident drop, along with an applied vibration – and absolutely nothing else!



Why had it never been observed before, and what was involved in how we considered such phenomena, and, even now, somehow prevented us from being able to explain what was going on?

The answer to this latter question is clearly crucial. And, this is because, what was preventing an explanation was a rarely admitted, but universally applied principle, termed the Principle of Plurality.

It is, certainly, this Principle, which makes the driving forces of Reality to be the entirely separable and unchanging Laws of Nature, which are said to cause all observed phenomena by merely summing together without any mutual transformations ever occurring. And, let’s be clear, it is the very basis of Analysis itself, where we attempt to find all the Natural Laws involved, and then explain the phenomenon solely in terms of those Laws - as unchanging components.

But, such a stance is almost never true!

And, the alternative Principle of Holism assumes, on the contrary, that the direct opposite is always the case, indeed, “Everything always affects everything else and changes it, so that nothing is eternal!” Clearly, if this is so, our methodology, for many, many centuries, has been quite definitely pluralistic, and certainly misleading us in literally every single case to some extent, at least, and, has survived, in spite of its inadequacies, by both the simplifying and idealising all of what we find.

Indeed, as was clear from the outset of this paper, we just cannot investigate actual Reality-as-is, because it doesn’t behave in a directly explicable way. So, to make situations amenable, we subscribe to the Principle of Plurality, as the basis of all complexity, and actually achieve a local, organised and maintained situation that conforms to Plurality - and then study that. To achieve this, we isolate, filter and constrain a locality, where the phenomenon we are interested in occurs, and, it is optimised to reveal just ONE of the involved, supposedly, “separable components”, while this “ideal” situation is maintained constantly, throughout our investigations.

This idealisation works because, by trial and error, experimenters finally adjust the context until a single targeted component is acting almost alone, and so can be displayed, observed and quantified to deliver its own, idealised Natural Law. But, in then assuming that the seemingly eternal Law will, in that farmed environment, remain exactly the same in all contexts is a myth.

With such a belief, it “became possible” to display and extract all the assumed-to-be, “unchanging Laws” contributing to a given real world situation, and analyse it into its “constituent Natural Laws”.

Clearly, we never actually crack any situation in totally unfettered Reality, but instead investigate and indeed “crack” a whole series of highly transformed and maintained idealised Domains. And, this means that whatever we do find, can only ever be applied in the very same artificial conditions in which they were discovered.

Postscript:

Two new Special Issues of the SHAPE Journal are now available on this subject by the physicist and philosopher Jim Schofield. The first is entitled The Substrate, and the second The Atom.



23 July, 2015

Both Ways Causality


The Crucial Importance of Recursion

Is causality a one-way mechanism? Do certain circumstances simply produce a single predictable outcome? Are cause-and-effect phenomena a one-way transaction, governed by an unchanging Natural Law? For, if all these were indeed true, then the outcomes, produced in a given context, may then become the cause in yet another unavoidable happening.

Causes and effects will be possible in long forward-moving linear sequences to predict future outcomes, yet can also, just as certainly, be traced backwards to original starting points, as is assumed in both Analysis and Reductionism.

Now, essentially, this position is clearly embodied in the rare even mentioned Principle of Plurality, and it alone has established the assumptions underlying the investigative methods we employ in many areas of study, and, most importantly, in what we call Science.

Though normally left entirely unstated, this Principle is almost universally dominant, and this is because it alone seems to provide us with an applicable investigative and theoretical process, which both delivers what are termed Natural Laws, and, as already mentioned, allows extension via both Analysis of complex situations, and even a tracing back through Time via Reductionism.

Indeed, physicists, studying what they regard as the Fundamental Particles of all Reality and hence the basis of all the sciences, justify that claim, by assuming that Plurality is indeed the true Nature of Reality. Many leading scientists have claimed as much, such as Murray Gell-Man, with his book The Quark and the Jaguar.




But, is this unstated assumed Principle really true? The simple and correct answer is, of course, “No!” Nevertheless, it has been, and still is, a useful initial stance, for in carefully chosen adjusted and then maintained circumstances, it can work quite well in many applications, where certain outcomes are required, and can reliably be made to occur.

The best description I can think of is that this approach is akin to The Farming of Reality. And, when this is done properly, it can certainly be made to work, and for the same reasons that Agriculture has been so effective in the large-scale production of food.

Instead of relying exclusively upon Reality-as-is, and simply using what we could find naturally occurring in the World around us, and using it “as found” (as was the case in our Hunter/Gatherer stage of development), we instead found ways to both change and prepare the land, to facilitate high yields from purposely sown-and-grown crops of what we required. And, that is also exactly what pluralist Science has managed to achieve in tackling a much wider series of situations and possibilities in our World - we are managing reality in order to use it.

Farming reality - agriculture as seen from space

But, just as Agriculture did not form a basis for understanding the Biology of Plants, so purely pluralist methods cannot automatically lead to a full understanding of Reality. It is clearly a pragmatic system of assumptions, concepts and methods, which necessarily kills the thing to be investigated, in order to try to understand it. It holds the piece of Reality dead still, in order to study it!

“Mix thoroughly and wait for equilibrium to become established before taking measurements” - does such a credo tell us anything about what was dynamically achieving our finally measured result, while we sat and waited?

So, if we can get by with most of our requirements by believing in this Principle, why is it also so misleading, and ultimately inadequate to addressing, and then explaining, the most important problems in Science?

The problem is partially to do with its too linear idea of Cause-and-Effect, as well as its too simplifying approach in its descriptions and its definitions. Effectively, it is an ideal method for studying static or stable situations only. But, it is useless for dealing with dynamic, evolving, or even only qualitatively changing situations.

So, its consequent way of dealing with Recursion is almost useless for it turns an actually ongoing and repeatable series of phases into a single caused effect, always ignoring ongoing, and frequently recursive transformations entirely.

Yet, when certain causes produce an effect, which only later on brings about a significant change in the original cause, a pluralist approach does not deal with that vital process at all: for it is, instead, addressed as just another cause affecting the original situation at a later time only. But it certainly isn’t that: it is a recursive consequence of the earlier cause, reflecting back upon it.

Now, such dissociated causes and methods as this are also carried over to become the main cornerstone of the ever more used method of Simulation. For, that has to be the only possible pluralist approach in complex, real-world scenarios, which cannot be separated into single effectively dominant contributions to be individually studied, without losing the actual crucial dynamic of the overall process.



To cope with the unavoidably complex situation, that have various intermediate outcomes, which cannot be predicted by isolated causes, a significant and unpredictable qualitative change is, instead, identified, from prior experience, with the passing of a threshold value of a certain variable. So, that, when this occurs, the situation is merely switched from one dominant model and its equations, directly to another dominant form, with its own equations. Both dominant phases are simplifications of what really occurs, and the Threshold variable is simply used as a switch between them, without the smallest vestige of an explanation as to why it occurs in the real situation.

The pluralist solution to a real-world complex and holistic situation, is a large set of pluralist, dominant simplifications, with the switches between them governed by rules of thumb, involving variable and the threshold values at which switches should be made.

Such means are not only very crude attempts, but are also entirely retrospective, in that they will only cover already experienced phases: it can never deliver the wholly new!

Now, Man is a highly intelligent animal, and has honed these methods to a remarkable degree, and just as our hunter/gatherer ancestors invented and effectively used Bows-and-Arrows, which transformed their lives, without their knowing why they worked so well, the scientists of today can manage to get remarkable results out of their studies, even though their resultant Laws are NOT of natural Reality-as-is, but only of specially farmed sections of Reality – though never, even then, representing what is actually going on, but, instead, using one particularly dominant part of that situation, until it has to be replaced entirely by another.

Clearly, Pluralist Science is the study of Static and Appropriately Farmed Domains, and totally omits the really important processes of a Developing situation, which very often will unavoidably involve Recursion.

True Causality is not one-way, and rarely strictly linear.

We can make farmed areas of it, in specially prepared and maintained Domains, but true Reality doesn’t work like that, particularly when it's on the move! The reason for the creation of phenomena, that are wholly new, is that Reality can, and indeed does, work both ways. The result of a certain conjunction of causes can, and does, react back upon those causes, and changes them, and their then consequences, significantly.

NOTE: It is certainly worth mentioning here that the recent solution to the perplexing anomalies, in the famous series of Double Slit Experiments, involved just such recursive effects caused initially, by the things travelling towards those slits, upon a reactive substrate, which later on reacted back upon those very same causes to deliver the full set of anomalies, unpredictable by any pluralist means.

Such significant, qualitative changes do not happen constantly, of course, for Reality’s commonest mode is one of settling into a series of relatively Stable Regimes – into so-called Eras of Temporary Stability, but every single one of these periods are always finally terminated: for developments occur in what are called Emergent Interludes outwith Stability, which start with crises, then deepen into wholesale collapses, and only then finally climb again via tumultuous, and even chaotic, periods of new, qualitative changes, which we term Emergences.

NOTE: So, it isn’t only the holistic changes that are necessary, but equally important is to recognise the oscillations between Stability and Emergences, without which creative Evolution could never happen.

And, in attempting to understand what occurs in these crucial interludes, Plurality is totally useless.

The only effective approach has to be the direct opposite one, which is emphatically holistic and replaces the static with the dynamic, and “keep it stable” approach with the alternation of Stability and Emergence that actually is crucial to understanding qualitative changes!

17 July, 2015

The Paths to Truth



Perhaps surprisingly, I see important resonances between the best scientists and the best fictional writers. I am sure that the reason is because there is no direct path to Absolute Truth in Humanity’s abilities and techniques. Whether you are a committed scientist or a dedicated writer, you have to find meaningful indirect routes, carrying sufficient Objective Content to deliver a measure of progress. But, of themselves, and in any particular production, these turn out to be never wholly sufficient.

Of course, I am not talking about all scientists, or all writers. I mean the very best in both these fields, who manage to reveal something of the Truth in their work.

This aspect wont be immediately evident, when considering the work of the majority of scientists, nor in that of most writers. For, the overwhelming majority of scientists are really technologists, and are primarily intent upon finding exploitable discoveries, while most writers readily admit to being entertaining story tellers. I am talking, in both areas, of those who earnestly tackle the question “Why?”.

For, believe it or not, such an imperative is not common to either most scientists or most writers: where a version of answering “How?” is considered sufficient. But, when you come across the very best writers, you are immediately aware that important things are being addressed, and you come away from reading their work with a definitely deeper understanding of important questions.

And, it is likewise, with the work of the best scientists (though I must emphasize that I do not include the “equation manipulators”, who currently dominate Sub Atomic Physics in this definition). I mean those who glimpse possible significant meanings, and are motivated to address why their studied areas are as they are.

We must realise that such people, in both areas, do not comfortably fit into the general social status quo! And, they are frequently given a hard time by those who feel threatened by their work. For, such purposely questioning work cannot but come up against the many things in their world, which hold it back, and keep it safe for those who currently hold sway.

For example, the writer of this piece is a scientist, and throughout his career he has been forced to take two unavoidable and directly contrasting roles in the opinions of his superiors. At first, he would invariably be the “blue-eyed boy”, for his abilities were soon evident, and he was seen as an asset to his superiors, and his place of work. But, such an interlude would never last! For, these entrepreneurs could not count upon his subservience to their requirements. He evidently had imperatives of his own to pursue. So, the alternative phase would always be that of the “enemy of the people”(see Ibsen’s famous play), in which he was seen as a thorn in the side of his superiors (and sometimes even his colleagues). The work being achieved was still of the same quality, but it no longer fitted in with the ambitions of his superiors.



The only way he was able to continue to make progress was by using his achievements to get a new job in another institution. It wasn’t that easy, for competition would always be with known locals, but in the end he usually made a good move, and the “blue-eyed boy”/”enemy of the people” oscillation would begin again. The policy adopted did in fact work, and he ended up as a professor in London University.

But, such a tumultuous journey was never plain sailing!

He, only rarely, found colleagues who had a similar approach, though when he did, he would remain (without promotion) at that particular post for a long time. In his most fruitful post, he stayed for ten years, and completed work, which finally allowed him to move into Higher Education.

The general state of his scientific colleagues in literally all of his posts was determined by the social imperatives of those in charge, so most employees subordinated themselves to that in order to slowly ascend the promotional ladder.

Now, though I am now a writer myself, I am not a creator of novels, and it is the best in that field, with which I feel the strongest resonances. But, they couldn’t be more different to the usual scientist. Whereas, the latter seek equations and useable discoveries, the writers never deal in anything similar, for it is invariably the qualities, both good and bad, of living that they pursue and reveal.

13 July, 2015

The coffin for Copenhagen now awaits!


Some comments on Bush’s Recent Review of
Yves Couder’s Experiments

The recent review entitled Pilot Wave Hydrodynamics by John W.M. Bush (of MIT’s Department of Mathematics) published in Annu. Rev. Fluid Mech. 2015, is without doubt comprehensive, and has confirmed this theorist’s (Jim Schofield’s) own ideas upon what the brilliant French physicist, Yves Couder, has finally revealed in an important series of experiments.

I would recommend it to all who are interested parties in that paradigm changing work.

Yet, in spite of almost completely evident (yet unstated) conclusions about the errors in present day physicists’ current Copenhagen stance, it chickens out within the very last paragraph of his final conclusions, in subscribing to Everett’s Many-World stance. (We must not forget that this contributor is a mathematician, rather than a physicist, so he reasons accordingly).

And, for the same reasons the account is saturated with the current consensus stance in such investigations, and it should not surprise us as he is from M.I.T.’s Department of Mathematics, and, at present, in Sub Atomic Physics (the evident target for the discoveries of Couder) the formal equation rules OK, and the approach is one in which the objective is to unearth and formulate purely mathematical descriptions, as if they were the essences of any physical attempts at explanation.

So, his review is automatically two stages removed from what is actually required in explaining not only what Couder and his historical antecedents have revealed so brilliantly, but the consequences of this work for Sub Atomic Physics is omitted.

There is, consequently, both the formalisation of that Sub Atomic Realm, and, then, the whole ethos of the professional mathematician, which together prohibit the required physical explanation of what has been revealed.

Nevertheless, what is available in this review is excellent in its historical references, and will facilitate truly physical investigators to, in consequence, have access to important gains and misdirections of the past. And these are, indeed, important! If the result of this work and this review is to re-interpret the Atom, and finally bury the Copenhagen Interpretation of Quantum Theory forever, then a great deal will have been made available to that essential objective.

Postscript:

Two new Special Issues of the SHAPE Journal are now available by the physicist and philosopher Jim Schofield. The first is entitled The Substrate, and the second The Atom, and constitute, as far as that writer can tell, the first suggestions as to a physical alternative to Copenhagen.



09 July, 2015

New Special Issue: The Atom and the Substrate 2


The 37th Special Issue of the SHAPE Journal and the second in a landmark series outlining an entirely new approach to Sub-Atomic Physics.

Clearly, if we are to seriously consider the presence of a Universal Substrate (like the now discarded Ether, but of a concrete composition), which is nevertheless undetectable by the usually applied means, we also have to address the dominating emptiness which, using all the current models, exists within the atom.

For, taking the known sizes of even the simplest atom’s components, and their distances apart, it would be hard to exclude any general substrate from filling those spaces too.

Now, if the consequence of such a substrate, outside of all the “material components”, was a major rethink, then the situation within atoms will certainly be even more demanding of a full explanatory account. Indeed, the current Copenhagen Interpretation of Quantum Theory is definitely NOT a physical description, never mind a physical explanation of phenomena in that realm, but, on the contrary, only a probabilistic description – involving only formal, abstracted elements, supported by a great deal of unsupported speculation.

Now, in this theorist’s treatment of the famed Double Slit Experiments, it was merely the presence of a Universal Substrate, which enabled an adequate, coherent and comprehensive explanation of all the confusing phenomena occurring there.

Thus, as we switch to the Sub Atomic Realm, we simply must consider all the effects that would be caused by the presence of that same substrate, on all phenomena occurring inside the atom too. So, this Special Issue of SHAPE Journal has as its remit the physical explanation of those phenomena – including, of course, the quantization of the orbits of contained electrons, and the presence of caused vortices in that substrate, which transform exactly how such phenomena are caused and inter-related to one another.

06 July, 2015

Bravo Varoufakis!


The Greeks move on

Having won the Referendum on rejecting the creditors' terms for continuing financial help, the socialist Prime Minister Alexis Tsipras and his Finance Minister Yanis Varoufakis turn up the heat yet another notch.

Just as their creditors' are clearly trying to demonise self-confessed Marxist Varoufakis for his strident condemnation of their tactics, he resigns out of the blue.

These Greek leaders are not falling for it. With Varoufakis tactically excluding himself from the negotiating team, the capitalists can no longer use him as the damning "fly in the ointment", and the intended mudslinging has been undermined.

I wonder why the BBC took 4 hours to announce his resignation. Didn't they know what to say?

The restructuring of the debts now becomes the key demand of the Greeks, and with some others and the IMF beginning to rethink the situation, they can fight hard to stop further Austerity.

Also, Spain is moving towards a general election later this year, and a party not unlike Syriza (Podemos) is in the running, and growing in strength with every gain made in the Greek battle. 

Clearly the credit-based capitalist union is being severely questioned. Why should the poorest pay for its evident and recurring weaknesses?

The fight-back has begun. 

Bravo Varoufakis!

Victory for the Greek people!