|David Moore - Light pattern, camera in motion (1948)|
Yet, this is, actually, an absolutely necessary assumption, as, without any real understanding of such an all-pervading context, we just have to simplify to even begin the attempt to understand. And the theories and models that we do manage to develop, in this way, can indeed suffice in many situations, while forming a relatively sound basis for further improvements too.
But, also, as our own abilities and consequent requirements develop, we must, inevitably, start to include more developed concepts of the evident substrate in our models and theories, when our initial efforts clearly fail to deliver.
The omission of substrates certainly simplifies the relations we can find and extract, but we must never forget that in employing such methods we are also both simplifying and idealising the real situation which we are trying to understand, by considering only both the most obvious, and the easiest-to-deal-with aspects of a much more complex situation.
NOTE: before we go any further, it must be emphasized that ignoring the possible presence of a substrate, severely distorts the way we deal with certain important phenomena.
The Propagation of Electromagnetic Energy though Space, the idea of Action-at-a-Distance, and a whole further set of phenomena, such as Pair Productions and Pair Annihilations, all make no sense at all without the presence of some sort of universal substrate. And, theorists simply abandon the attempt to understand, and are instead satisfied with a useable description only – indeed, they replace all causative explanations by purely mathematical descriptions – namely Equations.
No initial efforts, applied to entities moving through the air, for example, have to include the effects of that substrate (as friction for example), and they are never included, initially. Using only a dynamical model, and then making adjustments, based upon results, will take us a long way to our objective, without involving the effects of the substrate.
But, if and when, an actual substrate is, itself, affected by the passage of such a moving body, and then reacts back upon that moving body (or another one closely following behind) then the consequent vortices and the recursive feedback cannot be ignored, and our conceptions, theories and models have to be developed upon a very different level.
So, our first inclusion of substrate effects will undoubtedly be a negative/frictional addition.
But, further studies also show that created vortices can significantly aid by enhancing the speed of following bodies, which is clearly a positive additional effect upon that movement.
What is also slowly becoming clear is, at yet another higher level, where energy caused by the motion of a body (particularly an oscillation) can be communicated via a substrate to another quite separate body, elsewhere, and with ongoing vibrations of the source, this transference of energy to the receiving body is termed Resonance.
|Berenice Abbott - The Exposure of Standing Waves|
Now, where such things can happen is certainly not common in everyday experiences, but they do occur. And, in investigating such an actual case, the French physicist, Yves Couder, arranged a set up consisting only of a single substance, silicone oil, to be set in motion by the falling of a single drop of the very same substance, and in carefully tuned circumstances a wholly new and stable entity, termed the Walker, was created.
Now, this was a remarkable discovery, for what was achieved was totally inexplicable by the usual means, but, clearly, both Resonance and Recursion were involved.
Though this was a highly controlled experiment, no one could call it complex: it consisted of a single substance in the form of a substrate and an incident drop, along with an applied vibration – and absolutely nothing else!
Why had it never been observed before, and what was involved in how we considered such phenomena, and, even now, somehow prevented us from being able to explain what was going on?
The answer to this latter question is clearly crucial. And, this is because, what was preventing an explanation was a rarely admitted, but universally applied principle, termed the Principle of Plurality.
It is, certainly, this Principle, which makes the driving forces of Reality to be the entirely separable and unchanging Laws of Nature, which are said to cause all observed phenomena by merely summing together without any mutual transformations ever occurring. And, let’s be clear, it is the very basis of Analysis itself, where we attempt to find all the Natural Laws involved, and then explain the phenomenon solely in terms of those Laws - as unchanging components.
But, such a stance is almost never true!
And, the alternative Principle of Holism assumes, on the contrary, that the direct opposite is always the case, indeed, “Everything always affects everything else and changes it, so that nothing is eternal!” Clearly, if this is so, our methodology, for many, many centuries, has been quite definitely pluralistic, and certainly misleading us in literally every single case to some extent, at least, and, has survived, in spite of its inadequacies, by both the simplifying and idealising all of what we find.
Indeed, as was clear from the outset of this paper, we just cannot investigate actual Reality-as-is, because it doesn’t behave in a directly explicable way. So, to make situations amenable, we subscribe to the Principle of Plurality, as the basis of all complexity, and actually achieve a local, organised and maintained situation that conforms to Plurality - and then study that. To achieve this, we isolate, filter and constrain a locality, where the phenomenon we are interested in occurs, and, it is optimised to reveal just ONE of the involved, supposedly, “separable components”, while this “ideal” situation is maintained constantly, throughout our investigations.
This idealisation works because, by trial and error, experimenters finally adjust the context until a single targeted component is acting almost alone, and so can be displayed, observed and quantified to deliver its own, idealised Natural Law. But, in then assuming that the seemingly eternal Law will, in that farmed environment, remain exactly the same in all contexts is a myth.
With such a belief, it “became possible” to display and extract all the assumed-to-be, “unchanging Laws” contributing to a given real world situation, and analyse it into its “constituent Natural Laws”.
Clearly, we never actually crack any situation in totally unfettered Reality, but instead investigate and indeed “crack” a whole series of highly transformed and maintained idealised Domains. And, this means that whatever we do find, can only ever be applied in the very same artificial conditions in which they were discovered.
Two new Special Issues of the SHAPE Journal are now available on this subject by the physicist and philosopher Jim Schofield. The first is entitled The Substrate, and the second The Atom.