From the article:
"The world is not a logically consistent one, but a profoundly paradoxical one. Again, this is illustrated in science, which shows that two things can be one at the same time — light, for instance, acts as both a particle and a wave."
I don't understand the need for constantly interspersing an otherwise flowing article with references to science. It feels like the author thinks that the authenticity of his thoughts and personal journey hinges on validation from science.
It is fairly common in zen publications to point out apparent exceptions to an otherwise binary system (i.e. Logical). It isn't seeking validation from science it is invalidating science as the be-all of understanding our existence.
Outside of zen (though very closely related). Phenomenology deals with this Subject/Object paradox our thoughts/language/science revolves around, and how it comes up short it describing some phenomena.
I re-read the article. Nowhere is it clear that there is even an attempt at invalidating science. He really does seem to be trying to use science to shore up his ideas. Another excerpt:
"This is the basis of Zen itself — that all life and existence is based on a kind of dynamic emptiness (a view now supported by modern science, which sees phenomena at a subatomic level popping in and out of existence in a ‘quantum froth’)."
I cringed every time he referenced Physics. It's something I've seen other philosophers do in an attempt to legitimize their beliefs by implying that they codify the natural order.
In truth, dynamic emptiness has absolutely nothing to do with quantum froth, which could be described using any objects and still make sense. This concept of some unseen physical manifestation of the model arises because of the terms a theoretician chose to identify elements of his mathematical model. They're essentially undefined terms, given meaning by their context in the larger physical system.
A lot of people get lost in the undefined terms, thinking that because we use a familiar word for the term, that the item in the model which the term identifies somehow actually exists. Actually there's no proof that the quantum froth exists as physical object, any more than that light is somehow both a wave and a particle. Rather, wave/particle "duality" is a consequence of different mathematical models which happen to generate results which are consistent with observation.
The "duality" doesn't actually exist, and the way light behaves is so incredibly nuanced that people dedicate their lives to modeling and testing it. Waveness and Particleness are models which produce mathematics consistent with the natural world.
In the cases where more than one mathematical model fits the same set of observations, scientific controversies arise. A Physical theory is accepted when the mathematical model produces results which are consistent with observation, and sufficient experimental work has been done to rule out other mathematical models with overlapping explanatory power.
My thermodynamics professor opened all of his classes by saying "Everything I am about to tell you is a carefully constructed series of lies."
Why not? Picking everything to pieces is exactly what those people (theoretical and practical scientists) who seem to have the most success in approximating truth do. These are the methods that best produce accurate predictions about the future and best produce designs for new technologies that actually work.
Beliefs are what we use to model how the world is and what the results of actions and experiments will be. Reality is what actually determines those results, regardless of what we might believe. Truth is then the set of beliefs about the world that accurately model reality. And I'll say it again, the scientists who spend decades picking things to pieces are the ones who end up with the most accurate beliefs. They find the most truth.
It's a correspondence definition. And actually I understand correspondence has been one of the most popular theories of truth since Plato and Aristotle, continuing to the present.
A child who knows his toys well might find it tough to ignore liberties taken in discussing them.
As someone interested in mindfulness for its own pursuits, I want to learn what's in my teacher's head, rather than listen to them try unsuccessfully to tie that to what's firmly in mine. I'm sorry to say it's distracting :)
I find that pill very hard to swallow. It's just too self-serving. Any religious figure is going to try to convince people not to pay any attention to the "man behind the curtain."
He certainly does take a potshot at reductionism and science's prolific use of that thinking:
--- While it is refreshing that Zen philosophy is supported in many ways by present scientific knowledge, it is also a critique of scientific thought. The scientific tradition requires things up to be cut up — both mentally and physically — into smaller and small pieces to investigate them. It suggests that the only kind of knowledge is empirical and that the rigid laws of scientific method are the only kind that are valid.
Zen implies that this is like throwing the baby out with the bathwater — scientific thinking might be immensely useful, but it also does violence to a meaningful conception of life. It tends to screen out the essential connectedness of things. We live in an imprecise world. Nature is extraordinarily vague. Science promotes the idea of hard, clear ‘brute facts’ — but some facts are soft. A ‘cutting-up’ attitude to life gives us dead knowledge, not live knowledge.---
There are spiritual observations that precede science. Saying that there isn't scientific evidence for anything is simplying stating that there isn't scientific evidence for something YET. And, showing how science has met spiritual observations, down the road, lends credence to the validity of the earliest explorations of existence.
> Saying that there isn't scientific evidence for anything is simplying stating that there isn't scientific evidence for something YET.
It can also be a weasel-y way of avoiding mentioning that there is plenty of scientific evidence favouring the null hypothesis when tested against that something. E.g. mind-reading.
Is it now time for this ancient dichotomy to be embraced by computer science?
For the sake of argument, if the principles of OOP are: dynamic dispatch; abstraction; subtype polymorphism; and inheritance, what might be the equivalent for Subject-Orientated Programming ?
In my view, a charateristic of SOP must be that data is personal and unique. Every single usage of data specifies a new unique identifier. To read is to interpret: is to record a new ID. Here, identity (x==y) gets broken; so another operator is needed (x~=y).
somewhat tangential to your point of implementing a so-called SOP, I think the discussion of state, as it relates to OOP or otherwise, can result in very philosophical conversations. see here: http://clojure.org/state
As an ex philosophy major, immutable state and its implications on identity interest me greatly. I feel like this, and other ideas from FP, could be considered a step toward a so-called Subject-Object Programming paradigm.
It actually feels un-Zen like for him to obsess over that. It sort of illustrates his example of dukkha. On one hand he tells us that Zen rejects the need for logical consistency. On the other hand, he keeps using science as validation. Even if there are paradoxes or contradictions in science, scientists are constantly working to resolve them because they don't accept the idea of an inconsistent world. His constant return to science is an example of dukkha. Just let it go man...
I think the author does this, not for his validation, but to cement the connections in the mind of the Western readers who are raised with a strong basis in science and logic.
The text promotes keeping an open mind and to be humble. Often, The exception to a common rule are dismissed as minor and without effect which allows us to keep our certainties. The light example in this context, is well choosen : it is fundamental, factual and therefore cannot be dismissed easily. The scientific aspect is secondary to those advantages.
> The light example in this context, is well choosen : it is fundamental, factual and therefore cannot be dismissed easily.
I'm curious, have you actually studied quantum mechanics? Because the impression I get from people who have is that the "light is a particle and a wave" thing is pretty much bullshit. (Which isn't the same thing as false, but more like "not actually a thing that you tell people if you want them to have a better understanding of the nature of reality".) But I haven't studied it myself.
I'll chime in here as someone who studied to become a physicist: it's not bullshit; the truth about the nature of matter is too complex (and weird) for someone without a deep understanding of mathematics to understand. Particle/wave duality is a useful abstraction that relates the math to something humans can grasp intuitively.
You are not refuting the GP, you are just redefining bullshit.
Yes, the wave/particle duality is a nice abstraction we can apply to QM so that we can extract some results from it intuitively (or with just a few calculations). Besides, it's a concept with deep historical significance. Even if it wasn't a good abstraction, it would still be important.
But the duality is also something you can wave in front of people that didn't study QM to make sure they don't question your arguments. Doing that will make the arguments harder for them to understand, but sound like an explanation (that means, something that makes it easier to understand).
"Bullshit" is a common name for that second kind of usage.
The (thought) experiment in quantum mechanics that demonstrates that light is neither a (classical) particle or a wave is the double slit experiment, careful analysis of the experiment demonstrates that the observation is neither consistent with the behavior of a classical particle or classical (say water) wave. Instead the state of the system (in the Copenhagen interpretation) is described by a probability amplitude ψ (if it is a "pure" state) or more generally by a "density matrix" ρ and there is a semi-rigorous prescription of how to translate classical notions like position, momentum and energy into "observables" of the system. Then the expectation value for an observable A of the system in a particular state ρ, is given by Trace(ρA). If you have taken a course in linear algebra then this is the same trace you have met there and there are simple quantum mechanical systems that can be described by two by two matrices (spin of electron going through a sequence of inhomogenous magnetic fields). Once you have introduced probability amplitudes you can reason about them as if they were classical particles within certain limits you find such reasoning in the wikipedia article on the double slit experiment for example.
The crucial idea behind all of this is "quantization", physicists have worked out a way of obtaining quantum systems from classical system and in particular a mapping of familiar observables to quantum mechanical observables.
One insight of Feynman was that this can be done via the so Lagrangian formulation of classical physics. It postulates that the future state of a system can be predicted from the current state of the system by assigning to each possible state a so called Lagrange density. Given two possible states of the system and a path in state space between them, integrating the Lagrange density along that path gives an "action" for that possibility, the theory postulates that the path with the minimal action is chosen. With appropriate choices of Lagrange density this correctly predicts the behavior of all classical mechanical systems, and also Maxwells equations. Feynman instead postulated that to obtain the probability amplitude for a system to transition between two states one had to sum over all possible paths in the classical state space weighted with exp(iS/2πh) where the action S is the result of integrating the Lagrangian density along that path and h is Planck's constant. Then by analogy with a result of 19th century analysis (http://en.wikipedia.org/wiki/Method_of_steepest_descent) the sum will be dominated by contributions close to the critical (classical) path, so that for S much larger than h (which is true for classical mechanics) we recover the original Lagrangian formulation.
So to summarize Physicists use the intuition gained from classical physical systems to study quantum mechanical systems, but not in the naive way of "light is both a particle and wave", which at best is misleading. What constitutes a quantum system is largely a question of length or energy scales. The method of path integral quantization suggests a very concrete way of how to quantize a classical physical system.
One should take into account the fact that Watts was presenting his material to an audience of western intellectuals who had been so deeply indoctrinated into "scientific orthodoxy" that one really had no chance of getting anything across if one did not choose to invoke science. I don't see this as any different than the Buddha choosing to give teachings using concepts familiar to the brahmins of his time.
I don't understand the need for constantly interspersing an otherwise flowing article with references to science. It feels like the author thinks that the authenticity of his thoughts and personal journey hinges on validation from science.