> The light example in this context, is well choosen : it is fundamental, factual and therefore cannot be dismissed easily.
I'm curious, have you actually studied quantum mechanics? Because the impression I get from people who have is that the "light is a particle and a wave" thing is pretty much bullshit. (Which isn't the same thing as false, but more like "not actually a thing that you tell people if you want them to have a better understanding of the nature of reality".) But I haven't studied it myself.
I'll chime in here as someone who studied to become a physicist: it's not bullshit; the truth about the nature of matter is too complex (and weird) for someone without a deep understanding of mathematics to understand. Particle/wave duality is a useful abstraction that relates the math to something humans can grasp intuitively.
You are not refuting the GP, you are just redefining bullshit.
Yes, the wave/particle duality is a nice abstraction we can apply to QM so that we can extract some results from it intuitively (or with just a few calculations). Besides, it's a concept with deep historical significance. Even if it wasn't a good abstraction, it would still be important.
But the duality is also something you can wave in front of people that didn't study QM to make sure they don't question your arguments. Doing that will make the arguments harder for them to understand, but sound like an explanation (that means, something that makes it easier to understand).
"Bullshit" is a common name for that second kind of usage.
The (thought) experiment in quantum mechanics that demonstrates that light is neither a (classical) particle or a wave is the double slit experiment, careful analysis of the experiment demonstrates that the observation is neither consistent with the behavior of a classical particle or classical (say water) wave. Instead the state of the system (in the Copenhagen interpretation) is described by a probability amplitude ψ (if it is a "pure" state) or more generally by a "density matrix" ρ and there is a semi-rigorous prescription of how to translate classical notions like position, momentum and energy into "observables" of the system. Then the expectation value for an observable A of the system in a particular state ρ, is given by Trace(ρA). If you have taken a course in linear algebra then this is the same trace you have met there and there are simple quantum mechanical systems that can be described by two by two matrices (spin of electron going through a sequence of inhomogenous magnetic fields). Once you have introduced probability amplitudes you can reason about them as if they were classical particles within certain limits you find such reasoning in the wikipedia article on the double slit experiment for example.
The crucial idea behind all of this is "quantization", physicists have worked out a way of obtaining quantum systems from classical system and in particular a mapping of familiar observables to quantum mechanical observables.
One insight of Feynman was that this can be done via the so Lagrangian formulation of classical physics. It postulates that the future state of a system can be predicted from the current state of the system by assigning to each possible state a so called Lagrange density. Given two possible states of the system and a path in state space between them, integrating the Lagrange density along that path gives an "action" for that possibility, the theory postulates that the path with the minimal action is chosen. With appropriate choices of Lagrange density this correctly predicts the behavior of all classical mechanical systems, and also Maxwells equations. Feynman instead postulated that to obtain the probability amplitude for a system to transition between two states one had to sum over all possible paths in the classical state space weighted with exp(iS/2πh) where the action S is the result of integrating the Lagrangian density along that path and h is Planck's constant. Then by analogy with a result of 19th century analysis (http://en.wikipedia.org/wiki/Method_of_steepest_descent) the sum will be dominated by contributions close to the critical (classical) path, so that for S much larger than h (which is true for classical mechanics) we recover the original Lagrangian formulation.
So to summarize Physicists use the intuition gained from classical physical systems to study quantum mechanical systems, but not in the naive way of "light is both a particle and wave", which at best is misleading. What constitutes a quantum system is largely a question of length or energy scales. The method of path integral quantization suggests a very concrete way of how to quantize a classical physical system.
I'm curious, have you actually studied quantum mechanics? Because the impression I get from people who have is that the "light is a particle and a wave" thing is pretty much bullshit. (Which isn't the same thing as false, but more like "not actually a thing that you tell people if you want them to have a better understanding of the nature of reality".) But I haven't studied it myself.