How does science make progress? In school we all learned about the "scientific method": data, hypothesis, experiment, new hypothesis...resulting in incremental improvement to our models of the world. When it comes to fundamental physics, however, this paradigm is rather inadequate, because it gives the impression that hypotheses are arbitrary, unconstrained, and concocted as needed to fit new data. Nothing could be farther from the truth.
Hypotheses in fundamental physics take the form of mathematical theories of the underlying structure of the universe, and mathematics is neither arbitrary nor unconstrained. Only certain mathematical structures exist and our theories must be built using these. New structures can be discovered, of course, but the latitude for constructing them is tightly limited by the requirement of logical consistency. For this reason, very sweeping hypotheses may often be put forth on the basis of little, or even no new data, but simply by investigating the mathematical consequences of our existing theories and fixing purely mathematical flaws.
In this case theorizing proceeds, not inductively, by gathering more data and seeking models to fit it, but rather deductively, by seeking some new mathematical structures which can resolve problems in the existing framework. Often enough there is only one mathematical structure which can achieve this. Of course it must be validated by experiments before we believe it, but if we find a complete, mathematically coherent hypothesis, chances are good that it is correct, because such hypotheses are not common. One or two key experiments may be all it takes to convince the community of a new theory when no mathematically compelling rival has been found.
It is not too much of an exaggeration to say that all of modern physics was born in this fashion. In the early to mid 19th century, Michael Faraday and James Clerk Maxwell had introduced a major new mathematical concept to the world, the "field", and had argued convincingly that the phenomena of light, electricity, and magnetism could all be unified in a new theory based on this new concept. The new theory was called Electromagnetism, and was the first major advance in physics since Newton's laws. The new theory scored success after success, but after several decades some clear thinkers began to notice that the field concept contained certain inherent difficulties. These were Lord Kelvin's famous "two small clouds" on the horizon of physics, and they would grow, respectively, into the revolutionary storms of Relativity and Quantum Mechanics.
To describe these problems let's back up and review physics as it was before the advent of the field. Before fields, there were particles. Particles were discrete bundles of matter, not subject to further analysis, and had a definite location at each moment in time. They exerted forces on each other by instantaneous action-at-a-distance (Newton's law of gravity, and the similar laws of electric charge attraction and repulsion).
The phenomenon of light, however, is very difficult to understand with a particle model. Its diffraction, refraction, and interference behaviors can only be explained by assuming light is a wave. But a wave of what? Something has to be "waving", just like the water whose up-and-down movement constitutes water waves; and that something is the newly invented concept of the field.
A field, unlike a particle, exists everywhere. In every nook and cranny of space, at all times, within and without any other matter, the field is there. It is somewhat analogous to temperature and pressure in the Earth's atmosphere; for every point in the space above the Earth, there is a temperature number and a pressure number. Likewise, a field is described by a certain set of numeric values at every point in space and time (for electromagnetism, there are six values). The larger the values, and the more rapidly they are changing, the more energy the field contains at a particular location. A disturbance at one location, like a pebble dropped in water, spreads by waves into the surrounding space.
We can't go further into the physics of fields and waves here, but the important point to grasp is that the field is a new kind of mathematical beast. Particles are defined by a location; fields are defined by a value at every possible location.
Every possible location is a lot of locations, and therein lies the first, and most crucial problem with fields: they have too much energy-storage capacity. You can always pack more energy into a given little region just by making the field fluctuate more rapidly in that region. This, unfortunately, makes it impossible to cook food! An oven works by heating up the surroundings of the food, so that heat is transferred to the food. The surroundings of the food include, of course, the electromagnetic field, so this must be heated up. But no matter how much energy you pump into the field in the oven, there is always room for more - there are always higher frequency modes of fluctuation which are not yet filled. The field, therefore, can never be heated to any temparature; both the oven, and the food in it, would see all of their energy sucked away by the field, making them colder than they started (indeed, taking them to absolute zero).
This paradox of fields was known for technical reasons as the "ultraviolet catastrophe", and it shows quite starkly that a classical field theory such as electromagnetism cannot be a fundamental theory of nature. No matter how well it seems to match many experiments, it is not mathematically possible for it to truly represent a universe in which any structure, e.g. life, could exist.
Max Planck was obsessed with this problem and, in perhaps the most remarkable bout of theorizing in the history of physics, he concocted a mathematical formula to resolve the oven problem, and a profoundly non-intuitive mechanism to underly it. Planck's formula was ad-hoc and just the tip of the iceberg - the first glimpse of a new, consistent mathematical structure which contains the old field theory, mostly, and resolves its problem of "too muchness". The new structure, called Quantum Field Theory (QFT), was created in the 1930's and its profound mathematical depths are being plumbed to this day.
Between Planck's discovery, in 1900, and the advent of QFT in the 1930's, physicists were engaged in working out a preliminary stage of this theory, namely Quantum Mechanics. Quantum Mechanics is a theory of particles, not fields, and this has obscured the fact that it came into existence to resolve a problem with fields. The universe could be made of classical, Newtonian particles; or, it could be made of Quantum particles; but it cannot be made of classical, Faraday/Maxwell fields.
Classical field theories cannot underly a real universe because of the oven problem, and so far no way has been found to resolve this outside of the Quantum. It appears that the "purpose" of the Quantum is to make field theories mathematically possible.
Thus Quantum physics arose out of the mathematical difficulties of fields. The Theory of Relativity also arose from the mathematics of fields, not as a problem but rather a very unexpected mathematical consequence.
Recall that energy propagates through a field by waves, just like the water waves when a pebble drops. So what? Well, the funny thing about waves is that they have a predetermined speed. You can't push on water waves to make them go any faster; any kind of splashing or pushing you do just makes more waves, but the new waves move at the same, predetermined speed as the old ones. This is completely different from particles, which move faster if you push them harder.
Now let's imagine that everything in the universe is described by a field of one kind or another (which, in fact, is believed to be the case). Imagine an object, for example a wristwatch, which consists of various different parts. These parts have to communicate with each other in order for the watch to work. The communication happens by waves of the fields, and these waves move at a certain speed. Now here's the kicker: what if the watch itself is moving at a speed close to the wave speed? Then the waves emitted from the parts behind are going to have an awfully hard time "catching up" to the parts ahead. This moving watch is very unlikely to tick at the same rate as a stationary watch; indeed, when we look at it this way it seems surprising that it can keep working at all.
Einstein thought very hard about this problem, albeit from a somewhat different angle, and the result is his famous Theory of Relativity, in which moving clocks run slow, moving objects shrink, and matter equates to energy. I will fill in more of the logical steps here in a later blog, but the point to take away is that things built from fields act funny when they move, because waves travel with a fixed speed. Depending what the fields are like exactly, moving things can act funny in a simple way or in arbitrarily complex ways. Einstein's hypothesis is that they act funny in the simplest possible way. His theory is often regarded as a theory "about space and time", but I think it is more correct to regard it as a theory about the behavior of moving matter; however, this discussion must wait for a later blog.
In closing let me note that the problems and mathematical developments brought about by the field concept are far from finished. It turns out that most Quantum Field Theories still suffer from the problem of "too muchness". In Quantum Physics, particles (i.e., local field fluctuations) can pop into existence temporarily from nothing, and if there are too many possible modes of fluctuation (roughly speaking) the theory doesn't make mathematical sense. This appears to be the case for any possible Quantum theory of gravity, so that gravity cannot coexist with the theories we have now for other types of matter. Something beyond a QFT is needed - and so far the only compelling candidate is String Theory.
Therefore, with some exaggeration, we can say that all of modern fundamental physics, from Relativity to Quantum Physics to String Theory, was implicit in the purely mathematical difficulties which arise from the field hypothesis. Had all scientific experimentation stopped in 1850, it is quite possible that all of modern physics would still have been discovered by mathematicians, and that they would have become convinced of its truth based on consistency alone, and lack of any other discoverable alternatives.
A field is a theoretical construct. It may not always correspond perfectly to real phenomena. For example, the temperature gradient in a piece of metal or the velocity of a fluid can be described by the use of fields but the implicit assumption is of a continuous and arbitrarily divisible substance. And this assumption is not true, not for real materials made of atoms. For many situations the difference is not important; but for some it is.
Ovens that work by convection do not depend on radiation, and will therefore work quite well regardless of the Rayleigh breakdown problem. In fact, the classical descriptions of radiant heat transfer work pretty well in predicting the function of an oven. At least they did in 1991 when I designed a 40 kW drying oven as a thesis project and used classical radiant heat transfer models to predict the performance.
As for mathematical constructs in physics; I think one will find that Schroedinger's equation is descriptive, not prescriptive. And Einstein's special theory of relativity is simply a body of reasoning applied to the axiom that the speed of light is a fixed constant. In both cases the math is descriptive.
Sometimes it is helpful to advance the mathematical tools. Sometimes it is helpful to make new and better observations. But I think it is most helpful to think of science as working toward a correspondence between a model and the real world.
Thanks for your comments. You could be right about the ontological status of fields, etc.
However, it is a matter of historical record that both Quantum Physics and Relativity arose from people pondering purely mathematical difficulties with fields.
Planck was trying to understand how thermal equilibrium could be possible given a field with unlimited high-frequency modes, while Einstein was trying to figure out what how Maxwell's equations could still hold true for someone moving at light speed (of course, they can't).
Neither Planck nor Einstein made use of any special observational data in their great works. Mathematics must be internally consistent, and when it isn't, one can be sure that it is not yet the right mathematics.
Post a Comment