# THE MICHELSON-MORLEY EXPERIMENT

### The prevalent belief in the nineteenth century was that light is a wave, carried by a subtle medium, the aether, which is at rest in the universe.The sun is at rest, in the center of the universe, and the earth moves through the aether and around the sun at about 30 km/sec.

The Michelson-Morley (M&M) experiment was designed to verify this belief. If light is sent back and forth on earth, in the direction of the earth's movement, then the round trip should take longer than it would if there were no aether. It should also take a little longer, but not as much as in the parallel direction, for light moving back and forth in the direction perpendicular to the earth's movement. The M&M experiment used two identical rods perpendicular to each other along which the light moved back and forth.

This experiment, first performed in 1881, repeated in 1887, and often thereafter, could find no indication of a difference. But the belief could not be shaken, and Hendrik Lorentz, in 1895 and again in 1904, thought the explanation could possibly be that the rod placed in the direction of the earth's movement might contract due to the earth's movement, just enough to make the round trip time equal to the case when there is no movement. The equation he developed is known as the Lorentz transformation (LT).

To understand the challenge, and the significance, of the Michelson-Morley (M&M) experiment, and explain the mathematical error, we can look at an analogous situation. We know that the velocity of a plane is measured with respect to the air stream through which it moves. If we fly from, say, San Francisco to New York and back and the air is still, it will take six hours each way, a total of 12 hours, at 500 miles per hour. If the air is moving at 100 miles an hour, from west to east, and the plane flies at 500 miles per hour, with respect to the air stream, the distance of about 3000 miles is covered in five hours (at 500+100 miles per hour) and the return trip takes 7.5 hours (at 500-100 miles per hour). So the total time is not 12 hours but 12.5 hours! The gain and loss, due to the movement of the air, don't quite balance out. Analogously, that would be the situation if the velocity of light is measured with respect to the aether, and the earth moves through the aether.

It is easy to show that if the round trip distance is reduced by 240 miles, or the distance between SF and NY by 120 miles, that is, by one-half this amount, it takes 4.7 hours going, 7.3 hours returning, and the total time will be 12 hours - the same as it would be if there were no jet stream. We cannot reduce the distance between NY and SF by the square root of 240 or about 16 miles, since that would not be enough to bring the time back to 12 hours.

But there is a subtle point that must be noted. If we want to deal with a single direct path, that is in some sense the ‘average' of coming and going, we could, mathematically, take the square root of the total path reduction that is needed. This produces the Lorentz factor (often called the ‘gamma' factor), that appears in the denominator in many important equations in particle physics as well as in astronomy. It applies only to a fictitious one-way path that has no basis in reality.

The problem with this step lies in the relationship between mathematics and physics:

Taking a square root, a geometric mean, is appropriate, ONLY if a second factor follows and acts on a first factor, that is to say, if the factors multiply. For example in the case of the Doppler factor, a stretch of a time interval on one leg of a journey is followed by second stretch on the return path - the Doppler factors (not the Doppler shifts) multiply (see the chapter on Doppler). Similarly if I stretch a rubber band by a factor 2, and then do a second stretch, by a factor 8, the total is 2x8 = 16 (and not 2 + 8 = 10). The ‘average', in the sense of a geometric mean, is then the square root of 16 or 4 (since 4x4=16). In the case of addition we would not take the square root of 10 to get the average of 2 + 8. But that, in essence, is what Lorentz did! Apparently he did not realize that time of travel, unlike the Doppler factor, did not involve stretching or shrinking. The shorter time on one leg of a journey is not further modified on the return leg. We simply get a longer time of transit, and the two times add - they do not multiply. There is no average airflow and no average time that makes sense.

Simply put, the mathematics of the Lorentz transformation, the last step of taking the square root as pertaining to the time for one leg of the journey, is pure fantasy, with no connection to reality - it ignores the physics of two way travel. There can be no physical significance to the square root, or to any concepts or quantities containing this factor.

Einstein maintained that he developed the Lorentz transformation independently. He does not cite Lorentz in his 1905 paper. It is most curious, therefore, that he made the same mathematical mistake of taking the square root at the same point in the development of the transform as his predecessor.

The defect in the LT does not show up because in all down-to-earth experiments we are dealing with velocities one hundred thousand to one million times smaller than the velocity of light. The defect becomes readily apparent only when the velocity is about one tenth the velocity of light, or greater. This happens in cosmology and in particle physics and requires a rethinking of much of 20th century physical theory. In cosmology, we encounter large red shifts, especially with type 1A supernovae. Using a revised formula for the Doppler factor, we are led to a radically different view of the origin and destiny of the universe.