- #1
xXIHAYDOIXx
- 21
- 0
My friend and I have debated about this for some time and, because the vast majority of the scientific community backs him, I presume him to be correct but still do not understand why. I have far from an extensive knowledge of relativity, but I will try not to make a complete fool of myself.
From what I understand, some or much of the theory of Time Dilation rests upon the model of and experiments concerning the light clock, or the clock that has an interval of time equal to the amount of time it takes for a beam of light to strike one photosensitive plate, "bounce" off, and strike another. It is relatively simple to understand how this clock is affected by time dilation. If an astronaut were to have a clock such as this aboard his static spaceship, it would keep time rather well both for an equally static outside observer and the astronaut himself. However, were the astronaut to fire up his futuristic ship and go 0.99c, then the clock would still keep time for anyone aboard the ship, but to the previous static observer, if he could somehow detect electromagnetic radiation, the light would be forced to travel further in order to reach the other panel and would look as though it were traveling diagonally. Because c is constant, it would take longer, causing the observer to perceive more time relative to the astronaut according to the clock.
That much I know (or think I know. It is possible my confusion comes from a poor understanding of what I tried to explain above), but here is where I am confused.
Why is it not just an error in the method of keeping time? For example, while a light clock would have these problems, why would two analog clocks somehow perfectly synchronized before the journey give separate readings after reaching 0.99c? As I said before, this example appears to only account for an error in man's attempt to keep time, not in time itself as measured by the universe (I.E. cellular degradation, planetary orbits, birth and death of stars). Where am I going wrong? Any and all help would be greatly appreciated.
From what I understand, some or much of the theory of Time Dilation rests upon the model of and experiments concerning the light clock, or the clock that has an interval of time equal to the amount of time it takes for a beam of light to strike one photosensitive plate, "bounce" off, and strike another. It is relatively simple to understand how this clock is affected by time dilation. If an astronaut were to have a clock such as this aboard his static spaceship, it would keep time rather well both for an equally static outside observer and the astronaut himself. However, were the astronaut to fire up his futuristic ship and go 0.99c, then the clock would still keep time for anyone aboard the ship, but to the previous static observer, if he could somehow detect electromagnetic radiation, the light would be forced to travel further in order to reach the other panel and would look as though it were traveling diagonally. Because c is constant, it would take longer, causing the observer to perceive more time relative to the astronaut according to the clock.
That much I know (or think I know. It is possible my confusion comes from a poor understanding of what I tried to explain above), but here is where I am confused.
Why is it not just an error in the method of keeping time? For example, while a light clock would have these problems, why would two analog clocks somehow perfectly synchronized before the journey give separate readings after reaching 0.99c? As I said before, this example appears to only account for an error in man's attempt to keep time, not in time itself as measured by the universe (I.E. cellular degradation, planetary orbits, birth and death of stars). Where am I going wrong? Any and all help would be greatly appreciated.