- #1
Unified28
- 27
- 1
While trying to understand how each frame of reference predict that the other is slower, I discovered what seems to be something unsymmetric. It looks as if it is possible to determine that a frame of reference is not one truly standing still. But that's impossible right? I'm really interested in where my mistake is.
So here it goes. Imagine that a rocket is traveling away from Earth at speed v=282647 km/s which is the speed required for the time in the rocket to be slowed by one third. When 6 seconds passes in the rocket, 18 seconds passes at Earth.
A person at Earth wants to measure that the time is slower in the rocket. How does he do that? When the rocket is next to him, he reads that the clock in the rocket is 2:00 sharp. 6 seconds later, he wants to do the next reading such that he will see that the time is really one third slower in the rocket. After 6 seconds, the person at Earth knows that the rocket has traveled a distance of 6*v kilometers. The light showing 2 (6/3=2) seconds on the clock in the spaceship needs to travel this distance at the speed of light before they will hit the eyes (or measurement instrument) on Earth. This time that light needs to travel is 6s*v/c=5,65s. After 6+5,65 seconds the person at Earth uses his knowledgeto predict that the light from the clock at the spaceship will reach him.
So far so good. Now the question is, how does the person within the rocket predict that the time on Earth is going slower? He should follow the same procedure by assuming that he is standing still and that Earth is moving.
He predicts that Earth is moving at the same speed. After 6 seconds (which is really 18 on Earth), the rocket has traveled three times longer than he is aware of. After this, he will wait 5,65s in his frame, which is 5,65*3=16,95s in Earths frame of reference. If everything is correct, the lightrays showing 2:02 from the clock on Earth, should reach the rocket after 18 seconds+16,95. This equals to 6 seconds + 5.65 seconds in the rocket. If that was true, it would be symmetrical and indistinguishable who the time is really going slower for.
However I did the calculations and it turns out that the lightrays from the clock showing 2:02 on Earth need a lot more time to catch up to the rocket, such that the rocket predicts that time is slower than it is on Earth.
Then I am wondering what is wrong? Is it possible to use this method to figure out differences between frames of references?
Here are my calculations:
If the light from the clock showing 2:02 on Earth should reach the rocket after 18+16,95=34,95s the following equation should be true:
c*28,95=v*34,95
Light showing 2:02 on Earth needs to catch up with the rocket while the rocket has a headstart of 6 seconds, hence the 28,95=34,95-6.
It turns out that the equation doesn't add up.
v*34,95=9878512
c*28,95=8678978
Why doesn't it add up? If the person at Earth did the measurement he would observe the correct lightrays that time is one third slower in the rocket. If the person at the rocket does the same procedure he gets wrong result?
A big thanks to anyone who can help me out here.
So here it goes. Imagine that a rocket is traveling away from Earth at speed v=282647 km/s which is the speed required for the time in the rocket to be slowed by one third. When 6 seconds passes in the rocket, 18 seconds passes at Earth.
A person at Earth wants to measure that the time is slower in the rocket. How does he do that? When the rocket is next to him, he reads that the clock in the rocket is 2:00 sharp. 6 seconds later, he wants to do the next reading such that he will see that the time is really one third slower in the rocket. After 6 seconds, the person at Earth knows that the rocket has traveled a distance of 6*v kilometers. The light showing 2 (6/3=2) seconds on the clock in the spaceship needs to travel this distance at the speed of light before they will hit the eyes (or measurement instrument) on Earth. This time that light needs to travel is 6s*v/c=5,65s. After 6+5,65 seconds the person at Earth uses his knowledgeto predict that the light from the clock at the spaceship will reach him.
So far so good. Now the question is, how does the person within the rocket predict that the time on Earth is going slower? He should follow the same procedure by assuming that he is standing still and that Earth is moving.
He predicts that Earth is moving at the same speed. After 6 seconds (which is really 18 on Earth), the rocket has traveled three times longer than he is aware of. After this, he will wait 5,65s in his frame, which is 5,65*3=16,95s in Earths frame of reference. If everything is correct, the lightrays showing 2:02 from the clock on Earth, should reach the rocket after 18 seconds+16,95. This equals to 6 seconds + 5.65 seconds in the rocket. If that was true, it would be symmetrical and indistinguishable who the time is really going slower for.
However I did the calculations and it turns out that the lightrays from the clock showing 2:02 on Earth need a lot more time to catch up to the rocket, such that the rocket predicts that time is slower than it is on Earth.
Then I am wondering what is wrong? Is it possible to use this method to figure out differences between frames of references?
Here are my calculations:
If the light from the clock showing 2:02 on Earth should reach the rocket after 18+16,95=34,95s the following equation should be true:
c*28,95=v*34,95
Light showing 2:02 on Earth needs to catch up with the rocket while the rocket has a headstart of 6 seconds, hence the 28,95=34,95-6.
It turns out that the equation doesn't add up.
v*34,95=9878512
c*28,95=8678978
Why doesn't it add up? If the person at Earth did the measurement he would observe the correct lightrays that time is one third slower in the rocket. If the person at the rocket does the same procedure he gets wrong result?
A big thanks to anyone who can help me out here.