- #1
Arbitrageur
- 11
- 0
This question needs to be framed:
We track time at the Earth's surface using atomic clocks.
We know that objects in orbit have clocks that run slower because of velocity and faster because of reduced gravity, and which effect is dominant depends on altitude as this illustration shows:
https://en.wikipedia.org/wiki/Error_analysis_for_the_Global_Positioning_System
http://upload.wikimedia.org/wikipedia/commons/3/36/Orbit_times.svg
Eyeballing the figures on the chart, at GPS orbit, it looks like time speeds up 520 picoseconds per second due to lower gravity, and time slows down 100 picoseconds per second due to velocity, for a net increase of perhaps 420 picoseconds per second.
I would like to disregard the velocity figure for the purpose of my question and focus on gravity because I don't want to talk about orbits around Earth, but rather about a hypothetical observer who is stationary relative to the cosmic microwave background.
My question is,
Q: What is the maximum value in picoseconds per second the gravitational speed-up plot on this chart can reach anywhere in the universe?
To find this value I imagine the hypothetical observer would be not only outside the solar system, but also outside the Milky Way galaxy, and also outside the local group. For starters we could put them at the lowest gravitational point between the local group and a neighboring galaxy cluster. Finding lower gravitational areas elsewhere in the universe may not differ much from such a remote area, though I'm not really sure about that.
I assure you this isn't a homework question, just something I'm curious about. While I might be able to do some math for the effects of the sun and moon since I know their distance and mass, and I can find some of that type of information online, I really have no idea what the time dilation effect of the Milky Way's gravitational field is, nor that of the local group (local galaxy cluster) and I didn't have much luck finding it.
Usually I can find answers I'm searching for like this online within 5 minutes, but I searched about 90 minutes and found some answers but they were wrong. I know they were wrong because the question asked the same thing I did about the gravitational time dilation, and the answers I found talked about the velocity of the sun going around the Milky Way and the corresponding time dilation, which is not relevant to the gravitational time dilation question, at least not directly.
My guess would be that the maximum value might reach something like an order of magnitude higher between galaxy clusters, than the 520 picoseconds per second at GPS orbit, but that guess could be way off, though I'd be a little surprised if it reached two orders of magnitude higher. Has anyone ever done the math on this or at least made a real estimate better than my guess?
We track time at the Earth's surface using atomic clocks.
We know that objects in orbit have clocks that run slower because of velocity and faster because of reduced gravity, and which effect is dominant depends on altitude as this illustration shows:
https://en.wikipedia.org/wiki/Error_analysis_for_the_Global_Positioning_System
http://upload.wikimedia.org/wikipedia/commons/3/36/Orbit_times.svg
Eyeballing the figures on the chart, at GPS orbit, it looks like time speeds up 520 picoseconds per second due to lower gravity, and time slows down 100 picoseconds per second due to velocity, for a net increase of perhaps 420 picoseconds per second.
I would like to disregard the velocity figure for the purpose of my question and focus on gravity because I don't want to talk about orbits around Earth, but rather about a hypothetical observer who is stationary relative to the cosmic microwave background.
My question is,
Q: What is the maximum value in picoseconds per second the gravitational speed-up plot on this chart can reach anywhere in the universe?
To find this value I imagine the hypothetical observer would be not only outside the solar system, but also outside the Milky Way galaxy, and also outside the local group. For starters we could put them at the lowest gravitational point between the local group and a neighboring galaxy cluster. Finding lower gravitational areas elsewhere in the universe may not differ much from such a remote area, though I'm not really sure about that.
I assure you this isn't a homework question, just something I'm curious about. While I might be able to do some math for the effects of the sun and moon since I know their distance and mass, and I can find some of that type of information online, I really have no idea what the time dilation effect of the Milky Way's gravitational field is, nor that of the local group (local galaxy cluster) and I didn't have much luck finding it.
Usually I can find answers I'm searching for like this online within 5 minutes, but I searched about 90 minutes and found some answers but they were wrong. I know they were wrong because the question asked the same thing I did about the gravitational time dilation, and the answers I found talked about the velocity of the sun going around the Milky Way and the corresponding time dilation, which is not relevant to the gravitational time dilation question, at least not directly.
My guess would be that the maximum value might reach something like an order of magnitude higher between galaxy clusters, than the 520 picoseconds per second at GPS orbit, but that guess could be way off, though I'd be a little surprised if it reached two orders of magnitude higher. Has anyone ever done the math on this or at least made a real estimate better than my guess?