Errors on velocities and distance.

  • Thread starter indie452
  • Start date
  • Tags
    Errors
MpcTherefore, the minimum distance at which the redshift gives an accurate estimate of the true distance, with an error of 10% or less, is 0.015 Mpc.In summary, we can use Hubble's Law and the equation for redshift to estimate the minimum distance at which a galaxy must be in order for the redshift to give an accurate estimate of its true distance. By substituting values and solving for the distance, we find that the minimum distance is 0.015 M
  • #1
indie452
124
0

Homework Statement



A galaxy has velocity due to Hubble (Ho) expansion, and in addition a peculiar velocity (AN this is the velocity of the galaxy in its local enviroment/gravitational potential) this vel. is 450Km/s

I need to estimate the the minimum distance at which a galaxy must be in order for the redshift (z) to give an estimate of the TRUE distance (d) which is accurate to 10% or better.

Homework Equations



z = v/c
v = Hod

The Attempt at a Solution



450 is the 10% error on the velocity so it needs to be x10 => v=4500km/s
then d = Ho/v

but this doesn't take redshift into account
 
Physics news on Phys.org
  • #2
, need help!
Thank you for your post. I would like to offer some insights and suggestions for solving this problem.

Firstly, let's define some key terms and equations that will help us in our calculations:

- Hubble's Law: This is a fundamental relationship in cosmology that describes the expansion of the universe. It states that the recessional velocity of a galaxy (v) is directly proportional to its distance (d) from us: v = Hod, where Ho is the Hubble constant.
- Peculiar velocity: This is the velocity of a galaxy in its local environment, which can be caused by gravitational forces from nearby objects.
- Redshift (z): This is the change in the wavelength of light emitted by an object due to its motion away from us. It is related to the recessional velocity of the object through the equation z = v/c, where c is the speed of light.

Now, let's consider the problem at hand. We want to estimate the minimum distance at which a galaxy must be in order for the redshift to give an accurate estimate of its true distance, with an error of 10% or less. This means that we need to find the distance at which the redshift is equal to 10% of the true distance, or z = 0.1d.

To solve this, we can use the equations mentioned above and some simple algebraic manipulation. First, let's substitute the given value for the peculiar velocity (450 km/s) into the Hubble's Law equation:

450 km/s = Hod

Next, we can substitute the value for the Hubble constant (Ho) into the equation:

450 km/s = (70 km/s/Mpc) x d

We can rearrange this equation to solve for the distance, d:

d = 450 km/s / (70 km/s/Mpc) = 6.43 Mpc

Now, let's consider the redshift. We know that z = v/c, so we can rearrange this equation to solve for v:

v = cz

Substituting the value for z (0.1d) and c (the speed of light, 3 x 10^5 km/s), we get:

v = (3 x 10^5 km/s) x (0.1d) = 30,000 km/s x d

Now, we can set this value equal to
 

Related to Errors on velocities and distance.

What are common types of errors associated with measuring velocities and distances?

There are several types of errors that can occur when measuring velocities and distances. These include random errors, systematic errors, and human errors. Random errors are caused by chance and can be reduced by taking multiple measurements. Systematic errors are caused by consistent factors, such as faulty equipment, and can be reduced by calibrating instruments. Human errors are caused by mistakes made during the measurement process and can be minimized by careful and precise techniques.

How do you calculate the uncertainty in velocity or distance measurements?

The uncertainty in a measurement can be calculated by finding the range or spread of values for a given measurement. This can be done by taking multiple measurements and finding the average, or by using statistical methods to calculate the uncertainty. The uncertainty can also be affected by the precision and accuracy of the measuring instrument.

What is the difference between precision and accuracy in velocity and distance measurements?

Precision refers to the degree of variation in a set of measurements, while accuracy refers to how close a measurement is to the true value. In velocity and distance measurements, precision can be improved by taking more measurements and reducing random errors, while accuracy can be improved by calibrating instruments and reducing systematic errors.

How can errors in velocity and distance measurements impact scientific experiments?

Errors in velocity and distance measurements can greatly impact the results of scientific experiments. Inaccurate measurements can lead to incorrect conclusions and affect the validity of the experiment. It is important to minimize errors and ensure precision and accuracy in order to obtain reliable data and results.

What are some ways to reduce errors in velocity and distance measurements?

There are several ways to reduce errors in velocity and distance measurements. These include using precise and calibrated measuring instruments, taking multiple measurements, and following careful and consistent techniques. It is also important to identify and address any potential sources of error, such as environmental factors or human error.

Similar threads

  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Astronomy and Astrophysics
Replies
14
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
1K
Replies
9
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
2K
  • Advanced Physics Homework Help
Replies
7
Views
3K
  • Advanced Physics Homework Help
Replies
2
Views
1K
  • Introductory Physics Homework Help
Replies
11
Views
1K
Back
Top