- #1
spoonthrower
- 37
- 0
How far away from a converging lens must an object be ( in terms of focal length, f ) so that the difference between the image distance and the focal lengh is 1% of the focal length?
I tried to translate the words into an equation based on the thin lens equation but i don't know if it is right or how to solve it for x in terms of f: tell me if I am right:
1/xf + 1/(f+.01f) = 1/f
please help! thanks.
I tried to translate the words into an equation based on the thin lens equation but i don't know if it is right or how to solve it for x in terms of f: tell me if I am right:
1/xf + 1/(f+.01f) = 1/f
please help! thanks.