- #1
ykaravas
- 2
- 0
I am looking around for some sort of equation regarding optical power dissipation of light (ultraviolet light in particular) in the atmosphere but i cannot find anything. I know that UV light obeys Rayleigh scattering but I don't know much else. I'm trying to figure out for example, if I shined a UV LED of 280nm wavelength and power output of 1 mW and tried to measure it 100 meters away, how much power of the original 1 mW would reach the detector. In essence, how much of the UV light is "lost" due to atmospheric dissipation.
Can anyone refer me to a formula or a link which has this information?
Thank you.
Can anyone refer me to a formula or a link which has this information?
Thank you.