- #1
Dux
- 8
- 0
I'm hoping that someone on this board can tell me where I'm going wrong here.
Basically, I'm trying to calculate the upward angle at which one would need to aim a gun in order to fire a bullet to perfectly compensate for the effect of gravity on the bullet during the course of its flight over a given time and distance. Using standard kinematic equations, I have come up with the following:
-1/2gt^2 + Vi * t = Dv
g = acceleration due to gravity, 9.8
t = time
Vi = initial upward velocity
Dv = total vertical displacement
My thought was that simply making Dv equal to 0 and solving for Vi, I could determine the initial upward velocity I would need to counteract the downward acceleration due to gravity like so:
-1/2gt^2 + Vi * t = 0
-Vi * t = -1/2gt^2
Vi = 1/2gt
I then figured I could combine this number with the total magnitude of the bullet's initial velocity (the bullet's muzzle velocity) and use a little trigonometry to determine the angle like so:
θ = asin(Vi/V)
This produces an angle that seems fairly reasonable, but when I plug it into a ballistics simulation I'm using, the result is that the bullet actually falls slightly below the point of aim. The longer the distance/time, the more pronounced the error becomes. At distances beyond 600 meters, the error starts to become very substantial.
I feel like I must be missing something. Anyone have any ideas?
Basically, I'm trying to calculate the upward angle at which one would need to aim a gun in order to fire a bullet to perfectly compensate for the effect of gravity on the bullet during the course of its flight over a given time and distance. Using standard kinematic equations, I have come up with the following:
-1/2gt^2 + Vi * t = Dv
g = acceleration due to gravity, 9.8
t = time
Vi = initial upward velocity
Dv = total vertical displacement
My thought was that simply making Dv equal to 0 and solving for Vi, I could determine the initial upward velocity I would need to counteract the downward acceleration due to gravity like so:
-1/2gt^2 + Vi * t = 0
-Vi * t = -1/2gt^2
Vi = 1/2gt
I then figured I could combine this number with the total magnitude of the bullet's initial velocity (the bullet's muzzle velocity) and use a little trigonometry to determine the angle like so:
θ = asin(Vi/V)
This produces an angle that seems fairly reasonable, but when I plug it into a ballistics simulation I'm using, the result is that the bullet actually falls slightly below the point of aim. The longer the distance/time, the more pronounced the error becomes. At distances beyond 600 meters, the error starts to become very substantial.
I feel like I must be missing something. Anyone have any ideas?