- #1
Grumple
- 7
- 0
Hello,
I am trying to accurately simulate motion in 1D with a jerk that is changing non-linearly, but predictably. As an arbitrary example, picture jerk increasing logarithmically over time. This is being done in the context of a physics simulation that is 'stepping' frame-by-frame (ie 60 steps per second), calculating distances traveled, new accel, velocity, etc, at the end of each frame. Up until now I have been always assuming a constant jerk frame-to-frame, making things pretty simple. However, this is not very accurate when jerk is in fact changing during the frame itself.
I can make the simulation more accurate by increasing the number of frames per second, but only to a certain point before hitting a performance wall, and this would still be fundamentally ignoring the fact that jerk not actually constant during the frame.
For example, let's say that the jerk value at time T is initial jerk to the power of T. I'm trying to figure out how to calculate distance traveled, delta accel, delta, vel, etc in a given frame knowing jerk did in fact change (non-linearly) during the frame. I know all variables at the start of the frame, and just want to accurately generate the same variables for the end of the frame knowing constant elapsed time.
As far as I can tell, there is no way to accurately factor in the formula-based change in jerk, so instead I am considering using an 'average jerk' for the frame in order to get a best-approximation with the constant jerk based motion formulas. I am doing this by calculating what the jerk would be at the end of a given frame based on the formula for change in jerk over time, then calculating an average jerk for the frame based on the frame start (J0) and end values (ie literally (J0 + JT)/2.0).
Does this make sense? I know the average jerk I calculate isn't necessarily accurate as the jerk isn't changing linearly, but it seems like it would be a better approximation than assuming the jerk for the entire frame was whatever it was at the start of the frame.
Am I missing something obvious that would allow a more accurate calculation of metrics each frame with this formula-based change in jerk over time? Maybe a more accurate way to generate the 'average' knowing the formula for change in jerk?
Thanks for any help!
I am trying to accurately simulate motion in 1D with a jerk that is changing non-linearly, but predictably. As an arbitrary example, picture jerk increasing logarithmically over time. This is being done in the context of a physics simulation that is 'stepping' frame-by-frame (ie 60 steps per second), calculating distances traveled, new accel, velocity, etc, at the end of each frame. Up until now I have been always assuming a constant jerk frame-to-frame, making things pretty simple. However, this is not very accurate when jerk is in fact changing during the frame itself.
I can make the simulation more accurate by increasing the number of frames per second, but only to a certain point before hitting a performance wall, and this would still be fundamentally ignoring the fact that jerk not actually constant during the frame.
For example, let's say that the jerk value at time T is initial jerk to the power of T. I'm trying to figure out how to calculate distance traveled, delta accel, delta, vel, etc in a given frame knowing jerk did in fact change (non-linearly) during the frame. I know all variables at the start of the frame, and just want to accurately generate the same variables for the end of the frame knowing constant elapsed time.
As far as I can tell, there is no way to accurately factor in the formula-based change in jerk, so instead I am considering using an 'average jerk' for the frame in order to get a best-approximation with the constant jerk based motion formulas. I am doing this by calculating what the jerk would be at the end of a given frame based on the formula for change in jerk over time, then calculating an average jerk for the frame based on the frame start (J0) and end values (ie literally (J0 + JT)/2.0).
Does this make sense? I know the average jerk I calculate isn't necessarily accurate as the jerk isn't changing linearly, but it seems like it would be a better approximation than assuming the jerk for the entire frame was whatever it was at the start of the frame.
Am I missing something obvious that would allow a more accurate calculation of metrics each frame with this formula-based change in jerk over time? Maybe a more accurate way to generate the 'average' knowing the formula for change in jerk?
Thanks for any help!