- #1
kiranimo
- 2
- 0
I'm going to simplify this situation so I can hopefully better understand what is going on. A bowling ball is given a certain initial rotational velocity and a certain initial linear velocity when thrown down a bowling lane. If everything needed to solve this problem is given, such as radius of the ball, coefficient of friction, etc. How far down the lane will the bowling ball begin to break?
What I don't understand is how at one point the ball can be spinning but not rolling side to side, then I'm assuming due to friction, the rotational velocity decreases and then when it reaches a certain speed the ball starts to roll from one side to another (when the ball breaks). Why doesn't it roll at one point and it does later on? How can I model this mathematically?
Hopefully my questions make sense...
What I don't understand is how at one point the ball can be spinning but not rolling side to side, then I'm assuming due to friction, the rotational velocity decreases and then when it reaches a certain speed the ball starts to roll from one side to another (when the ball breaks). Why doesn't it roll at one point and it does later on? How can I model this mathematically?
Hopefully my questions make sense...