My question may be naïve, but my recollection of physics theory was that the perfect angle of launch for maximum distance on a circular object (such as a golf ball )was 45 degrees – exactly! To my knowledge, no club in a golfer’s bag conforms to this. Indeed, as the distance of clubs rises from wedge through to the irons etc, the launch angle decreases away from the optimum., yet the distance the ball flies increases. Is it more to do with the increasing length of the club that increases the clubhead speed. It is this increase in speed which accounts for the increase in distance, yes? No? And why do drivers not aim for a launch angle at or near the theoretical optimum?
I’m sure there is a simple answer. I just don’t know what it is. Help!!
Your question is very valid and something which seems to be in violation of our intuition and what we normally experience when spraying water, throwing stones or shooting bullets.
The difference between a stone and a golf ball is that a golf ball – in every case except for it being topped – has backspin. This backspin, helped by the dimples on the ball creates a turbulent layer of air around the ball which is dragged over the surface of the ball creating a different air pressure on opposite sides of the ball.
You have observed this effect in baseball which results in a curve ball. In golf this effect – because of the backspin creates a lift (or curve) upward which holds the ball up in flight fighting gravity. The ball thus performs more like a glider than a stone. It is this lift force – different for different balls, because of dimple design and which varies based on the spin rate — that dictates the optimum launch angle for a particular ball speed and spin rate.
Hope this helps.