almost all ways of computing integrals (except by anti-differentiation, that is: finding a primitive) use some kind of "simple function" to serve as a proxy for the function we're integrating.
riemann sums use constant functions
the trapezoid rule uses linear functions
simpson's rule (in its most basic form) use parabolas (quadratic functions)
all of these can been seen as "special" cases of using a polynomial instead of f(x), so if we're brave enough, we can use a taylor approximation.
something along a different tack is using a fourier series (trigonometric approximation). once the fourier coefficients are known (which, unfortunately, require computing some OTHER integrals first), integrating is very simple, as the integrals of the terms:
are straight-forward (there may be some "adjustment factors" to fit the period to the interval [a,b] which can result in some constant factors not shown).
all of these are important, because there are some fairly simple to write down integrals for which no primitives (in terms of other "elementary functions": that is combinations of polynomials, logs, or exponentials (if one allows the euler definition of sine and cosine this includes the trigonometric functions)) exist. the most famous of these is probably this integral:
$$\int e^{-x^2}\ dx$$
which occurs quite frequently in applications of mathematics (as a (suitably adjusted) "normal distribution" in probability, and which also is extremely important in signal processing). which means we NEED numerical approximations of integrals to solve "real problems".
it turns out, for example, that calculating the arc-length of an elliptical arc, is one such difficult problem (surprisingly enough, calculating the area under an elliptical arc is not so bad...you might suspect from this that "boundaries" of regions often tend to be more intractible than the regions themselves, and you'd be right).