So the other day, I was talking to a friend of mine about complex analysis — particularly, the application of the residue theorem to the evaluation of definite (real) integrals. He stated the following (I’ve paraphrased/tried to make it precise):
Theorem 1. Provided that as and is holomorphic on all of except possibly with finitely many isolated singularities occurring in , we have that
where the sum is taken over all in the upper half-plane.
The condition that as simply ensures that we can take the semicircle which starts at , goes to , and then arcs back to in a counterclockwise direction, and letting this go to infinity, the contour integral of around the curve will indeed converge to the desired improper integral (the left-hand side of the above equation). That is, it ensures that the contribution of the “arc” portion of the curve to the integral converges to zero as .
After he said this, I started thinking about the following theorem:
Theorem 2. If a holomorphic function has finitely many singularities, then the sum of all the residues must be 0.
This means that if a function has no singularities on the real line, then the sum of its residues in the upper half-plane must be the negation of the sum of its residues in the lower half-plane.
Going back to the discussion of Theorem 1 now. We could also take the semicircle in the bottom half-plane, whereby the curve would wind around any enclosed singularities in a clockwise direction. The residue theorem will then spit out a negative sign in front of the sum, since the winding number will be (due to the clockwise rotation):
whereby we observe the residues in the upper and lower half-planes summing to zero as expected.
A quick corollary of this is the following: if is a polynomial with with all of its roots in one particular half-plane, then clearly satisfies the limit condition, but then the sum on the right-hand side of the equation is zero by Theorem 2, which means that
I’ll write another post soon; right now I have to sleep. Here’s a plot of