16 Sequences and Series of Real Numbers
We have already looked at limits of
sequences in a metric space, as a generalization of the limit of a sequence in .
Chapter 4 in the textbook takes a more in-depth look at sequences in , and it
covers series (infinite sums) in detail. Some of the material will be familiar from
Calculus II, but there are many new ideas.
We consider sequences of real numbers. To review: Remember that a sequence
is really a function from to . We say that
if for every , there is an such that for all .
If a sequence has a limit, we say that the sequence converges; a seqence
that does not converge is said to diverge. A special case of divergence is
divergence to . We say if for every ,
there is an such that for all . We say
if for every , there is an such that for all . It is important
to remember that when or ,
the sequence is divergent.
The new material on sequences of real numbers is mainly concerned with two important
classes of sequences: monotone sequences and Cauchy sequences.
Definition: A sequence is
increasing if for all , if then .
It is decreasing if for all , if then .
And is is monotone if it is either increasing or decreasing.
(Note that "increasing" in this definition should probably be called non-decreasing,
and "decreasing" should be non-increasing.)
It should be clear that an increasing sequence that is bounded above is convergent.
In fact, it converges to . Similarly, a decreasing
sequence that is bounded below converges to its greatest lower bound.
These facts are stated in this theorem:
Monotone Convergence Theorem:
An increasing sequence that is bounded above is convergent. An increasing sequence
that is not bounded above diverges to infinity. A decreasing
sequence that is bounded below is convergent. A decreasing sequence that
is not bounded below diverges to minus infinity.
A Cauchy sequence is one in which the terms of the sequence get arbitrarily
close to each other as , The major result is that any Cauchy
sequence of real numbers is convergent. Note that both this fact and the
Monotone Convergence Theorem depend on the completeness of in an
essential way. For example, it is not true that a Cauchy sequence of
rational numbers must converge to a rational number. We also consider
a particular type of sequence called a "contraction."
Definition: A sequence is
said to be a Cauchy sequence if for any , there is
an such that for any and , .
Cauchy Convergence Theorem:
Any Cauchy sequence of real numbers is convergent.
Theorem (The Contraction Principle):
Let be a sequence of real numbers. Suppose that there
is a real number in the range such that for all ,
. Then the sequence is convergent.
Note by the way that all of these convergence criteria will apply to a
sequence that "eventually" meets the criteria, that is if
there is an such that the criterion holds for the subsequence
. This is because and
clearly have the same convergence behavior (since
convergence is about what happens to in the long run, which doesn't
depend on the first terms of the sequence).
For example, a bounded-above sequence will converge by the Monotone
Convergence Sequence if it is consistently increasing after the
term, no matter what it does for the first terms.
Closely related to infinite sequences are infinite series.
An infinite series is written as an infinite sum such as .
To make sense of such a sum, we have to look at the infinite sequence of partial
sums of the series.
Definition: The k-th partial
sum of the infinite series is the value of the
finite sum . The infinite series is convergent
if the sequence of partial sums, , is convergent;
otherwise the series is divergent. For a convergent series,
the sum of the series is defined to be the limit of the
sequence of partial sums, and we write .
Similarly, we write when
.
Much of the theory of infinite sequences carries over to infinite series. For example,
suppose that for all . In that case, the sequence of partial sums is
increasing, so the Monotone Convergence Theorem applies to the sequence of partial
sums. This means that the series will either converge or will diverge to infinity,
depending on whether or not the sequence of partial sums is bounded above.
This fact is one of the reasons that we often consider series of non-negative
terms. A non-negative series
is one whose terms are all greater than or equal to zero.
Major results about series include a variety of tests for convergence and divergence.
Note that many of these tests apply only to non-negative series. However, the
term test applies to any series. It says that a series cannot converge
unless its terms get small. It is usually used as a test for divergence.
Theorem (The term test)
If the series converges, then .
Equivalently if does not exist, or if the limit is some
non-zero value, then the series diverges.
Theorem (Comparison test)
Suppose that and are
non-negative series.
If for all and if
converges, then also converges.
If for all and if
diverges, then also diverges.
Theorem (Ratio test)
Suppose that is a non-negative series, and that
(where can be either
a non-negative number or ). If , then
the series converges. If , then the series diverges.
(If , then the test gives no information about convergence of the series.)
Theorem (Root test)
Suppose that is a non-negative series, and that
(where can be either
a non-negative number or ). If , then
the series converges. If , then the series diverges.
(If , then the test gives no information about convergence of the series.)
Infinite series also have a linearity property, similar to the linearity
property of the definite integral:
Theorem:
If and are convergent series,
then the series also converges, and
If is a convergent series and then
also converges, and
For series that can have both positive and negative terms, we can distinguish between
"absolute" convergence and "conditional" convergence. It can be shown that a series that
is absolutely convergent is also convergent. And we get one more test for
convergence that applies to alternating series, in which the terms alternate
between positive and negative.
Definition:
Let be a series. The series is said to be
absolutely convergent if the series
is convergent. A series that is convergent but not absolutely convergent
is said to be conditionally convergent.
That is, is conditionally convergent if
it converges but diverges.
Theorem: If the series
converges absolutely, then it converges.
Theorem (Alternating Series Test)
Let be a decreasing sequence of non-negative terms,
such that . Then the series
converges.
The most typical example of a conditionally convergent series is
. This series converges by the
Alternating Series Test, but the series of absolute values is the
well-known harmonic series,
, which diverges.
It is worth noting that the Ratio and Root tests can be adapted for
series that can include negative as well as non-negative terms, using absolute
convergence:
Theorem (Ratio test)
Let be any series, and suppose that
(where can be either
a non-negative number or .) If , then
the series converges absolutely. If , then the series diverges.
(If , then the test gives no information about convergence of the series.)
Theorem (Root test)
Let be any series, and suppose that
(where can be either
a non-negative number or .) If , then
the series converges absolutely. If , then the series diverges.
(If , then the test gives no information about convergence of the series.)
(back to contents)