Ratio Test

The Ratio Test is a method used in calculus to determine the convergence or divergence of an infinite series. The test is particularly useful for series with non-negative terms and factorials, and it involves examining the limit of the ratio of consecutive terms in the series.

Given an infinite series ∑a_n, the Ratio Test states:

1. If lim (n→∞) |a_(n+1) / a_n| = L < 1, the series converges absolutely.
2. If lim (n→∞) |a_(n+1) / a_n| = L > 1, the series diverges.
3. If lim (n→∞) |a_(n+1) / a_n| = L = 1, the test is inconclusive, and the series may converge or diverge.

The Ratio Test is often used in conjunction with other convergence tests, such as the Root Test or the Comparison Test, to determine the behavior of an infinite series.