Harder than It Seems

Problem 1. Let f be differentiable on \mathbb R and

\displaystyle \lim_{x \to -\infty} f(x) = \lim_{x \to \infty} f(x)= 0.

Prove that there exists c \in \mathbb R such that f'(c) = 0.

(Click for Solution)

Solution. The main idea is that f(x) \approx 0 for sufficiently large |x|.

If f = 0 identically, then f' = 0, and in particular, f'(0) = 0. Now suppose f \neq 0. Then there exists x_0 \in \mathbb R such that f(x_0) \neq 0. By applying subsequent arguments on \pm f, we may assume without loss of generality that f(x_0) > 0.

Define \epsilon_0 := f(x_0)/2. Use the limits to find N > |x_0| \geq 0 such that

|x| \geq N \quad \Rightarrow \quad f(x) \leq |f(x)| < \epsilon_0.

In particular, \max\{ f(-N), f(N)\} < \epsilon_0 < f(x_0). Apply the intermediate value theorem on the intervals [-N, x_0] and [x_0, N] respectively to find c_1 \in (-N, x_0) and c_2 \in (x_0, N) such that

f(c_1) = f(c_2) = \epsilon_0.

Now apply Rolle’s theorem to obtain c \in (c_1,c_2) \subseteq \mathbb R such that f'(c) = 0, as required.

—Joel Kindiak, 25 Oct 24, 2153H


Leave a comment