[Calculus] 7. Differential

Definition 7.1 Given a function \(f:S(\sube\R) \to \R\), and a point \(a\in S\). We say this function is differentiable/left differentiable/right differentiable at \(a\) if:

  1. \(\exists\delta>0\big((a-\delta,a+\delta)/(a-\delta,a]/[a,a+\delta) \sube S\big)\)
  2. \(\lim_{x\to a/a^-/a^+}\frac{f(x)-f(a)}{x-a}\) exists. Or \(\lim_{h\to 0/0^-/0^+}\frac{f(a+h)-f(a)}{h}\) exists.

If this is the case, we call the limit the derivative/left derivative/right derivative of \(f\) at \(a\), and denote it by \(f'(a)=\frac{\d f(a)}{\d x} = \frac{\d f}{\d x}(a)/f_{-}'(a)/f_{+}'(a)\).

We say \(f\) is differentiable on \(S\) if it is differentiable at every \(x\in S\).

Proposition 7.2 If \(f\) is diff./left diff./right diff. at \(a\), then \(f\) is continuous/left conti./right conti.. \(f\) is diff. at \(a\) iff \(f\) is left diff. and right diff. at \(a\).

Proof (1)

\[ \lim_{x\to a}(f(x)-f(a)) = \lim_{x\to a }\left(\frac{f(x)-f(a)}{x-a}(x-a)\right) = f'(a)\cdot 0 = 0 \]

i.e., \(\lim_{x\to a}f(x) = f(a)\).

For the case of left and right differentiable, exchange \(a\) to \(a^{-}/a^{+}\) will do.

(2)

It is easy to prove by the definition of limits.

Definition 7.3 A function \(f:I\to \R\) on an interval \(I\sube \R\), is called a convex function if \(\forall a,b \in I, t\in (0,1)(f((1-t)a+b) \le (1-t)f(a) + tf(b))\). We say it is strictly convex, when \(\le\) is \(<\).

Proposition 7.4 For any \(a < c < b (\in I)\), these following statements are equivalent:

  • \(f\) is (strictly) convex
  • \(f(c) \le(<) \frac{b-c}{b-a}f(a) + \frac{c-a}{b-a} f(b)\).
  • \(\frac{f(c)-f(a)}{c-a}\le(<)\frac{f(b)-f(c)}{b-c}\).

Proof (1-2)

Let \(c=(1-t)a + tb\), then \(t = \frac{c-a}{b-a}\) and \(1-t = \frac{b-c}{b-a}\).

Then,

\begin{align*} f((1-t)a + tb) &\le (1-t)f(a) + tf(b) \\ f(c) & \le \frac{b-c}{b-a}f(a) + \frac{c-a}{b-a}f(b) \end{align*}

(1-3)

By (1-2),

\begin{align*} (b-a)f(c) & \le (b-c)f(a) + (c-a)f(b)\\ bf(c) - (b-c)f(a) &\le (c-a)f(b) + af(c)\\ bf(c) - (b-c)f(a) - cf(c) &\le (c-a)f(b) + af(c) - cf(c)\\ (b-c)(f(c) - f(a)) &\le (c-a)(f(b) - f(c))\\ \frac{f(c) - f(a)}{c-a} &\le \frac{f(b) - f(c)}{b-c} \end{align*}

hi, test.

Proposition 7.5 Let \(f,g:S\to\R\), and \(a\in S\) be both differentiable at \(a\).

  1. \((f\pm g)'(a) = f'(a) \pm g'(a)\)
  2. (Leibnitz's product rule) \((fg)'(a) = f'(a)g(a) + f(a)g'(a)\).
  3. If \(g(a)\ne 0\), then \(\left(\frac{f}{g}\right)'(a) = \frac{f'(a)g(a) - f(a)g'(a)}{g(a)^2}\).

Proof (1)

\[ \frac{f(a+h)\pm g(a+h) - (f(a)\pm g(a))}{h} = \frac{f(a+h) - f(a)}{h} \pm \frac{g(a+h) - g(a)}{h} \]

The left part is the derivative of \((f\pm g)(x)\) at \(a\), and the right part is the \(f'(a) \pm g'(a)\).

(2)

\[ \frac{f(x)g(x) - f(a)g(a)}{x-a} = \frac{f(x) - f(a)}{x-a}g(x) + \frac{g(x) - g(a)}{x-a}f(a) \]

(3) Similar as (2)

Proposition 7.6 (differentiating inverse functions) Let \(f: I \to J\) be continuous bijection between intervals \(I\) and \(J\) in \(\R\). Let \(g: J \to I\) be its inverse map. For \(a\in I\), if \(f\) is differentiable at \(a\) and \(f'(a) \ne 0\), then \(g\) is differentiable at \(f(a)\), and \(g'(f(a)) = \frac{1}{f'(a)}\).

Proof Let \(b = f(a)\), then \(a = g(b)\).

\begin{align*} \frac{g(y) - g(b)}{y-b} &= \frac{g(y) - a}{f(g(y)) - f(a)}\\ &= \frac{1}{\frac{f(g(y)) - f(a)}{g(y) - a}} \end{align*}

Let \(F(x) = \frac{1}{\frac{f(x) - f(a)}{x-a}}\), and we have \(\lim{x\to a} = \frac{1}{f'(a)}\).

The first item is actually \(F(g(y))\). \(g(y) = a\) as \(y\to b\), which means \(F(g(y)) = \lim_{x\to a}F(x) = \frac{1}{f'(a)}\).

Lemma 7.7 Let \(f: S(\sube\R) \to \R\) be a function, and \(a\in S\). \(f\) is differentiable at \(a\) iff there exist \(\eta: S \to \R\) and \(A\in\R\), such that \(f(a+h) = f(a) + Ah + \eta(h)h\) for all \(h\) near 0, and \(\lim_{h\to0}\eta(h) = \eta(0) = 0\). Furthermore, if any side holds, then \(A=f'(a)\).

Proposition 7.8 (\(\Rarr\))

Let \(A = f'(a) = \lim_{h \to 0} \frac{f(a+h) - f(a)}{h}\), and

\[ \eta(h) = \begin{cases} \frac{f(a+h) - f(a) - f'(a)h}{h} & h\ne0\\ 0 & h=0 \end{cases} \]

It is obvious that \(\eta(h)\) is continous at 0, and \(\eta(0) = 0\).

We now have found \(A\) and \(\eta(h)\) satisfying conditions.

(\(\Larr\))

\[ \frac{f(a+h) - f(a)}{h} = A + \eta(h) \]

Above item is to \(A\), as \(h\to 0\), which is \(f'(a)\) in fact.

Theorem 7.9 (the chain rule) Given \(f: S \to T\), \(g: T \to \R\), and \(a\in S, b=f(a) \in T\). If \(f\) is differentiable at \(a\), and \(g\) is differentiable at \(b\), then \(g\circ f\) is differentiable at \(a\), and \((g\circ f)'(a) = g'(b)f'(a)\).

Proof For \(g\), we have \(g(y) = g(b) + g'(b)(y-b) + \xi(k)(y-b)\), where \(k=y-b\).

For \(f\), we have \(f(x) = f(a) = f'(a)(x-a) + \eta(h)(x-a)\), where \(h=x-a\).

\begin{align*} (g\circ f)(x) &= (g\circ f)(a) + g'(b)[f'(a)(x-a) + \eta(h)h] + \xi(k)(f'(a)(x-a) + \eta(h)h)\\ &=(g\circ f)(a) + g'(b)f'(a)(x-a) + \Delta(h) h \end{align*}

, where \(\Delta\) is a bunch of items rest, and its limit at \(0\) is \(\Delta(0) = 0\).

Definition 7.10 (local extremum) Given a function \(f:X\to\R\), where \(X\) is a metric space, and \(a\in X\). We say that \(f\) achieves a local maximum/minimum at \(a\) if \(\exists \delta > 0 \forall x \in B_\delta(a)[f(x) \le(\ge) f(a)]\).

Proposition 7.11 Given a function \(f:S\to\R\), which achieves a local maximum, \(a\in S\), and the left and right limit of \(\frac{f(x) - f(a)}{x - a}\) at \(a\) both exist. Then \(\lim_{x\to a^+}\frac{f(x) -f(a)}{x-a} \le 0\), and \(\lim_{x\to a^-}\frac{f(x) -f(a)}{x-a} \ge 0\). In particular, if \(\lim_{x\to a}\frac{f(x) - f(a)}{x - a}\) exists, then it must be 0.

Proof As \(x \to a^+\), \(x-a > 0\), and \(f(x) - f(a) \le 0\) since \(f\) achieves local maximum at \(a\), then \(\frac{f(x) -f(a)}{x - a} \le 0\) when \(x\) gets close to \(a\), i.e., its limit is also less than equal 0.

Similar for the other side.

Then we have the derivative is both ge. 0 or le. 0, which means it must be 0.

Theorem 7.12 (Rolle's theorem) Given \(f:[a,b] \to \R\) which is continuous on \([a,b]\) and is differentiable on (a,b), if \(f(a) = f(b)\), then \(\exists c \in (a,b)[f'(c) = 0]\).

Proof Since the domain of \(f\) is compact, by Theorem 5.14, \(f\) have global max and min.

For special cases, \(f\) achieves max or min at \(a\) or \(b\), then it must achieve min or max at some \(c\in (a,b)\). Else \(f\) then becomes a constant function.

We can obtain local extremum from global max/min. Then by propsition 7.10, we have that \(f'(c) = 0\).

Theorem 7.13 (the mean value theorem) If \(f:[a,b] \to \R\) is continuous on \([a,b]\) and differentiable on \((a,b)\), then \(\exists c \in (a,b)[\frac{f(b)-f(a)}{b-a} = f'(c)]\).

Proof Let \(F(x) = f(x) - \left[f(a) + \frac{f(b) - f(a)}{b - a}(x-a)\right]\). Then we know that \(F(a) = 0\), and \(F(b) = 0\), and \(F\) is continuous on \([a,b]\), too.

By Rolle's theorem, there exists a \(c\in(a,b)\), such that \(F'(c) = 0\), i.e., \(\frac{f(b) - f(a)}{b - a} = f'(c)\).

Definition 7.14 A map \(f: X \to Y\) between metric spaces is called Lipschitz if \(\exists C > 0 \forall x,x' \in X [ d(x,x') \le Cd(f(x),f(x')) ]\).

If \(f\) is Lipschitz, then it is uniformly continuous.

Proof For any \(\epsilon > 0\), consider \(d(f(x), f(x')) < \epsilon\), since Lipschitz, then there exists some C > 0, such that

\[ d(x,x') \le Cd(f(x),f(x')) < C\epsilon \]

Let \(\delta = C\epsilon >0\), then we obtain that \(f\) is uniformly continuous.

Remark 7.15 Consider \(f(x) = \sin x\), by the mean value theorem, there exists some \(c\) between \(x\) and \(x'\),

\[ \left| \frac{\sin x - \sin x'}{x - x'} \right| = |cos c| ≤ 1 \]

Therefore, \(\sin x\) is Lipschitz, which is also uniformly continuous.

That tells us, we can discuss the differentiation of functions to see the continuity of functions. Sometimes it is a easier way than directly checking by definition.

Theorem 7.16 (gerneralized mean value theorem) If \(f,g: [a,b] \to \R\) are continuous on \([a,b]\), and differentiable on \((a,b)\), then there exists a \(c \in (a,b)\), such that

\[ f'(c)[g(b) - g(a)] = g'(c)[f(b) - f(a)] \]

Proof Apply Rolle's theorem to \(F(x) = f(x)[g(b) - g(a)] - g(x)[f(b) - f(a)]\). Since \(F(a) = f(a)g(b) - f(a)g(a) - f(b)g(a) + f(a)g(a) = f(a)g(b) - f(b)g(a)\) and \(F(b) = f(b)g(b) - f(b)g(a) - f(b)g(b) + f(a)g(b) = f(a)g(b) - f(b)g(a)\), which are equal, there exists a \(c\) between \(a\) and \(b\), such that \(F'(c) = f'(c)[g(b) - g(a)] - g'(c)[f(b) - f(a)] = 0\).

For \(f,g,h: [a,b] \to \R\), which are both continuous on \([a,b]\) and differentiable on \((a,b)\), there exists a \(c \in (a,b)\), such that

\begin{vmatrix} f(a) & g(a) & h(a) \\ f(b) & g(b) & h(b) \\ f'(c) & g'(c) & h'(c) \end{vmatrix}

Proof It is very easy, with only some writing jobs.

Theorem 7.17 (L'Hospital's rule) Given differentiable functions \(f,g: (a,b) \to \R\), where \(a < b\), if

  • \(g'(x) \ne 0\) for all \(x \in (a,b)\)
  • either \(\lim_{x \to a^+}f(x) = \lim_{x \to a^+}g(x) = 0\) or \(\lim{x \to a^+}|g(x)| = \infty\)

\(\lim_{x \to a^+} \frac{f'(x)}{g'(x)} = L\)

, then

\[ \lim_{x\to a^+} \frac{f(x)}{g(x)} = L \]

Proof Before all the others, we know to show that \(g(x)\) can not be 0 for all \(x\in (a,b)\).

If there are more than two points, whose value is 0, then by Rolle's theorem, there must another point between them, such that its derivative is 0. But this can not be true. Therefore there can only be at least 1 zero.

If this is the case, the interval can be reduced in size not contain the single zero of \(g\).

Special case

\(a\in \R\), and \(\lim_{x \to a^+}f(x) = \lim_{x \to a^+} = 0\).

Consider the right limit. Construct two extended functions:

$$ F(x) =

\begin{cases} f(x) & x\in (a,b)\\ 0 & x = a \end{cases}, G(x) = \begin{cases} g(x) & x\in (a,b)\\ 0 & x = a \end{cases}

$$

Then, by the generalized mean value theorem, there exists some \(c\in (a,x)\)

\[ \frac{f(x)}{g(x)} = \frac{F(x) - F(a)}{G(x) - G(a)} = \frac{f'(c)}{g'(c)} \]

And we have \(\lim_{x \to a^+} \frac{f'(x)}{g'(x)} = L\), i.e.,

\[ \forall \epsilon > 0 \delta > 0 [ a < x < a + \delta \Rarr \left| \frac{f'(x)}{g'(x)} - L \right| < \epsilon] \]

Since \(a < c < x < a + \delta\),

\[ \left| \frac{f'(c)}{g'(c)} - L \right| < \epsilon \left| \frac{f(x)}{g(x)} - L \right| < \epsilon \]

which means \(\frac{f(x)}{g(x)}\) converges to \(L\) as \(x \to a^+\).

General proof from wikipedia

For any \(x\) in interval, let \(m(y) = \inf \frac{f'(\xi)}{g'(\xi)}\) and \(M(y) = \sup \frac{f'(\xi)}{g'(\xi)}\) as \(\xi\) ranges over all values between \(a\) and \(y\).

By the generalized mean value theorem, there exists a \(c \in (x,y)\), where \(x,y\in (a,b)\), such that

\[ \frac{f(x) - f(y)}{g(x) - g(y)} = \frac{f'(\xi)}{g'(\xi)} \Rarr m(y) \le \frac{f(x) - f(y)}{g(x) - g(y)} \le M(y) \]

Case 1 \(\lim_{x \to a} f(x) = g(x) = 0\)

For any \(y\in (a,b)\), and \(x \in (a,y)\), we have

\[ m(y) \le \frac{f(x) - f(y)}{g(x) - g(y)} \le M(y) \]

as \(x \to c\), \(f(x),g(x)\) become zero, and so

\[ m(y) \le \frac{f(y)}{g(y)} \le M(y) \]

the squeeze test establishes that \(\lim_{x \to c}\frac{f(x)}{g(x)}\) exists, which is equal to \(L\), as \(y \to a\).

Case 2 \(\lim_{x\to a}|g(x)| = \infty\)

For any \(y \in (a,b)\), define \(S_y = \{x | x \in (a,y)\}\).

\[ m(y) \le \frac{\frac{f(x)}{g(x)} - \frac{f(y)}{g(x)}}{1 - \frac{g(y)}{g(x)}} \le M(y) \]

As \(x \to a\), consider \(m := \liminf_{x\in S_y}\frac{f(x)}{g(x)}\), and \(M := \limsup_{x\in S_y}\frac{f(x)}{g(x)}\), we have:

\(m(y)\) is a lower bound of \(\frac{f(x)}{g(x)}\) as \(x \to a\), then \(m(y) \le m\). Similarly, \(M \le M(y)\). i.e.,

\[ m(y) \le \liminf_{x\in S_y}\frac{f(x)}{g(x)} \le \limsup_{x\in S_y}\frac{f(x)}{g(x)} \le M(y) \]

By hypothesis,

\[ \lim_{y \to a} m(y) = \lim_{y \to a} M(y) = \lim_{y \to a} \frac{f'(y)}{g'(y)} = L \]

and

\[ \lim_{y \to a}\left(\liminf_{x \in S_y}\frac{f(x)}{g(x)}\right) = \liminf_{y \to a}\frac{f(y)}{g(y)} \\ \lim_{y \to a}\left(\limsup_{x \in S_y}\frac{f(x)}{g(x)}\right) = \limsup_{y \to a}\frac{f(y)}{g(y)} \]

By the squeeze test,

\[ \liminf_{y \to a}\frac{f(y)}{g(y)} = \limsup_{y \to a}\frac{f(y)}{g(y)} = \lim_{y \to a}m(y) = \lim_{y \to a}M(y) = L \]

so the limit \(\lim_{y\to a}\frac{f(y)}{g(y)}\) exists and is equal to \(L\).

Proposition 7.18 Given differentiable \(f: I \to \R\), where \(I\) is an open interval. For all \(x \in I\), we have

$$ f'(x)

\begin{cases} \ge 0 & ( > 0)\\ = 0\\ \le 0 & ( < 0) \end{cases}\\ \Rarr\\ f\text{ is} \begin{cases} \text{increasing} & \text{strictly increasing}\\ \text{constant}\\ \text{decreasing} & \text{strictly decreasing} \end{cases}

$$

Proof By the mean value theorem, for any \(x,x' \in I\), and \(x < x'\), there exists some \(c \in (x,x')\), such that

\[ f(x') - f(x) = \frac{f(x') - f(x)}{x' - x}(x' - x) = f'(c)(x' - x) \]

Therefore, that whether $f(x') - f(x) is greather or less than 0 depends on the sign of \(f'(c)\).

Proposition 7.19 If \(P(x)\) is a polynomial of degree \(k\), then for every \(a\),

\[ P(x) = P(a) + P'(a)(x-a) + \frac{P''(a)}{2!}(x - a)^2 + \cdots + \frac{P^{(k)}}{k!}(x - a)^k \]

Proposition 7.20 If $P(x) \underset{k}{\sim} Q(x), and \(Q(x) \underset{k}{\sim} R(x)\) as \(x\to a\), then \(P(x) \underset{k}{\sim} R(x)\) as \(x\to a\), and hence \(P(a) = R(a)\), \(P^{(j)}(a) = R^{(j)}\) for \(j = 1,2,\cdots,k\).

Powered by Org Mode.