Note: less than 1 lecture, optional, see also the optional Section 4.3
Let \(U \subset \R^n\) be an open set and \(f \colon U \to \R\) a function. Denote our coordinates by \(x = (x_1,x_2,\ldots,x_n) \in \R^n\text{.}\) Suppose \(\frac{\partial f}{\partial x_{\ell}}\) exists everywhere in \(U\text{,}\) then it is also a function \(\frac{\partial f}{\partial x_{\ell}}
\colon U \to \R\text{.}\) Therefore, it makes sense to talk about its partial derivatives. We denote the partial derivative of \(\frac{\partial f}{\partial x_{\ell}}\) with respect to \(x_m\) by
Such a derivative is called a partial derivative of order \(k\).
Sometimes the notation \(f_{x_{\ell} x_m}\) is used for \(\frac{\partial^2 f}{\partial x_m \partial x_{\ell}}\text{.}\) This notation swaps the order in which we write the derivatives, which may be important.
Definition8.6.1.
Suppose \(U \subset \R^n\) is an open set and \(f \colon U \to \R\) is a function. We say \(f\) is \(k\)-times continuously differentiable function, or a \(C^k\) function, if all partial derivatives of all orders up to and including order \(k\) exist and are continuous.
So a continuously differentiable, or \(C^1\text{,}\) function is one where all first order partial derivatives exist and are continuous, which agrees with our previous definition due to Proposition 8.4.6. We could have required only that the \(k\)th order partial derivatives exist and are continuous, as the existence of lower order partial derivatives is clearly necessary to even define \(k\)th order partial derivatives, and these lower order partial derivatives are continuous as they are (continuously) differentiable functions.
When the partial derivatives are continuous, we can swap their order.
Proposition8.6.2.
Suppose \(U \subset \R^n\) is open and \(f \colon U \to \R\) is a \(C^2\) function, and \(\ell\) and \(m\) are two integers from \(1\) to \(n\text{.}\) Then
Fix a \(p \in U\text{,}\) and let \(e_{\ell}\) and \(e_m\) be the standard basis vectors. Pick two positive numbers \(s\) and \(t\) small enough so that \(p+s_0e_{\ell} +t_0e_m \in U\) whenever \(0 < s_0 \leq s\) and \(0 < t_0 \leq t\text{.}\) This can be done as \(U\) is open and so contains a small open ball (or a box if you wish) around \(p\text{.}\)
See Figure 8.14. The \(s_0\) and \(t_0\) depend on \(s\) and \(t\text{,}\) but \(0 < s_0 < s\) and \(0 < t_0 < t\text{.}\) Let the domain of the function \(g\) be the set \((0,\epsilon) \times
(0,\epsilon)\) for some small \(\epsilon > 0\text{.}\) As \((s,t) \in (0,\epsilon) \times (0,\epsilon)\) goes to \((0,0)\text{,}\)\((s_0,t_0)\) also goes to \((0,0)\text{.}\) By continuity of the second partial derivatives,
Now reverse the roles of \(s\) and \(t\) (and \(\ell\) and \(m\)). Start with the function \(\sigma \mapsto f(p+\sigma e_{\ell} + te_m)-f(p + \sigma
e_{\ell})\) find an \(s_1 \in (0,s)\) such that
The proposition does not hold if the derivatives are not continuous. See Exercise 8.6.2. Notice also that we did not really need a \(C^2\) function, we only needed the two second order partial derivatives involved to be continuous functions.
ExercisesExercises
8.6.1.
Suppose \(f \colon U \to \R\) is a \(C^2\) function for some open \(U \subset
\R^n\) and \(p \in U\text{.}\) Use the proof of Proposition 8.6.2 to find an expression in terms of just the values of \(f\) (analogue of the difference quotient for the first derivative), whose limit is \(\frac{\partial^2 f}{ \partial x_{\ell} \partial x_m}(p)\text{.}\)
The first order partial derivatives exist and are continuous.
The partial derivatives \(\frac{\partial^2 f}{\partial x \partial y}\) and \(\frac{\partial^2 f}{\partial y \partial x}\) exist, but are not continuous at \((0,0)\text{,}\) and \(\frac{\partial^2 f}{\partial x \partial y}(0,0) \not=
\frac{\partial^2 f}{\partial y \partial x}(0,0)\text{.}\)
8.6.3.
Let \(f \colon U \to \R\) be a \(C^k\) function for some open \(U \subset \R^n\) and \(p \in U\text{.}\) Suppose \({\ell}_1,{\ell}_2,\ldots,{\ell}_k\) are integers between \(1\) and \(n\text{,}\) and \(\sigma=(\sigma_1,\sigma_2,\ldots,\sigma_k)\) is a permutation of \((1,2,\ldots,k)\text{.}\) Prove
Suppose \(\varphi \colon \R^2 \to \R\) is a \(C^k\) function such that \(\varphi(0,\theta) = \varphi(0,\psi)\) for all \(\theta,\psi \in \R\) and \(\varphi(r,\theta) = \varphi(r,\theta+2\pi)\) for all \(r,\theta \in \R\text{.}\) Let \(F(r,\theta) \coloneqq \bigl(r \cos(\theta), r \sin(\theta) \bigr)\) from Exercise 8.5.8. Show that a function \(g \colon \R^2 \to \R\text{,}\) given \(g(x,y) \coloneqq \varphi \bigl(F^{-1}(x,y)\bigr)\) is well-defined (notice that \(F^{-1}(x,y)\) can only be defined locally), and when restricted to \(\R^2 \setminus \{ 0 \}\) it is a \(C^k\) function. Note: Feel free to use what you know about sine and cosine from calculus.
8.6.5.
Suppose \(f \colon \R^2 \to \R\) is a \(C^2\) function. For all \((x,y) \in \R^2\text{,}\) compute
in terms of the partial derivatives of \(f\text{.}\)
8.6.6.
Suppose \(f \colon \R^2 \to \R\) is a function such that all first and second order partial derivatives exist. Furthermore, suppose that all second order partial derivatives are bounded functions. Prove that \(f\) is continuously differentiable.
8.6.7.
Follow the strategy below to prove the following simple version of the second derivative test for functions defined on \(\R^2\) (using \((x,y)\) as coordinates): Suppose \(f \colon \R^2
\to \R\) is a twice continuously differentiable function with a critical point at the origin, \(f'(0,0) = 0\text{.}\) If
then \(f\) has a (strict) local minimum at \((0,0)\text{.}\) Use the following technique: First suppose without loss of generality that \(f(0,0) = 0\text{.}\) Then prove:
There exists an \(A \in L(\R^2)\) such that \(g = f \circ A\) is such that \(\frac{\partial^2 g}{\partial x \partial y} (0,0) = 0\text{,}\) and \(\frac{\partial^2 g}{\partial x^2} (0,0) =
\frac{\partial^2 g}{\partial y^2} (0,0) = 1\text{.}\)
For every \(\epsilon > 0\text{,}\) there exists a \(\delta > 0\) such that \(\abs{g(x,y) - x^2 - y^2} < \epsilon (x^2+y^2)\) for all \((x,y) \in B\bigl((0,0),\delta\bigr)\text{.}\) Hint: You can use Taylor’s theorem in one variable.
This means that \(g\text{,}\) and therefore \(f\text{,}\) has a strict local minimum at \((0,0)\text{.}\)
Note: You must avoid the temptation to just apply the one variable second derivative test along lines through the origin, see Exercise 8.3.11.
For a higher quality printout use the PDF versions: https://www.jirka.org/ra/realanal.pdf or https://www.jirka.org/ra/realanal2.pdf