Recall that if a function \(f = f(x)\) of a single variable is differentiable at \(x=x_0\text{,}\) then
\begin{equation*}
f'(x_0) = \lim_{h \to 0} \frac{f(x_0+h)-f(x_0)}{h}
\end{equation*}
exists. We saw in single variable calculus that the existence of \(f'(x_0)\) means that the graph of \(f\) is locally linear at \(x=x_0\text{.}\) In other words, the graph of \(f\) looks like its linearization \(L(x) = f(x_0)+f'(x_0)(x-x_0)\) for \(x\) close to \(x_0\text{.}\) That is, the values of \(f(x)\) can be closely approximated by \(L(x)\) as long as \(x\) is close to \(x_0\text{.}\) We can measure how good the approximation of \(L(x)\) is to \(f(x)\) with the error function
\begin{equation*}
E(x) = L(x) - f(x) = f(x_0)+f'(x_0)(x-x_0) - f(x).
\end{equation*}
As \(x\) approaches \(x_0\text{,}\) \(E(x)\) approaches \(f(x_0)+f'(x_0)(0) - f(x_0) = 0\text{,}\) and so \(L(x)\) provides increasingly better approximations to \(f(x)\) as \(x\) gets closer to \(x_0\text{.}\) Show that, even though \(g(x,y) = \sqrt{|xy|}\) is not locally linear at \((0,0)\text{,}\) its error term
\begin{equation*}
E(x,y) = L(x,y) - g(x,y)
\end{equation*}
at \((0,0)\) has a limit of \(0\) as \((x,y)\) approaches \((0,0)\text{.}\) (Use the linearization you found in part (b).) This shows that just because an error term goes to \(0\) as \((x,y)\) approaches \((x_0,y_0)\text{,}\) we cannot conclude that a function is locally linear at \((x_0,y_0)\text{.}\)