# Diiferentiability of Functions of a Complex Variable ... Markushevich, Theorem 7.1 ... ...

#### Peter

##### Well-known member
MHB Site Helper
I am reading the book: "Theory of Functions of a Complex Variable" by A. I. Markushevich (Part 1) ...

I need some help with an aspect of the proof of Theorem 7.1 ...

The statement of Theorem 7.1 reads as follows:

At the start of the above proof by Markushevich we read the following:

"If $$\displaystyle f(z)$$ has a derivative $$\displaystyle f'_E(z_0)$$ at $$\displaystyle z_0$$, then by definition

$$\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$

where $$\displaystyle \epsilon ( z, z_0 ) \to 0$$ as $$\displaystyle \Delta z \to 0$$. ... ... "

Now previously in Equation 7.1 at the start of Chapter 7, Markushevich has defined $$\displaystyle f'_E(z_0)$$ as follows:

$$\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$ ... ... ... (1)

How exactly (formally and rigorously) is equation (1) exactly the same as $$\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$ ...

... strictly speaking, shouldn't Markushevich be deriving $$\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$ ... from equation (1) ...

Peter

#### steep

##### Member
I am reading the book: "Theory of Functions of a Complex Variable" by A. I. Markushevich (Part 1) ...
At the start of the above proof by Markushevich we read the following:

"If $$\displaystyle f(z)$$ has a derivative $$\displaystyle f'_E(z_0)$$ at $$\displaystyle z_0$$, then by definition

$$\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$

where $$\displaystyle \epsilon ( z, z_0 ) \to 0$$ as $$\displaystyle \Delta z \to 0$$. ... ... "

Now previously in Equation 7.1 at the start of Chapter 7, Markushevich has defined $$\displaystyle f'_E(z_0)$$ as follows:

$$\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$ ... ... ... (1)

How exactly (formally and rigorously) is equation (1) exactly the same as $$\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$ ...

... strictly speaking, shouldn't Markushevich be deriving $$\displaystyle \frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$ ... from equation (1) ...

Peter
I think I know what is being said, though the notation is a bit all over the place here. E.g. you wrote
$$\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$
but I didn't see a limit explicitly taken so it doesn't seem like a definition of a derivative that I'm familiar with.

Anyway, as is often the case, why not try to estimate the difference between those two definitions, and use the oh so important triangle inequality.

This gives

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert$

by triangle inequality. Now pass limits
i.e. consider for any $\epsilon \gt 0$ we can select a $\delta_1$ neighborhood (i.e. for all $\big \vert z - z_0 \big \vert \lt \delta_1$) such that
$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert \lt \frac{\epsilon}{2}$

and $\delta_2$ neighborhood such that
$\Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2}$

and select $\delta = \min \big(\delta_1, \delta_2\big)$ so you have

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$

#### Peter

##### Well-known member
MHB Site Helper
I think I know what is being said, though the notation is a bit all over the place here. E.g. you wrote
$$\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$
but I didn't see a limit explicitly taken so it doesn't seem like a definition of a derivative that I'm familiar with.

Anyway, as is often the case, why not try to estimate the difference between those two definitions, and use the oh so important triangle inequality.

This gives

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert$

by triangle inequality. Now pass limits
i.e. consider for any $\epsilon \gt 0$ we can select a $\delta_1$ neighborhood (i.e. for all $\big \vert z - z_0 \big \vert \lt \delta_1$) such that
$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert \lt \frac{\epsilon}{2}$

and $\delta_2$ neighborhood such that
$\Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2}$

and select $\delta = \min \big(\delta_1, \delta_2\big)$ so you have

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$

Hi steep...

Thanks so much for your post ...

I am still reflecting on what you have written ...

I must apologize for a serious typo in equation (1) ...

I wrote ...

$$\displaystyle f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$ ... ... ... (1)

when I should have written

$$\displaystyle f'_E(z_0) = \lim_{ z \to z_0} \frac{ f(z) - f(z_0) }{ z - z_0 } = \lim_{ \Delta z \to 0} \frac{ \Delta_E f(z) }{ \Delta z }$$ ... ... ... (1)

I should have also posted the beginnings of Markushevich's start to Chapter 7 to give readers access o his definitions ... so I am posting that now ... as follows:

Hope that helps ...

Peter

#### steep

##### Member
btw it seems worth pointing out that the (limiting) difference quotient is in some sense the 'original' derivative definition for single variable calc.

However the second definition introduced here is, in effect, that the derivative (if it exists) is the best linear approximation of a function over a sufficiently small neighborhood, period. This definition / interpretation is one that generalizes to higher dimensions. And since complex analysis is one variable analysis, but is 'kind of like' multivariable analysis ($\mathbb R^2$ looms), any perceived difference between the definitions is a good thing to dwell on.

Another thing-- a long overdue release / update to Beardon's "Complex Analysis: The Argument Principle" is coming next month as a Dover book. I think you may have a bigger math library than me but I thought I'd mention it.

#### Peter

##### Well-known member
MHB Site Helper
btw it seems worth pointing out that the (limiting) difference quotient is in some sense the 'original' derivative definition for single variable calc.

However the second definition introduced here is, in effect, that the derivative (if it exists) is the best linear approximation of a function over a sufficiently small neighborhood, period. This definition / interpretation is one that generalizes to higher dimensions. And since complex analysis is one variable analysis, but is 'kind of like' multivariable analysis ($\mathbb R^2$ looms), any perceived difference between the definitions is a good thing to dwell on.

Another thing-- a long overdue release / update to Beardon's "Complex Analysis: The Argument Principle" is coming next month as a Dover book. I think you may have a bigger math library than me but I thought I'd mention it.

Thanks for your most helpful posts, steep ...

I'll definitely keep a watch out for the release of Beardon's book ... I find being able consult a number of texts treatment of mathematical topics is helpful to learning ...

Thanks again ...

Peter