Peel Off
  • Blog
  • About
  • Contact

Sin and cos

1/11/2014

0 Comments

 
We present a way to define $\sin$ and $\cos$ which is quite traditional, but show a non-canonical way to "prove" that these definitions are equivalent to the geometrical ones.

First, let's define the derivative of a function $f:\mathbb{R} \rightarrow \mathbb{C}$:

Definition: Given a function $f:\mathbb{R} \rightarrow \mathbb{C}$ given by: $f(x)=\Re(f(x))+i \Im(f(x))$, define:

$f'(x)=\Re'(f(x))+i \Im'(f(x))$


OBS: Note that theorems like "derivative of sum is sum of derivatives" still hold, as well as the definition of derivative by the limit.
OBS: Note also that this ISN'T the derivative of a function $f:\mathbb{C} \rightarrow \mathbb{C}$. We are concerned with functions with real domain.


Now, extend the definition of exponentiation (read the first post on this blog) to complex numbers:


Definition: $\displaystyle e^z:=\sum_{n=0}^{\infty}\frac{z^n}{n!}$


The series converges for every complex $z$ by the ratio test, and the formula $e^{(z+w)}=e^ze^w$ still holds by the cauchy product formula. Now, let's calculate the derivative of $e^x$ and $e^{ix}$. Note that $x$ is real.
It's common to do this by theorems of power series. We shall not use them. Instead, we use more elementary methods.
For the derivative of $e^x$:

$\displaystyle \lim_{h\rightarrow 0} \frac{e^{x+h}-e^x}{h}=e^{x}\lim_{h\rightarrow 0} \frac{e^h-1}{h}$

Now, to evaluate the last limit (without using theorems of power series), do the following:

Fix an arbitrary $H >0$.
Now, given an $\epsilon >0$, there exists $n \in \mathbb{N}$ such that:

$$\frac{H^{n}}{(n+1)!}+\frac{H^{n+1}}{(n+2)!}+...  \leq \epsilon$$

since the series $\displaystyle \sum_{k=0}^{\infty}\frac{H^k}{(k+1)!}$ converges by the ratio test. But note that if you multiply $0<h<H$ this implies :

$$\frac{hH^{n}}{(n+1)!}+\frac{hH^{n+1}}{(n+2)!}+... \leq \epsilon.h$$

Since $h<H$:

$$\frac{h^{n+1}}{(n+1)!}+\frac{h^{n+2}}{(n+2)!}+...  \leq \frac{hH^{n}}{(n+1)!}+\frac{hH^{n+1}}{(n+2)!}+... \leq \epsilon.h$$

But then, we have:

$$e^h \leq 1+h+\frac{h^2}{2!}+\frac{h^3}{3!}+...+\frac{h^n}{n!} + \epsilon.h$$

Which gives us:

$$\frac{e^h -1}{h} \leq 1+\frac{h}{2!}+\frac{h^2}{3!}+...+\frac{h^{n-1}}{n!} + \epsilon$$


But $1\leq \frac{e^h -1}{h}$ is obvious from the definition of $e^h$. So, taking limits:


$$1 \leq \displaystyle \lim_{h\rightarrow 0^{+}} \frac{e^h -1}{h} \leq 1+\epsilon$$


But $\epsilon>0$ was arbitrary, which gives:


$$\lim_{h\rightarrow 0^{+}} \frac{e^h -1}{h} =1$$

Now, note that:

 $$\displaystyle \lim_{h\rightarrow 0^{-}} \frac{e^h -1}{h} =
\lim_{h\rightarrow 0^{+}}  \frac{e^{-h}-1}{-h}= \lim_{h\rightarrow 0^{+}} \frac{\frac{1}{e^h}-1}{-h}=
\lim_{h\rightarrow 0^{+}} \frac{e^h-1}{h}.\frac{1}{e^h}=1$$  

Hence, the limit equals $1$, and it is proved that the derivative of $e^x$ is $e^x$. $\blacksquare$

Now, we will calculate the derivative of $e^{ix}$:

$\displaystyle \lim_{h\rightarrow 0} \frac{e^{i(x+h)}-e^{ix}}{h}=e^{ix}\lim_{h\rightarrow 0} \frac{e^{ih}-1}{h}=e^{ix}\lim_{h\rightarrow 0} \frac{e^{ih}-1}{h}$.

But $e^{ih}=1+ih-\frac{h^2}{2!}-i\frac{h^3}{3!}+\frac{h^4}{4!}+...$. Since the series is absolutely convergent, separate the series in two pieces: the part with $i$ and the part without $i$. Similar estimations that were used before now will be able to be used, and will result (since the term in $h$ is $i$):

$$\lim_{h\rightarrow 0} \frac{e^{ih}-1}{h}=i$$

So, the derivative of $e^{ix}$ is $ie^{ix}$.

You may ask at this point: where is $\cos$ and $\sin$?

Definition: 
$\displaystyle \cos(x):=\frac{e^{ix}+e^{-ix}}{2}$
$\displaystyle \sin(x):=\frac{e^{ix}-e^{-ix}}{2i}$

By the definition of $e^z$, $e^{\overline{z}}=\overline{e^z}$. Then, $\cos$ and $\sin$ are real functions. Moreover, it is evident that:

$$e^{ix}=\cos(x)+ i \sin(x)$$

We also have:

$$|e^{ix}|^2=e^{ix}.\overline{e^{ix}}=e^{ix}e^{-ix}=1$$

which implies:

$$|e^{ix}|=1 \Rightarrow \sin^2(x)+\cos^2(x)=1$$

Also, directly from definition:

$$\cos'(x)=-\sin(x), ~~~~~~\sin'(x)=\cos(x)$$

And also directly from definition: $\cos(0)=1$, $\sin(0)=0$

Now, why on earth are those definitions the sine and cosine we know?

We will prove they must be. How?

Proposition: Let $c:\mathbb{R} \rightarrow \mathbb{R}$ and $s: \mathbb{R} \rightarrow \mathbb{R}$ such that:

(1) $c(0)=1$, $s(0)=0$
(2)$c'(x)=-s(x)$, $s'(x)=c(x)$.

So, $s(x)=\sin(x)$ and $c(x)=\cos(x)$.

This way, since the functions sine and cosine we know geometrically satisfy those properties, they must be the $\sin$ and $\cos$ we just defined.

Proof: Suppose we have functions $c, s$ satisfying those properties.
Define the function $f(x):=(\cos(x)-c(x))^2+(\sin(x)-s(x))^2$. We have:

$$f'(x)=2(\cos(x)-c(x))(-\sin(x)+s(x))+2(\sin(x)-s(x))(\cos(x)-c(x))=0$$

Therefore, $f$ is constant.

But $f(0)=(1-1)^2+(0-0)^2=0$. So $f(x)=0$ for all $x \in \mathbb{R}$. 

But this can only be true if $\sin(x)=s(x)$ and $\cos(x)=c(x)$ for all $x \in \mathbb{R}$. $\blacksquare$.









0 Comments



Leave a Reply.

    Author

    My name is Aloizio Macedo, and I am a 21 years old Mathematics student at UFRJ (Universidade Federal do Rio de Janeiro).

    Archives

    October 2015
    August 2015
    November 2014
    September 2014
    August 2014

    Categories

    All

    RSS Feed

Powered by Create your own unique website with customizable templates.