Proving the angle sum and difference identities for sine and cosine without involving the functions' geometric meanings
Clash Royale CLAN TAG#URR8PPP
$begingroup$
For well known identities
$$
sin(alpha pm beta) = sinalphacosbeta pm cosalphasinbeta
$$
$$
cos(alpha pm beta) = cosalphacosbeta mp sinalphasinbeta
$$
is it possible to provide a proof which does not involve geometric meaning of sine and cosine functions (that is use of Ptolemy’s theorem)?
trigonometry
$endgroup$
add a comment |
$begingroup$
For well known identities
$$
sin(alpha pm beta) = sinalphacosbeta pm cosalphasinbeta
$$
$$
cos(alpha pm beta) = cosalphacosbeta mp sinalphasinbeta
$$
is it possible to provide a proof which does not involve geometric meaning of sine and cosine functions (that is use of Ptolemy’s theorem)?
trigonometry
$endgroup$
3
$begingroup$
Then do you want to start with alternative definitions of these functions? Any algebraic definitions would reduce the proof to an easy calculation. For example, define them as the real and imaginary parts of $e^ix$.
$endgroup$
– Aravind
Feb 17 at 14:27
add a comment |
$begingroup$
For well known identities
$$
sin(alpha pm beta) = sinalphacosbeta pm cosalphasinbeta
$$
$$
cos(alpha pm beta) = cosalphacosbeta mp sinalphasinbeta
$$
is it possible to provide a proof which does not involve geometric meaning of sine and cosine functions (that is use of Ptolemy’s theorem)?
trigonometry
$endgroup$
For well known identities
$$
sin(alpha pm beta) = sinalphacosbeta pm cosalphasinbeta
$$
$$
cos(alpha pm beta) = cosalphacosbeta mp sinalphasinbeta
$$
is it possible to provide a proof which does not involve geometric meaning of sine and cosine functions (that is use of Ptolemy’s theorem)?
trigonometry
trigonometry
edited Feb 17 at 14:37
Blue
49.1k870156
49.1k870156
asked Feb 17 at 14:10
scrutariscrutari
1679
1679
3
$begingroup$
Then do you want to start with alternative definitions of these functions? Any algebraic definitions would reduce the proof to an easy calculation. For example, define them as the real and imaginary parts of $e^ix$.
$endgroup$
– Aravind
Feb 17 at 14:27
add a comment |
3
$begingroup$
Then do you want to start with alternative definitions of these functions? Any algebraic definitions would reduce the proof to an easy calculation. For example, define them as the real and imaginary parts of $e^ix$.
$endgroup$
– Aravind
Feb 17 at 14:27
3
3
$begingroup$
Then do you want to start with alternative definitions of these functions? Any algebraic definitions would reduce the proof to an easy calculation. For example, define them as the real and imaginary parts of $e^ix$.
$endgroup$
– Aravind
Feb 17 at 14:27
$begingroup$
Then do you want to start with alternative definitions of these functions? Any algebraic definitions would reduce the proof to an easy calculation. For example, define them as the real and imaginary parts of $e^ix$.
$endgroup$
– Aravind
Feb 17 at 14:27
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
The answer is that it depends how you define the sine and cosine functions; if they have a geometric definition then geometry has to come in somewhere. In fact, they relate closely to the concept of similarity in the Euclidean plane, and are useful in this context because similar triangles have equal angles.
Sine and cosine are also related to the exponential function in the complex plane through the identity $$e^ia=cos a +isin a,$$ and we can compute
beginalign
e^i(a+b)&=e^iae^ib\
cos (a+b)+isin (a+b)&=(cos a +isin a)(cos b +isin b)\
&=(cos acos b-sin a sin b)+i(sin a cos b+cos asin b).
endalign
Equating real and imaginary parts then gives what we want, and this is applicable generally. Some geometric proofs and constructions apply only to a specific range of values or require considering various cases.
$endgroup$
$begingroup$
could you please explain the third equality?
$endgroup$
– scrutari
Feb 17 at 14:55
$begingroup$
@scrutari That comes from expanding the product of brackets and using $i^2=-1$. There are two terms in each bracket, therefore four in the product, and these are gathered together in pairs. I've probably put the second pair in a confusing order.
$endgroup$
– Mark Bennet
Feb 17 at 15:18
$begingroup$
Apologies for the confusion, I meant transition from $cos(a + b) + i sin(a + b)$ to $(cos a + i sin a)(cos b + i sin b)$.
$endgroup$
– scrutari
Feb 17 at 16:01
1
$begingroup$
@scrutari: That's saying $e^i(a+b)=e^iae^ib$
$endgroup$
– J. W. Tanner
Feb 17 at 16:05
2
$begingroup$
This sort of just reduces it to the question "How do you prove that $e^i(a+b) = e^ia e^ib$, without using the sum identities?" It's easy to want to say it's obvious, because the notation $e^z$ for the exponential function makes you assume that it obeys laws of exponents, but that's something that needs to be proved.
$endgroup$
– Nate Eldredge
Feb 17 at 20:55
|
show 3 more comments
$begingroup$
Here's an approach I recently assigned as homework that needs nothing more than simple calculus facts. No complex numbers, in particular.
Suppose you know only the following:
$$sin'=cos, quad cos'=-sin, quad sin 0 = 0, quad cos 0 = 1$$
As a warmup, prove the Pythagorean identity $sin^2 x + cos^2 x = 1$. (Hint: let $f(x) = sin^2 x + cos^2 x$ and compute $f'$.) In particular, $|sin x|le 1$ and $|cos x| le 1$.
Now fix $a$ and consider the function
$$g(x) = sin(x+a) - sin x cos a - cos x sin a.$$
Compute $g'$ and $g''$, and note that $g'' = -g$. Verify that $g^(n)(0) = 0$ for every $n$, and that $|g^(n)(x)| le 3$ for every $n,x$. Now apply Taylor's theorem with Lagrange remainder (which is really just a consequence of the mean value theorem) to bound the difference $|g(x) - p_n(x)|$, where $p_n$ is the $n$th degree Taylor polynomial of $g$ centered at 0. But $p_n=0$. Letting $n to infty$ you can conclude $g equiv 0$.
For the identity involving $cos(x+a)$, consider $g'$. The minus versions may be done similarly via the function $h(x) = sin(a-x) - sin acos x + cos a sin x$, or by showing separately that $sin$ is an odd function and $cos$ is an even function.
Some variations:
If you know about real analytic functions, and you know that $sin x, cos x,sin(x+a)$ are all real analytic, then you are done as soon as you show that $g^(n)=0$ for every $n$.
If you know about uniqueness of solutions to ODEs, then just note that $g(0) = g'(0) = 0$ and $g'' = -g$.
$endgroup$
1
$begingroup$
I like this approach, quite intense in using theorems from various parts of Maths, though "No complex numbers" in fact means "but use the rest of Maths instead" :)
$endgroup$
– scrutari
Feb 17 at 19:55
1
$begingroup$
I think if you look carefully, other than the variations, that this really doesn't use anything except basic facts about derivatives (product rule, chain rule, etc) and the mean value theorem. (Taylor's theorem is just a repeated application of the mean value theorem.)
$endgroup$
– Nate Eldredge
Feb 17 at 20:50
$begingroup$
This is a nice approach (+1). There are various ways of showing that the derivatives are as stated in the assumptions. How this is done depends on the definition, and I think it has to be done with some care to avoid being circular.
$endgroup$
– Mark Bennet
Feb 17 at 21:52
$begingroup$
@MarkBennet: Yes, certainly. The OP didn't state definitions, so one has to assume something, and I wanted to state clearly what I was assuming. But I think these are fairly safe. They could be the definition, if you know about existence and uniqueness of solutions to ODEs. Or, if your definition is in terms of power series, these facts are almost immediate if you are careful about convergence.
$endgroup$
– Nate Eldredge
Feb 17 at 21:56
3
$begingroup$
@scrutari: To do that, you'll have to give a "pure algebraic" definition of the sine and cosine functions. Given that they are transcendental, this seems hard to do.
$endgroup$
– Nate Eldredge
Feb 17 at 21:58
|
show 3 more comments
$begingroup$
Just to share my idea. I will not say that my proof doesn't involve geometry.
Let $A=(cosalpha,sinalpha)$ and $B=(cosbeta,sinbeta)$. Then
$$AB^2=(cosalpha-cosbeta)^2+(sinalpha-sinbeta)^2=2-2cosalphacosbeta-2sinalphasinbeta$$
On the other hand,
$$AB^2=OA^2+OB^2-2(OA)(OB)cosangle AOB=1^2+1^2-2(1)(1)cos(alpha-beta)$$
This proves that $cos(alpha-beta)=cosalphacosbeta+sinalphasinbeta$. The proof holds for arbitrary $alpha$ and $beta$.
$cos(alpha+beta)=cosalphacos(-beta)+sinalphasin(-beta)=cosalphacosbeta-sinalphasinbeta$
$sin(alpha+beta)=cosleft(fracpi2-alpha-betaright)=cosleft(fracpi2-alpharight)cosbeta+sinleft(fracpi2-alpharight)sinbeta=sinalphacosbeta+cosalphasinbeta$
$sin(alpha-beta)=sinalphacos(-beta)+cosalphasin(-beta)=sinalphacosbeta-cosalphasinbeta$
$endgroup$
1
$begingroup$
(+1) haven't seen you in a while!
$endgroup$
– TheSimpliFire
Feb 17 at 19:04
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3116284%2fproving-the-angle-sum-and-difference-identities-for-sine-and-cosine-without-invo%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The answer is that it depends how you define the sine and cosine functions; if they have a geometric definition then geometry has to come in somewhere. In fact, they relate closely to the concept of similarity in the Euclidean plane, and are useful in this context because similar triangles have equal angles.
Sine and cosine are also related to the exponential function in the complex plane through the identity $$e^ia=cos a +isin a,$$ and we can compute
beginalign
e^i(a+b)&=e^iae^ib\
cos (a+b)+isin (a+b)&=(cos a +isin a)(cos b +isin b)\
&=(cos acos b-sin a sin b)+i(sin a cos b+cos asin b).
endalign
Equating real and imaginary parts then gives what we want, and this is applicable generally. Some geometric proofs and constructions apply only to a specific range of values or require considering various cases.
$endgroup$
$begingroup$
could you please explain the third equality?
$endgroup$
– scrutari
Feb 17 at 14:55
$begingroup$
@scrutari That comes from expanding the product of brackets and using $i^2=-1$. There are two terms in each bracket, therefore four in the product, and these are gathered together in pairs. I've probably put the second pair in a confusing order.
$endgroup$
– Mark Bennet
Feb 17 at 15:18
$begingroup$
Apologies for the confusion, I meant transition from $cos(a + b) + i sin(a + b)$ to $(cos a + i sin a)(cos b + i sin b)$.
$endgroup$
– scrutari
Feb 17 at 16:01
1
$begingroup$
@scrutari: That's saying $e^i(a+b)=e^iae^ib$
$endgroup$
– J. W. Tanner
Feb 17 at 16:05
2
$begingroup$
This sort of just reduces it to the question "How do you prove that $e^i(a+b) = e^ia e^ib$, without using the sum identities?" It's easy to want to say it's obvious, because the notation $e^z$ for the exponential function makes you assume that it obeys laws of exponents, but that's something that needs to be proved.
$endgroup$
– Nate Eldredge
Feb 17 at 20:55
|
show 3 more comments
$begingroup$
The answer is that it depends how you define the sine and cosine functions; if they have a geometric definition then geometry has to come in somewhere. In fact, they relate closely to the concept of similarity in the Euclidean plane, and are useful in this context because similar triangles have equal angles.
Sine and cosine are also related to the exponential function in the complex plane through the identity $$e^ia=cos a +isin a,$$ and we can compute
beginalign
e^i(a+b)&=e^iae^ib\
cos (a+b)+isin (a+b)&=(cos a +isin a)(cos b +isin b)\
&=(cos acos b-sin a sin b)+i(sin a cos b+cos asin b).
endalign
Equating real and imaginary parts then gives what we want, and this is applicable generally. Some geometric proofs and constructions apply only to a specific range of values or require considering various cases.
$endgroup$
$begingroup$
could you please explain the third equality?
$endgroup$
– scrutari
Feb 17 at 14:55
$begingroup$
@scrutari That comes from expanding the product of brackets and using $i^2=-1$. There are two terms in each bracket, therefore four in the product, and these are gathered together in pairs. I've probably put the second pair in a confusing order.
$endgroup$
– Mark Bennet
Feb 17 at 15:18
$begingroup$
Apologies for the confusion, I meant transition from $cos(a + b) + i sin(a + b)$ to $(cos a + i sin a)(cos b + i sin b)$.
$endgroup$
– scrutari
Feb 17 at 16:01
1
$begingroup$
@scrutari: That's saying $e^i(a+b)=e^iae^ib$
$endgroup$
– J. W. Tanner
Feb 17 at 16:05
2
$begingroup$
This sort of just reduces it to the question "How do you prove that $e^i(a+b) = e^ia e^ib$, without using the sum identities?" It's easy to want to say it's obvious, because the notation $e^z$ for the exponential function makes you assume that it obeys laws of exponents, but that's something that needs to be proved.
$endgroup$
– Nate Eldredge
Feb 17 at 20:55
|
show 3 more comments
$begingroup$
The answer is that it depends how you define the sine and cosine functions; if they have a geometric definition then geometry has to come in somewhere. In fact, they relate closely to the concept of similarity in the Euclidean plane, and are useful in this context because similar triangles have equal angles.
Sine and cosine are also related to the exponential function in the complex plane through the identity $$e^ia=cos a +isin a,$$ and we can compute
beginalign
e^i(a+b)&=e^iae^ib\
cos (a+b)+isin (a+b)&=(cos a +isin a)(cos b +isin b)\
&=(cos acos b-sin a sin b)+i(sin a cos b+cos asin b).
endalign
Equating real and imaginary parts then gives what we want, and this is applicable generally. Some geometric proofs and constructions apply only to a specific range of values or require considering various cases.
$endgroup$
The answer is that it depends how you define the sine and cosine functions; if they have a geometric definition then geometry has to come in somewhere. In fact, they relate closely to the concept of similarity in the Euclidean plane, and are useful in this context because similar triangles have equal angles.
Sine and cosine are also related to the exponential function in the complex plane through the identity $$e^ia=cos a +isin a,$$ and we can compute
beginalign
e^i(a+b)&=e^iae^ib\
cos (a+b)+isin (a+b)&=(cos a +isin a)(cos b +isin b)\
&=(cos acos b-sin a sin b)+i(sin a cos b+cos asin b).
endalign
Equating real and imaginary parts then gives what we want, and this is applicable generally. Some geometric proofs and constructions apply only to a specific range of values or require considering various cases.
edited Feb 17 at 17:02
YawarRaza7349
884
884
answered Feb 17 at 14:31
Mark BennetMark Bennet
81.6k984181
81.6k984181
$begingroup$
could you please explain the third equality?
$endgroup$
– scrutari
Feb 17 at 14:55
$begingroup$
@scrutari That comes from expanding the product of brackets and using $i^2=-1$. There are two terms in each bracket, therefore four in the product, and these are gathered together in pairs. I've probably put the second pair in a confusing order.
$endgroup$
– Mark Bennet
Feb 17 at 15:18
$begingroup$
Apologies for the confusion, I meant transition from $cos(a + b) + i sin(a + b)$ to $(cos a + i sin a)(cos b + i sin b)$.
$endgroup$
– scrutari
Feb 17 at 16:01
1
$begingroup$
@scrutari: That's saying $e^i(a+b)=e^iae^ib$
$endgroup$
– J. W. Tanner
Feb 17 at 16:05
2
$begingroup$
This sort of just reduces it to the question "How do you prove that $e^i(a+b) = e^ia e^ib$, without using the sum identities?" It's easy to want to say it's obvious, because the notation $e^z$ for the exponential function makes you assume that it obeys laws of exponents, but that's something that needs to be proved.
$endgroup$
– Nate Eldredge
Feb 17 at 20:55
|
show 3 more comments
$begingroup$
could you please explain the third equality?
$endgroup$
– scrutari
Feb 17 at 14:55
$begingroup$
@scrutari That comes from expanding the product of brackets and using $i^2=-1$. There are two terms in each bracket, therefore four in the product, and these are gathered together in pairs. I've probably put the second pair in a confusing order.
$endgroup$
– Mark Bennet
Feb 17 at 15:18
$begingroup$
Apologies for the confusion, I meant transition from $cos(a + b) + i sin(a + b)$ to $(cos a + i sin a)(cos b + i sin b)$.
$endgroup$
– scrutari
Feb 17 at 16:01
1
$begingroup$
@scrutari: That's saying $e^i(a+b)=e^iae^ib$
$endgroup$
– J. W. Tanner
Feb 17 at 16:05
2
$begingroup$
This sort of just reduces it to the question "How do you prove that $e^i(a+b) = e^ia e^ib$, without using the sum identities?" It's easy to want to say it's obvious, because the notation $e^z$ for the exponential function makes you assume that it obeys laws of exponents, but that's something that needs to be proved.
$endgroup$
– Nate Eldredge
Feb 17 at 20:55
$begingroup$
could you please explain the third equality?
$endgroup$
– scrutari
Feb 17 at 14:55
$begingroup$
could you please explain the third equality?
$endgroup$
– scrutari
Feb 17 at 14:55
$begingroup$
@scrutari That comes from expanding the product of brackets and using $i^2=-1$. There are two terms in each bracket, therefore four in the product, and these are gathered together in pairs. I've probably put the second pair in a confusing order.
$endgroup$
– Mark Bennet
Feb 17 at 15:18
$begingroup$
@scrutari That comes from expanding the product of brackets and using $i^2=-1$. There are two terms in each bracket, therefore four in the product, and these are gathered together in pairs. I've probably put the second pair in a confusing order.
$endgroup$
– Mark Bennet
Feb 17 at 15:18
$begingroup$
Apologies for the confusion, I meant transition from $cos(a + b) + i sin(a + b)$ to $(cos a + i sin a)(cos b + i sin b)$.
$endgroup$
– scrutari
Feb 17 at 16:01
$begingroup$
Apologies for the confusion, I meant transition from $cos(a + b) + i sin(a + b)$ to $(cos a + i sin a)(cos b + i sin b)$.
$endgroup$
– scrutari
Feb 17 at 16:01
1
1
$begingroup$
@scrutari: That's saying $e^i(a+b)=e^iae^ib$
$endgroup$
– J. W. Tanner
Feb 17 at 16:05
$begingroup$
@scrutari: That's saying $e^i(a+b)=e^iae^ib$
$endgroup$
– J. W. Tanner
Feb 17 at 16:05
2
2
$begingroup$
This sort of just reduces it to the question "How do you prove that $e^i(a+b) = e^ia e^ib$, without using the sum identities?" It's easy to want to say it's obvious, because the notation $e^z$ for the exponential function makes you assume that it obeys laws of exponents, but that's something that needs to be proved.
$endgroup$
– Nate Eldredge
Feb 17 at 20:55
$begingroup$
This sort of just reduces it to the question "How do you prove that $e^i(a+b) = e^ia e^ib$, without using the sum identities?" It's easy to want to say it's obvious, because the notation $e^z$ for the exponential function makes you assume that it obeys laws of exponents, but that's something that needs to be proved.
$endgroup$
– Nate Eldredge
Feb 17 at 20:55
|
show 3 more comments
$begingroup$
Here's an approach I recently assigned as homework that needs nothing more than simple calculus facts. No complex numbers, in particular.
Suppose you know only the following:
$$sin'=cos, quad cos'=-sin, quad sin 0 = 0, quad cos 0 = 1$$
As a warmup, prove the Pythagorean identity $sin^2 x + cos^2 x = 1$. (Hint: let $f(x) = sin^2 x + cos^2 x$ and compute $f'$.) In particular, $|sin x|le 1$ and $|cos x| le 1$.
Now fix $a$ and consider the function
$$g(x) = sin(x+a) - sin x cos a - cos x sin a.$$
Compute $g'$ and $g''$, and note that $g'' = -g$. Verify that $g^(n)(0) = 0$ for every $n$, and that $|g^(n)(x)| le 3$ for every $n,x$. Now apply Taylor's theorem with Lagrange remainder (which is really just a consequence of the mean value theorem) to bound the difference $|g(x) - p_n(x)|$, where $p_n$ is the $n$th degree Taylor polynomial of $g$ centered at 0. But $p_n=0$. Letting $n to infty$ you can conclude $g equiv 0$.
For the identity involving $cos(x+a)$, consider $g'$. The minus versions may be done similarly via the function $h(x) = sin(a-x) - sin acos x + cos a sin x$, or by showing separately that $sin$ is an odd function and $cos$ is an even function.
Some variations:
If you know about real analytic functions, and you know that $sin x, cos x,sin(x+a)$ are all real analytic, then you are done as soon as you show that $g^(n)=0$ for every $n$.
If you know about uniqueness of solutions to ODEs, then just note that $g(0) = g'(0) = 0$ and $g'' = -g$.
$endgroup$
1
$begingroup$
I like this approach, quite intense in using theorems from various parts of Maths, though "No complex numbers" in fact means "but use the rest of Maths instead" :)
$endgroup$
– scrutari
Feb 17 at 19:55
1
$begingroup$
I think if you look carefully, other than the variations, that this really doesn't use anything except basic facts about derivatives (product rule, chain rule, etc) and the mean value theorem. (Taylor's theorem is just a repeated application of the mean value theorem.)
$endgroup$
– Nate Eldredge
Feb 17 at 20:50
$begingroup$
This is a nice approach (+1). There are various ways of showing that the derivatives are as stated in the assumptions. How this is done depends on the definition, and I think it has to be done with some care to avoid being circular.
$endgroup$
– Mark Bennet
Feb 17 at 21:52
$begingroup$
@MarkBennet: Yes, certainly. The OP didn't state definitions, so one has to assume something, and I wanted to state clearly what I was assuming. But I think these are fairly safe. They could be the definition, if you know about existence and uniqueness of solutions to ODEs. Or, if your definition is in terms of power series, these facts are almost immediate if you are careful about convergence.
$endgroup$
– Nate Eldredge
Feb 17 at 21:56
3
$begingroup$
@scrutari: To do that, you'll have to give a "pure algebraic" definition of the sine and cosine functions. Given that they are transcendental, this seems hard to do.
$endgroup$
– Nate Eldredge
Feb 17 at 21:58
|
show 3 more comments
$begingroup$
Here's an approach I recently assigned as homework that needs nothing more than simple calculus facts. No complex numbers, in particular.
Suppose you know only the following:
$$sin'=cos, quad cos'=-sin, quad sin 0 = 0, quad cos 0 = 1$$
As a warmup, prove the Pythagorean identity $sin^2 x + cos^2 x = 1$. (Hint: let $f(x) = sin^2 x + cos^2 x$ and compute $f'$.) In particular, $|sin x|le 1$ and $|cos x| le 1$.
Now fix $a$ and consider the function
$$g(x) = sin(x+a) - sin x cos a - cos x sin a.$$
Compute $g'$ and $g''$, and note that $g'' = -g$. Verify that $g^(n)(0) = 0$ for every $n$, and that $|g^(n)(x)| le 3$ for every $n,x$. Now apply Taylor's theorem with Lagrange remainder (which is really just a consequence of the mean value theorem) to bound the difference $|g(x) - p_n(x)|$, where $p_n$ is the $n$th degree Taylor polynomial of $g$ centered at 0. But $p_n=0$. Letting $n to infty$ you can conclude $g equiv 0$.
For the identity involving $cos(x+a)$, consider $g'$. The minus versions may be done similarly via the function $h(x) = sin(a-x) - sin acos x + cos a sin x$, or by showing separately that $sin$ is an odd function and $cos$ is an even function.
Some variations:
If you know about real analytic functions, and you know that $sin x, cos x,sin(x+a)$ are all real analytic, then you are done as soon as you show that $g^(n)=0$ for every $n$.
If you know about uniqueness of solutions to ODEs, then just note that $g(0) = g'(0) = 0$ and $g'' = -g$.
$endgroup$
1
$begingroup$
I like this approach, quite intense in using theorems from various parts of Maths, though "No complex numbers" in fact means "but use the rest of Maths instead" :)
$endgroup$
– scrutari
Feb 17 at 19:55
1
$begingroup$
I think if you look carefully, other than the variations, that this really doesn't use anything except basic facts about derivatives (product rule, chain rule, etc) and the mean value theorem. (Taylor's theorem is just a repeated application of the mean value theorem.)
$endgroup$
– Nate Eldredge
Feb 17 at 20:50
$begingroup$
This is a nice approach (+1). There are various ways of showing that the derivatives are as stated in the assumptions. How this is done depends on the definition, and I think it has to be done with some care to avoid being circular.
$endgroup$
– Mark Bennet
Feb 17 at 21:52
$begingroup$
@MarkBennet: Yes, certainly. The OP didn't state definitions, so one has to assume something, and I wanted to state clearly what I was assuming. But I think these are fairly safe. They could be the definition, if you know about existence and uniqueness of solutions to ODEs. Or, if your definition is in terms of power series, these facts are almost immediate if you are careful about convergence.
$endgroup$
– Nate Eldredge
Feb 17 at 21:56
3
$begingroup$
@scrutari: To do that, you'll have to give a "pure algebraic" definition of the sine and cosine functions. Given that they are transcendental, this seems hard to do.
$endgroup$
– Nate Eldredge
Feb 17 at 21:58
|
show 3 more comments
$begingroup$
Here's an approach I recently assigned as homework that needs nothing more than simple calculus facts. No complex numbers, in particular.
Suppose you know only the following:
$$sin'=cos, quad cos'=-sin, quad sin 0 = 0, quad cos 0 = 1$$
As a warmup, prove the Pythagorean identity $sin^2 x + cos^2 x = 1$. (Hint: let $f(x) = sin^2 x + cos^2 x$ and compute $f'$.) In particular, $|sin x|le 1$ and $|cos x| le 1$.
Now fix $a$ and consider the function
$$g(x) = sin(x+a) - sin x cos a - cos x sin a.$$
Compute $g'$ and $g''$, and note that $g'' = -g$. Verify that $g^(n)(0) = 0$ for every $n$, and that $|g^(n)(x)| le 3$ for every $n,x$. Now apply Taylor's theorem with Lagrange remainder (which is really just a consequence of the mean value theorem) to bound the difference $|g(x) - p_n(x)|$, where $p_n$ is the $n$th degree Taylor polynomial of $g$ centered at 0. But $p_n=0$. Letting $n to infty$ you can conclude $g equiv 0$.
For the identity involving $cos(x+a)$, consider $g'$. The minus versions may be done similarly via the function $h(x) = sin(a-x) - sin acos x + cos a sin x$, or by showing separately that $sin$ is an odd function and $cos$ is an even function.
Some variations:
If you know about real analytic functions, and you know that $sin x, cos x,sin(x+a)$ are all real analytic, then you are done as soon as you show that $g^(n)=0$ for every $n$.
If you know about uniqueness of solutions to ODEs, then just note that $g(0) = g'(0) = 0$ and $g'' = -g$.
$endgroup$
Here's an approach I recently assigned as homework that needs nothing more than simple calculus facts. No complex numbers, in particular.
Suppose you know only the following:
$$sin'=cos, quad cos'=-sin, quad sin 0 = 0, quad cos 0 = 1$$
As a warmup, prove the Pythagorean identity $sin^2 x + cos^2 x = 1$. (Hint: let $f(x) = sin^2 x + cos^2 x$ and compute $f'$.) In particular, $|sin x|le 1$ and $|cos x| le 1$.
Now fix $a$ and consider the function
$$g(x) = sin(x+a) - sin x cos a - cos x sin a.$$
Compute $g'$ and $g''$, and note that $g'' = -g$. Verify that $g^(n)(0) = 0$ for every $n$, and that $|g^(n)(x)| le 3$ for every $n,x$. Now apply Taylor's theorem with Lagrange remainder (which is really just a consequence of the mean value theorem) to bound the difference $|g(x) - p_n(x)|$, where $p_n$ is the $n$th degree Taylor polynomial of $g$ centered at 0. But $p_n=0$. Letting $n to infty$ you can conclude $g equiv 0$.
For the identity involving $cos(x+a)$, consider $g'$. The minus versions may be done similarly via the function $h(x) = sin(a-x) - sin acos x + cos a sin x$, or by showing separately that $sin$ is an odd function and $cos$ is an even function.
Some variations:
If you know about real analytic functions, and you know that $sin x, cos x,sin(x+a)$ are all real analytic, then you are done as soon as you show that $g^(n)=0$ for every $n$.
If you know about uniqueness of solutions to ODEs, then just note that $g(0) = g'(0) = 0$ and $g'' = -g$.
answered Feb 17 at 16:50
Nate EldredgeNate Eldredge
64.1k682174
64.1k682174
1
$begingroup$
I like this approach, quite intense in using theorems from various parts of Maths, though "No complex numbers" in fact means "but use the rest of Maths instead" :)
$endgroup$
– scrutari
Feb 17 at 19:55
1
$begingroup$
I think if you look carefully, other than the variations, that this really doesn't use anything except basic facts about derivatives (product rule, chain rule, etc) and the mean value theorem. (Taylor's theorem is just a repeated application of the mean value theorem.)
$endgroup$
– Nate Eldredge
Feb 17 at 20:50
$begingroup$
This is a nice approach (+1). There are various ways of showing that the derivatives are as stated in the assumptions. How this is done depends on the definition, and I think it has to be done with some care to avoid being circular.
$endgroup$
– Mark Bennet
Feb 17 at 21:52
$begingroup$
@MarkBennet: Yes, certainly. The OP didn't state definitions, so one has to assume something, and I wanted to state clearly what I was assuming. But I think these are fairly safe. They could be the definition, if you know about existence and uniqueness of solutions to ODEs. Or, if your definition is in terms of power series, these facts are almost immediate if you are careful about convergence.
$endgroup$
– Nate Eldredge
Feb 17 at 21:56
3
$begingroup$
@scrutari: To do that, you'll have to give a "pure algebraic" definition of the sine and cosine functions. Given that they are transcendental, this seems hard to do.
$endgroup$
– Nate Eldredge
Feb 17 at 21:58
|
show 3 more comments
1
$begingroup$
I like this approach, quite intense in using theorems from various parts of Maths, though "No complex numbers" in fact means "but use the rest of Maths instead" :)
$endgroup$
– scrutari
Feb 17 at 19:55
1
$begingroup$
I think if you look carefully, other than the variations, that this really doesn't use anything except basic facts about derivatives (product rule, chain rule, etc) and the mean value theorem. (Taylor's theorem is just a repeated application of the mean value theorem.)
$endgroup$
– Nate Eldredge
Feb 17 at 20:50
$begingroup$
This is a nice approach (+1). There are various ways of showing that the derivatives are as stated in the assumptions. How this is done depends on the definition, and I think it has to be done with some care to avoid being circular.
$endgroup$
– Mark Bennet
Feb 17 at 21:52
$begingroup$
@MarkBennet: Yes, certainly. The OP didn't state definitions, so one has to assume something, and I wanted to state clearly what I was assuming. But I think these are fairly safe. They could be the definition, if you know about existence and uniqueness of solutions to ODEs. Or, if your definition is in terms of power series, these facts are almost immediate if you are careful about convergence.
$endgroup$
– Nate Eldredge
Feb 17 at 21:56
3
$begingroup$
@scrutari: To do that, you'll have to give a "pure algebraic" definition of the sine and cosine functions. Given that they are transcendental, this seems hard to do.
$endgroup$
– Nate Eldredge
Feb 17 at 21:58
1
1
$begingroup$
I like this approach, quite intense in using theorems from various parts of Maths, though "No complex numbers" in fact means "but use the rest of Maths instead" :)
$endgroup$
– scrutari
Feb 17 at 19:55
$begingroup$
I like this approach, quite intense in using theorems from various parts of Maths, though "No complex numbers" in fact means "but use the rest of Maths instead" :)
$endgroup$
– scrutari
Feb 17 at 19:55
1
1
$begingroup$
I think if you look carefully, other than the variations, that this really doesn't use anything except basic facts about derivatives (product rule, chain rule, etc) and the mean value theorem. (Taylor's theorem is just a repeated application of the mean value theorem.)
$endgroup$
– Nate Eldredge
Feb 17 at 20:50
$begingroup$
I think if you look carefully, other than the variations, that this really doesn't use anything except basic facts about derivatives (product rule, chain rule, etc) and the mean value theorem. (Taylor's theorem is just a repeated application of the mean value theorem.)
$endgroup$
– Nate Eldredge
Feb 17 at 20:50
$begingroup$
This is a nice approach (+1). There are various ways of showing that the derivatives are as stated in the assumptions. How this is done depends on the definition, and I think it has to be done with some care to avoid being circular.
$endgroup$
– Mark Bennet
Feb 17 at 21:52
$begingroup$
This is a nice approach (+1). There are various ways of showing that the derivatives are as stated in the assumptions. How this is done depends on the definition, and I think it has to be done with some care to avoid being circular.
$endgroup$
– Mark Bennet
Feb 17 at 21:52
$begingroup$
@MarkBennet: Yes, certainly. The OP didn't state definitions, so one has to assume something, and I wanted to state clearly what I was assuming. But I think these are fairly safe. They could be the definition, if you know about existence and uniqueness of solutions to ODEs. Or, if your definition is in terms of power series, these facts are almost immediate if you are careful about convergence.
$endgroup$
– Nate Eldredge
Feb 17 at 21:56
$begingroup$
@MarkBennet: Yes, certainly. The OP didn't state definitions, so one has to assume something, and I wanted to state clearly what I was assuming. But I think these are fairly safe. They could be the definition, if you know about existence and uniqueness of solutions to ODEs. Or, if your definition is in terms of power series, these facts are almost immediate if you are careful about convergence.
$endgroup$
– Nate Eldredge
Feb 17 at 21:56
3
3
$begingroup$
@scrutari: To do that, you'll have to give a "pure algebraic" definition of the sine and cosine functions. Given that they are transcendental, this seems hard to do.
$endgroup$
– Nate Eldredge
Feb 17 at 21:58
$begingroup$
@scrutari: To do that, you'll have to give a "pure algebraic" definition of the sine and cosine functions. Given that they are transcendental, this seems hard to do.
$endgroup$
– Nate Eldredge
Feb 17 at 21:58
|
show 3 more comments
$begingroup$
Just to share my idea. I will not say that my proof doesn't involve geometry.
Let $A=(cosalpha,sinalpha)$ and $B=(cosbeta,sinbeta)$. Then
$$AB^2=(cosalpha-cosbeta)^2+(sinalpha-sinbeta)^2=2-2cosalphacosbeta-2sinalphasinbeta$$
On the other hand,
$$AB^2=OA^2+OB^2-2(OA)(OB)cosangle AOB=1^2+1^2-2(1)(1)cos(alpha-beta)$$
This proves that $cos(alpha-beta)=cosalphacosbeta+sinalphasinbeta$. The proof holds for arbitrary $alpha$ and $beta$.
$cos(alpha+beta)=cosalphacos(-beta)+sinalphasin(-beta)=cosalphacosbeta-sinalphasinbeta$
$sin(alpha+beta)=cosleft(fracpi2-alpha-betaright)=cosleft(fracpi2-alpharight)cosbeta+sinleft(fracpi2-alpharight)sinbeta=sinalphacosbeta+cosalphasinbeta$
$sin(alpha-beta)=sinalphacos(-beta)+cosalphasin(-beta)=sinalphacosbeta-cosalphasinbeta$
$endgroup$
1
$begingroup$
(+1) haven't seen you in a while!
$endgroup$
– TheSimpliFire
Feb 17 at 19:04
add a comment |
$begingroup$
Just to share my idea. I will not say that my proof doesn't involve geometry.
Let $A=(cosalpha,sinalpha)$ and $B=(cosbeta,sinbeta)$. Then
$$AB^2=(cosalpha-cosbeta)^2+(sinalpha-sinbeta)^2=2-2cosalphacosbeta-2sinalphasinbeta$$
On the other hand,
$$AB^2=OA^2+OB^2-2(OA)(OB)cosangle AOB=1^2+1^2-2(1)(1)cos(alpha-beta)$$
This proves that $cos(alpha-beta)=cosalphacosbeta+sinalphasinbeta$. The proof holds for arbitrary $alpha$ and $beta$.
$cos(alpha+beta)=cosalphacos(-beta)+sinalphasin(-beta)=cosalphacosbeta-sinalphasinbeta$
$sin(alpha+beta)=cosleft(fracpi2-alpha-betaright)=cosleft(fracpi2-alpharight)cosbeta+sinleft(fracpi2-alpharight)sinbeta=sinalphacosbeta+cosalphasinbeta$
$sin(alpha-beta)=sinalphacos(-beta)+cosalphasin(-beta)=sinalphacosbeta-cosalphasinbeta$
$endgroup$
1
$begingroup$
(+1) haven't seen you in a while!
$endgroup$
– TheSimpliFire
Feb 17 at 19:04
add a comment |
$begingroup$
Just to share my idea. I will not say that my proof doesn't involve geometry.
Let $A=(cosalpha,sinalpha)$ and $B=(cosbeta,sinbeta)$. Then
$$AB^2=(cosalpha-cosbeta)^2+(sinalpha-sinbeta)^2=2-2cosalphacosbeta-2sinalphasinbeta$$
On the other hand,
$$AB^2=OA^2+OB^2-2(OA)(OB)cosangle AOB=1^2+1^2-2(1)(1)cos(alpha-beta)$$
This proves that $cos(alpha-beta)=cosalphacosbeta+sinalphasinbeta$. The proof holds for arbitrary $alpha$ and $beta$.
$cos(alpha+beta)=cosalphacos(-beta)+sinalphasin(-beta)=cosalphacosbeta-sinalphasinbeta$
$sin(alpha+beta)=cosleft(fracpi2-alpha-betaright)=cosleft(fracpi2-alpharight)cosbeta+sinleft(fracpi2-alpharight)sinbeta=sinalphacosbeta+cosalphasinbeta$
$sin(alpha-beta)=sinalphacos(-beta)+cosalphasin(-beta)=sinalphacosbeta-cosalphasinbeta$
$endgroup$
Just to share my idea. I will not say that my proof doesn't involve geometry.
Let $A=(cosalpha,sinalpha)$ and $B=(cosbeta,sinbeta)$. Then
$$AB^2=(cosalpha-cosbeta)^2+(sinalpha-sinbeta)^2=2-2cosalphacosbeta-2sinalphasinbeta$$
On the other hand,
$$AB^2=OA^2+OB^2-2(OA)(OB)cosangle AOB=1^2+1^2-2(1)(1)cos(alpha-beta)$$
This proves that $cos(alpha-beta)=cosalphacosbeta+sinalphasinbeta$. The proof holds for arbitrary $alpha$ and $beta$.
$cos(alpha+beta)=cosalphacos(-beta)+sinalphasin(-beta)=cosalphacosbeta-sinalphasinbeta$
$sin(alpha+beta)=cosleft(fracpi2-alpha-betaright)=cosleft(fracpi2-alpharight)cosbeta+sinleft(fracpi2-alpharight)sinbeta=sinalphacosbeta+cosalphasinbeta$
$sin(alpha-beta)=sinalphacos(-beta)+cosalphasin(-beta)=sinalphacosbeta-cosalphasinbeta$
edited Feb 18 at 10:07
Bysshed
1,06311023
1,06311023
answered Feb 17 at 14:41
CY AriesCY Aries
16.8k11743
16.8k11743
1
$begingroup$
(+1) haven't seen you in a while!
$endgroup$
– TheSimpliFire
Feb 17 at 19:04
add a comment |
1
$begingroup$
(+1) haven't seen you in a while!
$endgroup$
– TheSimpliFire
Feb 17 at 19:04
1
1
$begingroup$
(+1) haven't seen you in a while!
$endgroup$
– TheSimpliFire
Feb 17 at 19:04
$begingroup$
(+1) haven't seen you in a while!
$endgroup$
– TheSimpliFire
Feb 17 at 19:04
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3116284%2fproving-the-angle-sum-and-difference-identities-for-sine-and-cosine-without-invo%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
$begingroup$
Then do you want to start with alternative definitions of these functions? Any algebraic definitions would reduce the proof to an easy calculation. For example, define them as the real and imaginary parts of $e^ix$.
$endgroup$
– Aravind
Feb 17 at 14:27