Prove $sumlimits_{n=0}^inftyfrac{x^n}{n!}=limlimits_{nrightarrowinfty}left(1+frac{x}{n}right)^n$ [duplicate]











up vote
0
down vote

favorite













This question already has an answer here:




  • Validity and Equivalence of two definitions of the real exponential function

    2 answers




$xinmathbb{R}$. I can prove that both sides converge, and maybe we should show
$limsuplimits_{nrightarrowinfty}left|sumlimits_{k=0}^{n}frac{x^k}{n!}-left(1+frac{x}{n}right)^n right|<varepsilon$, but the construction of the right hand side is a little tricky for me.










share|cite|improve this question













marked as duplicate by RRL real-analysis
Users with the  real-analysis badge can single-handedly close real-analysis questions as duplicates and reopen them as needed.

StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;

$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');

$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
Nov 18 at 22:58


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.











  • 1




    Show that they both solve the differential equation $f'(x) = f(x)$ with initial condition $f(0) = 1$ (the solution is unique; this isn't hard to prove). The LHS solves it by finding its Taylor series and the RHS solves it using Euler's method.
    – Qiaochu Yuan
    Nov 18 at 22:15










  • Expand the right side (binomial expansion) and match up term by term (powers of $x$) to the left side.
    – herb steinberg
    Nov 18 at 22:24

















up vote
0
down vote

favorite













This question already has an answer here:




  • Validity and Equivalence of two definitions of the real exponential function

    2 answers




$xinmathbb{R}$. I can prove that both sides converge, and maybe we should show
$limsuplimits_{nrightarrowinfty}left|sumlimits_{k=0}^{n}frac{x^k}{n!}-left(1+frac{x}{n}right)^n right|<varepsilon$, but the construction of the right hand side is a little tricky for me.










share|cite|improve this question













marked as duplicate by RRL real-analysis
Users with the  real-analysis badge can single-handedly close real-analysis questions as duplicates and reopen them as needed.

StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;

$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');

$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
Nov 18 at 22:58


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.











  • 1




    Show that they both solve the differential equation $f'(x) = f(x)$ with initial condition $f(0) = 1$ (the solution is unique; this isn't hard to prove). The LHS solves it by finding its Taylor series and the RHS solves it using Euler's method.
    – Qiaochu Yuan
    Nov 18 at 22:15










  • Expand the right side (binomial expansion) and match up term by term (powers of $x$) to the left side.
    – herb steinberg
    Nov 18 at 22:24















up vote
0
down vote

favorite









up vote
0
down vote

favorite












This question already has an answer here:




  • Validity and Equivalence of two definitions of the real exponential function

    2 answers




$xinmathbb{R}$. I can prove that both sides converge, and maybe we should show
$limsuplimits_{nrightarrowinfty}left|sumlimits_{k=0}^{n}frac{x^k}{n!}-left(1+frac{x}{n}right)^n right|<varepsilon$, but the construction of the right hand side is a little tricky for me.










share|cite|improve this question














This question already has an answer here:




  • Validity and Equivalence of two definitions of the real exponential function

    2 answers




$xinmathbb{R}$. I can prove that both sides converge, and maybe we should show
$limsuplimits_{nrightarrowinfty}left|sumlimits_{k=0}^{n}frac{x^k}{n!}-left(1+frac{x}{n}right)^n right|<varepsilon$, but the construction of the right hand side is a little tricky for me.





This question already has an answer here:




  • Validity and Equivalence of two definitions of the real exponential function

    2 answers








real-analysis sequences-and-series limits






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 18 at 22:10









The R

516




516




marked as duplicate by RRL real-analysis
Users with the  real-analysis badge can single-handedly close real-analysis questions as duplicates and reopen them as needed.

StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;

$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');

$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
Nov 18 at 22:58


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.






marked as duplicate by RRL real-analysis
Users with the  real-analysis badge can single-handedly close real-analysis questions as duplicates and reopen them as needed.

StackExchange.ready(function() {
if (StackExchange.options.isMobile) return;

$('.dupe-hammer-message-hover:not(.hover-bound)').each(function() {
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');

$hover.hover(
function() {
$hover.showInfoMessage('', {
messageElement: $msg.clone().show(),
transient: false,
position: { my: 'bottom left', at: 'top center', offsetTop: -7 },
dismissable: false,
relativeToBody: true
});
},
function() {
StackExchange.helpers.removeMessages();
}
);
});
});
Nov 18 at 22:58


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.










  • 1




    Show that they both solve the differential equation $f'(x) = f(x)$ with initial condition $f(0) = 1$ (the solution is unique; this isn't hard to prove). The LHS solves it by finding its Taylor series and the RHS solves it using Euler's method.
    – Qiaochu Yuan
    Nov 18 at 22:15










  • Expand the right side (binomial expansion) and match up term by term (powers of $x$) to the left side.
    – herb steinberg
    Nov 18 at 22:24
















  • 1




    Show that they both solve the differential equation $f'(x) = f(x)$ with initial condition $f(0) = 1$ (the solution is unique; this isn't hard to prove). The LHS solves it by finding its Taylor series and the RHS solves it using Euler's method.
    – Qiaochu Yuan
    Nov 18 at 22:15










  • Expand the right side (binomial expansion) and match up term by term (powers of $x$) to the left side.
    – herb steinberg
    Nov 18 at 22:24










1




1




Show that they both solve the differential equation $f'(x) = f(x)$ with initial condition $f(0) = 1$ (the solution is unique; this isn't hard to prove). The LHS solves it by finding its Taylor series and the RHS solves it using Euler's method.
– Qiaochu Yuan
Nov 18 at 22:15




Show that they both solve the differential equation $f'(x) = f(x)$ with initial condition $f(0) = 1$ (the solution is unique; this isn't hard to prove). The LHS solves it by finding its Taylor series and the RHS solves it using Euler's method.
– Qiaochu Yuan
Nov 18 at 22:15












Expand the right side (binomial expansion) and match up term by term (powers of $x$) to the left side.
– herb steinberg
Nov 18 at 22:24






Expand the right side (binomial expansion) and match up term by term (powers of $x$) to the left side.
– herb steinberg
Nov 18 at 22:24












1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










Rather than proving this directly, I'd proceed as follows:



(1) Define $exp(x)$ to be the unique solution to $y' = y$ with $y(0) = 1$. Let $log$ denote its inverse (which exists because $exp$ is increasing).



(2) Use uniqueness of $exp$ to show that $exp(x + y) = exp(x) exp(y)$ and thus $log(xy) = log(x) + log(y)$.



(3) Use the defining differential equation of $exp$ to show that $exp(x) = sum x^n/n!$.



(4) Use the definition of $log$ to show that $log'(1) = 1$, then compute the log of the right-hand limit with l'Hopital's rule.






share|cite|improve this answer






























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted










    Rather than proving this directly, I'd proceed as follows:



    (1) Define $exp(x)$ to be the unique solution to $y' = y$ with $y(0) = 1$. Let $log$ denote its inverse (which exists because $exp$ is increasing).



    (2) Use uniqueness of $exp$ to show that $exp(x + y) = exp(x) exp(y)$ and thus $log(xy) = log(x) + log(y)$.



    (3) Use the defining differential equation of $exp$ to show that $exp(x) = sum x^n/n!$.



    (4) Use the definition of $log$ to show that $log'(1) = 1$, then compute the log of the right-hand limit with l'Hopital's rule.






    share|cite|improve this answer



























      up vote
      1
      down vote



      accepted










      Rather than proving this directly, I'd proceed as follows:



      (1) Define $exp(x)$ to be the unique solution to $y' = y$ with $y(0) = 1$. Let $log$ denote its inverse (which exists because $exp$ is increasing).



      (2) Use uniqueness of $exp$ to show that $exp(x + y) = exp(x) exp(y)$ and thus $log(xy) = log(x) + log(y)$.



      (3) Use the defining differential equation of $exp$ to show that $exp(x) = sum x^n/n!$.



      (4) Use the definition of $log$ to show that $log'(1) = 1$, then compute the log of the right-hand limit with l'Hopital's rule.






      share|cite|improve this answer

























        up vote
        1
        down vote



        accepted







        up vote
        1
        down vote



        accepted






        Rather than proving this directly, I'd proceed as follows:



        (1) Define $exp(x)$ to be the unique solution to $y' = y$ with $y(0) = 1$. Let $log$ denote its inverse (which exists because $exp$ is increasing).



        (2) Use uniqueness of $exp$ to show that $exp(x + y) = exp(x) exp(y)$ and thus $log(xy) = log(x) + log(y)$.



        (3) Use the defining differential equation of $exp$ to show that $exp(x) = sum x^n/n!$.



        (4) Use the definition of $log$ to show that $log'(1) = 1$, then compute the log of the right-hand limit with l'Hopital's rule.






        share|cite|improve this answer














        Rather than proving this directly, I'd proceed as follows:



        (1) Define $exp(x)$ to be the unique solution to $y' = y$ with $y(0) = 1$. Let $log$ denote its inverse (which exists because $exp$ is increasing).



        (2) Use uniqueness of $exp$ to show that $exp(x + y) = exp(x) exp(y)$ and thus $log(xy) = log(x) + log(y)$.



        (3) Use the defining differential equation of $exp$ to show that $exp(x) = sum x^n/n!$.



        (4) Use the definition of $log$ to show that $log'(1) = 1$, then compute the log of the right-hand limit with l'Hopital's rule.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Nov 19 at 0:27

























        answered Nov 18 at 22:27









        anomaly

        17.1k42662




        17.1k42662















            Popular posts from this blog

            Quarter-circle Tiles

            build a pushdown automaton that recognizes the reverse language of a given pushdown automaton?

            Mont Emei