How to create very hard problems on Lagrange Multipliers












2












$begingroup$


This is a rather odd request.

I only recently started studying the Lagrange Multipliers, and was given a task to create some challenging problems on them and also provide solutions. Could somebody please suggest how I could get started on this? Some example problems would be welcome!



Thanks very much!










share|cite|improve this question











$endgroup$












  • $begingroup$
    Have you searched the site? Running the risk of excessive self-promotion I rather liked this trick. Not sure whether I would call it very hard though.
    $endgroup$
    – Jyrki Lahtonen
    May 3 '15 at 7:46










  • $begingroup$
    You can use <br> or <hr> to separate parts of the text. See editing help.
    $endgroup$
    – Martin Sleziak
    May 3 '15 at 7:50










  • $begingroup$
    @Jyrki It is a nice question. But I need to create at least 5 very difficult questions. Could you suggest ways and examples, please? I've given one of the questions I have in the post (I just edited it).
    $endgroup$
    – Ishan
    May 3 '15 at 7:55










  • $begingroup$
    I'm searching now.
    $endgroup$
    – Ishan
    May 3 '15 at 7:58






  • 1




    $begingroup$
    @Jykri Sir, I have looked at several problems. However, due to my limited knowledge, I cannot judge as to which is a very challenging problem. I know I'm asking for a lot, but Sir, would it be possible for you to please select a problem which according to you is very, very difficult? I would be truly grateful to you.
    $endgroup$
    – Ishan
    May 3 '15 at 8:05
















2












$begingroup$


This is a rather odd request.

I only recently started studying the Lagrange Multipliers, and was given a task to create some challenging problems on them and also provide solutions. Could somebody please suggest how I could get started on this? Some example problems would be welcome!



Thanks very much!










share|cite|improve this question











$endgroup$












  • $begingroup$
    Have you searched the site? Running the risk of excessive self-promotion I rather liked this trick. Not sure whether I would call it very hard though.
    $endgroup$
    – Jyrki Lahtonen
    May 3 '15 at 7:46










  • $begingroup$
    You can use <br> or <hr> to separate parts of the text. See editing help.
    $endgroup$
    – Martin Sleziak
    May 3 '15 at 7:50










  • $begingroup$
    @Jyrki It is a nice question. But I need to create at least 5 very difficult questions. Could you suggest ways and examples, please? I've given one of the questions I have in the post (I just edited it).
    $endgroup$
    – Ishan
    May 3 '15 at 7:55










  • $begingroup$
    I'm searching now.
    $endgroup$
    – Ishan
    May 3 '15 at 7:58






  • 1




    $begingroup$
    @Jykri Sir, I have looked at several problems. However, due to my limited knowledge, I cannot judge as to which is a very challenging problem. I know I'm asking for a lot, but Sir, would it be possible for you to please select a problem which according to you is very, very difficult? I would be truly grateful to you.
    $endgroup$
    – Ishan
    May 3 '15 at 8:05














2












2








2


2



$begingroup$


This is a rather odd request.

I only recently started studying the Lagrange Multipliers, and was given a task to create some challenging problems on them and also provide solutions. Could somebody please suggest how I could get started on this? Some example problems would be welcome!



Thanks very much!










share|cite|improve this question











$endgroup$




This is a rather odd request.

I only recently started studying the Lagrange Multipliers, and was given a task to create some challenging problems on them and also provide solutions. Could somebody please suggest how I could get started on this? Some example problems would be welcome!



Thanks very much!







calculus multivariable-calculus lagrange-multiplier






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 2 '18 at 16:17







Ishan

















asked May 3 '15 at 7:27









IshanIshan

2,0171927




2,0171927












  • $begingroup$
    Have you searched the site? Running the risk of excessive self-promotion I rather liked this trick. Not sure whether I would call it very hard though.
    $endgroup$
    – Jyrki Lahtonen
    May 3 '15 at 7:46










  • $begingroup$
    You can use <br> or <hr> to separate parts of the text. See editing help.
    $endgroup$
    – Martin Sleziak
    May 3 '15 at 7:50










  • $begingroup$
    @Jyrki It is a nice question. But I need to create at least 5 very difficult questions. Could you suggest ways and examples, please? I've given one of the questions I have in the post (I just edited it).
    $endgroup$
    – Ishan
    May 3 '15 at 7:55










  • $begingroup$
    I'm searching now.
    $endgroup$
    – Ishan
    May 3 '15 at 7:58






  • 1




    $begingroup$
    @Jykri Sir, I have looked at several problems. However, due to my limited knowledge, I cannot judge as to which is a very challenging problem. I know I'm asking for a lot, but Sir, would it be possible for you to please select a problem which according to you is very, very difficult? I would be truly grateful to you.
    $endgroup$
    – Ishan
    May 3 '15 at 8:05


















  • $begingroup$
    Have you searched the site? Running the risk of excessive self-promotion I rather liked this trick. Not sure whether I would call it very hard though.
    $endgroup$
    – Jyrki Lahtonen
    May 3 '15 at 7:46










  • $begingroup$
    You can use <br> or <hr> to separate parts of the text. See editing help.
    $endgroup$
    – Martin Sleziak
    May 3 '15 at 7:50










  • $begingroup$
    @Jyrki It is a nice question. But I need to create at least 5 very difficult questions. Could you suggest ways and examples, please? I've given one of the questions I have in the post (I just edited it).
    $endgroup$
    – Ishan
    May 3 '15 at 7:55










  • $begingroup$
    I'm searching now.
    $endgroup$
    – Ishan
    May 3 '15 at 7:58






  • 1




    $begingroup$
    @Jykri Sir, I have looked at several problems. However, due to my limited knowledge, I cannot judge as to which is a very challenging problem. I know I'm asking for a lot, but Sir, would it be possible for you to please select a problem which according to you is very, very difficult? I would be truly grateful to you.
    $endgroup$
    – Ishan
    May 3 '15 at 8:05
















$begingroup$
Have you searched the site? Running the risk of excessive self-promotion I rather liked this trick. Not sure whether I would call it very hard though.
$endgroup$
– Jyrki Lahtonen
May 3 '15 at 7:46




$begingroup$
Have you searched the site? Running the risk of excessive self-promotion I rather liked this trick. Not sure whether I would call it very hard though.
$endgroup$
– Jyrki Lahtonen
May 3 '15 at 7:46












$begingroup$
You can use <br> or <hr> to separate parts of the text. See editing help.
$endgroup$
– Martin Sleziak
May 3 '15 at 7:50




$begingroup$
You can use <br> or <hr> to separate parts of the text. See editing help.
$endgroup$
– Martin Sleziak
May 3 '15 at 7:50












$begingroup$
@Jyrki It is a nice question. But I need to create at least 5 very difficult questions. Could you suggest ways and examples, please? I've given one of the questions I have in the post (I just edited it).
$endgroup$
– Ishan
May 3 '15 at 7:55




$begingroup$
@Jyrki It is a nice question. But I need to create at least 5 very difficult questions. Could you suggest ways and examples, please? I've given one of the questions I have in the post (I just edited it).
$endgroup$
– Ishan
May 3 '15 at 7:55












$begingroup$
I'm searching now.
$endgroup$
– Ishan
May 3 '15 at 7:58




$begingroup$
I'm searching now.
$endgroup$
– Ishan
May 3 '15 at 7:58




1




1




$begingroup$
@Jykri Sir, I have looked at several problems. However, due to my limited knowledge, I cannot judge as to which is a very challenging problem. I know I'm asking for a lot, but Sir, would it be possible for you to please select a problem which according to you is very, very difficult? I would be truly grateful to you.
$endgroup$
– Ishan
May 3 '15 at 8:05




$begingroup$
@Jykri Sir, I have looked at several problems. However, due to my limited knowledge, I cannot judge as to which is a very challenging problem. I know I'm asking for a lot, but Sir, would it be possible for you to please select a problem which according to you is very, very difficult? I would be truly grateful to you.
$endgroup$
– Ishan
May 3 '15 at 8:05










3 Answers
3






active

oldest

votes


















2












$begingroup$

Sure, here's 2:



a really good one to start out with is the optimization problem behind Principal Component Analysis.



$$text{max$_{bf v}$ } langle bf{v},Sigma_n bf{v}rangle$$ $$text {s.t.}$$ $$||bf {v}||_2=1$$



where $bf v$ is a vector and $Sigma_n= frac 1nsum_{i=1}^n(bf x_i -mu_n)(bf x_i - mu_n)^T$ is the Covariance matrix of the data points $bf x_i$



The answer for $bf v$ is the vector that points in the direction of the eigenvector of $Sigma_n$ corresponding to its largest eigenvalue $lambda_{max}$



Another really good application of Lagrange multipliers/ difficult problem involving Lagrange multipliers is solving for the Euler equation in Economics for logarithmic utility. This is extremely important in the theory of dynamic programming as well.



$$max sum_{t=0}^{T-1} lnc_t + lnx_T$$ $$text{s.t.}$$ $$x_{t+1} =alpha (x_t-c_t)$$



Be careful, $x_t$ shows up twice. Interpret as maximizing consumption and final wealth where $x_t$ is the wealth at period $t$ with $t=0$ corresponding to initial wealth and $c_t$ is the consumption for period $t$.



The solution is: $x_t^*=frac {T+1-t}{T+1}(frac 1{alpha})^{-t}x_0$ and $c_t^*=frac {(frac 1{alpha})^{-t}x_0}{T+1}$



These two problems include applications of lagrange multipliers to 1. a problem involving matrix calculus. and 2. a problem in which an optimal consumption path is determined (function), which has ties to optimal control and the functional optimization.






share|cite|improve this answer









$endgroup$





















    3












    $begingroup$

    Here is a moderately complicated example, cooked up in such a way that it can be solved explicitly all the way to the end.



    Given are the two points $A=(0,-7)$, $B=(-7,0)$, and the circle $gamma: >x^2+y^2=4$. Determine two points $P$, $Qingamma$ such that the quantity
    $$d(P,Q):=|AP|^2+|PQ|^2+|QB|^2$$
    becomes maximal, resp., minimal.



    enter image description here



    You will obtain four conditionally stationary situations $(P_k,Q_k)$: the two extremal situations and two "saddle points". The latter had to be expected, since the surface determined by the conditions is a torus.



    A full solution (in German) is given on pp. 212–214 of the following pdf-file. The file contains a full chapter (pages 128–236) of a textbook for engineering students. Scroll down to page 212; there you will see the figure printed above.



    https://people.math.ethz.ch/~blatter/Inganalysis_5.pdf






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      I'm sorry Sir, but there is no page 212 in the link you have given.
      $endgroup$
      – Ishan
      May 3 '15 at 10:17












    • $begingroup$
      This file is one long webpage with page numbers and headers for every included book page from 128 to 236. You cannot access page 212 by typing 212 somewhere. Scroll down until you get to page 212. I won't go into fabricating a translation.
      $endgroup$
      – Christian Blatter
      May 5 '15 at 18:12










    • $begingroup$
      @ChristianBlatter , very interesting problem. Where did you get inspired to create it? How did you formulate this question ? What was your thought process ?
      $endgroup$
      – danilocn94
      Oct 26 '18 at 22:59






    • 1




      $begingroup$
      @danilocn94: I wanted a problem with two conditions but no boundaries to inspect. Then the setup should be as simple as possible, no square roots at the beginning of the computation, and "small numbers" in the result. A lot of "reverse engineering" went into the final version of this problem.
      $endgroup$
      – Christian Blatter
      Oct 27 '18 at 8:06












    • $begingroup$
      @ChristianBlatter , what I'd like to know is about the geometry of the problem. How did you formulate this question (geometrically) ? What were your inspiration and thought process ?
      $endgroup$
      – danilocn94
      Nov 19 '18 at 7:03



















    2












    $begingroup$

    Here two other ones :



    Exercice 1 :




    Let $alphainmathbb{R}_{+}$. For all $left(x,y,zright)inmathbb{R}^{3}$, find the optimal constant $Cleft(alpharight)inmathbb{R}$
    such that $$x^{3}-left(frac{z}{alpha}right)^{3}leq Cleft(alpharight)left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
    and precise the case of equality in term of $alpha$.




    Solution :



    We notice that the both sizes of the above equation are homogeneous functions of degree $3$ : it suffices to consider the problem on the unit sphere $mathcal{S}=left{ left(x,y,zright)inmathbb{R}^{3};x^{2}+y^{2}+z^{2}=1right}$.
    Soient $f$ et $g$ the functions defined on $mathbb{R}^{3}$ by $$begin{cases}
    fleft(x,y,zright)=x^{3}-left(frac{z}{alpha}right)^{3}\
    gleft(x,y,zright)=x^{2}+y^{2}+z^{2}-1
    end{cases}.$$



    As $f$ is continuous and $mathcal{S}$ is compact, $f_{midmathcal{S}}$ is bounded and attain its maximum values.
    For all $left(x,y,zright)inmathbb{R}^{3}$, we have
    $$begin{cases}
    mathrm{d}fleft(x,y,zright)=3x^{2}dx-3frac{z^{2}}{alpha^{3}}dz\
    mathrm{d}gleft(x,y,zright)=2xdx+2ydy+2zdz
    end{cases}$$
    or in matricial notations in the basis $left(mathrm{d}x,mathrm{d}y,mathrm{d}zright)$
    of $left(mathbb{R}^{3}right)^{*}$



    $$begin{pmatrix}3x^{2} & 0 & -3frac{z^{2}}{alpha^{3}}\
    2x & 2y & 2z
    end{pmatrix}.$$



    Cancelling all the $begin{pmatrix}3\2end{pmatrix}=3$ $2times2$-determinants so that $mathrm{d}fleft(x,y,zright)$ and $mathrm{d}gleft(x,y,zright)$ are linearly dependant yields :



    $$begin{vmatrix}3x^{2} & 0\
    2x & 2y
    end{vmatrix}=6x^{2}y=0Longleftrightarrowleft(x=0right)textrm{ ou }left(y=0right).$$
    $$ begin{vmatrix}3x^{2} & -3frac{z^{2}}{alpha^{3}}
    2x & 2z
    end{vmatrix}=6xzleft(x+frac{z}{alpha^{3}}right)=0Longleftrightarrowleft(x=0right)textrm{ ou }left(z=0right)textrm{ ou }left(x=-frac{z}{alpha^{3}}right).$$
    $$begin{vmatrix}0 & 3frac{z^{2}}{alpha^{3}}\
    2y & 2z
    end{vmatrix}=6yfrac{z^{2}}{alpha^{3}}=0Longleftrightarrowleft(y=0right)textrm{ ou }left(z=0right).$$



    Then, for all $left(x,y,zright)inmathbb{R}^{3}$ :



    $(i)$ if $alpha<1$, $$fleft(x,y,zright)leqfrac{1}{alpha^{3}}left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
    with equality iff $left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right}$ ;



    $(ii)$ if $alpha=1$, $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
    with equality iff $left(left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} right)$
    or $left(left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right} right)$ ;



    $(iii)$ If $alpha>1$,
    $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
    with equality iff $left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} $.



    Exercice 2 :




    Let $n$,$alphainmathbb{N}^{*}$, $1leqalphaleq n$.
    For all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, find the optimal constant $Cleft(n,alpharight)inmathbb{R}$ such that
    $$x_{1}ldots x_{n}leq Cleft(n,alpharight)left(x_{1}^{alpha}+ldots+x_{n}^{alpha}right)^{n/alpha}$$
    and precise the case of equality in term of $alpha$.




    Solution :



    We notice that the both sizes of the above equation are homogeneous functions of degree $n$ : it suffices to consider the problem on the hypersurface $mathcal{S}=left{ left(x_{1},ldots,x_{n}right)inmathbb{R}^{n};x_{1}^{alpha}+ldots+x_{n}^{alpha}=1right}$.
    Let $f$ and $g$ the functions defined on $mathbb{R}^{n}$ by



    $$begin{cases}
    fleft(x_{1},ldots,x_{n}right)=x_{1}ldots x_{n}\
    gleft(x_{1},ldots,x_{n}right)=x_{1}^{alpha}+ldots+x_{n}^{alpha}-1
    end{cases}.$$



    We can assume that $x_{j}>0$ for all $1leq jleq n$ (the inequality is stronger in the case where there is an even number of $x_j<0$, which is the same that consider them all non-negative).
    For all $left(x_{1},ldots,x_{n}right)inmathbb{R}_{+}^{n}$, we have



    $$begin{cases}
    dfleft(x_{1},ldots,x_{n}right)=sum_{j=1}^{n}left(prod_{kneq j}x_{k}right)dx_{j}\
    dgleft(x_{1},ldots,x_{n}right)=alphasum_{j=1}^{n}left(x_{j}^{alpha-1}right)dx_{j}
    end{cases}$$
    or in matricial notations



    $$begin{pmatrix}x_{2}x_{3}ldots x_{n} & x_{1}x_{3}x_{4}ldots x_{n} & cdots & x_{1}x_{2}ldots x_{n-2}x_{n-1}\
    alpha x_{1}^{alpha-1} & alpha x_{2}^{alpha-1} & cdots & alpha x_{n}^{alpha-1}
    end{pmatrix}.$$



    We cancel all the $begin{pmatrix}n\2end{pmatrix}$ $2times2$-determinants so that $mathrm{d}fleft(x_{1},ldots,x_{n}right)$ and $mathrm{d}gleft(x_{1},ldots,x_{n}right)$ linearly dependant : for all $1leq k$, $ellleq n$ with $kneqell$,
    $$begin{vmatrix}x_{1}ldots x_{k-1}x_{k+1}x_{n} & x_{1}ldots x_{ell-1}x_{ell+1}ldots x_{n}\
    alpha x_{k}^{alpha-1} & alpha x_{ell}^{alpha-1}
    end{vmatrix}=alpha x_{1}ldots x_{k-1}x_{k+1}ldots x_{ell-1}x_{ell+1}ldots x_{n}left(x_{ell}^{alpha}-x_{k}^{alpha}right)=0
    Longleftrightarrow x_{k}=x_{ell}Longrightarrow x_{1}=left(frac{1}{n}right)^{frac{1}{alpha}}$$
    with the constraint $gleft(x_{1},ldots,x_{n}right)=0$.
    We get $$fleft(left(frac{1}{n}right)^{frac{1}{alpha}},ldots,left(frac{1}{n}right)^{frac{1}{alpha}}right)=left(frac{1}{n}right)^{frac{n}{alpha}}$$
    and hence $$f_{midmathcal{S}}left(x_{1},ldots,x_{n}right)leqleft(frac{1}{n}right)^{frac{1}{alpha}}.$$
    We extend this result by homothetie to $mathbb{R}^{n}$ :
    for all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, $$x_{1}ldots x_{n}leqleft(frac{x_{1}^{alpha}+ldots+x_{n}^{alpha}}{n}right)^{frac{n}{alpha}}$$
    with equality iff $x_{1}=ldots=x_{n}$.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Could you please tell me the meaning of optimal constant? Thanks!
      $endgroup$
      – Ishan
      May 3 '15 at 10:00










    • $begingroup$
      @BetterWorld The optimal constant means the smallest one such that the above equalities hold.
      $endgroup$
      – Nicolas
      May 3 '15 at 10:40











    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1264201%2fhow-to-create-very-hard-problems-on-lagrange-multipliers%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    Sure, here's 2:



    a really good one to start out with is the optimization problem behind Principal Component Analysis.



    $$text{max$_{bf v}$ } langle bf{v},Sigma_n bf{v}rangle$$ $$text {s.t.}$$ $$||bf {v}||_2=1$$



    where $bf v$ is a vector and $Sigma_n= frac 1nsum_{i=1}^n(bf x_i -mu_n)(bf x_i - mu_n)^T$ is the Covariance matrix of the data points $bf x_i$



    The answer for $bf v$ is the vector that points in the direction of the eigenvector of $Sigma_n$ corresponding to its largest eigenvalue $lambda_{max}$



    Another really good application of Lagrange multipliers/ difficult problem involving Lagrange multipliers is solving for the Euler equation in Economics for logarithmic utility. This is extremely important in the theory of dynamic programming as well.



    $$max sum_{t=0}^{T-1} lnc_t + lnx_T$$ $$text{s.t.}$$ $$x_{t+1} =alpha (x_t-c_t)$$



    Be careful, $x_t$ shows up twice. Interpret as maximizing consumption and final wealth where $x_t$ is the wealth at period $t$ with $t=0$ corresponding to initial wealth and $c_t$ is the consumption for period $t$.



    The solution is: $x_t^*=frac {T+1-t}{T+1}(frac 1{alpha})^{-t}x_0$ and $c_t^*=frac {(frac 1{alpha})^{-t}x_0}{T+1}$



    These two problems include applications of lagrange multipliers to 1. a problem involving matrix calculus. and 2. a problem in which an optimal consumption path is determined (function), which has ties to optimal control and the functional optimization.






    share|cite|improve this answer









    $endgroup$


















      2












      $begingroup$

      Sure, here's 2:



      a really good one to start out with is the optimization problem behind Principal Component Analysis.



      $$text{max$_{bf v}$ } langle bf{v},Sigma_n bf{v}rangle$$ $$text {s.t.}$$ $$||bf {v}||_2=1$$



      where $bf v$ is a vector and $Sigma_n= frac 1nsum_{i=1}^n(bf x_i -mu_n)(bf x_i - mu_n)^T$ is the Covariance matrix of the data points $bf x_i$



      The answer for $bf v$ is the vector that points in the direction of the eigenvector of $Sigma_n$ corresponding to its largest eigenvalue $lambda_{max}$



      Another really good application of Lagrange multipliers/ difficult problem involving Lagrange multipliers is solving for the Euler equation in Economics for logarithmic utility. This is extremely important in the theory of dynamic programming as well.



      $$max sum_{t=0}^{T-1} lnc_t + lnx_T$$ $$text{s.t.}$$ $$x_{t+1} =alpha (x_t-c_t)$$



      Be careful, $x_t$ shows up twice. Interpret as maximizing consumption and final wealth where $x_t$ is the wealth at period $t$ with $t=0$ corresponding to initial wealth and $c_t$ is the consumption for period $t$.



      The solution is: $x_t^*=frac {T+1-t}{T+1}(frac 1{alpha})^{-t}x_0$ and $c_t^*=frac {(frac 1{alpha})^{-t}x_0}{T+1}$



      These two problems include applications of lagrange multipliers to 1. a problem involving matrix calculus. and 2. a problem in which an optimal consumption path is determined (function), which has ties to optimal control and the functional optimization.






      share|cite|improve this answer









      $endgroup$
















        2












        2








        2





        $begingroup$

        Sure, here's 2:



        a really good one to start out with is the optimization problem behind Principal Component Analysis.



        $$text{max$_{bf v}$ } langle bf{v},Sigma_n bf{v}rangle$$ $$text {s.t.}$$ $$||bf {v}||_2=1$$



        where $bf v$ is a vector and $Sigma_n= frac 1nsum_{i=1}^n(bf x_i -mu_n)(bf x_i - mu_n)^T$ is the Covariance matrix of the data points $bf x_i$



        The answer for $bf v$ is the vector that points in the direction of the eigenvector of $Sigma_n$ corresponding to its largest eigenvalue $lambda_{max}$



        Another really good application of Lagrange multipliers/ difficult problem involving Lagrange multipliers is solving for the Euler equation in Economics for logarithmic utility. This is extremely important in the theory of dynamic programming as well.



        $$max sum_{t=0}^{T-1} lnc_t + lnx_T$$ $$text{s.t.}$$ $$x_{t+1} =alpha (x_t-c_t)$$



        Be careful, $x_t$ shows up twice. Interpret as maximizing consumption and final wealth where $x_t$ is the wealth at period $t$ with $t=0$ corresponding to initial wealth and $c_t$ is the consumption for period $t$.



        The solution is: $x_t^*=frac {T+1-t}{T+1}(frac 1{alpha})^{-t}x_0$ and $c_t^*=frac {(frac 1{alpha})^{-t}x_0}{T+1}$



        These two problems include applications of lagrange multipliers to 1. a problem involving matrix calculus. and 2. a problem in which an optimal consumption path is determined (function), which has ties to optimal control and the functional optimization.






        share|cite|improve this answer









        $endgroup$



        Sure, here's 2:



        a really good one to start out with is the optimization problem behind Principal Component Analysis.



        $$text{max$_{bf v}$ } langle bf{v},Sigma_n bf{v}rangle$$ $$text {s.t.}$$ $$||bf {v}||_2=1$$



        where $bf v$ is a vector and $Sigma_n= frac 1nsum_{i=1}^n(bf x_i -mu_n)(bf x_i - mu_n)^T$ is the Covariance matrix of the data points $bf x_i$



        The answer for $bf v$ is the vector that points in the direction of the eigenvector of $Sigma_n$ corresponding to its largest eigenvalue $lambda_{max}$



        Another really good application of Lagrange multipliers/ difficult problem involving Lagrange multipliers is solving for the Euler equation in Economics for logarithmic utility. This is extremely important in the theory of dynamic programming as well.



        $$max sum_{t=0}^{T-1} lnc_t + lnx_T$$ $$text{s.t.}$$ $$x_{t+1} =alpha (x_t-c_t)$$



        Be careful, $x_t$ shows up twice. Interpret as maximizing consumption and final wealth where $x_t$ is the wealth at period $t$ with $t=0$ corresponding to initial wealth and $c_t$ is the consumption for period $t$.



        The solution is: $x_t^*=frac {T+1-t}{T+1}(frac 1{alpha})^{-t}x_0$ and $c_t^*=frac {(frac 1{alpha})^{-t}x_0}{T+1}$



        These two problems include applications of lagrange multipliers to 1. a problem involving matrix calculus. and 2. a problem in which an optimal consumption path is determined (function), which has ties to optimal control and the functional optimization.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered May 3 '15 at 8:13









        user237393user237393

        449211




        449211























            3












            $begingroup$

            Here is a moderately complicated example, cooked up in such a way that it can be solved explicitly all the way to the end.



            Given are the two points $A=(0,-7)$, $B=(-7,0)$, and the circle $gamma: >x^2+y^2=4$. Determine two points $P$, $Qingamma$ such that the quantity
            $$d(P,Q):=|AP|^2+|PQ|^2+|QB|^2$$
            becomes maximal, resp., minimal.



            enter image description here



            You will obtain four conditionally stationary situations $(P_k,Q_k)$: the two extremal situations and two "saddle points". The latter had to be expected, since the surface determined by the conditions is a torus.



            A full solution (in German) is given on pp. 212–214 of the following pdf-file. The file contains a full chapter (pages 128–236) of a textbook for engineering students. Scroll down to page 212; there you will see the figure printed above.



            https://people.math.ethz.ch/~blatter/Inganalysis_5.pdf






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              I'm sorry Sir, but there is no page 212 in the link you have given.
              $endgroup$
              – Ishan
              May 3 '15 at 10:17












            • $begingroup$
              This file is one long webpage with page numbers and headers for every included book page from 128 to 236. You cannot access page 212 by typing 212 somewhere. Scroll down until you get to page 212. I won't go into fabricating a translation.
              $endgroup$
              – Christian Blatter
              May 5 '15 at 18:12










            • $begingroup$
              @ChristianBlatter , very interesting problem. Where did you get inspired to create it? How did you formulate this question ? What was your thought process ?
              $endgroup$
              – danilocn94
              Oct 26 '18 at 22:59






            • 1




              $begingroup$
              @danilocn94: I wanted a problem with two conditions but no boundaries to inspect. Then the setup should be as simple as possible, no square roots at the beginning of the computation, and "small numbers" in the result. A lot of "reverse engineering" went into the final version of this problem.
              $endgroup$
              – Christian Blatter
              Oct 27 '18 at 8:06












            • $begingroup$
              @ChristianBlatter , what I'd like to know is about the geometry of the problem. How did you formulate this question (geometrically) ? What were your inspiration and thought process ?
              $endgroup$
              – danilocn94
              Nov 19 '18 at 7:03
















            3












            $begingroup$

            Here is a moderately complicated example, cooked up in such a way that it can be solved explicitly all the way to the end.



            Given are the two points $A=(0,-7)$, $B=(-7,0)$, and the circle $gamma: >x^2+y^2=4$. Determine two points $P$, $Qingamma$ such that the quantity
            $$d(P,Q):=|AP|^2+|PQ|^2+|QB|^2$$
            becomes maximal, resp., minimal.



            enter image description here



            You will obtain four conditionally stationary situations $(P_k,Q_k)$: the two extremal situations and two "saddle points". The latter had to be expected, since the surface determined by the conditions is a torus.



            A full solution (in German) is given on pp. 212–214 of the following pdf-file. The file contains a full chapter (pages 128–236) of a textbook for engineering students. Scroll down to page 212; there you will see the figure printed above.



            https://people.math.ethz.ch/~blatter/Inganalysis_5.pdf






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              I'm sorry Sir, but there is no page 212 in the link you have given.
              $endgroup$
              – Ishan
              May 3 '15 at 10:17












            • $begingroup$
              This file is one long webpage with page numbers and headers for every included book page from 128 to 236. You cannot access page 212 by typing 212 somewhere. Scroll down until you get to page 212. I won't go into fabricating a translation.
              $endgroup$
              – Christian Blatter
              May 5 '15 at 18:12










            • $begingroup$
              @ChristianBlatter , very interesting problem. Where did you get inspired to create it? How did you formulate this question ? What was your thought process ?
              $endgroup$
              – danilocn94
              Oct 26 '18 at 22:59






            • 1




              $begingroup$
              @danilocn94: I wanted a problem with two conditions but no boundaries to inspect. Then the setup should be as simple as possible, no square roots at the beginning of the computation, and "small numbers" in the result. A lot of "reverse engineering" went into the final version of this problem.
              $endgroup$
              – Christian Blatter
              Oct 27 '18 at 8:06












            • $begingroup$
              @ChristianBlatter , what I'd like to know is about the geometry of the problem. How did you formulate this question (geometrically) ? What were your inspiration and thought process ?
              $endgroup$
              – danilocn94
              Nov 19 '18 at 7:03














            3












            3








            3





            $begingroup$

            Here is a moderately complicated example, cooked up in such a way that it can be solved explicitly all the way to the end.



            Given are the two points $A=(0,-7)$, $B=(-7,0)$, and the circle $gamma: >x^2+y^2=4$. Determine two points $P$, $Qingamma$ such that the quantity
            $$d(P,Q):=|AP|^2+|PQ|^2+|QB|^2$$
            becomes maximal, resp., minimal.



            enter image description here



            You will obtain four conditionally stationary situations $(P_k,Q_k)$: the two extremal situations and two "saddle points". The latter had to be expected, since the surface determined by the conditions is a torus.



            A full solution (in German) is given on pp. 212–214 of the following pdf-file. The file contains a full chapter (pages 128–236) of a textbook for engineering students. Scroll down to page 212; there you will see the figure printed above.



            https://people.math.ethz.ch/~blatter/Inganalysis_5.pdf






            share|cite|improve this answer











            $endgroup$



            Here is a moderately complicated example, cooked up in such a way that it can be solved explicitly all the way to the end.



            Given are the two points $A=(0,-7)$, $B=(-7,0)$, and the circle $gamma: >x^2+y^2=4$. Determine two points $P$, $Qingamma$ such that the quantity
            $$d(P,Q):=|AP|^2+|PQ|^2+|QB|^2$$
            becomes maximal, resp., minimal.



            enter image description here



            You will obtain four conditionally stationary situations $(P_k,Q_k)$: the two extremal situations and two "saddle points". The latter had to be expected, since the surface determined by the conditions is a torus.



            A full solution (in German) is given on pp. 212–214 of the following pdf-file. The file contains a full chapter (pages 128–236) of a textbook for engineering students. Scroll down to page 212; there you will see the figure printed above.



            https://people.math.ethz.ch/~blatter/Inganalysis_5.pdf







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited May 5 '15 at 11:14

























            answered May 3 '15 at 9:08









            Christian BlatterChristian Blatter

            172k7113326




            172k7113326












            • $begingroup$
              I'm sorry Sir, but there is no page 212 in the link you have given.
              $endgroup$
              – Ishan
              May 3 '15 at 10:17












            • $begingroup$
              This file is one long webpage with page numbers and headers for every included book page from 128 to 236. You cannot access page 212 by typing 212 somewhere. Scroll down until you get to page 212. I won't go into fabricating a translation.
              $endgroup$
              – Christian Blatter
              May 5 '15 at 18:12










            • $begingroup$
              @ChristianBlatter , very interesting problem. Where did you get inspired to create it? How did you formulate this question ? What was your thought process ?
              $endgroup$
              – danilocn94
              Oct 26 '18 at 22:59






            • 1




              $begingroup$
              @danilocn94: I wanted a problem with two conditions but no boundaries to inspect. Then the setup should be as simple as possible, no square roots at the beginning of the computation, and "small numbers" in the result. A lot of "reverse engineering" went into the final version of this problem.
              $endgroup$
              – Christian Blatter
              Oct 27 '18 at 8:06












            • $begingroup$
              @ChristianBlatter , what I'd like to know is about the geometry of the problem. How did you formulate this question (geometrically) ? What were your inspiration and thought process ?
              $endgroup$
              – danilocn94
              Nov 19 '18 at 7:03


















            • $begingroup$
              I'm sorry Sir, but there is no page 212 in the link you have given.
              $endgroup$
              – Ishan
              May 3 '15 at 10:17












            • $begingroup$
              This file is one long webpage with page numbers and headers for every included book page from 128 to 236. You cannot access page 212 by typing 212 somewhere. Scroll down until you get to page 212. I won't go into fabricating a translation.
              $endgroup$
              – Christian Blatter
              May 5 '15 at 18:12










            • $begingroup$
              @ChristianBlatter , very interesting problem. Where did you get inspired to create it? How did you formulate this question ? What was your thought process ?
              $endgroup$
              – danilocn94
              Oct 26 '18 at 22:59






            • 1




              $begingroup$
              @danilocn94: I wanted a problem with two conditions but no boundaries to inspect. Then the setup should be as simple as possible, no square roots at the beginning of the computation, and "small numbers" in the result. A lot of "reverse engineering" went into the final version of this problem.
              $endgroup$
              – Christian Blatter
              Oct 27 '18 at 8:06












            • $begingroup$
              @ChristianBlatter , what I'd like to know is about the geometry of the problem. How did you formulate this question (geometrically) ? What were your inspiration and thought process ?
              $endgroup$
              – danilocn94
              Nov 19 '18 at 7:03
















            $begingroup$
            I'm sorry Sir, but there is no page 212 in the link you have given.
            $endgroup$
            – Ishan
            May 3 '15 at 10:17






            $begingroup$
            I'm sorry Sir, but there is no page 212 in the link you have given.
            $endgroup$
            – Ishan
            May 3 '15 at 10:17














            $begingroup$
            This file is one long webpage with page numbers and headers for every included book page from 128 to 236. You cannot access page 212 by typing 212 somewhere. Scroll down until you get to page 212. I won't go into fabricating a translation.
            $endgroup$
            – Christian Blatter
            May 5 '15 at 18:12




            $begingroup$
            This file is one long webpage with page numbers and headers for every included book page from 128 to 236. You cannot access page 212 by typing 212 somewhere. Scroll down until you get to page 212. I won't go into fabricating a translation.
            $endgroup$
            – Christian Blatter
            May 5 '15 at 18:12












            $begingroup$
            @ChristianBlatter , very interesting problem. Where did you get inspired to create it? How did you formulate this question ? What was your thought process ?
            $endgroup$
            – danilocn94
            Oct 26 '18 at 22:59




            $begingroup$
            @ChristianBlatter , very interesting problem. Where did you get inspired to create it? How did you formulate this question ? What was your thought process ?
            $endgroup$
            – danilocn94
            Oct 26 '18 at 22:59




            1




            1




            $begingroup$
            @danilocn94: I wanted a problem with two conditions but no boundaries to inspect. Then the setup should be as simple as possible, no square roots at the beginning of the computation, and "small numbers" in the result. A lot of "reverse engineering" went into the final version of this problem.
            $endgroup$
            – Christian Blatter
            Oct 27 '18 at 8:06






            $begingroup$
            @danilocn94: I wanted a problem with two conditions but no boundaries to inspect. Then the setup should be as simple as possible, no square roots at the beginning of the computation, and "small numbers" in the result. A lot of "reverse engineering" went into the final version of this problem.
            $endgroup$
            – Christian Blatter
            Oct 27 '18 at 8:06














            $begingroup$
            @ChristianBlatter , what I'd like to know is about the geometry of the problem. How did you formulate this question (geometrically) ? What were your inspiration and thought process ?
            $endgroup$
            – danilocn94
            Nov 19 '18 at 7:03




            $begingroup$
            @ChristianBlatter , what I'd like to know is about the geometry of the problem. How did you formulate this question (geometrically) ? What were your inspiration and thought process ?
            $endgroup$
            – danilocn94
            Nov 19 '18 at 7:03











            2












            $begingroup$

            Here two other ones :



            Exercice 1 :




            Let $alphainmathbb{R}_{+}$. For all $left(x,y,zright)inmathbb{R}^{3}$, find the optimal constant $Cleft(alpharight)inmathbb{R}$
            such that $$x^{3}-left(frac{z}{alpha}right)^{3}leq Cleft(alpharight)left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            and precise the case of equality in term of $alpha$.




            Solution :



            We notice that the both sizes of the above equation are homogeneous functions of degree $3$ : it suffices to consider the problem on the unit sphere $mathcal{S}=left{ left(x,y,zright)inmathbb{R}^{3};x^{2}+y^{2}+z^{2}=1right}$.
            Soient $f$ et $g$ the functions defined on $mathbb{R}^{3}$ by $$begin{cases}
            fleft(x,y,zright)=x^{3}-left(frac{z}{alpha}right)^{3}\
            gleft(x,y,zright)=x^{2}+y^{2}+z^{2}-1
            end{cases}.$$



            As $f$ is continuous and $mathcal{S}$ is compact, $f_{midmathcal{S}}$ is bounded and attain its maximum values.
            For all $left(x,y,zright)inmathbb{R}^{3}$, we have
            $$begin{cases}
            mathrm{d}fleft(x,y,zright)=3x^{2}dx-3frac{z^{2}}{alpha^{3}}dz\
            mathrm{d}gleft(x,y,zright)=2xdx+2ydy+2zdz
            end{cases}$$
            or in matricial notations in the basis $left(mathrm{d}x,mathrm{d}y,mathrm{d}zright)$
            of $left(mathbb{R}^{3}right)^{*}$



            $$begin{pmatrix}3x^{2} & 0 & -3frac{z^{2}}{alpha^{3}}\
            2x & 2y & 2z
            end{pmatrix}.$$



            Cancelling all the $begin{pmatrix}3\2end{pmatrix}=3$ $2times2$-determinants so that $mathrm{d}fleft(x,y,zright)$ and $mathrm{d}gleft(x,y,zright)$ are linearly dependant yields :



            $$begin{vmatrix}3x^{2} & 0\
            2x & 2y
            end{vmatrix}=6x^{2}y=0Longleftrightarrowleft(x=0right)textrm{ ou }left(y=0right).$$
            $$ begin{vmatrix}3x^{2} & -3frac{z^{2}}{alpha^{3}}
            2x & 2z
            end{vmatrix}=6xzleft(x+frac{z}{alpha^{3}}right)=0Longleftrightarrowleft(x=0right)textrm{ ou }left(z=0right)textrm{ ou }left(x=-frac{z}{alpha^{3}}right).$$
            $$begin{vmatrix}0 & 3frac{z^{2}}{alpha^{3}}\
            2y & 2z
            end{vmatrix}=6yfrac{z^{2}}{alpha^{3}}=0Longleftrightarrowleft(y=0right)textrm{ ou }left(z=0right).$$



            Then, for all $left(x,y,zright)inmathbb{R}^{3}$ :



            $(i)$ if $alpha<1$, $$fleft(x,y,zright)leqfrac{1}{alpha^{3}}left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right}$ ;



            $(ii)$ if $alpha=1$, $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} right)$
            or $left(left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right} right)$ ;



            $(iii)$ If $alpha>1$,
            $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} $.



            Exercice 2 :




            Let $n$,$alphainmathbb{N}^{*}$, $1leqalphaleq n$.
            For all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, find the optimal constant $Cleft(n,alpharight)inmathbb{R}$ such that
            $$x_{1}ldots x_{n}leq Cleft(n,alpharight)left(x_{1}^{alpha}+ldots+x_{n}^{alpha}right)^{n/alpha}$$
            and precise the case of equality in term of $alpha$.




            Solution :



            We notice that the both sizes of the above equation are homogeneous functions of degree $n$ : it suffices to consider the problem on the hypersurface $mathcal{S}=left{ left(x_{1},ldots,x_{n}right)inmathbb{R}^{n};x_{1}^{alpha}+ldots+x_{n}^{alpha}=1right}$.
            Let $f$ and $g$ the functions defined on $mathbb{R}^{n}$ by



            $$begin{cases}
            fleft(x_{1},ldots,x_{n}right)=x_{1}ldots x_{n}\
            gleft(x_{1},ldots,x_{n}right)=x_{1}^{alpha}+ldots+x_{n}^{alpha}-1
            end{cases}.$$



            We can assume that $x_{j}>0$ for all $1leq jleq n$ (the inequality is stronger in the case where there is an even number of $x_j<0$, which is the same that consider them all non-negative).
            For all $left(x_{1},ldots,x_{n}right)inmathbb{R}_{+}^{n}$, we have



            $$begin{cases}
            dfleft(x_{1},ldots,x_{n}right)=sum_{j=1}^{n}left(prod_{kneq j}x_{k}right)dx_{j}\
            dgleft(x_{1},ldots,x_{n}right)=alphasum_{j=1}^{n}left(x_{j}^{alpha-1}right)dx_{j}
            end{cases}$$
            or in matricial notations



            $$begin{pmatrix}x_{2}x_{3}ldots x_{n} & x_{1}x_{3}x_{4}ldots x_{n} & cdots & x_{1}x_{2}ldots x_{n-2}x_{n-1}\
            alpha x_{1}^{alpha-1} & alpha x_{2}^{alpha-1} & cdots & alpha x_{n}^{alpha-1}
            end{pmatrix}.$$



            We cancel all the $begin{pmatrix}n\2end{pmatrix}$ $2times2$-determinants so that $mathrm{d}fleft(x_{1},ldots,x_{n}right)$ and $mathrm{d}gleft(x_{1},ldots,x_{n}right)$ linearly dependant : for all $1leq k$, $ellleq n$ with $kneqell$,
            $$begin{vmatrix}x_{1}ldots x_{k-1}x_{k+1}x_{n} & x_{1}ldots x_{ell-1}x_{ell+1}ldots x_{n}\
            alpha x_{k}^{alpha-1} & alpha x_{ell}^{alpha-1}
            end{vmatrix}=alpha x_{1}ldots x_{k-1}x_{k+1}ldots x_{ell-1}x_{ell+1}ldots x_{n}left(x_{ell}^{alpha}-x_{k}^{alpha}right)=0
            Longleftrightarrow x_{k}=x_{ell}Longrightarrow x_{1}=left(frac{1}{n}right)^{frac{1}{alpha}}$$
            with the constraint $gleft(x_{1},ldots,x_{n}right)=0$.
            We get $$fleft(left(frac{1}{n}right)^{frac{1}{alpha}},ldots,left(frac{1}{n}right)^{frac{1}{alpha}}right)=left(frac{1}{n}right)^{frac{n}{alpha}}$$
            and hence $$f_{midmathcal{S}}left(x_{1},ldots,x_{n}right)leqleft(frac{1}{n}right)^{frac{1}{alpha}}.$$
            We extend this result by homothetie to $mathbb{R}^{n}$ :
            for all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, $$x_{1}ldots x_{n}leqleft(frac{x_{1}^{alpha}+ldots+x_{n}^{alpha}}{n}right)^{frac{n}{alpha}}$$
            with equality iff $x_{1}=ldots=x_{n}$.






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              Could you please tell me the meaning of optimal constant? Thanks!
              $endgroup$
              – Ishan
              May 3 '15 at 10:00










            • $begingroup$
              @BetterWorld The optimal constant means the smallest one such that the above equalities hold.
              $endgroup$
              – Nicolas
              May 3 '15 at 10:40
















            2












            $begingroup$

            Here two other ones :



            Exercice 1 :




            Let $alphainmathbb{R}_{+}$. For all $left(x,y,zright)inmathbb{R}^{3}$, find the optimal constant $Cleft(alpharight)inmathbb{R}$
            such that $$x^{3}-left(frac{z}{alpha}right)^{3}leq Cleft(alpharight)left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            and precise the case of equality in term of $alpha$.




            Solution :



            We notice that the both sizes of the above equation are homogeneous functions of degree $3$ : it suffices to consider the problem on the unit sphere $mathcal{S}=left{ left(x,y,zright)inmathbb{R}^{3};x^{2}+y^{2}+z^{2}=1right}$.
            Soient $f$ et $g$ the functions defined on $mathbb{R}^{3}$ by $$begin{cases}
            fleft(x,y,zright)=x^{3}-left(frac{z}{alpha}right)^{3}\
            gleft(x,y,zright)=x^{2}+y^{2}+z^{2}-1
            end{cases}.$$



            As $f$ is continuous and $mathcal{S}$ is compact, $f_{midmathcal{S}}$ is bounded and attain its maximum values.
            For all $left(x,y,zright)inmathbb{R}^{3}$, we have
            $$begin{cases}
            mathrm{d}fleft(x,y,zright)=3x^{2}dx-3frac{z^{2}}{alpha^{3}}dz\
            mathrm{d}gleft(x,y,zright)=2xdx+2ydy+2zdz
            end{cases}$$
            or in matricial notations in the basis $left(mathrm{d}x,mathrm{d}y,mathrm{d}zright)$
            of $left(mathbb{R}^{3}right)^{*}$



            $$begin{pmatrix}3x^{2} & 0 & -3frac{z^{2}}{alpha^{3}}\
            2x & 2y & 2z
            end{pmatrix}.$$



            Cancelling all the $begin{pmatrix}3\2end{pmatrix}=3$ $2times2$-determinants so that $mathrm{d}fleft(x,y,zright)$ and $mathrm{d}gleft(x,y,zright)$ are linearly dependant yields :



            $$begin{vmatrix}3x^{2} & 0\
            2x & 2y
            end{vmatrix}=6x^{2}y=0Longleftrightarrowleft(x=0right)textrm{ ou }left(y=0right).$$
            $$ begin{vmatrix}3x^{2} & -3frac{z^{2}}{alpha^{3}}
            2x & 2z
            end{vmatrix}=6xzleft(x+frac{z}{alpha^{3}}right)=0Longleftrightarrowleft(x=0right)textrm{ ou }left(z=0right)textrm{ ou }left(x=-frac{z}{alpha^{3}}right).$$
            $$begin{vmatrix}0 & 3frac{z^{2}}{alpha^{3}}\
            2y & 2z
            end{vmatrix}=6yfrac{z^{2}}{alpha^{3}}=0Longleftrightarrowleft(y=0right)textrm{ ou }left(z=0right).$$



            Then, for all $left(x,y,zright)inmathbb{R}^{3}$ :



            $(i)$ if $alpha<1$, $$fleft(x,y,zright)leqfrac{1}{alpha^{3}}left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right}$ ;



            $(ii)$ if $alpha=1$, $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} right)$
            or $left(left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right} right)$ ;



            $(iii)$ If $alpha>1$,
            $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} $.



            Exercice 2 :




            Let $n$,$alphainmathbb{N}^{*}$, $1leqalphaleq n$.
            For all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, find the optimal constant $Cleft(n,alpharight)inmathbb{R}$ such that
            $$x_{1}ldots x_{n}leq Cleft(n,alpharight)left(x_{1}^{alpha}+ldots+x_{n}^{alpha}right)^{n/alpha}$$
            and precise the case of equality in term of $alpha$.




            Solution :



            We notice that the both sizes of the above equation are homogeneous functions of degree $n$ : it suffices to consider the problem on the hypersurface $mathcal{S}=left{ left(x_{1},ldots,x_{n}right)inmathbb{R}^{n};x_{1}^{alpha}+ldots+x_{n}^{alpha}=1right}$.
            Let $f$ and $g$ the functions defined on $mathbb{R}^{n}$ by



            $$begin{cases}
            fleft(x_{1},ldots,x_{n}right)=x_{1}ldots x_{n}\
            gleft(x_{1},ldots,x_{n}right)=x_{1}^{alpha}+ldots+x_{n}^{alpha}-1
            end{cases}.$$



            We can assume that $x_{j}>0$ for all $1leq jleq n$ (the inequality is stronger in the case where there is an even number of $x_j<0$, which is the same that consider them all non-negative).
            For all $left(x_{1},ldots,x_{n}right)inmathbb{R}_{+}^{n}$, we have



            $$begin{cases}
            dfleft(x_{1},ldots,x_{n}right)=sum_{j=1}^{n}left(prod_{kneq j}x_{k}right)dx_{j}\
            dgleft(x_{1},ldots,x_{n}right)=alphasum_{j=1}^{n}left(x_{j}^{alpha-1}right)dx_{j}
            end{cases}$$
            or in matricial notations



            $$begin{pmatrix}x_{2}x_{3}ldots x_{n} & x_{1}x_{3}x_{4}ldots x_{n} & cdots & x_{1}x_{2}ldots x_{n-2}x_{n-1}\
            alpha x_{1}^{alpha-1} & alpha x_{2}^{alpha-1} & cdots & alpha x_{n}^{alpha-1}
            end{pmatrix}.$$



            We cancel all the $begin{pmatrix}n\2end{pmatrix}$ $2times2$-determinants so that $mathrm{d}fleft(x_{1},ldots,x_{n}right)$ and $mathrm{d}gleft(x_{1},ldots,x_{n}right)$ linearly dependant : for all $1leq k$, $ellleq n$ with $kneqell$,
            $$begin{vmatrix}x_{1}ldots x_{k-1}x_{k+1}x_{n} & x_{1}ldots x_{ell-1}x_{ell+1}ldots x_{n}\
            alpha x_{k}^{alpha-1} & alpha x_{ell}^{alpha-1}
            end{vmatrix}=alpha x_{1}ldots x_{k-1}x_{k+1}ldots x_{ell-1}x_{ell+1}ldots x_{n}left(x_{ell}^{alpha}-x_{k}^{alpha}right)=0
            Longleftrightarrow x_{k}=x_{ell}Longrightarrow x_{1}=left(frac{1}{n}right)^{frac{1}{alpha}}$$
            with the constraint $gleft(x_{1},ldots,x_{n}right)=0$.
            We get $$fleft(left(frac{1}{n}right)^{frac{1}{alpha}},ldots,left(frac{1}{n}right)^{frac{1}{alpha}}right)=left(frac{1}{n}right)^{frac{n}{alpha}}$$
            and hence $$f_{midmathcal{S}}left(x_{1},ldots,x_{n}right)leqleft(frac{1}{n}right)^{frac{1}{alpha}}.$$
            We extend this result by homothetie to $mathbb{R}^{n}$ :
            for all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, $$x_{1}ldots x_{n}leqleft(frac{x_{1}^{alpha}+ldots+x_{n}^{alpha}}{n}right)^{frac{n}{alpha}}$$
            with equality iff $x_{1}=ldots=x_{n}$.






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              Could you please tell me the meaning of optimal constant? Thanks!
              $endgroup$
              – Ishan
              May 3 '15 at 10:00










            • $begingroup$
              @BetterWorld The optimal constant means the smallest one such that the above equalities hold.
              $endgroup$
              – Nicolas
              May 3 '15 at 10:40














            2












            2








            2





            $begingroup$

            Here two other ones :



            Exercice 1 :




            Let $alphainmathbb{R}_{+}$. For all $left(x,y,zright)inmathbb{R}^{3}$, find the optimal constant $Cleft(alpharight)inmathbb{R}$
            such that $$x^{3}-left(frac{z}{alpha}right)^{3}leq Cleft(alpharight)left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            and precise the case of equality in term of $alpha$.




            Solution :



            We notice that the both sizes of the above equation are homogeneous functions of degree $3$ : it suffices to consider the problem on the unit sphere $mathcal{S}=left{ left(x,y,zright)inmathbb{R}^{3};x^{2}+y^{2}+z^{2}=1right}$.
            Soient $f$ et $g$ the functions defined on $mathbb{R}^{3}$ by $$begin{cases}
            fleft(x,y,zright)=x^{3}-left(frac{z}{alpha}right)^{3}\
            gleft(x,y,zright)=x^{2}+y^{2}+z^{2}-1
            end{cases}.$$



            As $f$ is continuous and $mathcal{S}$ is compact, $f_{midmathcal{S}}$ is bounded and attain its maximum values.
            For all $left(x,y,zright)inmathbb{R}^{3}$, we have
            $$begin{cases}
            mathrm{d}fleft(x,y,zright)=3x^{2}dx-3frac{z^{2}}{alpha^{3}}dz\
            mathrm{d}gleft(x,y,zright)=2xdx+2ydy+2zdz
            end{cases}$$
            or in matricial notations in the basis $left(mathrm{d}x,mathrm{d}y,mathrm{d}zright)$
            of $left(mathbb{R}^{3}right)^{*}$



            $$begin{pmatrix}3x^{2} & 0 & -3frac{z^{2}}{alpha^{3}}\
            2x & 2y & 2z
            end{pmatrix}.$$



            Cancelling all the $begin{pmatrix}3\2end{pmatrix}=3$ $2times2$-determinants so that $mathrm{d}fleft(x,y,zright)$ and $mathrm{d}gleft(x,y,zright)$ are linearly dependant yields :



            $$begin{vmatrix}3x^{2} & 0\
            2x & 2y
            end{vmatrix}=6x^{2}y=0Longleftrightarrowleft(x=0right)textrm{ ou }left(y=0right).$$
            $$ begin{vmatrix}3x^{2} & -3frac{z^{2}}{alpha^{3}}
            2x & 2z
            end{vmatrix}=6xzleft(x+frac{z}{alpha^{3}}right)=0Longleftrightarrowleft(x=0right)textrm{ ou }left(z=0right)textrm{ ou }left(x=-frac{z}{alpha^{3}}right).$$
            $$begin{vmatrix}0 & 3frac{z^{2}}{alpha^{3}}\
            2y & 2z
            end{vmatrix}=6yfrac{z^{2}}{alpha^{3}}=0Longleftrightarrowleft(y=0right)textrm{ ou }left(z=0right).$$



            Then, for all $left(x,y,zright)inmathbb{R}^{3}$ :



            $(i)$ if $alpha<1$, $$fleft(x,y,zright)leqfrac{1}{alpha^{3}}left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right}$ ;



            $(ii)$ if $alpha=1$, $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} right)$
            or $left(left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right} right)$ ;



            $(iii)$ If $alpha>1$,
            $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} $.



            Exercice 2 :




            Let $n$,$alphainmathbb{N}^{*}$, $1leqalphaleq n$.
            For all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, find the optimal constant $Cleft(n,alpharight)inmathbb{R}$ such that
            $$x_{1}ldots x_{n}leq Cleft(n,alpharight)left(x_{1}^{alpha}+ldots+x_{n}^{alpha}right)^{n/alpha}$$
            and precise the case of equality in term of $alpha$.




            Solution :



            We notice that the both sizes of the above equation are homogeneous functions of degree $n$ : it suffices to consider the problem on the hypersurface $mathcal{S}=left{ left(x_{1},ldots,x_{n}right)inmathbb{R}^{n};x_{1}^{alpha}+ldots+x_{n}^{alpha}=1right}$.
            Let $f$ and $g$ the functions defined on $mathbb{R}^{n}$ by



            $$begin{cases}
            fleft(x_{1},ldots,x_{n}right)=x_{1}ldots x_{n}\
            gleft(x_{1},ldots,x_{n}right)=x_{1}^{alpha}+ldots+x_{n}^{alpha}-1
            end{cases}.$$



            We can assume that $x_{j}>0$ for all $1leq jleq n$ (the inequality is stronger in the case where there is an even number of $x_j<0$, which is the same that consider them all non-negative).
            For all $left(x_{1},ldots,x_{n}right)inmathbb{R}_{+}^{n}$, we have



            $$begin{cases}
            dfleft(x_{1},ldots,x_{n}right)=sum_{j=1}^{n}left(prod_{kneq j}x_{k}right)dx_{j}\
            dgleft(x_{1},ldots,x_{n}right)=alphasum_{j=1}^{n}left(x_{j}^{alpha-1}right)dx_{j}
            end{cases}$$
            or in matricial notations



            $$begin{pmatrix}x_{2}x_{3}ldots x_{n} & x_{1}x_{3}x_{4}ldots x_{n} & cdots & x_{1}x_{2}ldots x_{n-2}x_{n-1}\
            alpha x_{1}^{alpha-1} & alpha x_{2}^{alpha-1} & cdots & alpha x_{n}^{alpha-1}
            end{pmatrix}.$$



            We cancel all the $begin{pmatrix}n\2end{pmatrix}$ $2times2$-determinants so that $mathrm{d}fleft(x_{1},ldots,x_{n}right)$ and $mathrm{d}gleft(x_{1},ldots,x_{n}right)$ linearly dependant : for all $1leq k$, $ellleq n$ with $kneqell$,
            $$begin{vmatrix}x_{1}ldots x_{k-1}x_{k+1}x_{n} & x_{1}ldots x_{ell-1}x_{ell+1}ldots x_{n}\
            alpha x_{k}^{alpha-1} & alpha x_{ell}^{alpha-1}
            end{vmatrix}=alpha x_{1}ldots x_{k-1}x_{k+1}ldots x_{ell-1}x_{ell+1}ldots x_{n}left(x_{ell}^{alpha}-x_{k}^{alpha}right)=0
            Longleftrightarrow x_{k}=x_{ell}Longrightarrow x_{1}=left(frac{1}{n}right)^{frac{1}{alpha}}$$
            with the constraint $gleft(x_{1},ldots,x_{n}right)=0$.
            We get $$fleft(left(frac{1}{n}right)^{frac{1}{alpha}},ldots,left(frac{1}{n}right)^{frac{1}{alpha}}right)=left(frac{1}{n}right)^{frac{n}{alpha}}$$
            and hence $$f_{midmathcal{S}}left(x_{1},ldots,x_{n}right)leqleft(frac{1}{n}right)^{frac{1}{alpha}}.$$
            We extend this result by homothetie to $mathbb{R}^{n}$ :
            for all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, $$x_{1}ldots x_{n}leqleft(frac{x_{1}^{alpha}+ldots+x_{n}^{alpha}}{n}right)^{frac{n}{alpha}}$$
            with equality iff $x_{1}=ldots=x_{n}$.






            share|cite|improve this answer











            $endgroup$



            Here two other ones :



            Exercice 1 :




            Let $alphainmathbb{R}_{+}$. For all $left(x,y,zright)inmathbb{R}^{3}$, find the optimal constant $Cleft(alpharight)inmathbb{R}$
            such that $$x^{3}-left(frac{z}{alpha}right)^{3}leq Cleft(alpharight)left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            and precise the case of equality in term of $alpha$.




            Solution :



            We notice that the both sizes of the above equation are homogeneous functions of degree $3$ : it suffices to consider the problem on the unit sphere $mathcal{S}=left{ left(x,y,zright)inmathbb{R}^{3};x^{2}+y^{2}+z^{2}=1right}$.
            Soient $f$ et $g$ the functions defined on $mathbb{R}^{3}$ by $$begin{cases}
            fleft(x,y,zright)=x^{3}-left(frac{z}{alpha}right)^{3}\
            gleft(x,y,zright)=x^{2}+y^{2}+z^{2}-1
            end{cases}.$$



            As $f$ is continuous and $mathcal{S}$ is compact, $f_{midmathcal{S}}$ is bounded and attain its maximum values.
            For all $left(x,y,zright)inmathbb{R}^{3}$, we have
            $$begin{cases}
            mathrm{d}fleft(x,y,zright)=3x^{2}dx-3frac{z^{2}}{alpha^{3}}dz\
            mathrm{d}gleft(x,y,zright)=2xdx+2ydy+2zdz
            end{cases}$$
            or in matricial notations in the basis $left(mathrm{d}x,mathrm{d}y,mathrm{d}zright)$
            of $left(mathbb{R}^{3}right)^{*}$



            $$begin{pmatrix}3x^{2} & 0 & -3frac{z^{2}}{alpha^{3}}\
            2x & 2y & 2z
            end{pmatrix}.$$



            Cancelling all the $begin{pmatrix}3\2end{pmatrix}=3$ $2times2$-determinants so that $mathrm{d}fleft(x,y,zright)$ and $mathrm{d}gleft(x,y,zright)$ are linearly dependant yields :



            $$begin{vmatrix}3x^{2} & 0\
            2x & 2y
            end{vmatrix}=6x^{2}y=0Longleftrightarrowleft(x=0right)textrm{ ou }left(y=0right).$$
            $$ begin{vmatrix}3x^{2} & -3frac{z^{2}}{alpha^{3}}
            2x & 2z
            end{vmatrix}=6xzleft(x+frac{z}{alpha^{3}}right)=0Longleftrightarrowleft(x=0right)textrm{ ou }left(z=0right)textrm{ ou }left(x=-frac{z}{alpha^{3}}right).$$
            $$begin{vmatrix}0 & 3frac{z^{2}}{alpha^{3}}\
            2y & 2z
            end{vmatrix}=6yfrac{z^{2}}{alpha^{3}}=0Longleftrightarrowleft(y=0right)textrm{ ou }left(z=0right).$$



            Then, for all $left(x,y,zright)inmathbb{R}^{3}$ :



            $(i)$ if $alpha<1$, $$fleft(x,y,zright)leqfrac{1}{alpha^{3}}left(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right}$ ;



            $(ii)$ if $alpha=1$, $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} right)$
            or $left(left(x,y,zright)inleft{ left(0,0,-lambdaright);lambdainmathbb{R}_{+}^{*}right} right)$ ;



            $(iii)$ If $alpha>1$,
            $$fleft(x,y,zright)leqleft(x^{2}+y^{2}+z^{2}right)^{3/2}$$
            with equality iff $left(x,y,zright)inleft{ left(lambda,0,0right);lambdainmathbb{R}_{+}^{*}right} $.



            Exercice 2 :




            Let $n$,$alphainmathbb{N}^{*}$, $1leqalphaleq n$.
            For all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, find the optimal constant $Cleft(n,alpharight)inmathbb{R}$ such that
            $$x_{1}ldots x_{n}leq Cleft(n,alpharight)left(x_{1}^{alpha}+ldots+x_{n}^{alpha}right)^{n/alpha}$$
            and precise the case of equality in term of $alpha$.




            Solution :



            We notice that the both sizes of the above equation are homogeneous functions of degree $n$ : it suffices to consider the problem on the hypersurface $mathcal{S}=left{ left(x_{1},ldots,x_{n}right)inmathbb{R}^{n};x_{1}^{alpha}+ldots+x_{n}^{alpha}=1right}$.
            Let $f$ and $g$ the functions defined on $mathbb{R}^{n}$ by



            $$begin{cases}
            fleft(x_{1},ldots,x_{n}right)=x_{1}ldots x_{n}\
            gleft(x_{1},ldots,x_{n}right)=x_{1}^{alpha}+ldots+x_{n}^{alpha}-1
            end{cases}.$$



            We can assume that $x_{j}>0$ for all $1leq jleq n$ (the inequality is stronger in the case where there is an even number of $x_j<0$, which is the same that consider them all non-negative).
            For all $left(x_{1},ldots,x_{n}right)inmathbb{R}_{+}^{n}$, we have



            $$begin{cases}
            dfleft(x_{1},ldots,x_{n}right)=sum_{j=1}^{n}left(prod_{kneq j}x_{k}right)dx_{j}\
            dgleft(x_{1},ldots,x_{n}right)=alphasum_{j=1}^{n}left(x_{j}^{alpha-1}right)dx_{j}
            end{cases}$$
            or in matricial notations



            $$begin{pmatrix}x_{2}x_{3}ldots x_{n} & x_{1}x_{3}x_{4}ldots x_{n} & cdots & x_{1}x_{2}ldots x_{n-2}x_{n-1}\
            alpha x_{1}^{alpha-1} & alpha x_{2}^{alpha-1} & cdots & alpha x_{n}^{alpha-1}
            end{pmatrix}.$$



            We cancel all the $begin{pmatrix}n\2end{pmatrix}$ $2times2$-determinants so that $mathrm{d}fleft(x_{1},ldots,x_{n}right)$ and $mathrm{d}gleft(x_{1},ldots,x_{n}right)$ linearly dependant : for all $1leq k$, $ellleq n$ with $kneqell$,
            $$begin{vmatrix}x_{1}ldots x_{k-1}x_{k+1}x_{n} & x_{1}ldots x_{ell-1}x_{ell+1}ldots x_{n}\
            alpha x_{k}^{alpha-1} & alpha x_{ell}^{alpha-1}
            end{vmatrix}=alpha x_{1}ldots x_{k-1}x_{k+1}ldots x_{ell-1}x_{ell+1}ldots x_{n}left(x_{ell}^{alpha}-x_{k}^{alpha}right)=0
            Longleftrightarrow x_{k}=x_{ell}Longrightarrow x_{1}=left(frac{1}{n}right)^{frac{1}{alpha}}$$
            with the constraint $gleft(x_{1},ldots,x_{n}right)=0$.
            We get $$fleft(left(frac{1}{n}right)^{frac{1}{alpha}},ldots,left(frac{1}{n}right)^{frac{1}{alpha}}right)=left(frac{1}{n}right)^{frac{n}{alpha}}$$
            and hence $$f_{midmathcal{S}}left(x_{1},ldots,x_{n}right)leqleft(frac{1}{n}right)^{frac{1}{alpha}}.$$
            We extend this result by homothetie to $mathbb{R}^{n}$ :
            for all $left(x_{1},ldots,x_{n}right)inmathbb{R}^{n}$, $$x_{1}ldots x_{n}leqleft(frac{x_{1}^{alpha}+ldots+x_{n}^{alpha}}{n}right)^{frac{n}{alpha}}$$
            with equality iff $x_{1}=ldots=x_{n}$.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited May 3 '15 at 10:47

























            answered May 3 '15 at 8:44









            NicolasNicolas

            2,8952719




            2,8952719












            • $begingroup$
              Could you please tell me the meaning of optimal constant? Thanks!
              $endgroup$
              – Ishan
              May 3 '15 at 10:00










            • $begingroup$
              @BetterWorld The optimal constant means the smallest one such that the above equalities hold.
              $endgroup$
              – Nicolas
              May 3 '15 at 10:40


















            • $begingroup$
              Could you please tell me the meaning of optimal constant? Thanks!
              $endgroup$
              – Ishan
              May 3 '15 at 10:00










            • $begingroup$
              @BetterWorld The optimal constant means the smallest one such that the above equalities hold.
              $endgroup$
              – Nicolas
              May 3 '15 at 10:40
















            $begingroup$
            Could you please tell me the meaning of optimal constant? Thanks!
            $endgroup$
            – Ishan
            May 3 '15 at 10:00




            $begingroup$
            Could you please tell me the meaning of optimal constant? Thanks!
            $endgroup$
            – Ishan
            May 3 '15 at 10:00












            $begingroup$
            @BetterWorld The optimal constant means the smallest one such that the above equalities hold.
            $endgroup$
            – Nicolas
            May 3 '15 at 10:40




            $begingroup$
            @BetterWorld The optimal constant means the smallest one such that the above equalities hold.
            $endgroup$
            – Nicolas
            May 3 '15 at 10:40


















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1264201%2fhow-to-create-very-hard-problems-on-lagrange-multipliers%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Ellipse (mathématiques)

            Quarter-circle Tiles

            Mont Emei