Maximizing a weighted sum of logarithms











up vote
1
down vote

favorite












I'm trying to show that for:



$$p^* equiv max_{x} sum^{k=n}_{k=1} a_k ln x_k
$$



Where:



$
x in mathbb{R}^n \
x geq 0 \
sum_{k=1}^{k=n} x_k=c \
c > 0 \
forall a_k, a_k > 0 \
a equiv sum_{k=1}^{k=n} a_k
$



That:



$$
p^* = aln{frac{c}{a}}+sum^{k=n}_{k=1} a_k ln a_k
$$



I'm really not certain where even to begin with this derivation.










share|cite|improve this question
























  • Try the KKT approach.
    – LinAlg
    Nov 14 at 21:36










  • @LinAlg what's a good resource for KKT? I'm not super familiar.
    – mmam
    Nov 14 at 22:08










  • en.wikipedia.org/wiki/…
    – LinAlg
    Nov 14 at 23:02















up vote
1
down vote

favorite












I'm trying to show that for:



$$p^* equiv max_{x} sum^{k=n}_{k=1} a_k ln x_k
$$



Where:



$
x in mathbb{R}^n \
x geq 0 \
sum_{k=1}^{k=n} x_k=c \
c > 0 \
forall a_k, a_k > 0 \
a equiv sum_{k=1}^{k=n} a_k
$



That:



$$
p^* = aln{frac{c}{a}}+sum^{k=n}_{k=1} a_k ln a_k
$$



I'm really not certain where even to begin with this derivation.










share|cite|improve this question
























  • Try the KKT approach.
    – LinAlg
    Nov 14 at 21:36










  • @LinAlg what's a good resource for KKT? I'm not super familiar.
    – mmam
    Nov 14 at 22:08










  • en.wikipedia.org/wiki/…
    – LinAlg
    Nov 14 at 23:02













up vote
1
down vote

favorite









up vote
1
down vote

favorite











I'm trying to show that for:



$$p^* equiv max_{x} sum^{k=n}_{k=1} a_k ln x_k
$$



Where:



$
x in mathbb{R}^n \
x geq 0 \
sum_{k=1}^{k=n} x_k=c \
c > 0 \
forall a_k, a_k > 0 \
a equiv sum_{k=1}^{k=n} a_k
$



That:



$$
p^* = aln{frac{c}{a}}+sum^{k=n}_{k=1} a_k ln a_k
$$



I'm really not certain where even to begin with this derivation.










share|cite|improve this question















I'm trying to show that for:



$$p^* equiv max_{x} sum^{k=n}_{k=1} a_k ln x_k
$$



Where:



$
x in mathbb{R}^n \
x geq 0 \
sum_{k=1}^{k=n} x_k=c \
c > 0 \
forall a_k, a_k > 0 \
a equiv sum_{k=1}^{k=n} a_k
$



That:



$$
p^* = aln{frac{c}{a}}+sum^{k=n}_{k=1} a_k ln a_k
$$



I'm really not certain where even to begin with this derivation.







inequality optimization summation logarithms convex-optimization






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 22 at 17:57









Batominovski

33.5k33292




33.5k33292










asked Nov 14 at 20:59









mmam

811826




811826












  • Try the KKT approach.
    – LinAlg
    Nov 14 at 21:36










  • @LinAlg what's a good resource for KKT? I'm not super familiar.
    – mmam
    Nov 14 at 22:08










  • en.wikipedia.org/wiki/…
    – LinAlg
    Nov 14 at 23:02


















  • Try the KKT approach.
    – LinAlg
    Nov 14 at 21:36










  • @LinAlg what's a good resource for KKT? I'm not super familiar.
    – mmam
    Nov 14 at 22:08










  • en.wikipedia.org/wiki/…
    – LinAlg
    Nov 14 at 23:02
















Try the KKT approach.
– LinAlg
Nov 14 at 21:36




Try the KKT approach.
– LinAlg
Nov 14 at 21:36












@LinAlg what's a good resource for KKT? I'm not super familiar.
– mmam
Nov 14 at 22:08




@LinAlg what's a good resource for KKT? I'm not super familiar.
– mmam
Nov 14 at 22:08












en.wikipedia.org/wiki/…
– LinAlg
Nov 14 at 23:02




en.wikipedia.org/wiki/…
– LinAlg
Nov 14 at 23:02










2 Answers
2






active

oldest

votes

















up vote
2
down vote













I shall come back with a Lagrangian approach tomorrow. However, I now present a solution using the Weighted AM-GM Inequality.



We have that
$$sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)=sum_{k=1}^n,x_k=c,,$$
with
$$sum_{k=1}^n,frac{a_i}{a}=frac{sumlimits_{k=1}^n,a_k}{a}=frac{a}{a}=1,.$$
By the Weighted AM-GM Inequality, we have
$$c=sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)geq prod_{k=1}^n,left(frac{a,x_k}{a_k}right)^{frac{a_k}{a}},.$$
Take logarithm on both sides of the inequality above, we get
$$ln(c)geq sum_{k=1}^n,frac{a_k}{a},lnleft(frac{a,x_k}{a_k}right)=frac{1}{a},sum_{k=1}^n,Big(a_k,lnleft(x_kright)-a_k,lnleft(a_kright)+a_k,ln(a)Big),.$$
Ergo,
$$sum_{k=1}^n,a_k,lnleft(x_kright)leq a,ln(c)-left(sum_{k=1}^n,a_kright),ln(a)+sum_{k=1}^n,a_k,lnleft(a_kright)=a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$
Note that inequality above becomes an equality if and only if $$frac{x_1}{a_1}=frac{x_2}{a_2}=ldots=frac{x_n}{a_n}=frac{c}{a},,$$
or equivalently,
$$left(x_1,x_2,ldots,x_nright)=left(frac{c,a_1}{a},frac{c,a_2}{a},ldots,frac{c,a_n}{a}right),.$$
This shows that $$p^*= a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$






share|cite|improve this answer




























    up vote
    0
    down vote













    With the Lagrange multiplier method you want to maximize $p(x)=sum a_k ln(x_k) +lambda (c-sum x_k)$.



    Differentiating w.r.t. the $x$'s and setting equal to zero gives $frac{a_k}{x_k}=lambda$.



    Imposing the constraint $c=sum x_k=frac{sum a_k}{lambda}=frac{a}{lambda}$. It follows that $lambda=a/c$.



    So $x_k=frac{a_k }{ lambda}=frac{c a_k}{a}$.



    So $p^*=sum a_k ln(x_k)=sum a_k ln(frac{c a_k}{a})=sum a_k (ln(frac{c}{a}) + ln(a_k))=aln(frac{c}{a})+sum a_k ln(a_k)$






    share|cite|improve this answer





















      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2998812%2fmaximizing-a-weighted-sum-of-logarithms%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      2
      down vote













      I shall come back with a Lagrangian approach tomorrow. However, I now present a solution using the Weighted AM-GM Inequality.



      We have that
      $$sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)=sum_{k=1}^n,x_k=c,,$$
      with
      $$sum_{k=1}^n,frac{a_i}{a}=frac{sumlimits_{k=1}^n,a_k}{a}=frac{a}{a}=1,.$$
      By the Weighted AM-GM Inequality, we have
      $$c=sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)geq prod_{k=1}^n,left(frac{a,x_k}{a_k}right)^{frac{a_k}{a}},.$$
      Take logarithm on both sides of the inequality above, we get
      $$ln(c)geq sum_{k=1}^n,frac{a_k}{a},lnleft(frac{a,x_k}{a_k}right)=frac{1}{a},sum_{k=1}^n,Big(a_k,lnleft(x_kright)-a_k,lnleft(a_kright)+a_k,ln(a)Big),.$$
      Ergo,
      $$sum_{k=1}^n,a_k,lnleft(x_kright)leq a,ln(c)-left(sum_{k=1}^n,a_kright),ln(a)+sum_{k=1}^n,a_k,lnleft(a_kright)=a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$
      Note that inequality above becomes an equality if and only if $$frac{x_1}{a_1}=frac{x_2}{a_2}=ldots=frac{x_n}{a_n}=frac{c}{a},,$$
      or equivalently,
      $$left(x_1,x_2,ldots,x_nright)=left(frac{c,a_1}{a},frac{c,a_2}{a},ldots,frac{c,a_n}{a}right),.$$
      This shows that $$p^*= a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$






      share|cite|improve this answer

























        up vote
        2
        down vote













        I shall come back with a Lagrangian approach tomorrow. However, I now present a solution using the Weighted AM-GM Inequality.



        We have that
        $$sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)=sum_{k=1}^n,x_k=c,,$$
        with
        $$sum_{k=1}^n,frac{a_i}{a}=frac{sumlimits_{k=1}^n,a_k}{a}=frac{a}{a}=1,.$$
        By the Weighted AM-GM Inequality, we have
        $$c=sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)geq prod_{k=1}^n,left(frac{a,x_k}{a_k}right)^{frac{a_k}{a}},.$$
        Take logarithm on both sides of the inequality above, we get
        $$ln(c)geq sum_{k=1}^n,frac{a_k}{a},lnleft(frac{a,x_k}{a_k}right)=frac{1}{a},sum_{k=1}^n,Big(a_k,lnleft(x_kright)-a_k,lnleft(a_kright)+a_k,ln(a)Big),.$$
        Ergo,
        $$sum_{k=1}^n,a_k,lnleft(x_kright)leq a,ln(c)-left(sum_{k=1}^n,a_kright),ln(a)+sum_{k=1}^n,a_k,lnleft(a_kright)=a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$
        Note that inequality above becomes an equality if and only if $$frac{x_1}{a_1}=frac{x_2}{a_2}=ldots=frac{x_n}{a_n}=frac{c}{a},,$$
        or equivalently,
        $$left(x_1,x_2,ldots,x_nright)=left(frac{c,a_1}{a},frac{c,a_2}{a},ldots,frac{c,a_n}{a}right),.$$
        This shows that $$p^*= a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$






        share|cite|improve this answer























          up vote
          2
          down vote










          up vote
          2
          down vote









          I shall come back with a Lagrangian approach tomorrow. However, I now present a solution using the Weighted AM-GM Inequality.



          We have that
          $$sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)=sum_{k=1}^n,x_k=c,,$$
          with
          $$sum_{k=1}^n,frac{a_i}{a}=frac{sumlimits_{k=1}^n,a_k}{a}=frac{a}{a}=1,.$$
          By the Weighted AM-GM Inequality, we have
          $$c=sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)geq prod_{k=1}^n,left(frac{a,x_k}{a_k}right)^{frac{a_k}{a}},.$$
          Take logarithm on both sides of the inequality above, we get
          $$ln(c)geq sum_{k=1}^n,frac{a_k}{a},lnleft(frac{a,x_k}{a_k}right)=frac{1}{a},sum_{k=1}^n,Big(a_k,lnleft(x_kright)-a_k,lnleft(a_kright)+a_k,ln(a)Big),.$$
          Ergo,
          $$sum_{k=1}^n,a_k,lnleft(x_kright)leq a,ln(c)-left(sum_{k=1}^n,a_kright),ln(a)+sum_{k=1}^n,a_k,lnleft(a_kright)=a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$
          Note that inequality above becomes an equality if and only if $$frac{x_1}{a_1}=frac{x_2}{a_2}=ldots=frac{x_n}{a_n}=frac{c}{a},,$$
          or equivalently,
          $$left(x_1,x_2,ldots,x_nright)=left(frac{c,a_1}{a},frac{c,a_2}{a},ldots,frac{c,a_n}{a}right),.$$
          This shows that $$p^*= a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$






          share|cite|improve this answer












          I shall come back with a Lagrangian approach tomorrow. However, I now present a solution using the Weighted AM-GM Inequality.



          We have that
          $$sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)=sum_{k=1}^n,x_k=c,,$$
          with
          $$sum_{k=1}^n,frac{a_i}{a}=frac{sumlimits_{k=1}^n,a_k}{a}=frac{a}{a}=1,.$$
          By the Weighted AM-GM Inequality, we have
          $$c=sum_{k=1}^n,frac{a_k}{a},left(frac{a,x_k}{a_k}right)geq prod_{k=1}^n,left(frac{a,x_k}{a_k}right)^{frac{a_k}{a}},.$$
          Take logarithm on both sides of the inequality above, we get
          $$ln(c)geq sum_{k=1}^n,frac{a_k}{a},lnleft(frac{a,x_k}{a_k}right)=frac{1}{a},sum_{k=1}^n,Big(a_k,lnleft(x_kright)-a_k,lnleft(a_kright)+a_k,ln(a)Big),.$$
          Ergo,
          $$sum_{k=1}^n,a_k,lnleft(x_kright)leq a,ln(c)-left(sum_{k=1}^n,a_kright),ln(a)+sum_{k=1}^n,a_k,lnleft(a_kright)=a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$
          Note that inequality above becomes an equality if and only if $$frac{x_1}{a_1}=frac{x_2}{a_2}=ldots=frac{x_n}{a_n}=frac{c}{a},,$$
          or equivalently,
          $$left(x_1,x_2,ldots,x_nright)=left(frac{c,a_1}{a},frac{c,a_2}{a},ldots,frac{c,a_n}{a}right),.$$
          This shows that $$p^*= a,lnleft(frac{c}{a}right)+sum_{k=1}^n,a_k,lnleft(a_kright),.$$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Nov 22 at 18:09









          Batominovski

          33.5k33292




          33.5k33292






















              up vote
              0
              down vote













              With the Lagrange multiplier method you want to maximize $p(x)=sum a_k ln(x_k) +lambda (c-sum x_k)$.



              Differentiating w.r.t. the $x$'s and setting equal to zero gives $frac{a_k}{x_k}=lambda$.



              Imposing the constraint $c=sum x_k=frac{sum a_k}{lambda}=frac{a}{lambda}$. It follows that $lambda=a/c$.



              So $x_k=frac{a_k }{ lambda}=frac{c a_k}{a}$.



              So $p^*=sum a_k ln(x_k)=sum a_k ln(frac{c a_k}{a})=sum a_k (ln(frac{c}{a}) + ln(a_k))=aln(frac{c}{a})+sum a_k ln(a_k)$






              share|cite|improve this answer

























                up vote
                0
                down vote













                With the Lagrange multiplier method you want to maximize $p(x)=sum a_k ln(x_k) +lambda (c-sum x_k)$.



                Differentiating w.r.t. the $x$'s and setting equal to zero gives $frac{a_k}{x_k}=lambda$.



                Imposing the constraint $c=sum x_k=frac{sum a_k}{lambda}=frac{a}{lambda}$. It follows that $lambda=a/c$.



                So $x_k=frac{a_k }{ lambda}=frac{c a_k}{a}$.



                So $p^*=sum a_k ln(x_k)=sum a_k ln(frac{c a_k}{a})=sum a_k (ln(frac{c}{a}) + ln(a_k))=aln(frac{c}{a})+sum a_k ln(a_k)$






                share|cite|improve this answer























                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  With the Lagrange multiplier method you want to maximize $p(x)=sum a_k ln(x_k) +lambda (c-sum x_k)$.



                  Differentiating w.r.t. the $x$'s and setting equal to zero gives $frac{a_k}{x_k}=lambda$.



                  Imposing the constraint $c=sum x_k=frac{sum a_k}{lambda}=frac{a}{lambda}$. It follows that $lambda=a/c$.



                  So $x_k=frac{a_k }{ lambda}=frac{c a_k}{a}$.



                  So $p^*=sum a_k ln(x_k)=sum a_k ln(frac{c a_k}{a})=sum a_k (ln(frac{c}{a}) + ln(a_k))=aln(frac{c}{a})+sum a_k ln(a_k)$






                  share|cite|improve this answer












                  With the Lagrange multiplier method you want to maximize $p(x)=sum a_k ln(x_k) +lambda (c-sum x_k)$.



                  Differentiating w.r.t. the $x$'s and setting equal to zero gives $frac{a_k}{x_k}=lambda$.



                  Imposing the constraint $c=sum x_k=frac{sum a_k}{lambda}=frac{a}{lambda}$. It follows that $lambda=a/c$.



                  So $x_k=frac{a_k }{ lambda}=frac{c a_k}{a}$.



                  So $p^*=sum a_k ln(x_k)=sum a_k ln(frac{c a_k}{a})=sum a_k (ln(frac{c}{a}) + ln(a_k))=aln(frac{c}{a})+sum a_k ln(a_k)$







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Nov 22 at 18:37









                  user121049

                  1,352164




                  1,352164






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.





                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                      Please pay close attention to the following guidance:


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2998812%2fmaximizing-a-weighted-sum-of-logarithms%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Quarter-circle Tiles

                      build a pushdown automaton that recognizes the reverse language of a given pushdown automaton?

                      Mont Emei