Showing Bernstein polynomial is a basis











up vote
2
down vote

favorite












Hello I want to show that the Bernstein polynomial $$B_{n,k}=binom{n}{k}x^k(1-x)^{n-k},$$ is a basis. For linear independece I got a hint from my teacher to expand the binom $(1-x)^{n-k}$ This way I get: $$B_{n,k}=binom{n}{k}x^k(1-x)^{n-k}sum_{j=0}^{n-k}binom{n-k}{j}(-1)^jx^j$$ And changing the index of summation gives: $$B_{n,k}=sum_{j=k}^{n}binom{n-k}{j-k}binom{n}{k}(-1)^{j-k}x^{j-k+k}=sum_{j=k}^{n}binom{n}{j}binom{j}{k}(-1)^{j-k}x^j$$ Now I have to show that $alpha_i$ are $0$ in the relation $sum_{i=0}^{n}alpha_iB_{i,n}=0,$ or$$alpha_0sum_{j=0}^n(-1)^jbinom{n}{j}binom{j}{0}x^j+alpha_1sum_{j=1}^n(-1)^{j-1}binom{n}{j}binom{j}{1}x^j+...+alpha_{n}sum_{j=n}^n(-1)^{j-n}binom{n}{j}binom{j}{n}x^j=0$$ Now what can I do and how can I finish this problem? Thanks in advance!










share|cite|improve this question




























    up vote
    2
    down vote

    favorite












    Hello I want to show that the Bernstein polynomial $$B_{n,k}=binom{n}{k}x^k(1-x)^{n-k},$$ is a basis. For linear independece I got a hint from my teacher to expand the binom $(1-x)^{n-k}$ This way I get: $$B_{n,k}=binom{n}{k}x^k(1-x)^{n-k}sum_{j=0}^{n-k}binom{n-k}{j}(-1)^jx^j$$ And changing the index of summation gives: $$B_{n,k}=sum_{j=k}^{n}binom{n-k}{j-k}binom{n}{k}(-1)^{j-k}x^{j-k+k}=sum_{j=k}^{n}binom{n}{j}binom{j}{k}(-1)^{j-k}x^j$$ Now I have to show that $alpha_i$ are $0$ in the relation $sum_{i=0}^{n}alpha_iB_{i,n}=0,$ or$$alpha_0sum_{j=0}^n(-1)^jbinom{n}{j}binom{j}{0}x^j+alpha_1sum_{j=1}^n(-1)^{j-1}binom{n}{j}binom{j}{1}x^j+...+alpha_{n}sum_{j=n}^n(-1)^{j-n}binom{n}{j}binom{j}{n}x^j=0$$ Now what can I do and how can I finish this problem? Thanks in advance!










    share|cite|improve this question


























      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      Hello I want to show that the Bernstein polynomial $$B_{n,k}=binom{n}{k}x^k(1-x)^{n-k},$$ is a basis. For linear independece I got a hint from my teacher to expand the binom $(1-x)^{n-k}$ This way I get: $$B_{n,k}=binom{n}{k}x^k(1-x)^{n-k}sum_{j=0}^{n-k}binom{n-k}{j}(-1)^jx^j$$ And changing the index of summation gives: $$B_{n,k}=sum_{j=k}^{n}binom{n-k}{j-k}binom{n}{k}(-1)^{j-k}x^{j-k+k}=sum_{j=k}^{n}binom{n}{j}binom{j}{k}(-1)^{j-k}x^j$$ Now I have to show that $alpha_i$ are $0$ in the relation $sum_{i=0}^{n}alpha_iB_{i,n}=0,$ or$$alpha_0sum_{j=0}^n(-1)^jbinom{n}{j}binom{j}{0}x^j+alpha_1sum_{j=1}^n(-1)^{j-1}binom{n}{j}binom{j}{1}x^j+...+alpha_{n}sum_{j=n}^n(-1)^{j-n}binom{n}{j}binom{j}{n}x^j=0$$ Now what can I do and how can I finish this problem? Thanks in advance!










      share|cite|improve this question















      Hello I want to show that the Bernstein polynomial $$B_{n,k}=binom{n}{k}x^k(1-x)^{n-k},$$ is a basis. For linear independece I got a hint from my teacher to expand the binom $(1-x)^{n-k}$ This way I get: $$B_{n,k}=binom{n}{k}x^k(1-x)^{n-k}sum_{j=0}^{n-k}binom{n-k}{j}(-1)^jx^j$$ And changing the index of summation gives: $$B_{n,k}=sum_{j=k}^{n}binom{n-k}{j-k}binom{n}{k}(-1)^{j-k}x^{j-k+k}=sum_{j=k}^{n}binom{n}{j}binom{j}{k}(-1)^{j-k}x^j$$ Now I have to show that $alpha_i$ are $0$ in the relation $sum_{i=0}^{n}alpha_iB_{i,n}=0,$ or$$alpha_0sum_{j=0}^n(-1)^jbinom{n}{j}binom{j}{0}x^j+alpha_1sum_{j=1}^n(-1)^{j-1}binom{n}{j}binom{j}{1}x^j+...+alpha_{n}sum_{j=n}^n(-1)^{j-n}binom{n}{j}binom{j}{n}x^j=0$$ Now what can I do and how can I finish this problem? Thanks in advance!







      linear-algebra polynomials hamel-basis






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited May 1 at 11:29

























      asked May 1 at 11:25









      Zacky

      3,3231337




      3,3231337






















          3 Answers
          3






          active

          oldest

          votes

















          up vote
          3
          down vote



          accepted










          When you expand them, you see that only one of the Bernstein polynomials has a non-zero constant term, namely $binom nn(1-x)^n$. So, if$$sum_{k=0}^nalpha_kB_{n,k}(x)=0,tag1$$then $alpha_0=0$.



          Now, there are only two Bernstein polynomials such that the coefficient of $x$ is non-zero, which are $B_{n,0}(x)$ and $B_{n,1}(x)$. But you already know that $alpha_0=0$. It follows then from $(1)$ that $alpha_1=0$.



          And so on…






          share|cite|improve this answer























          • I am stuck on the first part. Are you saying that the only coefficient of x that is non-zero are in the $B_{n,n}$ ? All I could do after you said to expand them is to show that $alpha_0=0$ but that is the coefficient of $B_{n,0}$
            – Zacky
            May 1 at 13:34








          • 1




            @Zacky You're right. My mistake. I've edited my answer and I hope that everything is clear now.
            – José Carlos Santos
            May 1 at 13:38


















          up vote
          3
          down vote













          Hint: The matrix that expresses the Bernstein polynomials with respect to the canonical monomial basis is triangular with a diagonal of binomial coefficients, and so is invertible.



          For instance, when $n=3$, we have
          $$
          begin{pmatrix}
          B_{3,0}(x) \ B_{3,1}(x)\ B_{3,2}(x) \ B_{3,3}(x)
          end{pmatrix}
          =
          begin{pmatrix}
          (1-x)^3 \ 3x(1-x)^2 \ 3x^2(1-x) \ x^3
          end{pmatrix}
          =
          begin{pmatrix}
          1 & -3 & hphantom{-}3 & hphantom{-}1 \
          0 & hphantom{-}3 & -6 & hphantom{-}3 \
          0 & hphantom{-}0 & hphantom{-}3 & -3 \
          0 & hphantom{-}0 & hphantom{-}0 & hphantom{-}1 \
          end{pmatrix}
          begin{pmatrix}
          1 \ x \ x^2 \ x^3
          end{pmatrix}
          $$
          The exact entries in the matrix are not important. The key point is that the $x^k$ factor in $B_{n,k}(x)$ ensures that in the $k$-th row all entries before the diagonal are zero, and so the matrix is triangular.






          share|cite|improve this answer























          • Thank you for the answer, how can I find that matrix?
            – Zacky
            May 1 at 12:39






          • 1




            @Zacky, you don't need to find the matrix explicitly, just to argue that it is triangular. Note the factor $x^k$.
            – lhf
            May 1 at 12:46










          • abit too hard for me, but thanks!
            – Zacky
            May 1 at 13:49


















          up vote
          1
          down vote













          First of all, note that we have
          power series and Bernstein polynomials



          Also if there exist constants $c_0, c_1, ..., c_n$ so that the identity
          $0 = c_0*B_{0,n}(t) + c_1*B_{1,n}(t) + · · · + c_n*B_{n,n}(t)$
          holds for all t, then all the $c_i$’s must be zero and we have
          Indecency
          Since the power basis is a linearly independent set, we must have that
          Zero coefficient
          which implies that $c_0 = c_1 = · · · = c_n = 0$ ($c_0$ is clearly zero, substituting this in the second equation gives $c_1 = 0$, substituting these two into the third equation gives ...)






          share|cite|improve this answer























          • Welcome to MSE. For some basic information about writing mathematics at this site see, e.g., basic help on mathjax notation, mathjax tutorial and quick reference, main meta site math tutorial and equation editing how-to.
            – José Carlos Santos
            Nov 22 at 9:29











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2761669%2fshowing-bernstein-polynomial-is-a-basis%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          3 Answers
          3






          active

          oldest

          votes








          3 Answers
          3






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          3
          down vote



          accepted










          When you expand them, you see that only one of the Bernstein polynomials has a non-zero constant term, namely $binom nn(1-x)^n$. So, if$$sum_{k=0}^nalpha_kB_{n,k}(x)=0,tag1$$then $alpha_0=0$.



          Now, there are only two Bernstein polynomials such that the coefficient of $x$ is non-zero, which are $B_{n,0}(x)$ and $B_{n,1}(x)$. But you already know that $alpha_0=0$. It follows then from $(1)$ that $alpha_1=0$.



          And so on…






          share|cite|improve this answer























          • I am stuck on the first part. Are you saying that the only coefficient of x that is non-zero are in the $B_{n,n}$ ? All I could do after you said to expand them is to show that $alpha_0=0$ but that is the coefficient of $B_{n,0}$
            – Zacky
            May 1 at 13:34








          • 1




            @Zacky You're right. My mistake. I've edited my answer and I hope that everything is clear now.
            – José Carlos Santos
            May 1 at 13:38















          up vote
          3
          down vote



          accepted










          When you expand them, you see that only one of the Bernstein polynomials has a non-zero constant term, namely $binom nn(1-x)^n$. So, if$$sum_{k=0}^nalpha_kB_{n,k}(x)=0,tag1$$then $alpha_0=0$.



          Now, there are only two Bernstein polynomials such that the coefficient of $x$ is non-zero, which are $B_{n,0}(x)$ and $B_{n,1}(x)$. But you already know that $alpha_0=0$. It follows then from $(1)$ that $alpha_1=0$.



          And so on…






          share|cite|improve this answer























          • I am stuck on the first part. Are you saying that the only coefficient of x that is non-zero are in the $B_{n,n}$ ? All I could do after you said to expand them is to show that $alpha_0=0$ but that is the coefficient of $B_{n,0}$
            – Zacky
            May 1 at 13:34








          • 1




            @Zacky You're right. My mistake. I've edited my answer and I hope that everything is clear now.
            – José Carlos Santos
            May 1 at 13:38













          up vote
          3
          down vote



          accepted







          up vote
          3
          down vote



          accepted






          When you expand them, you see that only one of the Bernstein polynomials has a non-zero constant term, namely $binom nn(1-x)^n$. So, if$$sum_{k=0}^nalpha_kB_{n,k}(x)=0,tag1$$then $alpha_0=0$.



          Now, there are only two Bernstein polynomials such that the coefficient of $x$ is non-zero, which are $B_{n,0}(x)$ and $B_{n,1}(x)$. But you already know that $alpha_0=0$. It follows then from $(1)$ that $alpha_1=0$.



          And so on…






          share|cite|improve this answer














          When you expand them, you see that only one of the Bernstein polynomials has a non-zero constant term, namely $binom nn(1-x)^n$. So, if$$sum_{k=0}^nalpha_kB_{n,k}(x)=0,tag1$$then $alpha_0=0$.



          Now, there are only two Bernstein polynomials such that the coefficient of $x$ is non-zero, which are $B_{n,0}(x)$ and $B_{n,1}(x)$. But you already know that $alpha_0=0$. It follows then from $(1)$ that $alpha_1=0$.



          And so on…







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Oct 8 at 16:59

























          answered May 1 at 11:35









          José Carlos Santos

          146k22117217




          146k22117217












          • I am stuck on the first part. Are you saying that the only coefficient of x that is non-zero are in the $B_{n,n}$ ? All I could do after you said to expand them is to show that $alpha_0=0$ but that is the coefficient of $B_{n,0}$
            – Zacky
            May 1 at 13:34








          • 1




            @Zacky You're right. My mistake. I've edited my answer and I hope that everything is clear now.
            – José Carlos Santos
            May 1 at 13:38


















          • I am stuck on the first part. Are you saying that the only coefficient of x that is non-zero are in the $B_{n,n}$ ? All I could do after you said to expand them is to show that $alpha_0=0$ but that is the coefficient of $B_{n,0}$
            – Zacky
            May 1 at 13:34








          • 1




            @Zacky You're right. My mistake. I've edited my answer and I hope that everything is clear now.
            – José Carlos Santos
            May 1 at 13:38
















          I am stuck on the first part. Are you saying that the only coefficient of x that is non-zero are in the $B_{n,n}$ ? All I could do after you said to expand them is to show that $alpha_0=0$ but that is the coefficient of $B_{n,0}$
          – Zacky
          May 1 at 13:34






          I am stuck on the first part. Are you saying that the only coefficient of x that is non-zero are in the $B_{n,n}$ ? All I could do after you said to expand them is to show that $alpha_0=0$ but that is the coefficient of $B_{n,0}$
          – Zacky
          May 1 at 13:34






          1




          1




          @Zacky You're right. My mistake. I've edited my answer and I hope that everything is clear now.
          – José Carlos Santos
          May 1 at 13:38




          @Zacky You're right. My mistake. I've edited my answer and I hope that everything is clear now.
          – José Carlos Santos
          May 1 at 13:38










          up vote
          3
          down vote













          Hint: The matrix that expresses the Bernstein polynomials with respect to the canonical monomial basis is triangular with a diagonal of binomial coefficients, and so is invertible.



          For instance, when $n=3$, we have
          $$
          begin{pmatrix}
          B_{3,0}(x) \ B_{3,1}(x)\ B_{3,2}(x) \ B_{3,3}(x)
          end{pmatrix}
          =
          begin{pmatrix}
          (1-x)^3 \ 3x(1-x)^2 \ 3x^2(1-x) \ x^3
          end{pmatrix}
          =
          begin{pmatrix}
          1 & -3 & hphantom{-}3 & hphantom{-}1 \
          0 & hphantom{-}3 & -6 & hphantom{-}3 \
          0 & hphantom{-}0 & hphantom{-}3 & -3 \
          0 & hphantom{-}0 & hphantom{-}0 & hphantom{-}1 \
          end{pmatrix}
          begin{pmatrix}
          1 \ x \ x^2 \ x^3
          end{pmatrix}
          $$
          The exact entries in the matrix are not important. The key point is that the $x^k$ factor in $B_{n,k}(x)$ ensures that in the $k$-th row all entries before the diagonal are zero, and so the matrix is triangular.






          share|cite|improve this answer























          • Thank you for the answer, how can I find that matrix?
            – Zacky
            May 1 at 12:39






          • 1




            @Zacky, you don't need to find the matrix explicitly, just to argue that it is triangular. Note the factor $x^k$.
            – lhf
            May 1 at 12:46










          • abit too hard for me, but thanks!
            – Zacky
            May 1 at 13:49















          up vote
          3
          down vote













          Hint: The matrix that expresses the Bernstein polynomials with respect to the canonical monomial basis is triangular with a diagonal of binomial coefficients, and so is invertible.



          For instance, when $n=3$, we have
          $$
          begin{pmatrix}
          B_{3,0}(x) \ B_{3,1}(x)\ B_{3,2}(x) \ B_{3,3}(x)
          end{pmatrix}
          =
          begin{pmatrix}
          (1-x)^3 \ 3x(1-x)^2 \ 3x^2(1-x) \ x^3
          end{pmatrix}
          =
          begin{pmatrix}
          1 & -3 & hphantom{-}3 & hphantom{-}1 \
          0 & hphantom{-}3 & -6 & hphantom{-}3 \
          0 & hphantom{-}0 & hphantom{-}3 & -3 \
          0 & hphantom{-}0 & hphantom{-}0 & hphantom{-}1 \
          end{pmatrix}
          begin{pmatrix}
          1 \ x \ x^2 \ x^3
          end{pmatrix}
          $$
          The exact entries in the matrix are not important. The key point is that the $x^k$ factor in $B_{n,k}(x)$ ensures that in the $k$-th row all entries before the diagonal are zero, and so the matrix is triangular.






          share|cite|improve this answer























          • Thank you for the answer, how can I find that matrix?
            – Zacky
            May 1 at 12:39






          • 1




            @Zacky, you don't need to find the matrix explicitly, just to argue that it is triangular. Note the factor $x^k$.
            – lhf
            May 1 at 12:46










          • abit too hard for me, but thanks!
            – Zacky
            May 1 at 13:49













          up vote
          3
          down vote










          up vote
          3
          down vote









          Hint: The matrix that expresses the Bernstein polynomials with respect to the canonical monomial basis is triangular with a diagonal of binomial coefficients, and so is invertible.



          For instance, when $n=3$, we have
          $$
          begin{pmatrix}
          B_{3,0}(x) \ B_{3,1}(x)\ B_{3,2}(x) \ B_{3,3}(x)
          end{pmatrix}
          =
          begin{pmatrix}
          (1-x)^3 \ 3x(1-x)^2 \ 3x^2(1-x) \ x^3
          end{pmatrix}
          =
          begin{pmatrix}
          1 & -3 & hphantom{-}3 & hphantom{-}1 \
          0 & hphantom{-}3 & -6 & hphantom{-}3 \
          0 & hphantom{-}0 & hphantom{-}3 & -3 \
          0 & hphantom{-}0 & hphantom{-}0 & hphantom{-}1 \
          end{pmatrix}
          begin{pmatrix}
          1 \ x \ x^2 \ x^3
          end{pmatrix}
          $$
          The exact entries in the matrix are not important. The key point is that the $x^k$ factor in $B_{n,k}(x)$ ensures that in the $k$-th row all entries before the diagonal are zero, and so the matrix is triangular.






          share|cite|improve this answer














          Hint: The matrix that expresses the Bernstein polynomials with respect to the canonical monomial basis is triangular with a diagonal of binomial coefficients, and so is invertible.



          For instance, when $n=3$, we have
          $$
          begin{pmatrix}
          B_{3,0}(x) \ B_{3,1}(x)\ B_{3,2}(x) \ B_{3,3}(x)
          end{pmatrix}
          =
          begin{pmatrix}
          (1-x)^3 \ 3x(1-x)^2 \ 3x^2(1-x) \ x^3
          end{pmatrix}
          =
          begin{pmatrix}
          1 & -3 & hphantom{-}3 & hphantom{-}1 \
          0 & hphantom{-}3 & -6 & hphantom{-}3 \
          0 & hphantom{-}0 & hphantom{-}3 & -3 \
          0 & hphantom{-}0 & hphantom{-}0 & hphantom{-}1 \
          end{pmatrix}
          begin{pmatrix}
          1 \ x \ x^2 \ x^3
          end{pmatrix}
          $$
          The exact entries in the matrix are not important. The key point is that the $x^k$ factor in $B_{n,k}(x)$ ensures that in the $k$-th row all entries before the diagonal are zero, and so the matrix is triangular.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited May 2 at 11:50

























          answered May 1 at 12:36









          lhf

          162k9166385




          162k9166385












          • Thank you for the answer, how can I find that matrix?
            – Zacky
            May 1 at 12:39






          • 1




            @Zacky, you don't need to find the matrix explicitly, just to argue that it is triangular. Note the factor $x^k$.
            – lhf
            May 1 at 12:46










          • abit too hard for me, but thanks!
            – Zacky
            May 1 at 13:49


















          • Thank you for the answer, how can I find that matrix?
            – Zacky
            May 1 at 12:39






          • 1




            @Zacky, you don't need to find the matrix explicitly, just to argue that it is triangular. Note the factor $x^k$.
            – lhf
            May 1 at 12:46










          • abit too hard for me, but thanks!
            – Zacky
            May 1 at 13:49
















          Thank you for the answer, how can I find that matrix?
          – Zacky
          May 1 at 12:39




          Thank you for the answer, how can I find that matrix?
          – Zacky
          May 1 at 12:39




          1




          1




          @Zacky, you don't need to find the matrix explicitly, just to argue that it is triangular. Note the factor $x^k$.
          – lhf
          May 1 at 12:46




          @Zacky, you don't need to find the matrix explicitly, just to argue that it is triangular. Note the factor $x^k$.
          – lhf
          May 1 at 12:46












          abit too hard for me, but thanks!
          – Zacky
          May 1 at 13:49




          abit too hard for me, but thanks!
          – Zacky
          May 1 at 13:49










          up vote
          1
          down vote













          First of all, note that we have
          power series and Bernstein polynomials



          Also if there exist constants $c_0, c_1, ..., c_n$ so that the identity
          $0 = c_0*B_{0,n}(t) + c_1*B_{1,n}(t) + · · · + c_n*B_{n,n}(t)$
          holds for all t, then all the $c_i$’s must be zero and we have
          Indecency
          Since the power basis is a linearly independent set, we must have that
          Zero coefficient
          which implies that $c_0 = c_1 = · · · = c_n = 0$ ($c_0$ is clearly zero, substituting this in the second equation gives $c_1 = 0$, substituting these two into the third equation gives ...)






          share|cite|improve this answer























          • Welcome to MSE. For some basic information about writing mathematics at this site see, e.g., basic help on mathjax notation, mathjax tutorial and quick reference, main meta site math tutorial and equation editing how-to.
            – José Carlos Santos
            Nov 22 at 9:29















          up vote
          1
          down vote













          First of all, note that we have
          power series and Bernstein polynomials



          Also if there exist constants $c_0, c_1, ..., c_n$ so that the identity
          $0 = c_0*B_{0,n}(t) + c_1*B_{1,n}(t) + · · · + c_n*B_{n,n}(t)$
          holds for all t, then all the $c_i$’s must be zero and we have
          Indecency
          Since the power basis is a linearly independent set, we must have that
          Zero coefficient
          which implies that $c_0 = c_1 = · · · = c_n = 0$ ($c_0$ is clearly zero, substituting this in the second equation gives $c_1 = 0$, substituting these two into the third equation gives ...)






          share|cite|improve this answer























          • Welcome to MSE. For some basic information about writing mathematics at this site see, e.g., basic help on mathjax notation, mathjax tutorial and quick reference, main meta site math tutorial and equation editing how-to.
            – José Carlos Santos
            Nov 22 at 9:29













          up vote
          1
          down vote










          up vote
          1
          down vote









          First of all, note that we have
          power series and Bernstein polynomials



          Also if there exist constants $c_0, c_1, ..., c_n$ so that the identity
          $0 = c_0*B_{0,n}(t) + c_1*B_{1,n}(t) + · · · + c_n*B_{n,n}(t)$
          holds for all t, then all the $c_i$’s must be zero and we have
          Indecency
          Since the power basis is a linearly independent set, we must have that
          Zero coefficient
          which implies that $c_0 = c_1 = · · · = c_n = 0$ ($c_0$ is clearly zero, substituting this in the second equation gives $c_1 = 0$, substituting these two into the third equation gives ...)






          share|cite|improve this answer














          First of all, note that we have
          power series and Bernstein polynomials



          Also if there exist constants $c_0, c_1, ..., c_n$ so that the identity
          $0 = c_0*B_{0,n}(t) + c_1*B_{1,n}(t) + · · · + c_n*B_{n,n}(t)$
          holds for all t, then all the $c_i$’s must be zero and we have
          Indecency
          Since the power basis is a linearly independent set, we must have that
          Zero coefficient
          which implies that $c_0 = c_1 = · · · = c_n = 0$ ($c_0$ is clearly zero, substituting this in the second equation gives $c_1 = 0$, substituting these two into the third equation gives ...)







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Nov 22 at 9:32

























          answered Nov 22 at 9:07









          Mahmood Dadkhah

          113




          113












          • Welcome to MSE. For some basic information about writing mathematics at this site see, e.g., basic help on mathjax notation, mathjax tutorial and quick reference, main meta site math tutorial and equation editing how-to.
            – José Carlos Santos
            Nov 22 at 9:29


















          • Welcome to MSE. For some basic information about writing mathematics at this site see, e.g., basic help on mathjax notation, mathjax tutorial and quick reference, main meta site math tutorial and equation editing how-to.
            – José Carlos Santos
            Nov 22 at 9:29
















          Welcome to MSE. For some basic information about writing mathematics at this site see, e.g., basic help on mathjax notation, mathjax tutorial and quick reference, main meta site math tutorial and equation editing how-to.
          – José Carlos Santos
          Nov 22 at 9:29




          Welcome to MSE. For some basic information about writing mathematics at this site see, e.g., basic help on mathjax notation, mathjax tutorial and quick reference, main meta site math tutorial and equation editing how-to.
          – José Carlos Santos
          Nov 22 at 9:29


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2761669%2fshowing-bernstein-polynomial-is-a-basis%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Quarter-circle Tiles

          build a pushdown automaton that recognizes the reverse language of a given pushdown automaton?

          Mont Emei