Relation between different presentations of a positive semidefinite matrix











up vote
2
down vote

favorite












Let $P$ be a positive semidefinite $ntimes n$ matrix of rank $r$. Then we may write $$ P=sum_{j=1}^r v_jv_j^*, text{where} v_1,ldots,v_rinmathbb C^n.$$



If we also have $P=sum_{j=1}^r w_jw_j^*$ for some $w_1,ldots,w_rinmathbb C^n$,




What's the relation between $v_1,ldots,v_r$ and $w_1,ldots,w_r$?











share|cite|improve this question




























    up vote
    2
    down vote

    favorite












    Let $P$ be a positive semidefinite $ntimes n$ matrix of rank $r$. Then we may write $$ P=sum_{j=1}^r v_jv_j^*, text{where} v_1,ldots,v_rinmathbb C^n.$$



    If we also have $P=sum_{j=1}^r w_jw_j^*$ for some $w_1,ldots,w_rinmathbb C^n$,




    What's the relation between $v_1,ldots,v_r$ and $w_1,ldots,w_r$?











    share|cite|improve this question


























      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      Let $P$ be a positive semidefinite $ntimes n$ matrix of rank $r$. Then we may write $$ P=sum_{j=1}^r v_jv_j^*, text{where} v_1,ldots,v_rinmathbb C^n.$$



      If we also have $P=sum_{j=1}^r w_jw_j^*$ for some $w_1,ldots,w_rinmathbb C^n$,




      What's the relation between $v_1,ldots,v_r$ and $w_1,ldots,w_r$?











      share|cite|improve this question















      Let $P$ be a positive semidefinite $ntimes n$ matrix of rank $r$. Then we may write $$ P=sum_{j=1}^r v_jv_j^*, text{where} v_1,ldots,v_rinmathbb C^n.$$



      If we also have $P=sum_{j=1}^r w_jw_j^*$ for some $w_1,ldots,w_rinmathbb C^n$,




      What's the relation between $v_1,ldots,v_r$ and $w_1,ldots,w_r$?








      linear-algebra matrices functional-analysis






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 22 at 22:16

























      asked Nov 22 at 4:20









      Martin Argerami

      123k1176174




      123k1176174






















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          2
          down vote













          I am not sure what purpose does your exercise serve, but the answer can be more succinct. Put together the vectors $v_i$s to form a matrix $V$ and define $W$ analogously. Then both $V$ and $W$ have full column ranks and $VV^ast=WW^ast$. Hence $V^ast x=0$ if and only if $W^ast x=0$, i.e. the column spaces of $V$ and $W$ have a common orthogonal complement $mathcal X$. Let $X$ be a matrix whose columns form a basis of $mathcal X$. Then both augmented matrices $[V|X]$ and $[W|X]$ are invertible. Since $VV^ast=WW^ast$, we get $[V|X][V|X]^ast = [W|X][W|X]^ast$. Thus $U=[W|X]^{-1}[V|X]$ is a unitary matrix and $V=WU$.






          share|cite|improve this answer























          • Very neat. $ $
            – Martin Argerami
            Nov 22 at 5:27


















          up vote
          1
          down vote













          The relation is that there exists a unitary $Uin M_r(mathbb C)$ such that $$w_j=sum_{k=1}^r U_{jk}v_k.$$



          Indeed, fix an orthonormal basis $e_1,ldots,e_n$ of $mathbb C^n$, and we may write
          $$
          v_k=sum_{k=1}^ns_{jk}e_j, w_k=sum_{j=1}^rt_{jk}e_j.
          $$

          Now consider $S,Tin M_r(mathbb C)$ with $S_{kj}=s_{kj}$, $T_{kj}=t_{kj}$. Then
          $$
          P=sum_{k=1}^rv_kv_k^*=sum_{k=1}^rsum_{j,ell=1}^rs_{jk}overline{s_{ell k}}e_je_ell^*=sum_{k=1}^rsum_{j,ell=1}^rS_{jk}{(S^*)_{k ell}}e_je_ell^*=sum_{j,ell=1}^r(SS^*)_{jell}e_je_ell^*.
          $$

          Similarly,
          $$
          P=sum_{j,ell=1}^r(TT^*)_{jell}e_je_ell^*.
          $$

          As $e_je_ell^*$ are the matrix units in $M_r(mathbb C)$, we get that $TT^*=SS^*$. Now we write the polar decompositions
          $$
          S^*=W(SS^*)^{1/2}, T^*=V(TT^*)^{1/2},
          $$

          where $W,V$ are unitaries. Then $S=(SS^*)^{1/2}W^*$, and
          $$
          T=(TT^*)^{1/2}V^*=(SS^*)^{1/2}V^*=SWV^*=SZ,
          $$

          where $Z=WV^*$. Now
          begin{align}
          w_j&=sum_{k=1}^r t_{kj}e_k=sum_{k=1}^r(SZ)_{kj}e_k=sum_{k,m=1}^rS_{km}Z_{mj}e_k\ \
          &=sum_{m=1}^r Z_{mj}sum_{k=1}^rS_{km}e_k=sum_{m=1}^rZ_{mj}v_m\ \
          &=sum_{m=1}^rU_{jm}v_m,
          end{align}

          where $U=Z^T$.






          share|cite|improve this answer























          • By polar decomposition,we have $W,V$ are partial isometries,why are they unitaries?Besides,the other reprentation of $P$ may have $m(m>r)$ sum of rank 1 matrices.
            – mathrookie
            Nov 22 at 21:06












          • If you do the polar decomposition in $M_r(mathbb C)$, you can always take them to be unitaries. As for $m>r$, enlarge the smallest family with zeroes.
            – Martin Argerami
            Nov 22 at 22:15











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3008734%2frelation-between-different-presentations-of-a-positive-semidefinite-matrix%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          2
          down vote













          I am not sure what purpose does your exercise serve, but the answer can be more succinct. Put together the vectors $v_i$s to form a matrix $V$ and define $W$ analogously. Then both $V$ and $W$ have full column ranks and $VV^ast=WW^ast$. Hence $V^ast x=0$ if and only if $W^ast x=0$, i.e. the column spaces of $V$ and $W$ have a common orthogonal complement $mathcal X$. Let $X$ be a matrix whose columns form a basis of $mathcal X$. Then both augmented matrices $[V|X]$ and $[W|X]$ are invertible. Since $VV^ast=WW^ast$, we get $[V|X][V|X]^ast = [W|X][W|X]^ast$. Thus $U=[W|X]^{-1}[V|X]$ is a unitary matrix and $V=WU$.






          share|cite|improve this answer























          • Very neat. $ $
            – Martin Argerami
            Nov 22 at 5:27















          up vote
          2
          down vote













          I am not sure what purpose does your exercise serve, but the answer can be more succinct. Put together the vectors $v_i$s to form a matrix $V$ and define $W$ analogously. Then both $V$ and $W$ have full column ranks and $VV^ast=WW^ast$. Hence $V^ast x=0$ if and only if $W^ast x=0$, i.e. the column spaces of $V$ and $W$ have a common orthogonal complement $mathcal X$. Let $X$ be a matrix whose columns form a basis of $mathcal X$. Then both augmented matrices $[V|X]$ and $[W|X]$ are invertible. Since $VV^ast=WW^ast$, we get $[V|X][V|X]^ast = [W|X][W|X]^ast$. Thus $U=[W|X]^{-1}[V|X]$ is a unitary matrix and $V=WU$.






          share|cite|improve this answer























          • Very neat. $ $
            – Martin Argerami
            Nov 22 at 5:27













          up vote
          2
          down vote










          up vote
          2
          down vote









          I am not sure what purpose does your exercise serve, but the answer can be more succinct. Put together the vectors $v_i$s to form a matrix $V$ and define $W$ analogously. Then both $V$ and $W$ have full column ranks and $VV^ast=WW^ast$. Hence $V^ast x=0$ if and only if $W^ast x=0$, i.e. the column spaces of $V$ and $W$ have a common orthogonal complement $mathcal X$. Let $X$ be a matrix whose columns form a basis of $mathcal X$. Then both augmented matrices $[V|X]$ and $[W|X]$ are invertible. Since $VV^ast=WW^ast$, we get $[V|X][V|X]^ast = [W|X][W|X]^ast$. Thus $U=[W|X]^{-1}[V|X]$ is a unitary matrix and $V=WU$.






          share|cite|improve this answer














          I am not sure what purpose does your exercise serve, but the answer can be more succinct. Put together the vectors $v_i$s to form a matrix $V$ and define $W$ analogously. Then both $V$ and $W$ have full column ranks and $VV^ast=WW^ast$. Hence $V^ast x=0$ if and only if $W^ast x=0$, i.e. the column spaces of $V$ and $W$ have a common orthogonal complement $mathcal X$. Let $X$ be a matrix whose columns form a basis of $mathcal X$. Then both augmented matrices $[V|X]$ and $[W|X]$ are invertible. Since $VV^ast=WW^ast$, we get $[V|X][V|X]^ast = [W|X][W|X]^ast$. Thus $U=[W|X]^{-1}[V|X]$ is a unitary matrix and $V=WU$.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          answered Nov 22 at 5:13


























          community wiki





          user1551













          • Very neat. $ $
            – Martin Argerami
            Nov 22 at 5:27


















          • Very neat. $ $
            – Martin Argerami
            Nov 22 at 5:27
















          Very neat. $ $
          – Martin Argerami
          Nov 22 at 5:27




          Very neat. $ $
          – Martin Argerami
          Nov 22 at 5:27










          up vote
          1
          down vote













          The relation is that there exists a unitary $Uin M_r(mathbb C)$ such that $$w_j=sum_{k=1}^r U_{jk}v_k.$$



          Indeed, fix an orthonormal basis $e_1,ldots,e_n$ of $mathbb C^n$, and we may write
          $$
          v_k=sum_{k=1}^ns_{jk}e_j, w_k=sum_{j=1}^rt_{jk}e_j.
          $$

          Now consider $S,Tin M_r(mathbb C)$ with $S_{kj}=s_{kj}$, $T_{kj}=t_{kj}$. Then
          $$
          P=sum_{k=1}^rv_kv_k^*=sum_{k=1}^rsum_{j,ell=1}^rs_{jk}overline{s_{ell k}}e_je_ell^*=sum_{k=1}^rsum_{j,ell=1}^rS_{jk}{(S^*)_{k ell}}e_je_ell^*=sum_{j,ell=1}^r(SS^*)_{jell}e_je_ell^*.
          $$

          Similarly,
          $$
          P=sum_{j,ell=1}^r(TT^*)_{jell}e_je_ell^*.
          $$

          As $e_je_ell^*$ are the matrix units in $M_r(mathbb C)$, we get that $TT^*=SS^*$. Now we write the polar decompositions
          $$
          S^*=W(SS^*)^{1/2}, T^*=V(TT^*)^{1/2},
          $$

          where $W,V$ are unitaries. Then $S=(SS^*)^{1/2}W^*$, and
          $$
          T=(TT^*)^{1/2}V^*=(SS^*)^{1/2}V^*=SWV^*=SZ,
          $$

          where $Z=WV^*$. Now
          begin{align}
          w_j&=sum_{k=1}^r t_{kj}e_k=sum_{k=1}^r(SZ)_{kj}e_k=sum_{k,m=1}^rS_{km}Z_{mj}e_k\ \
          &=sum_{m=1}^r Z_{mj}sum_{k=1}^rS_{km}e_k=sum_{m=1}^rZ_{mj}v_m\ \
          &=sum_{m=1}^rU_{jm}v_m,
          end{align}

          where $U=Z^T$.






          share|cite|improve this answer























          • By polar decomposition,we have $W,V$ are partial isometries,why are they unitaries?Besides,the other reprentation of $P$ may have $m(m>r)$ sum of rank 1 matrices.
            – mathrookie
            Nov 22 at 21:06












          • If you do the polar decomposition in $M_r(mathbb C)$, you can always take them to be unitaries. As for $m>r$, enlarge the smallest family with zeroes.
            – Martin Argerami
            Nov 22 at 22:15















          up vote
          1
          down vote













          The relation is that there exists a unitary $Uin M_r(mathbb C)$ such that $$w_j=sum_{k=1}^r U_{jk}v_k.$$



          Indeed, fix an orthonormal basis $e_1,ldots,e_n$ of $mathbb C^n$, and we may write
          $$
          v_k=sum_{k=1}^ns_{jk}e_j, w_k=sum_{j=1}^rt_{jk}e_j.
          $$

          Now consider $S,Tin M_r(mathbb C)$ with $S_{kj}=s_{kj}$, $T_{kj}=t_{kj}$. Then
          $$
          P=sum_{k=1}^rv_kv_k^*=sum_{k=1}^rsum_{j,ell=1}^rs_{jk}overline{s_{ell k}}e_je_ell^*=sum_{k=1}^rsum_{j,ell=1}^rS_{jk}{(S^*)_{k ell}}e_je_ell^*=sum_{j,ell=1}^r(SS^*)_{jell}e_je_ell^*.
          $$

          Similarly,
          $$
          P=sum_{j,ell=1}^r(TT^*)_{jell}e_je_ell^*.
          $$

          As $e_je_ell^*$ are the matrix units in $M_r(mathbb C)$, we get that $TT^*=SS^*$. Now we write the polar decompositions
          $$
          S^*=W(SS^*)^{1/2}, T^*=V(TT^*)^{1/2},
          $$

          where $W,V$ are unitaries. Then $S=(SS^*)^{1/2}W^*$, and
          $$
          T=(TT^*)^{1/2}V^*=(SS^*)^{1/2}V^*=SWV^*=SZ,
          $$

          where $Z=WV^*$. Now
          begin{align}
          w_j&=sum_{k=1}^r t_{kj}e_k=sum_{k=1}^r(SZ)_{kj}e_k=sum_{k,m=1}^rS_{km}Z_{mj}e_k\ \
          &=sum_{m=1}^r Z_{mj}sum_{k=1}^rS_{km}e_k=sum_{m=1}^rZ_{mj}v_m\ \
          &=sum_{m=1}^rU_{jm}v_m,
          end{align}

          where $U=Z^T$.






          share|cite|improve this answer























          • By polar decomposition,we have $W,V$ are partial isometries,why are they unitaries?Besides,the other reprentation of $P$ may have $m(m>r)$ sum of rank 1 matrices.
            – mathrookie
            Nov 22 at 21:06












          • If you do the polar decomposition in $M_r(mathbb C)$, you can always take them to be unitaries. As for $m>r$, enlarge the smallest family with zeroes.
            – Martin Argerami
            Nov 22 at 22:15













          up vote
          1
          down vote










          up vote
          1
          down vote









          The relation is that there exists a unitary $Uin M_r(mathbb C)$ such that $$w_j=sum_{k=1}^r U_{jk}v_k.$$



          Indeed, fix an orthonormal basis $e_1,ldots,e_n$ of $mathbb C^n$, and we may write
          $$
          v_k=sum_{k=1}^ns_{jk}e_j, w_k=sum_{j=1}^rt_{jk}e_j.
          $$

          Now consider $S,Tin M_r(mathbb C)$ with $S_{kj}=s_{kj}$, $T_{kj}=t_{kj}$. Then
          $$
          P=sum_{k=1}^rv_kv_k^*=sum_{k=1}^rsum_{j,ell=1}^rs_{jk}overline{s_{ell k}}e_je_ell^*=sum_{k=1}^rsum_{j,ell=1}^rS_{jk}{(S^*)_{k ell}}e_je_ell^*=sum_{j,ell=1}^r(SS^*)_{jell}e_je_ell^*.
          $$

          Similarly,
          $$
          P=sum_{j,ell=1}^r(TT^*)_{jell}e_je_ell^*.
          $$

          As $e_je_ell^*$ are the matrix units in $M_r(mathbb C)$, we get that $TT^*=SS^*$. Now we write the polar decompositions
          $$
          S^*=W(SS^*)^{1/2}, T^*=V(TT^*)^{1/2},
          $$

          where $W,V$ are unitaries. Then $S=(SS^*)^{1/2}W^*$, and
          $$
          T=(TT^*)^{1/2}V^*=(SS^*)^{1/2}V^*=SWV^*=SZ,
          $$

          where $Z=WV^*$. Now
          begin{align}
          w_j&=sum_{k=1}^r t_{kj}e_k=sum_{k=1}^r(SZ)_{kj}e_k=sum_{k,m=1}^rS_{km}Z_{mj}e_k\ \
          &=sum_{m=1}^r Z_{mj}sum_{k=1}^rS_{km}e_k=sum_{m=1}^rZ_{mj}v_m\ \
          &=sum_{m=1}^rU_{jm}v_m,
          end{align}

          where $U=Z^T$.






          share|cite|improve this answer














          The relation is that there exists a unitary $Uin M_r(mathbb C)$ such that $$w_j=sum_{k=1}^r U_{jk}v_k.$$



          Indeed, fix an orthonormal basis $e_1,ldots,e_n$ of $mathbb C^n$, and we may write
          $$
          v_k=sum_{k=1}^ns_{jk}e_j, w_k=sum_{j=1}^rt_{jk}e_j.
          $$

          Now consider $S,Tin M_r(mathbb C)$ with $S_{kj}=s_{kj}$, $T_{kj}=t_{kj}$. Then
          $$
          P=sum_{k=1}^rv_kv_k^*=sum_{k=1}^rsum_{j,ell=1}^rs_{jk}overline{s_{ell k}}e_je_ell^*=sum_{k=1}^rsum_{j,ell=1}^rS_{jk}{(S^*)_{k ell}}e_je_ell^*=sum_{j,ell=1}^r(SS^*)_{jell}e_je_ell^*.
          $$

          Similarly,
          $$
          P=sum_{j,ell=1}^r(TT^*)_{jell}e_je_ell^*.
          $$

          As $e_je_ell^*$ are the matrix units in $M_r(mathbb C)$, we get that $TT^*=SS^*$. Now we write the polar decompositions
          $$
          S^*=W(SS^*)^{1/2}, T^*=V(TT^*)^{1/2},
          $$

          where $W,V$ are unitaries. Then $S=(SS^*)^{1/2}W^*$, and
          $$
          T=(TT^*)^{1/2}V^*=(SS^*)^{1/2}V^*=SWV^*=SZ,
          $$

          where $Z=WV^*$. Now
          begin{align}
          w_j&=sum_{k=1}^r t_{kj}e_k=sum_{k=1}^r(SZ)_{kj}e_k=sum_{k,m=1}^rS_{km}Z_{mj}e_k\ \
          &=sum_{m=1}^r Z_{mj}sum_{k=1}^rS_{km}e_k=sum_{m=1}^rZ_{mj}v_m\ \
          &=sum_{m=1}^rU_{jm}v_m,
          end{align}

          where $U=Z^T$.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Nov 22 at 13:34

























          answered Nov 22 at 4:20









          Martin Argerami

          123k1176174




          123k1176174












          • By polar decomposition,we have $W,V$ are partial isometries,why are they unitaries?Besides,the other reprentation of $P$ may have $m(m>r)$ sum of rank 1 matrices.
            – mathrookie
            Nov 22 at 21:06












          • If you do the polar decomposition in $M_r(mathbb C)$, you can always take them to be unitaries. As for $m>r$, enlarge the smallest family with zeroes.
            – Martin Argerami
            Nov 22 at 22:15


















          • By polar decomposition,we have $W,V$ are partial isometries,why are they unitaries?Besides,the other reprentation of $P$ may have $m(m>r)$ sum of rank 1 matrices.
            – mathrookie
            Nov 22 at 21:06












          • If you do the polar decomposition in $M_r(mathbb C)$, you can always take them to be unitaries. As for $m>r$, enlarge the smallest family with zeroes.
            – Martin Argerami
            Nov 22 at 22:15
















          By polar decomposition,we have $W,V$ are partial isometries,why are they unitaries?Besides,the other reprentation of $P$ may have $m(m>r)$ sum of rank 1 matrices.
          – mathrookie
          Nov 22 at 21:06






          By polar decomposition,we have $W,V$ are partial isometries,why are they unitaries?Besides,the other reprentation of $P$ may have $m(m>r)$ sum of rank 1 matrices.
          – mathrookie
          Nov 22 at 21:06














          If you do the polar decomposition in $M_r(mathbb C)$, you can always take them to be unitaries. As for $m>r$, enlarge the smallest family with zeroes.
          – Martin Argerami
          Nov 22 at 22:15




          If you do the polar decomposition in $M_r(mathbb C)$, you can always take them to be unitaries. As for $m>r$, enlarge the smallest family with zeroes.
          – Martin Argerami
          Nov 22 at 22:15


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3008734%2frelation-between-different-presentations-of-a-positive-semidefinite-matrix%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Quarter-circle Tiles

          build a pushdown automaton that recognizes the reverse language of a given pushdown automaton?

          Mont Emei