intuition for similar matrix












9












$begingroup$


If matrices $A$ and $B$ satisfy the definition of similar matrix, i.e. $B=PAP^{-1}$, then we can interpret $A$ and $B$ are the same linear transformation under different basis. But my question is how to "grok" this interpretation by just look at the definition of similar matrix?










share|cite|improve this question











$endgroup$

















    9












    $begingroup$


    If matrices $A$ and $B$ satisfy the definition of similar matrix, i.e. $B=PAP^{-1}$, then we can interpret $A$ and $B$ are the same linear transformation under different basis. But my question is how to "grok" this interpretation by just look at the definition of similar matrix?










    share|cite|improve this question











    $endgroup$















      9












      9








      9


      4



      $begingroup$


      If matrices $A$ and $B$ satisfy the definition of similar matrix, i.e. $B=PAP^{-1}$, then we can interpret $A$ and $B$ are the same linear transformation under different basis. But my question is how to "grok" this interpretation by just look at the definition of similar matrix?










      share|cite|improve this question











      $endgroup$




      If matrices $A$ and $B$ satisfy the definition of similar matrix, i.e. $B=PAP^{-1}$, then we can interpret $A$ and $B$ are the same linear transformation under different basis. But my question is how to "grok" this interpretation by just look at the definition of similar matrix?







      linear-algebra matrices






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Jul 15 '12 at 19:56







      chaohuang

















      asked May 25 '12 at 16:33









      chaohuangchaohuang

      3,23421529




      3,23421529






















          4 Answers
          4






          active

          oldest

          votes


















          5












          $begingroup$

          Sure. $P$ is a linear transformation which takes things to a new basis. after applying $B$, you bring your (transformed) basis back to where it used to be (transformed). If this is the same thing as your original transformation ($A$), then $A$ and $B$ must fundamentally be the same transform, in the different basis. I visualize it with a rotation example: rotate to a new basis ($P$), do your rotation $B$, and rotate back $(P^{-1})$. If this is the same as one rotation $A$, then they were the same.



          $Startrightarrow^P Rotated$



          $downarrow Ahspace{40 pt}downarrow B$



          $End leftarrow^{P^{-1}} Twice$



          If you end at the same place then $A$ and $B$ must be the same.






          share|cite|improve this answer









          $endgroup$





















            2












            $begingroup$

            I always understood that the matrix $P$ is a change of basis matrix from the domain to itself. So as the codomain and the domain are the same vector space, the definition is saying $A$ and $B$ are similar if transforming by $A$ is the same as changing basis, then transforming via $B$, then changing back again.
            Hope it helps.






            share|cite|improve this answer











            $endgroup$





















              1












              $begingroup$

              I'm not sure if this helps, but...



              The way I think of it is that if $pi in mathbb{R}^n$ is the representation of the vector $v$ in the basis formed from the columns of $P$ (ie, $v = sum pi_i p_i$), then $Ppi$ is the representation of $v$ in the basis $e_1,...e_n$ (since $v = sum pi_i p_i = sum pi_i P e_i = P sum pi_i e_i = sum [P pi]_i e_i$).



              Specifically, this means that $P$ is the identity mapping with different bases for the domain and range ($v$ in basis $p_1,...,p_n$ maps to $v$ in basis $e_1,...,e_n$).



              Similarly, $P^{-1}$ is the identity mapping with the domain and range bases swapped.



              So, I think of the $P$ and $P^{-1}$ in the expression $P^{-1}AP$ as identity mappings, and both $A$ and $B$ represent the same linear operator in some 'coordinate-free' space.






              share|cite|improve this answer









              $endgroup$





















                0












                $begingroup$

                My current understanding, not sure this helps:



                $B=PAP^{-1}$ means $BP=PA$, so if $A$ is a diagonal, $BP=PA$ is just the definition of eigenvetors, which is easy to understand. Because we know a matrix and its diagonal form are the same transformation under different basis, so to extrapolate to non-diagonal cases, we can define any square matrix of the form $B'P=PA'$ or $B'=PA'P^{-1}$ representing the same transformation under different basis, called similar matrix.



                Edit: $B=PAP^{-1}$ means $A=P^{-1}BP$, and think of $P$ as the change of basis matrix, then $P^{-1}BP$ means change of basis first, then do the linear transformation, then change the basis back, so $A$ and $B$ represent the same linear transformation under different bases. See here: https://youtu.be/P2LTAUO1TdA?t=9m12s






                share|cite|improve this answer











                $endgroup$













                  Your Answer





                  StackExchange.ifUsing("editor", function () {
                  return StackExchange.using("mathjaxEditing", function () {
                  StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
                  StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
                  });
                  });
                  }, "mathjax-editing");

                  StackExchange.ready(function() {
                  var channelOptions = {
                  tags: "".split(" "),
                  id: "69"
                  };
                  initTagRenderer("".split(" "), "".split(" "), channelOptions);

                  StackExchange.using("externalEditor", function() {
                  // Have to fire editor after snippets, if snippets enabled
                  if (StackExchange.settings.snippets.snippetsEnabled) {
                  StackExchange.using("snippets", function() {
                  createEditor();
                  });
                  }
                  else {
                  createEditor();
                  }
                  });

                  function createEditor() {
                  StackExchange.prepareEditor({
                  heartbeatType: 'answer',
                  autoActivateHeartbeat: false,
                  convertImagesToLinks: true,
                  noModals: true,
                  showLowRepImageUploadWarning: true,
                  reputationToPostImages: 10,
                  bindNavPrevention: true,
                  postfix: "",
                  imageUploader: {
                  brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                  contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                  allowUrls: true
                  },
                  noCode: true, onDemand: true,
                  discardSelector: ".discard-answer"
                  ,immediatelyShowMarkdownHelp:true
                  });


                  }
                  });














                  draft saved

                  draft discarded


















                  StackExchange.ready(
                  function () {
                  StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f149703%2fintuition-for-similar-matrix%23new-answer', 'question_page');
                  }
                  );

                  Post as a guest















                  Required, but never shown

























                  4 Answers
                  4






                  active

                  oldest

                  votes








                  4 Answers
                  4






                  active

                  oldest

                  votes









                  active

                  oldest

                  votes






                  active

                  oldest

                  votes









                  5












                  $begingroup$

                  Sure. $P$ is a linear transformation which takes things to a new basis. after applying $B$, you bring your (transformed) basis back to where it used to be (transformed). If this is the same thing as your original transformation ($A$), then $A$ and $B$ must fundamentally be the same transform, in the different basis. I visualize it with a rotation example: rotate to a new basis ($P$), do your rotation $B$, and rotate back $(P^{-1})$. If this is the same as one rotation $A$, then they were the same.



                  $Startrightarrow^P Rotated$



                  $downarrow Ahspace{40 pt}downarrow B$



                  $End leftarrow^{P^{-1}} Twice$



                  If you end at the same place then $A$ and $B$ must be the same.






                  share|cite|improve this answer









                  $endgroup$


















                    5












                    $begingroup$

                    Sure. $P$ is a linear transformation which takes things to a new basis. after applying $B$, you bring your (transformed) basis back to where it used to be (transformed). If this is the same thing as your original transformation ($A$), then $A$ and $B$ must fundamentally be the same transform, in the different basis. I visualize it with a rotation example: rotate to a new basis ($P$), do your rotation $B$, and rotate back $(P^{-1})$. If this is the same as one rotation $A$, then they were the same.



                    $Startrightarrow^P Rotated$



                    $downarrow Ahspace{40 pt}downarrow B$



                    $End leftarrow^{P^{-1}} Twice$



                    If you end at the same place then $A$ and $B$ must be the same.






                    share|cite|improve this answer









                    $endgroup$
















                      5












                      5








                      5





                      $begingroup$

                      Sure. $P$ is a linear transformation which takes things to a new basis. after applying $B$, you bring your (transformed) basis back to where it used to be (transformed). If this is the same thing as your original transformation ($A$), then $A$ and $B$ must fundamentally be the same transform, in the different basis. I visualize it with a rotation example: rotate to a new basis ($P$), do your rotation $B$, and rotate back $(P^{-1})$. If this is the same as one rotation $A$, then they were the same.



                      $Startrightarrow^P Rotated$



                      $downarrow Ahspace{40 pt}downarrow B$



                      $End leftarrow^{P^{-1}} Twice$



                      If you end at the same place then $A$ and $B$ must be the same.






                      share|cite|improve this answer









                      $endgroup$



                      Sure. $P$ is a linear transformation which takes things to a new basis. after applying $B$, you bring your (transformed) basis back to where it used to be (transformed). If this is the same thing as your original transformation ($A$), then $A$ and $B$ must fundamentally be the same transform, in the different basis. I visualize it with a rotation example: rotate to a new basis ($P$), do your rotation $B$, and rotate back $(P^{-1})$. If this is the same as one rotation $A$, then they were the same.



                      $Startrightarrow^P Rotated$



                      $downarrow Ahspace{40 pt}downarrow B$



                      $End leftarrow^{P^{-1}} Twice$



                      If you end at the same place then $A$ and $B$ must be the same.







                      share|cite|improve this answer












                      share|cite|improve this answer



                      share|cite|improve this answer










                      answered Jun 29 '12 at 0:51









                      Robert MastragostinoRobert Mastragostino

                      12.3k22448




                      12.3k22448























                          2












                          $begingroup$

                          I always understood that the matrix $P$ is a change of basis matrix from the domain to itself. So as the codomain and the domain are the same vector space, the definition is saying $A$ and $B$ are similar if transforming by $A$ is the same as changing basis, then transforming via $B$, then changing back again.
                          Hope it helps.






                          share|cite|improve this answer











                          $endgroup$


















                            2












                            $begingroup$

                            I always understood that the matrix $P$ is a change of basis matrix from the domain to itself. So as the codomain and the domain are the same vector space, the definition is saying $A$ and $B$ are similar if transforming by $A$ is the same as changing basis, then transforming via $B$, then changing back again.
                            Hope it helps.






                            share|cite|improve this answer











                            $endgroup$
















                              2












                              2








                              2





                              $begingroup$

                              I always understood that the matrix $P$ is a change of basis matrix from the domain to itself. So as the codomain and the domain are the same vector space, the definition is saying $A$ and $B$ are similar if transforming by $A$ is the same as changing basis, then transforming via $B$, then changing back again.
                              Hope it helps.






                              share|cite|improve this answer











                              $endgroup$



                              I always understood that the matrix $P$ is a change of basis matrix from the domain to itself. So as the codomain and the domain are the same vector space, the definition is saying $A$ and $B$ are similar if transforming by $A$ is the same as changing basis, then transforming via $B$, then changing back again.
                              Hope it helps.







                              share|cite|improve this answer














                              share|cite|improve this answer



                              share|cite|improve this answer








                              edited Dec 23 '18 at 22:56

























                              answered May 25 '12 at 17:10









                              Alex J BestAlex J Best

                              2,24211225




                              2,24211225























                                  1












                                  $begingroup$

                                  I'm not sure if this helps, but...



                                  The way I think of it is that if $pi in mathbb{R}^n$ is the representation of the vector $v$ in the basis formed from the columns of $P$ (ie, $v = sum pi_i p_i$), then $Ppi$ is the representation of $v$ in the basis $e_1,...e_n$ (since $v = sum pi_i p_i = sum pi_i P e_i = P sum pi_i e_i = sum [P pi]_i e_i$).



                                  Specifically, this means that $P$ is the identity mapping with different bases for the domain and range ($v$ in basis $p_1,...,p_n$ maps to $v$ in basis $e_1,...,e_n$).



                                  Similarly, $P^{-1}$ is the identity mapping with the domain and range bases swapped.



                                  So, I think of the $P$ and $P^{-1}$ in the expression $P^{-1}AP$ as identity mappings, and both $A$ and $B$ represent the same linear operator in some 'coordinate-free' space.






                                  share|cite|improve this answer









                                  $endgroup$


















                                    1












                                    $begingroup$

                                    I'm not sure if this helps, but...



                                    The way I think of it is that if $pi in mathbb{R}^n$ is the representation of the vector $v$ in the basis formed from the columns of $P$ (ie, $v = sum pi_i p_i$), then $Ppi$ is the representation of $v$ in the basis $e_1,...e_n$ (since $v = sum pi_i p_i = sum pi_i P e_i = P sum pi_i e_i = sum [P pi]_i e_i$).



                                    Specifically, this means that $P$ is the identity mapping with different bases for the domain and range ($v$ in basis $p_1,...,p_n$ maps to $v$ in basis $e_1,...,e_n$).



                                    Similarly, $P^{-1}$ is the identity mapping with the domain and range bases swapped.



                                    So, I think of the $P$ and $P^{-1}$ in the expression $P^{-1}AP$ as identity mappings, and both $A$ and $B$ represent the same linear operator in some 'coordinate-free' space.






                                    share|cite|improve this answer









                                    $endgroup$
















                                      1












                                      1








                                      1





                                      $begingroup$

                                      I'm not sure if this helps, but...



                                      The way I think of it is that if $pi in mathbb{R}^n$ is the representation of the vector $v$ in the basis formed from the columns of $P$ (ie, $v = sum pi_i p_i$), then $Ppi$ is the representation of $v$ in the basis $e_1,...e_n$ (since $v = sum pi_i p_i = sum pi_i P e_i = P sum pi_i e_i = sum [P pi]_i e_i$).



                                      Specifically, this means that $P$ is the identity mapping with different bases for the domain and range ($v$ in basis $p_1,...,p_n$ maps to $v$ in basis $e_1,...,e_n$).



                                      Similarly, $P^{-1}$ is the identity mapping with the domain and range bases swapped.



                                      So, I think of the $P$ and $P^{-1}$ in the expression $P^{-1}AP$ as identity mappings, and both $A$ and $B$ represent the same linear operator in some 'coordinate-free' space.






                                      share|cite|improve this answer









                                      $endgroup$



                                      I'm not sure if this helps, but...



                                      The way I think of it is that if $pi in mathbb{R}^n$ is the representation of the vector $v$ in the basis formed from the columns of $P$ (ie, $v = sum pi_i p_i$), then $Ppi$ is the representation of $v$ in the basis $e_1,...e_n$ (since $v = sum pi_i p_i = sum pi_i P e_i = P sum pi_i e_i = sum [P pi]_i e_i$).



                                      Specifically, this means that $P$ is the identity mapping with different bases for the domain and range ($v$ in basis $p_1,...,p_n$ maps to $v$ in basis $e_1,...,e_n$).



                                      Similarly, $P^{-1}$ is the identity mapping with the domain and range bases swapped.



                                      So, I think of the $P$ and $P^{-1}$ in the expression $P^{-1}AP$ as identity mappings, and both $A$ and $B$ represent the same linear operator in some 'coordinate-free' space.







                                      share|cite|improve this answer












                                      share|cite|improve this answer



                                      share|cite|improve this answer










                                      answered May 25 '12 at 17:34









                                      copper.hatcopper.hat

                                      127k559160




                                      127k559160























                                          0












                                          $begingroup$

                                          My current understanding, not sure this helps:



                                          $B=PAP^{-1}$ means $BP=PA$, so if $A$ is a diagonal, $BP=PA$ is just the definition of eigenvetors, which is easy to understand. Because we know a matrix and its diagonal form are the same transformation under different basis, so to extrapolate to non-diagonal cases, we can define any square matrix of the form $B'P=PA'$ or $B'=PA'P^{-1}$ representing the same transformation under different basis, called similar matrix.



                                          Edit: $B=PAP^{-1}$ means $A=P^{-1}BP$, and think of $P$ as the change of basis matrix, then $P^{-1}BP$ means change of basis first, then do the linear transformation, then change the basis back, so $A$ and $B$ represent the same linear transformation under different bases. See here: https://youtu.be/P2LTAUO1TdA?t=9m12s






                                          share|cite|improve this answer











                                          $endgroup$


















                                            0












                                            $begingroup$

                                            My current understanding, not sure this helps:



                                            $B=PAP^{-1}$ means $BP=PA$, so if $A$ is a diagonal, $BP=PA$ is just the definition of eigenvetors, which is easy to understand. Because we know a matrix and its diagonal form are the same transformation under different basis, so to extrapolate to non-diagonal cases, we can define any square matrix of the form $B'P=PA'$ or $B'=PA'P^{-1}$ representing the same transformation under different basis, called similar matrix.



                                            Edit: $B=PAP^{-1}$ means $A=P^{-1}BP$, and think of $P$ as the change of basis matrix, then $P^{-1}BP$ means change of basis first, then do the linear transformation, then change the basis back, so $A$ and $B$ represent the same linear transformation under different bases. See here: https://youtu.be/P2LTAUO1TdA?t=9m12s






                                            share|cite|improve this answer











                                            $endgroup$
















                                              0












                                              0








                                              0





                                              $begingroup$

                                              My current understanding, not sure this helps:



                                              $B=PAP^{-1}$ means $BP=PA$, so if $A$ is a diagonal, $BP=PA$ is just the definition of eigenvetors, which is easy to understand. Because we know a matrix and its diagonal form are the same transformation under different basis, so to extrapolate to non-diagonal cases, we can define any square matrix of the form $B'P=PA'$ or $B'=PA'P^{-1}$ representing the same transformation under different basis, called similar matrix.



                                              Edit: $B=PAP^{-1}$ means $A=P^{-1}BP$, and think of $P$ as the change of basis matrix, then $P^{-1}BP$ means change of basis first, then do the linear transformation, then change the basis back, so $A$ and $B$ represent the same linear transformation under different bases. See here: https://youtu.be/P2LTAUO1TdA?t=9m12s






                                              share|cite|improve this answer











                                              $endgroup$



                                              My current understanding, not sure this helps:



                                              $B=PAP^{-1}$ means $BP=PA$, so if $A$ is a diagonal, $BP=PA$ is just the definition of eigenvetors, which is easy to understand. Because we know a matrix and its diagonal form are the same transformation under different basis, so to extrapolate to non-diagonal cases, we can define any square matrix of the form $B'P=PA'$ or $B'=PA'P^{-1}$ representing the same transformation under different basis, called similar matrix.



                                              Edit: $B=PAP^{-1}$ means $A=P^{-1}BP$, and think of $P$ as the change of basis matrix, then $P^{-1}BP$ means change of basis first, then do the linear transformation, then change the basis back, so $A$ and $B$ represent the same linear transformation under different bases. See here: https://youtu.be/P2LTAUO1TdA?t=9m12s







                                              share|cite|improve this answer














                                              share|cite|improve this answer



                                              share|cite|improve this answer








                                              edited Sep 12 '16 at 2:58

























                                              answered Jul 15 '12 at 20:13









                                              chaohuangchaohuang

                                              3,23421529




                                              3,23421529






























                                                  draft saved

                                                  draft discarded




















































                                                  Thanks for contributing an answer to Mathematics Stack Exchange!


                                                  • Please be sure to answer the question. Provide details and share your research!

                                                  But avoid



                                                  • Asking for help, clarification, or responding to other answers.

                                                  • Making statements based on opinion; back them up with references or personal experience.


                                                  Use MathJax to format equations. MathJax reference.


                                                  To learn more, see our tips on writing great answers.




                                                  draft saved


                                                  draft discarded














                                                  StackExchange.ready(
                                                  function () {
                                                  StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f149703%2fintuition-for-similar-matrix%23new-answer', 'question_page');
                                                  }
                                                  );

                                                  Post as a guest















                                                  Required, but never shown





















































                                                  Required, but never shown














                                                  Required, but never shown












                                                  Required, but never shown







                                                  Required, but never shown

































                                                  Required, but never shown














                                                  Required, but never shown












                                                  Required, but never shown







                                                  Required, but never shown







                                                  Popular posts from this blog

                                                  Quarter-circle Tiles

                                                  build a pushdown automaton that recognizes the reverse language of a given pushdown automaton?

                                                  Mont Emei