Are these square matrices always diagonalisable?Conditions for diagonalizability of $ntimes n$ anti-diagonal matricesFinding eigenvalues/vectors of a matrix and proving it is not diagonalisable.Finding if a linear transformation is diagonalisableThe diagonalisation of the two matricesIf $A in K^n times n$ is diagonalisable, the dimension of the subspace of its commuting matrices is $geq n$ nCalculating the eigenvalues of a diagonalisable linear operator $L$.How to find eigenvalues for mod 2 field?Find for which real parameter a matrix is diagonalisableSquare Roots of a Matrix: Diagonalisable Solutions.eigenvalues and eigenvectors of Diagonalisable matrices

Sci-fi novel series with instant travel between planets through gates. A river runs through the gates

Unexpected email from Yorkshire Bank

Meaning of Bloch representation

Will a top journal at least read my introduction?

Was there a shared-world project before "Thieves World"?

Packing rectangles: Does rotation ever help?

Controversial area of mathematics

How can I practically buy stocks?

Is the 5 MB static resource size limit 5,242,880 bytes or 5,000,000 bytes?

Examples of non trivial equivalence relations , I mean equivalence relations without the expression " same ... as" in their definition?

Why was the Spitfire's elliptical wing almost uncopied by other aircraft of World War 2?

Reducing vertical space in stackrel

Will tsunami waves travel forever if there was no land?

Stop and Take a Breath!

What are the potential pitfalls when using metals as a currency?

How can I place the product on a social media post better?

Can someone publish a story that happened to you?

Using a Lyapunov function to classify stability and sketching a phase portrait

Is there any limitation with Arduino Nano serial communication distance?

Was there a Viking Exchange as well as a Columbian one?

How do I use proper grammar in the negation of "have not" for the following sentence translation?

Is there a way to get a compiler for the original B programming language?

How to make a pipeline wait for end-of-file or stop after an error?

What is the relationship between spectral sequences and obstruction theory?



Are these square matrices always diagonalisable?


Conditions for diagonalizability of $ntimes n$ anti-diagonal matricesFinding eigenvalues/vectors of a matrix and proving it is not diagonalisable.Finding if a linear transformation is diagonalisableThe diagonalisation of the two matricesIf $A in K^n times n$ is diagonalisable, the dimension of the subspace of its commuting matrices is $geq n$ nCalculating the eigenvalues of a diagonalisable linear operator $L$.How to find eigenvalues for mod 2 field?Find for which real parameter a matrix is diagonalisableSquare Roots of a Matrix: Diagonalisable Solutions.eigenvalues and eigenvectors of Diagonalisable matrices













13












$begingroup$


When trying to solve a physics problem on decoupling a system of ODEs, I found myself needing to address the following problem:




Let $A_nin M_n(mathbb R)$ be the matrix with all $1$s above its main diagonal, all $-1$s below its diagonal, and $0$s everywhere else. Is $A_n$ always diagonalisable? If so, what is its diagonalisation (equivalently: what are its eigenvalues and corresponding eigenvectors)?




For example,
$$A_3=beginbmatrix0&1&0\-1&0&1\0&-1&0endbmatrix,quad A_5beginbmatrix0&1&0&0&0\-1&0&1&0&0\0&-1&0&1&0\0&0&-1&0&1\0&0&0&-1&0endbmatrix.$$




Assuming my code is correct, Mathematica has been able to verify that $A_n$ is always diagonalisable up to $n=1000$. If we use $chi_n(t)inmathbb Z[t]$ to denote the characteristic polynomial of $A_n$, a straightforward evaluation also shows that
$$chi_n(t)=-tchi_n-1(t)+chi_n-2(t)tag1$$
for all $ngeq4$. Furthermore, note that $A_n=-A_n^t$ so that, in the case where the dimension is even,
$$det(A_2n-lambda I)=det(A_2n^t-lambda I)=det(-A_2n-lambda I)=det(A_2n+lambda I).$$
This implies that whenever $lambda$ is an eigenvalue of $A_2n$, so is $-lambda$. In other words, $chi_2n(t)$ is always of the form $(t^2-lambda _1^2)(t^2-lambda_2^2)dotsm(t^2-lambda_n^2)$ for some $lambda_i$.



And this is where I am stuck. In order for $A_n$ to be diagonalisable, we must have that all the eigenvalues are distinct, but trying to use the recurrence $(1)$ and strong induction, or trying to use the formula for the even case have not helped at all. It seems like the most probable line of attack would be to somehow show that
$$chi_2n'(t)=2tsum_k=1^nfracchi_2n(t)t^2-lambda_k^2$$
never shares a common zero with $chi_2n$ (which would resolve the even case), though I don't see how to make this work.




Note: I do not have any clue how to actually find the eigenvalues/eigenvectors even in the case where the $A_n$ are diagonalisable. As such even if someone cannot answer the second part of the question, but can prove that the $A_n$ are diagonalisable, I would appreciate that as an answer as well. Above I tried to look at the special case where the dimension is even, though of course the proof for all odd and even $n$ is more valuable. Even if this is not possible, for my purposes I just need an unbounded subset $Ssubseteqmathbb Z$ for which the conclusion is proven for $nin S$, so any such approach is welcome too.



Thank you in advance!










share|cite|improve this question









$endgroup$







  • 3




    $begingroup$
    All eigenvalues distinct is a sufficient but not necessary condition for a matrix to be diagonalizable.
    $endgroup$
    – Henning Makholm
    Apr 24 at 0:23










  • $begingroup$
    @HenningMakholm that's a very good point. But before the responses to the question, it's the only method I knew (hence all my approaches were based on that).
    $endgroup$
    – YiFan
    Apr 24 at 4:37















13












$begingroup$


When trying to solve a physics problem on decoupling a system of ODEs, I found myself needing to address the following problem:




Let $A_nin M_n(mathbb R)$ be the matrix with all $1$s above its main diagonal, all $-1$s below its diagonal, and $0$s everywhere else. Is $A_n$ always diagonalisable? If so, what is its diagonalisation (equivalently: what are its eigenvalues and corresponding eigenvectors)?




For example,
$$A_3=beginbmatrix0&1&0\-1&0&1\0&-1&0endbmatrix,quad A_5beginbmatrix0&1&0&0&0\-1&0&1&0&0\0&-1&0&1&0\0&0&-1&0&1\0&0&0&-1&0endbmatrix.$$




Assuming my code is correct, Mathematica has been able to verify that $A_n$ is always diagonalisable up to $n=1000$. If we use $chi_n(t)inmathbb Z[t]$ to denote the characteristic polynomial of $A_n$, a straightforward evaluation also shows that
$$chi_n(t)=-tchi_n-1(t)+chi_n-2(t)tag1$$
for all $ngeq4$. Furthermore, note that $A_n=-A_n^t$ so that, in the case where the dimension is even,
$$det(A_2n-lambda I)=det(A_2n^t-lambda I)=det(-A_2n-lambda I)=det(A_2n+lambda I).$$
This implies that whenever $lambda$ is an eigenvalue of $A_2n$, so is $-lambda$. In other words, $chi_2n(t)$ is always of the form $(t^2-lambda _1^2)(t^2-lambda_2^2)dotsm(t^2-lambda_n^2)$ for some $lambda_i$.



And this is where I am stuck. In order for $A_n$ to be diagonalisable, we must have that all the eigenvalues are distinct, but trying to use the recurrence $(1)$ and strong induction, or trying to use the formula for the even case have not helped at all. It seems like the most probable line of attack would be to somehow show that
$$chi_2n'(t)=2tsum_k=1^nfracchi_2n(t)t^2-lambda_k^2$$
never shares a common zero with $chi_2n$ (which would resolve the even case), though I don't see how to make this work.




Note: I do not have any clue how to actually find the eigenvalues/eigenvectors even in the case where the $A_n$ are diagonalisable. As such even if someone cannot answer the second part of the question, but can prove that the $A_n$ are diagonalisable, I would appreciate that as an answer as well. Above I tried to look at the special case where the dimension is even, though of course the proof for all odd and even $n$ is more valuable. Even if this is not possible, for my purposes I just need an unbounded subset $Ssubseteqmathbb Z$ for which the conclusion is proven for $nin S$, so any such approach is welcome too.



Thank you in advance!










share|cite|improve this question









$endgroup$







  • 3




    $begingroup$
    All eigenvalues distinct is a sufficient but not necessary condition for a matrix to be diagonalizable.
    $endgroup$
    – Henning Makholm
    Apr 24 at 0:23










  • $begingroup$
    @HenningMakholm that's a very good point. But before the responses to the question, it's the only method I knew (hence all my approaches were based on that).
    $endgroup$
    – YiFan
    Apr 24 at 4:37













13












13








13


4



$begingroup$


When trying to solve a physics problem on decoupling a system of ODEs, I found myself needing to address the following problem:




Let $A_nin M_n(mathbb R)$ be the matrix with all $1$s above its main diagonal, all $-1$s below its diagonal, and $0$s everywhere else. Is $A_n$ always diagonalisable? If so, what is its diagonalisation (equivalently: what are its eigenvalues and corresponding eigenvectors)?




For example,
$$A_3=beginbmatrix0&1&0\-1&0&1\0&-1&0endbmatrix,quad A_5beginbmatrix0&1&0&0&0\-1&0&1&0&0\0&-1&0&1&0\0&0&-1&0&1\0&0&0&-1&0endbmatrix.$$




Assuming my code is correct, Mathematica has been able to verify that $A_n$ is always diagonalisable up to $n=1000$. If we use $chi_n(t)inmathbb Z[t]$ to denote the characteristic polynomial of $A_n$, a straightforward evaluation also shows that
$$chi_n(t)=-tchi_n-1(t)+chi_n-2(t)tag1$$
for all $ngeq4$. Furthermore, note that $A_n=-A_n^t$ so that, in the case where the dimension is even,
$$det(A_2n-lambda I)=det(A_2n^t-lambda I)=det(-A_2n-lambda I)=det(A_2n+lambda I).$$
This implies that whenever $lambda$ is an eigenvalue of $A_2n$, so is $-lambda$. In other words, $chi_2n(t)$ is always of the form $(t^2-lambda _1^2)(t^2-lambda_2^2)dotsm(t^2-lambda_n^2)$ for some $lambda_i$.



And this is where I am stuck. In order for $A_n$ to be diagonalisable, we must have that all the eigenvalues are distinct, but trying to use the recurrence $(1)$ and strong induction, or trying to use the formula for the even case have not helped at all. It seems like the most probable line of attack would be to somehow show that
$$chi_2n'(t)=2tsum_k=1^nfracchi_2n(t)t^2-lambda_k^2$$
never shares a common zero with $chi_2n$ (which would resolve the even case), though I don't see how to make this work.




Note: I do not have any clue how to actually find the eigenvalues/eigenvectors even in the case where the $A_n$ are diagonalisable. As such even if someone cannot answer the second part of the question, but can prove that the $A_n$ are diagonalisable, I would appreciate that as an answer as well. Above I tried to look at the special case where the dimension is even, though of course the proof for all odd and even $n$ is more valuable. Even if this is not possible, for my purposes I just need an unbounded subset $Ssubseteqmathbb Z$ for which the conclusion is proven for $nin S$, so any such approach is welcome too.



Thank you in advance!










share|cite|improve this question









$endgroup$




When trying to solve a physics problem on decoupling a system of ODEs, I found myself needing to address the following problem:




Let $A_nin M_n(mathbb R)$ be the matrix with all $1$s above its main diagonal, all $-1$s below its diagonal, and $0$s everywhere else. Is $A_n$ always diagonalisable? If so, what is its diagonalisation (equivalently: what are its eigenvalues and corresponding eigenvectors)?




For example,
$$A_3=beginbmatrix0&1&0\-1&0&1\0&-1&0endbmatrix,quad A_5beginbmatrix0&1&0&0&0\-1&0&1&0&0\0&-1&0&1&0\0&0&-1&0&1\0&0&0&-1&0endbmatrix.$$




Assuming my code is correct, Mathematica has been able to verify that $A_n$ is always diagonalisable up to $n=1000$. If we use $chi_n(t)inmathbb Z[t]$ to denote the characteristic polynomial of $A_n$, a straightforward evaluation also shows that
$$chi_n(t)=-tchi_n-1(t)+chi_n-2(t)tag1$$
for all $ngeq4$. Furthermore, note that $A_n=-A_n^t$ so that, in the case where the dimension is even,
$$det(A_2n-lambda I)=det(A_2n^t-lambda I)=det(-A_2n-lambda I)=det(A_2n+lambda I).$$
This implies that whenever $lambda$ is an eigenvalue of $A_2n$, so is $-lambda$. In other words, $chi_2n(t)$ is always of the form $(t^2-lambda _1^2)(t^2-lambda_2^2)dotsm(t^2-lambda_n^2)$ for some $lambda_i$.



And this is where I am stuck. In order for $A_n$ to be diagonalisable, we must have that all the eigenvalues are distinct, but trying to use the recurrence $(1)$ and strong induction, or trying to use the formula for the even case have not helped at all. It seems like the most probable line of attack would be to somehow show that
$$chi_2n'(t)=2tsum_k=1^nfracchi_2n(t)t^2-lambda_k^2$$
never shares a common zero with $chi_2n$ (which would resolve the even case), though I don't see how to make this work.




Note: I do not have any clue how to actually find the eigenvalues/eigenvectors even in the case where the $A_n$ are diagonalisable. As such even if someone cannot answer the second part of the question, but can prove that the $A_n$ are diagonalisable, I would appreciate that as an answer as well. Above I tried to look at the special case where the dimension is even, though of course the proof for all odd and even $n$ is more valuable. Even if this is not possible, for my purposes I just need an unbounded subset $Ssubseteqmathbb Z$ for which the conclusion is proven for $nin S$, so any such approach is welcome too.



Thank you in advance!







linear-algebra eigenvalues-eigenvectors determinant diagonalization






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Apr 23 at 22:43









YiFanYiFan

5,6842831




5,6842831







  • 3




    $begingroup$
    All eigenvalues distinct is a sufficient but not necessary condition for a matrix to be diagonalizable.
    $endgroup$
    – Henning Makholm
    Apr 24 at 0:23










  • $begingroup$
    @HenningMakholm that's a very good point. But before the responses to the question, it's the only method I knew (hence all my approaches were based on that).
    $endgroup$
    – YiFan
    Apr 24 at 4:37












  • 3




    $begingroup$
    All eigenvalues distinct is a sufficient but not necessary condition for a matrix to be diagonalizable.
    $endgroup$
    – Henning Makholm
    Apr 24 at 0:23










  • $begingroup$
    @HenningMakholm that's a very good point. But before the responses to the question, it's the only method I knew (hence all my approaches were based on that).
    $endgroup$
    – YiFan
    Apr 24 at 4:37







3




3




$begingroup$
All eigenvalues distinct is a sufficient but not necessary condition for a matrix to be diagonalizable.
$endgroup$
– Henning Makholm
Apr 24 at 0:23




$begingroup$
All eigenvalues distinct is a sufficient but not necessary condition for a matrix to be diagonalizable.
$endgroup$
– Henning Makholm
Apr 24 at 0:23












$begingroup$
@HenningMakholm that's a very good point. But before the responses to the question, it's the only method I knew (hence all my approaches were based on that).
$endgroup$
– YiFan
Apr 24 at 4:37




$begingroup$
@HenningMakholm that's a very good point. But before the responses to the question, it's the only method I knew (hence all my approaches were based on that).
$endgroup$
– YiFan
Apr 24 at 4:37










3 Answers
3






active

oldest

votes


















27












$begingroup$

The matrix $A_n$ is a tridiagonal Toeplitz matrix with diagonal entries $delta = 0$ and off-diagonal entries $tau = 1$ and $sigma = -1$. Hence, we can use the formula in this paper to show that the eigenvalues are $$lambda_k = 2icosleft(dfrackpin+1right),$$ for $k = 1,ldots,n$, and the corresponding eigenvectors $v_1,ldots,v_n$ have entries $$v_k[m] = i^msinleft(dfracmkpin+1right).$$






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    There's probably a connection to be made between these matrices and the Chebyshev polynomials, but I can't quite nail it down. Note that roots of the Chebyshev polynomial $T_n(lambda)$ are given by $lambda = cos( k pi/n)$ for $k in mathcalZ$. It's also possible to show that the characteristic polynomials $P_n(lambda)$ of the given matrices obey the recursion relation $P_n(lambda) = lambda P_n-1(lambda) + P_n-2(lambda)$; this is quite similar to the Chebyshev recursion relation $T_n(lambda) = 2 lambda T_n-1(lambda) - T_n-2(lambda)$.
    $endgroup$
    – Michael Seifert
    Apr 24 at 13:58


















27












$begingroup$

All those matrices are anti-symmetric and therefore they are normal matrices. And every normal matrix is diagonalizable over $mathbb C$, by the spectral theorem.






share|cite|improve this answer









$endgroup$












  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06






  • 4




    $begingroup$
    I would have been surprised if you had not accepted that answer, since it provides more information than mine.
    $endgroup$
    – José Carlos Santos
    Apr 23 at 23:08


















6












$begingroup$

Using that your matrices are skew symmetric, you get that these matrices are diagonalizable. See the section spectral theory on this Wikipedia article.






share|cite|improve this answer








New contributor




gcousin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$












  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06










  • $begingroup$
    I think, compared to the above detailed answer, the only interest of mine is to show that you could have found all these informations on your own using google, as soon as you would have observed that your matrices were skew-symmetric,
    $endgroup$
    – gcousin
    Apr 24 at 17:30












Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3198918%2fare-these-square-matrices-always-diagonalisable%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























3 Answers
3






active

oldest

votes








3 Answers
3






active

oldest

votes









active

oldest

votes






active

oldest

votes









27












$begingroup$

The matrix $A_n$ is a tridiagonal Toeplitz matrix with diagonal entries $delta = 0$ and off-diagonal entries $tau = 1$ and $sigma = -1$. Hence, we can use the formula in this paper to show that the eigenvalues are $$lambda_k = 2icosleft(dfrackpin+1right),$$ for $k = 1,ldots,n$, and the corresponding eigenvectors $v_1,ldots,v_n$ have entries $$v_k[m] = i^msinleft(dfracmkpin+1right).$$






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    There's probably a connection to be made between these matrices and the Chebyshev polynomials, but I can't quite nail it down. Note that roots of the Chebyshev polynomial $T_n(lambda)$ are given by $lambda = cos( k pi/n)$ for $k in mathcalZ$. It's also possible to show that the characteristic polynomials $P_n(lambda)$ of the given matrices obey the recursion relation $P_n(lambda) = lambda P_n-1(lambda) + P_n-2(lambda)$; this is quite similar to the Chebyshev recursion relation $T_n(lambda) = 2 lambda T_n-1(lambda) - T_n-2(lambda)$.
    $endgroup$
    – Michael Seifert
    Apr 24 at 13:58















27












$begingroup$

The matrix $A_n$ is a tridiagonal Toeplitz matrix with diagonal entries $delta = 0$ and off-diagonal entries $tau = 1$ and $sigma = -1$. Hence, we can use the formula in this paper to show that the eigenvalues are $$lambda_k = 2icosleft(dfrackpin+1right),$$ for $k = 1,ldots,n$, and the corresponding eigenvectors $v_1,ldots,v_n$ have entries $$v_k[m] = i^msinleft(dfracmkpin+1right).$$






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    There's probably a connection to be made between these matrices and the Chebyshev polynomials, but I can't quite nail it down. Note that roots of the Chebyshev polynomial $T_n(lambda)$ are given by $lambda = cos( k pi/n)$ for $k in mathcalZ$. It's also possible to show that the characteristic polynomials $P_n(lambda)$ of the given matrices obey the recursion relation $P_n(lambda) = lambda P_n-1(lambda) + P_n-2(lambda)$; this is quite similar to the Chebyshev recursion relation $T_n(lambda) = 2 lambda T_n-1(lambda) - T_n-2(lambda)$.
    $endgroup$
    – Michael Seifert
    Apr 24 at 13:58













27












27








27





$begingroup$

The matrix $A_n$ is a tridiagonal Toeplitz matrix with diagonal entries $delta = 0$ and off-diagonal entries $tau = 1$ and $sigma = -1$. Hence, we can use the formula in this paper to show that the eigenvalues are $$lambda_k = 2icosleft(dfrackpin+1right),$$ for $k = 1,ldots,n$, and the corresponding eigenvectors $v_1,ldots,v_n$ have entries $$v_k[m] = i^msinleft(dfracmkpin+1right).$$






share|cite|improve this answer











$endgroup$



The matrix $A_n$ is a tridiagonal Toeplitz matrix with diagonal entries $delta = 0$ and off-diagonal entries $tau = 1$ and $sigma = -1$. Hence, we can use the formula in this paper to show that the eigenvalues are $$lambda_k = 2icosleft(dfrackpin+1right),$$ for $k = 1,ldots,n$, and the corresponding eigenvectors $v_1,ldots,v_n$ have entries $$v_k[m] = i^msinleft(dfracmkpin+1right).$$







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Apr 23 at 22:58

























answered Apr 23 at 22:52









JimmyK4542JimmyK4542

41.8k248110




41.8k248110







  • 1




    $begingroup$
    There's probably a connection to be made between these matrices and the Chebyshev polynomials, but I can't quite nail it down. Note that roots of the Chebyshev polynomial $T_n(lambda)$ are given by $lambda = cos( k pi/n)$ for $k in mathcalZ$. It's also possible to show that the characteristic polynomials $P_n(lambda)$ of the given matrices obey the recursion relation $P_n(lambda) = lambda P_n-1(lambda) + P_n-2(lambda)$; this is quite similar to the Chebyshev recursion relation $T_n(lambda) = 2 lambda T_n-1(lambda) - T_n-2(lambda)$.
    $endgroup$
    – Michael Seifert
    Apr 24 at 13:58












  • 1




    $begingroup$
    There's probably a connection to be made between these matrices and the Chebyshev polynomials, but I can't quite nail it down. Note that roots of the Chebyshev polynomial $T_n(lambda)$ are given by $lambda = cos( k pi/n)$ for $k in mathcalZ$. It's also possible to show that the characteristic polynomials $P_n(lambda)$ of the given matrices obey the recursion relation $P_n(lambda) = lambda P_n-1(lambda) + P_n-2(lambda)$; this is quite similar to the Chebyshev recursion relation $T_n(lambda) = 2 lambda T_n-1(lambda) - T_n-2(lambda)$.
    $endgroup$
    – Michael Seifert
    Apr 24 at 13:58







1




1




$begingroup$
There's probably a connection to be made between these matrices and the Chebyshev polynomials, but I can't quite nail it down. Note that roots of the Chebyshev polynomial $T_n(lambda)$ are given by $lambda = cos( k pi/n)$ for $k in mathcalZ$. It's also possible to show that the characteristic polynomials $P_n(lambda)$ of the given matrices obey the recursion relation $P_n(lambda) = lambda P_n-1(lambda) + P_n-2(lambda)$; this is quite similar to the Chebyshev recursion relation $T_n(lambda) = 2 lambda T_n-1(lambda) - T_n-2(lambda)$.
$endgroup$
– Michael Seifert
Apr 24 at 13:58




$begingroup$
There's probably a connection to be made between these matrices and the Chebyshev polynomials, but I can't quite nail it down. Note that roots of the Chebyshev polynomial $T_n(lambda)$ are given by $lambda = cos( k pi/n)$ for $k in mathcalZ$. It's also possible to show that the characteristic polynomials $P_n(lambda)$ of the given matrices obey the recursion relation $P_n(lambda) = lambda P_n-1(lambda) + P_n-2(lambda)$; this is quite similar to the Chebyshev recursion relation $T_n(lambda) = 2 lambda T_n-1(lambda) - T_n-2(lambda)$.
$endgroup$
– Michael Seifert
Apr 24 at 13:58











27












$begingroup$

All those matrices are anti-symmetric and therefore they are normal matrices. And every normal matrix is diagonalizable over $mathbb C$, by the spectral theorem.






share|cite|improve this answer









$endgroup$












  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06






  • 4




    $begingroup$
    I would have been surprised if you had not accepted that answer, since it provides more information than mine.
    $endgroup$
    – José Carlos Santos
    Apr 23 at 23:08















27












$begingroup$

All those matrices are anti-symmetric and therefore they are normal matrices. And every normal matrix is diagonalizable over $mathbb C$, by the spectral theorem.






share|cite|improve this answer









$endgroup$












  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06






  • 4




    $begingroup$
    I would have been surprised if you had not accepted that answer, since it provides more information than mine.
    $endgroup$
    – José Carlos Santos
    Apr 23 at 23:08













27












27








27





$begingroup$

All those matrices are anti-symmetric and therefore they are normal matrices. And every normal matrix is diagonalizable over $mathbb C$, by the spectral theorem.






share|cite|improve this answer









$endgroup$



All those matrices are anti-symmetric and therefore they are normal matrices. And every normal matrix is diagonalizable over $mathbb C$, by the spectral theorem.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Apr 23 at 22:56









José Carlos SantosJosé Carlos Santos

178k24139253




178k24139253











  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06






  • 4




    $begingroup$
    I would have been surprised if you had not accepted that answer, since it provides more information than mine.
    $endgroup$
    – José Carlos Santos
    Apr 23 at 23:08
















  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06






  • 4




    $begingroup$
    I would have been surprised if you had not accepted that answer, since it provides more information than mine.
    $endgroup$
    – José Carlos Santos
    Apr 23 at 23:08















$begingroup$
Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
$endgroup$
– YiFan
Apr 23 at 23:06




$begingroup$
Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
$endgroup$
– YiFan
Apr 23 at 23:06




4




4




$begingroup$
I would have been surprised if you had not accepted that answer, since it provides more information than mine.
$endgroup$
– José Carlos Santos
Apr 23 at 23:08




$begingroup$
I would have been surprised if you had not accepted that answer, since it provides more information than mine.
$endgroup$
– José Carlos Santos
Apr 23 at 23:08











6












$begingroup$

Using that your matrices are skew symmetric, you get that these matrices are diagonalizable. See the section spectral theory on this Wikipedia article.






share|cite|improve this answer








New contributor




gcousin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$












  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06










  • $begingroup$
    I think, compared to the above detailed answer, the only interest of mine is to show that you could have found all these informations on your own using google, as soon as you would have observed that your matrices were skew-symmetric,
    $endgroup$
    – gcousin
    Apr 24 at 17:30
















6












$begingroup$

Using that your matrices are skew symmetric, you get that these matrices are diagonalizable. See the section spectral theory on this Wikipedia article.






share|cite|improve this answer








New contributor




gcousin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$












  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06










  • $begingroup$
    I think, compared to the above detailed answer, the only interest of mine is to show that you could have found all these informations on your own using google, as soon as you would have observed that your matrices were skew-symmetric,
    $endgroup$
    – gcousin
    Apr 24 at 17:30














6












6








6





$begingroup$

Using that your matrices are skew symmetric, you get that these matrices are diagonalizable. See the section spectral theory on this Wikipedia article.






share|cite|improve this answer








New contributor




gcousin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$



Using that your matrices are skew symmetric, you get that these matrices are diagonalizable. See the section spectral theory on this Wikipedia article.







share|cite|improve this answer








New contributor




gcousin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this answer



share|cite|improve this answer






New contributor




gcousin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









answered Apr 23 at 22:57









gcousingcousin

1412




1412




New contributor




gcousin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





gcousin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






gcousin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06










  • $begingroup$
    I think, compared to the above detailed answer, the only interest of mine is to show that you could have found all these informations on your own using google, as soon as you would have observed that your matrices were skew-symmetric,
    $endgroup$
    – gcousin
    Apr 24 at 17:30

















  • $begingroup$
    Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
    $endgroup$
    – YiFan
    Apr 23 at 23:06










  • $begingroup$
    I think, compared to the above detailed answer, the only interest of mine is to show that you could have found all these informations on your own using google, as soon as you would have observed that your matrices were skew-symmetric,
    $endgroup$
    – gcousin
    Apr 24 at 17:30
















$begingroup$
Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
$endgroup$
– YiFan
Apr 23 at 23:06




$begingroup$
Thank you very much for the quick response! I hope you don't mind that I accepted JimmyK4542's answer, since it also gives explicitly the eigenvectors and eigenvalues.
$endgroup$
– YiFan
Apr 23 at 23:06












$begingroup$
I think, compared to the above detailed answer, the only interest of mine is to show that you could have found all these informations on your own using google, as soon as you would have observed that your matrices were skew-symmetric,
$endgroup$
– gcousin
Apr 24 at 17:30





$begingroup$
I think, compared to the above detailed answer, the only interest of mine is to show that you could have found all these informations on your own using google, as soon as you would have observed that your matrices were skew-symmetric,
$endgroup$
– gcousin
Apr 24 at 17:30


















draft saved

draft discarded
















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3198918%2fare-these-square-matrices-always-diagonalisable%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Sum ergo cogito? 1 nng

三茅街道4182Guuntc Dn precexpngmageondP