OMOUS 2024

Let $A$ be a real matrix such that $A+A^2A^t+(A^t)^2=0$. Prove that $A=0$.

$\textbf{Solution}.$ Multiply the given equation by $A^t$ from the right to get $AA^t+A^2(A^2)^t+(A^t)^3=0$ and hence $(A^t)^3=-AA^t-A^2(A^2)^t$ . Observe that RHS is negative semi-definite symmetric matrix. Thus $(A^t)^3$ and hence $A^3$ is symmetric negative semi-definite.

Now multiply the given equation with $A$ from the left to get  $A^2+A^3A^t+A(A^t)^2=0$ and using $A^3=(A^t)^3$ we get $A^2+(A^t)^4+A(A^t)^2=0$. Transposing this equation we get $(A^t)^2+A^4+A^2 A^t=0$. Comparing with the original equation we get $A^4=A$. Thus $A$ is diagonalizable and $\lambda^4=\lambda$ for every characteristic root $\lambda$ of $A$. Thus  $\lambda=0$ or $\lambda^3=1$. But $A^3$ is negative semi-definite, so $\lambda^3\le 0$. Thus $\lambda=0$ and hence all roots of $A$ are $0$. Since $A$ is diagonalizable, $A=0$.

Comments

Popular posts from this blog

Basel problem type sum, proposed by prof. Skordev

A problem proposed by prof. Babev

Modification on a sequence from VJIMC