Simple one (Lucas theorem)

Consider the integers $${n \choose k}$$
for $k=0,1,\ldots,n$. It turns out that the number of those which are odd is always a power of $2$.

Proof. Direct application of Lucas theorem. $ \displaystyle {n \choose k}$ is odd exactly when the $1's$ in the binary represenation of $k$ correspond only to $1's$ in the binary representation of $n$. If $n$ has exactly $d$ $1's$ then the number of such $k$ is $2^d$.

Yet another solution of an old problem

Consider a problem we already solved (in a weaker setting) in a previous post.
Let $x_1,\ldots, x_n$ be complex numbers, such that $$x_1^k+\cdots +x_n^k=0$$ for all $k\in\{1,\ldots,n\}$. Prove that all of the numbers are zeros.

Here we would investigate another approach. The idea is simple. First we construct a matrix $A$ whose characteristic polynomial has as roots $x_1,\ldots, x_n$. Then we use the fact that $\text{Tr}(A^k)=x_1^k+\cdots+x_n^k$ to successively eliminate all the coefficients, hence showing that $x_1,\ldots,x_n$ are roots of $x^n=0$.
Assume $x^n-a_1 x^{n-1}-\cdots-a_{n-1}x-a_n$ is a polynomial whose roots are precisely $x_1,\ldots,x_n$. Then the matrix
$$  A= \begin{pmatrix}
        a_1       & a_2 & a_3 & \dots & a_{n-1} & a_n \\
        1       & 0 & 0 & \dots  &0 & 0 \\
         0     & 1 & 0  & \dots &0 & 0 \\
         \vdots          &\vdots & \vdots &\vdots   &\ddots   &\vdots\\
        0      & 0 & 0 & \dots &1  & 0
    \end{pmatrix}
 $$ has this polynomial as characteristic.
Since $\text{Tr}(A)=0$ we have $a_1=0$. Now assume we have deduced that $a_1=\cdots=a_k=0$ for some $k\le n-1$.  It is easy to see by induction that $$A^{k+1}=\begin{pmatrix}
        a_{k+1}      & a_{k+2}  & a_{k+3} & \dots & \dots & a_{n}     & 0 & \dots & 0\\
        0    & a_{k+1} & a_{k+2}        & \dots  & \dots & a_{n-1}              & a_n &\dots       & 0\\   
    \vdots   &\vdots &\vdots &\vdots &\vdots &\vdots &\vdots &\vdots &\vdots\\
       0       & 0 & 0        & \dots  &a_{k+1}              & \dots     & \dots & \dots & a_n  \\
1      & 0 & 0        & \dots  &\dots            &\dots &\dots &\dots &0  \\
0       & 1 & 0        & \dots  &\dots &\dots &\dots &\dots  &0\\

    \vdots   &\vdots &\vdots &\vdots &\vdots &\vdots &\vdots&\vdots&\vdots\\

          0      & 0 & \dots & 1 & 0 & \dots & \dots &\dots & 0
    \end{pmatrix}$$
 (the matrix may not be perfectly written, but you got the feeling)
From here is evident that $\text {Tr}(A^{k+1})=(k+1) a_{k+1}$. Hence $(k+1)a_{k+1}=0$ thus $a_{k+1}$ is zero. Thus we obtain that all the coefficients are zero. Hence the roots.

Comment. Does this solution generalizes the problem into a more abstract setting?

Equation

Prove that for any $c\in (0,1)$ the equation $$\left(1-\left(c-cx\right)^{2/3}\right)^3=(1-c^2)x^2$$ has unique real root in $(0,1)$ and find this root.

Comment. This Problem has connection with a property of the astroid - being the curve which is obtained as a contour of the set of lines ending at the two axes and with unit length.

Cofinal set is uncountable

Consider sequences of natural numbers. We say that a sequence $a$ majorizes $b$, notation $a\succ b$, if $$\lim_{n\to \infty}\frac{b_n}{a_n}=0.$$
We call a set of sequences $C$ cofinal if for any sequence $v$ there is $u\in C$ such that $u$ majorizes $v$.


$\textbf{Statement}.$ Every cofinal set is uncountable.

$\textbf{Proof}.$ The proof is utilization of Cantor's diagonal argument.
Assume $C$ is a cofinal countable set, denote its elements as $a^k$ for $k=1,2,3,\ldots$. First of all observe that $\succ$ is a transitive relation and for any finite set of sequences, there is a sequence $u$ which majorizes all of them (consider $n$ times the pointwise maximum).
Construct a sequence of sequences $b^k$ as follows -
take $b^1_n=\max\{a^1_1,\ldots,a^1_n\}$. Suppose we have constructed $b^j$ for $j=1,\ldots,k$. Construct $b^{k+1}$ as an nondecreasing sequence which majorizes all of $a^i$ for $i=1,\ldots,k+1$ and $b^j$ for $j=1,\ldots,k$.
Thus $b^j\succ a^j$ and $b^{j+1}\succ b^j$ and each $b^j$ is nondecreasing sequence.
Let $n_1$ be such that $\displaystyle \frac{b^1_{j}}{b^{2}_j}<\frac{1}{2}$ for $j\ge n_1$. Now for $i>1$ take $n_i \ge\max\{i, \ n_{i-1}\}$ and such that $\displaystyle \frac{b^i_{j}}{b^{i+1}_j}<\frac{1}{2}$ for $j\ge n_i$ . Now define $c_k=b_{n_k}^k$. Take some $a^j$ ($j$ fixed) and we know $b^j\succ a^j$. Now take $k>j$. We have $$\frac{b^j_{n_k}}{c_k}=\frac{b^j_{n_k}}{b_{n_k}^k}=\frac{b^{k-1}_{n_{k}}}{b_{n_k}^k}\cdot \frac{b^{k-2}_{n_{k}}}{b_{n_{k}}^{k-1}}\cdots \frac{b_{n_k}^{j}}{b_{n_k}^{j+1}}.$$ Since $n_k\ge n_i$ when $k>i$ we have that $$\frac{b^{i+1}_{n_{k}}}{b^i_{n_k}}< \frac{1}{2}$$ for $i\in\{j,\cdots k-1\}$. Thus using the above form we have $$\frac{b^j_{n_k}}{c_k}<\frac{1}{2^{k-j}}$$ Since $b^j$ is noncreasing and $n_k\ge k$ we have $$\frac{b^j_{k}}{c_k}\le \frac{b^j_{n_k}}{c_k}<\frac{1}{2^{k-j}}.$$ This means that $c\succ b^j$ for all $j$. Since $b^j\succ a^j$ we see that $c\succ a_j$ for all $j$ which contradicts the fact that $C$ is cofinal.

Characterization of limit ordinals

An ordinal $\alpha$ is limit ordinal if and only if there exists ordinal $\beta$ such that $\alpha=\omega\beta$.

Proof. $\Leftarrow)$ If $\beta=1$ then $\alpha=\omega$ so it is limit. Assume $\omega\gamma$ is limit for all $\gamma<\beta$. We will prove that $\omega\beta$ is limit.
First assume $\beta$ is limit. Then $\omega\beta=\sup\{\omega\gamma \ \big| \ \gamma<\beta\}$. Let $\theta<\omega\beta$, then $\theta<\omega\gamma'$ for some $\gamma'<\beta$. But $\omega\gamma'$ is limit thus $\theta+1<\omega\gamma'$ so $\theta+1<\omega\beta$ hence $\omega\beta$ is limit.
Now let $\beta=\beta'+1$ for some $\beta'$. Thus $\omega\beta=\omega\beta'+\omega$. If $\theta<\omega\beta=\omega\beta'+\omega=\sup\{\omega\beta'+n\ \big| \ n<\omega\}$ then $\theta<\omega\beta'+n_0$ for some $n_0$. And now $\theta+1<\omega\beta'+n_0+1<\sup\{\omega\beta'+n\ \big| \ n<\omega\}$ hence $\omega\beta$ is limit.
$\Rightarrow)$ Let $\beta=\sup\{\gamma\ \big| \ \omega\gamma\le\alpha\}$. We have that $\omega\beta\le \alpha$. Assume $\omega\beta<\alpha$. Since $\alpha$ is limit we find that $\omega\beta+1<\alpha$ and inductively $\omega\beta+n<\alpha$ for all $n<\omega$. Hence $\omega\beta+\omega\le \alpha$. This means that $\omega(\beta+1)\le \alpha$ which is a contradiction with the definition of $\beta$.

Binomial identity

$$\displaystyle \sum_{k=1}^n(-1)^{n-k}\binom{n}{k}k^j=\begin{cases} n!, & j=n\\
0, & j=1,2,\ldots,n\end{cases}$$
Proof. Consider the number of surjective functions $f:\{1,2,\ldots,j\}\to\{1,2,\ldots,n\}$. If we count them using the inclusion-exclusion principle we get the sum in the problem. On the other hand this number is trivially $n!$ if $j=n$ and $0$ if $j<n$.
Remark 1. For $j>n$ we again find the number of surjections, which has a closed form form some $j$.
Remark 2. Interesting identity which represents the above identities in a unified manner is the following $$\sum_{k=0}^n(-1)^{n-k}\binom{n}{k}(x+k)^n=n!$$ for all $x$ (in $\mathbb{R}$ for example, but any commutative ring suffices).

Property of join

 Let $X$ be path-connected and $Y$ be arbitrary topological space. Then the join $X*Y$ is simply connected.   $\textbf{Proof}.$ We use Van K...