Welcome to our community

Be a part of something great, join today!

Problem of the Week #38 - February 18th, 2013

Status
Not open for further replies.
  • Thread starter
  • Moderator
  • #1

Chris L T521

Well-known member
Staff member
Jan 26, 2012
995
Here's this week's problem.

-----

Problem: Let $V$ be a finite-dimensional $K[X]$-module, and let $\phi$ be the associated operator on $V$. Suppose that $\Delta$ represents $\phi$ with respect to some basis. Prove that if $\Delta$ is a diagonal matrix (no nonzero entries off the diagonal), and the diagonal entries of $\Delta$ are pairwise distinct, then $V$ is a cyclic $K[X]$-module.

-----

Remember to read the POTW submission guidelines to find out how to submit your answers!
 
  • Thread starter
  • Moderator
  • #2

Chris L T521

Well-known member
Staff member
Jan 26, 2012
995
This week's question was correctly answered by jakncoke. You can find his solution below.

Let V be a n-dimensional module $K[X]$. $T: V \to V$ be an operator with respect to some basis.

Char(T) = (x-$\lambda_{1}$)...(x-$\lambda_{n}$), where $\lambda_{i}$ are the corresponding diagonal entries of the matrix.
Deg(Char(T)) = n, since T has n distinct eigen values, and thus n distinct eigen vectors.

Now if f(x) was the minimal polynomial for $T$
then f(x) divides Char(A).
f(x) = p(x)Char(A). so the $\lambda_i$ are also roots of f(x).
Since Deg(f(x)) $\leq n$, and Deg(f(x)) = Deg(Char(A)) = n, it stands that Char(A) = f(x).


I will now prove that there exists a $v \in K[X]$, $\{I_v,...,(T)^{n-1}(v)\}$ is linearly independent set.
Assume that there exists no v with this property,
Then for any vector v, we can find not all zero scalars $c_1,..c_i \in X$ s.t $c_1v + ... + c_nT^{n-1}(v) = 0$
So this means every vector v has, including all the n eigenvectors $e_1+...+e_n$ corresponding to distinct n eigenvalues, have a corresponding polynomial in $f_i \in X[F]$, such that $f_{i}(T)(e_1+...+e_n) = f_{i}(T)e_1 ... + f_{i}(T)e_n = 0$ But deg($f_{i}(x)) < n$, which cannot be so since it is less than the deg of the min polynomial.
so it has aleast one vector which makes the above set lin. independent. Thus there exists a v s.t that $c_1v + ... + c_nT^{n-1}(v) = 0$ implies $c_1,..,c_n = 0$
Thus $v \in K[X]$, $\{I_v,...,(T)^{n-1}(v)$

Since Dim(V) = n , and for some v, $\{I_v,...,T^{n-1}(v)\}$ has n lin. independent vectors, it spans $K[X]$ and thus $K[X]$ is cyclic

Here's my solution as well:

Suppose \begin{equation*}
\Delta =
\left(
\begin{array}{cccc}
\lambda_1 & 0 & \cdots & 0 \\
0 & \lambda_2& \cdots & 0 \\
\vdots& \vdots & \ddots & \vdots\\
0 & 0& \cdots &\lambda_n \\
\end{array}
\right)
\end{equation*}
where $\lambda_i \neq \lambda_j$, for $i \neq j;\ i,j = 1,2,\cdots, n$. Then for $k=1,2,\cdots,n-1$,
\begin{equation*}
\Delta^k = \left(
\begin{array}{cccc}
\lambda^k_1 & 0 & \cdots & 0 \\
0 & \lambda^k_2& \cdots & 0 \\
\vdots& \vdots & \ddots & \vdots\\
0 & 0& \cdots &\lambda^k_n \\
\end{array}
\right)
\end{equation*}

Let $v = (1,1,\cdots,1)^T \in V$, then for $k=1,2,\cdots,n-1$,
$$\phi^k(v) = \Delta^k v = (\lambda^k_1, \lambda^k_2, \cdots, \lambda^k_n)^T.$$

Now we prove that $\{v,\phi(v),\phi^2(v), \cdots, \phi^{n-1}(v)\}$ are linearly independent. Suppose for $k_1,k_2,\cdots,k_n \in K$,
$$k_1 v + k_2\phi(v) + k_3\phi^2(v) + \cdots + k_n\phi^{n-1}(v) = 0.$$

i.e.

\begin{equation*} k_1
\left(
\begin{array}{c}
1 \\
1\\
\vdots \\
1\\
\end{array}
\right)
+k_2 \left(
\begin{array}{c}
\lambda_1 \\
\lambda_2\\
\vdots \\
\lambda_n\\
\end{array}
\right) +\cdots+ k_n \left(
\begin{array}{c}
\lambda^{n-1}_1 \\
\lambda^{n-1}_2\\
\vdots \\
\lambda^{n-1}_n\\
\end{array}
\right) = 0.
\end{equation*}

That is
\begin{equation*}
\left(
\begin{array}{cccc}
1 & \lambda_1 & \cdots & \lambda^{n-1}_1 \\
1 & \lambda_2& \cdots & \lambda^{n-1}_2 \\
\vdots& \vdots & \ddots & \vdots\\
1 & \lambda_n& \cdots &\lambda^{n-1}_n \\
\end{array}
\right)
\left(
\begin{array}{c}
k_1 \\
k_2\\
\vdots \\
k_n\\
\end{array}
\right) = 0.
\end{equation*}

The matrix is a Vandermonde Matrix above, therefore
\begin{equation*}
\det
\left(
\begin{array}{cccc}
1 & \lambda_1 & \cdots & \lambda^{n-1}_1 \\
1 & \lambda_2& \cdots & \lambda^{n-1}_2 \\
\vdots& \vdots & \ddots & \vdots\\
1 & \lambda_n& \cdots &\lambda^{n-1}_n \\
\end{array}
\right) = \prod_{1\leqslant i < j \leqslant n} (\lambda_i - \lambda_j).
\end{equation*}

By the presumption, $\lambda_i \neq \lambda_j$, for $i \neq j$. Hence the determinant above is non-zero. Therefore the matrix is non-singular. Thus the equations admit only zero solution. i.e.
$$k_1 = k_2 = \cdots = k_n = 0.$$
Hence $\{v,\phi(v),\phi^2(v), \cdots, \phi^{n-1}(v)\}$ are linearly independent, and it is a basis for $V$. By definition, $V$ is a cyclic $K[X]$-module.
 
Status
Not open for further replies.