Friday, April 15, 2016

CSIR NET Solved problems on Eigenvalue and Eigenvector.



1..Suppose the matrixA=(402911183012262450)



has a certain complex number λ0 as an eigenvalue. 

Which of the following numbers must also be an eigenvalue of A ?
  • λ+20
  • λ20
  • 20λ
  • 20λ
Solution:

A=(402911183012262450)(029110301202450) (C1C1+C2+C3)

So, clearly determinant of this matrix is 0. So. One eigenvalue=0.

And since Trance=20 So, third eigenvalue=20λ

So, 3rd option is correct.




2.Let A be a 3×3 matrix with real entries such that det and the trace of A is 0 . If \det(A+I)=0 , where I denote the 3\times 3 identity matrix, then the eigenvalues of A are





-1,2,3
-1,2,-3
1,2,-3
-1,-2,3

Solution:


Some Results Related to Eigenvalue often used: If \lambda is an eigenvalue of A then

o k\lambda is an eigenvalue of kA
o \lambda+k is an eigenvalue of A+k\lambda I
o \lambda^2 is an eigenvalue of A^2
o \frac{|A|}{\lambda} is an eigenvalue of adj(A) provided |A|\neq 0
o \frac{1}{\lambda} is an eigenvalue of \frac{1}{A} provided |A|\neq 0
o P(\lambda) is an eigenvalue of P(A)
Also,
Sum of eigenvalues=Trace of matrix product of eigenvalues=Product of eigenvalues.
So, Let, a,b,c are eigenvalues of a then we have a+b+c=0 and abc=6


And since determinant of A+I=0 therefore, It’s one eigenvalue is 0 So, one eigenvalue of A Must be equal to -1.
So, we have a+b-1=0 and ab=-6 Solving we get, eigenvalues of A=-1,-2,3.
So, 4th option is correct.




3.Let P be a 2 \times 2  complex matrix such that P^* P  is the identity matrix, where P^* is the conjugate transpose of P . Then the eigenvalue of P are



• real
• complex conjugate of each other
• reciprocals of each other
• of modulus 1


Solution:


Some basis Results:

Eigenvalues of Symmetric and Hermitian Matrix are Real
Eigenvalues of Skew Symmetric and Skew-Hermitian are Imaginary.
Modulus of Eigenvalues of Unitary and orthogonal Matrix =|z|=1
Eigenvalue of Nilpotent Matrix=0.

You can remember these results using this diagram.

So, using above result, 4th option is correct.



4.Let \{v_1,\cdots , v_n\} be a linearly independent subset of a vector space V where n \geq 4 . Set w_{ij} =v_i-v_j . Let W be the span of \{w_{ij}|1\leq i,j\leq n\} . Then





  \{w_{ij}|1\leq i
\{w_{ij}|1\leq i
\{w_{ij}|1\leq i\leq n-1,j=i+1\} spans W .
\dim W=n

Solution: 


Let S=\{w_{ij}|1\leq i\leq n-1,j=i+1\} =\{w_{12},w_{23},\cdots,w_{n-1 n}\}
We will prove that S set spans W.
Let w_{mn} \in W Let m>n then

w_{mn}
=v_m-v_n
=-((v_n-v_{n+1})+(v_{n+1}-v_{n+2})\cdots (v_{m-1}-v_m))
=-(w_{n,n+1}+w_{n+1,n+2}\cdots w_{m-1,m})

 If n>m then
w_{mn}
=v_m-v_n
=(v_m-v_{m+1})+(v_{m+1}-v_{m+2})\cdots (v_{n-1}-v_n)
=w_{m,m+1}+w_{m+1,m+2}\cdots w_{n-1,n}

So, \{w_{ij}|1\leq i\leq n-1,j=i+1\} spans W .

Also since it contains only n-1 terms. so, dimension of W=n-1.
(Since the basis set is always, the subset of spanning set)

Also, \{w_{ij}|1\leq i
\{w_{ij}|1\leq i\leq n-1,j=i+1\}
Therefore it will also span W

So, 1,3 are correct.



5.Which of the following matrices is not diagonalizable over \mathbb{R} ?





\begin{pmatrix} 1&1&0\\ 0&2&0\\0&0&2\end{pmatrix}

\begin{pmatrix} 1&1&0\\0&2&1\\0&0&3 \end{pmatrix}

\begin{pmatrix} 1&1&0\\0&1 &0\\0&0&2\end{pmatrix}

\begin{pmatrix} 1&0&1\\0&2&0\\0&0&3\end{pmatrix}



Solution: 

We don't need to find eigenvalues.

Notice that
\begin{pmatrix} 1&1&0\\ 0&1 &0\\ 0&0&2 \end{pmatrix}
 is in Standard Jordan Canonical form. 
Hence 3rd option is correct.



6.Let A be a 3\times 3 matrix with A^3=-1 . Which of the following statements are correct?




A has three distinct eigenvalues.

A is diagonalizable over \mathbb{C} .
A is triangularizable over \mathbb{C} .
A is non singular.


Solution:


 A^3=-1

Therfore it's we have characteristic polynomial
\lambda^3=-1
\Rightarrow \lambda^3+1=0

\Rightarrow (\lambda+1)(\lambda^2-2\lambda+1=0
\Rightarrow (\lambda+1)(\lambda-1)^2 so, eigenvalues are 1,1,-1

Since they are not distinct, we can't say that It is diagonizable or not.
We know that every matrix is triangularizable.
and determinant of matrix is product of eigenvalues.
So. |A|=-1

So, 3rd and 4th options are correct.



7.Let N be a non-zero 3\times 3 matrix with the property N^2=0 . Which of the following is/are true?





• N is not similar to a diagonal matrix.
• N is similar to a diagonal matrix.
• N has non-zero eigenvector.
• N has three linearly independent eigenvectors.



Solution:


N is basically a Nilpotent matrix and we know that,
 Nilpotent matrix has all Eigenvalue equals to 0.

and nilpotent matrix is only digonalizable if it is a Zero matrix.

So, let N=\begin{pmatrix}0&0&0\\0&0&1\\0&0&0\end{pmatrix}

This matrix satisfies the above conditions and eigenvectors are \begin{pmatrix}1\\0\\0\end{pmatrix}\quad \begin{pmatrix}0\\1\\0\end{pmatrix}.

So, 1st and 3rd options are correct.



8.Let A be a 3 \times 3 and b be a 3 \times 1 matrix with integer entries. Suppose that the system Ax = b has a complex solution. Then





Ax = b has an integer solution
Ax = b has an rational solution
• The set of real solutions to Ax = 0 has a basis consisting of rational solutions.
• If b\neq 0 then A has positive rank.


Solution:



Option:1&2 x=A^{-1}b=\frac{adj(A)}{|A|}\times b now since integers are closed under the operation +,-,\times, / So, numerator and denominator are integers.


So, division of two integers is rational number.
So, b is rational.


Option:3

If you have Complex solution to Ax=b (imaginary part of complex solution is not zero) then it is sure that you have at least one non-zero solution to Ax=0.
To illustrate this:

 Let A be a 3\times 4 and b be a 3\times 1 matrix with integer entries

.Suppose that the system Ax=b has a complex solution.

 Let this complex solution is u=\left( \begin{array}{ccc}x_1+i\times y_1\\x_2+i\times y_2\\x_3+i\times y_3\end{array} \right)

= \left( \begin{array}{ccc}x_1\\x_2 \\x_3\end{array} \right)+i\left( \begin{array}{ccc}y_1\\y_2\\y_3\end{array}\right)

=X+i \times Y ( where Y is not a zero vector)


So this system of linear equation can be written as: A(X+i \times Y)=b +0  since entries of b are integers..i.e., real so we can consider it as two system of linear equations: A X=b and AY=0   that means AX=0 has a solution other than zero. (which is Y). So it is clear that null space has dimension greater than equals to 1

Option 4: 4th option is true since if A=0 then there exists no such x such that Ax\neq 0 So, It's true


So, 2,3,and 4 are correct.




9. Let S:\mathbb{R}^n \rightarrow \mathbb{R}^n be given by v \longmapsto \alpha v for a fixed \alpha \in \mathbb{R} , \alpha \neq 0 . Let T:\mathbb{R}^n \rightarrow \mathbb{R}^n be a linear transformation such that \mathbb{B}\{v_1,\cdots , v_n\} is a set linearly independent eigenvectors of T . Then




• The matrix of T with respect to \mathbb{B} is diagonal.

• The matrix of T-S with respect to \mathbb{B} is diagonal.

• The matrix of T with respect to \mathbb{B} is not necessarily diagonal, but upper triangular.
• The matrix of T with respect to \mathbb{B} is diagonal but the matrix of  (T-S) with respect to \mathbb{B} is not diagonal.


Solution:


Given, \{x_1,\cdots , x_n\}, is basis of \mathbb{R}. Also this set is eigenvectors of T then,

T(x_1)=\lambda_1\times x_1+0+\cdots
T(x_2)=0+\lambda_2\times x_2+0+\cdots
T(x_3)=0+0+\lambda_3\times x_3+0+\cdots
\cdots \cdots

So, Matrix of linear transformation of T with respect to B.
is

\begin{pmatrix}\lambda_1&0&0&0&\cdots&\cdots\\0&\lambda_2&0&0&\cdots&\cdots\\0&0&\lambda_3&0&\cdots&\cdots\\\cdots&\cdots&\cdots&\cdots\\\cdots&\cdots&\cdots&\cdots\end{pmatrix}

So, clearly. It is a Diagonal Matrix.
Again

S(x_1)=\alpha a_1\times x_1+0+\cdots

S(x_2)=0+\alpha\times x_2+0+\cdots

S(x_3)=0+0+\alpha\times x_3+0+\cdots

\cdots \cdots

So, Matrix of linear transformation of S with respect to B.

\begin{pmatrix}\alpha&0&0&0&\cdots&\cdots\\0&\alpha&0&0&\cdots&\cdots\\0&0&\alpha&0&\cdots&\cdots\\\cdots&\cdots&\cdots&\cdots\\\cdots&\cdots&\cdots&\cdots\end{pmatrix}

So, Matrix of S is also a diagonal Matrix.

and since, (T-S)(x)=T(x)-S(x) So, the matrix of T-S is also Diagonal.

1,2 options are correct.




10. For an n \times n real matrix A,\lambda \in \mathbb{R} and a nonzero vector v\in \mathbb{R}^n suppose that (A-\lambda I)^kv=0 for some

positive integer k . Let I be the n \times n identity matrix. Then which of the following is/are always true?


(A-\lambda I)^{k+r}v=0 for all positive integer r .

(A-\lambda I)^{k-1}v=0
(A-\lambda I) is not injective.
\lambda is an eigenvalue of A .


Solution:



(A-\lambda I)^kv=0
If (A-\lambda I)v\neq 0 then (A-\lambda I)^{k-1}v= 0
Proceeding in similar manner we get (A-\lambda I)v=0 (Contradiction!)

Therefore, \lambda is an eigenvalue of A.
1,2,4 options are correct.


11. Let A\in M_{10}(\mathbb{C}) , the vector space of 10\times 10 matrices with entries in \mathbb{C} . Let W_A be the subspace of M_{10}(\mathbb{C}) spanned by \{A^n|n \geq =0\} . Choose the correct statements.

• for any A, \dim(W_A)\leq 10
• for any A, \dim(W_A)< 10
• for some A, 10<\dim(W_A)< 100
• for any A, \dim(W_A)= 100



Solution:



For a 10\times 10 matrix there exists a characteristic polynomial of degree 10 satisfied by A.
Also , there may exists a polynomial of degree< characteristic polynomial satisfied by A [known as Monic Polynomial]


Let the characteristic polynomial is: a_0+a_1\lambda+a_2 \lambda^2+\cdots +a_{10} \lambda^{10}=0

So, by Cayley-Hamilton Theorem:

a_0+a_1\times A+a_2\times A^2+\cdots +a_{10} A^{10}=0

So, W_A be the subspace of spanned by \{A^n|n \geq =0\} has atmost 10 elements.

So, 1st option is correct.


• • • • • • • • • • •

Related Posts:

4 comments :

  1. Thanks for help

    ReplyDelete
  2. You done the mistake in question no. 6 in using formula (lemda^3+1)=(lemda+1)(lemda^2+1-lemda) so option b is also correct

    ReplyDelete
  3. This blog is genuinely helpful to pass on upgraded enlightening endeavors over web which is really examination. I found one productive instance of this truth through this blog. I will use such information now. Online Classes for CSIR Net Chemical science

    ReplyDelete