Saturday, April 16, 2016

Mathematics:CSIR: Previous year questions on:Vector spaces and linear Transformation.

1.The dimension of vector space of all symmetric matrices of order $n \times n(n \geq 2) $ with real entries and trace equal to zero is
  • $(n^2-n)/2-1 $
  • $(n^2+n)/2-1 $
  • $(n^2-2n)/2-1 $
  • $(n^2+2n)/2-1 $
Solution:
The Vector space of all \(n\times n\) square matrix=\(n^2\). In case of symmetric matrix \(a_{ij}=a_{ji} \forall i,j\).
So, Number of elements on the upper part of diagonal=\(\frac{n^2-n}{2}\) Removed the \(n\) diagonal elements.
here, Trace=\(0\). So that means all the elements of diagonal is not linearly independent, one of them can be expressed as the linear combination of other elements of diagonal. So, we have, the dimension of symmetric matrix with trace equal to zero equals to \(\frac{n^2-n}{2}+(n-1)=\frac{n(n+1)}{2}-1\).
2nd option is correct.
2.The dimension of the vector space of all symmetric matrices $A=(a_{jk}) $ of order $n\times n(n\geq 2) $ with real entries, $a_{11}=0 $ is
  • $(n^2+n-4)/2 $
  • $(n^2+n+4)/2 $
  • $(n^2+n-3)/2 $
  • $(n^2+n+3)/2 $
Solution:
Since the elements below diagonal is mirror image of elements above it so, number of free variable is =\(\frac{n^2-n}{2}\)
(removed the diagonal elements and divided by 2)
since \(a_11=0\) So, number of free variables in diagonal=\(n-1\)
So dimension of required matrix=\(\frac{n^2-n}{2}+(n-1)=\frac{n^2+n-2}{2}\)
Options are incorrect. Answer is \(\frac{n^2+n-2}{2}\)
3.Consider the following row vectors: $\alpha_1=(1,1,0,1,0,0)\quad \alpha_2=(1,1,0,0,1,0)\quad \alpha_3=(1,1,0,0,0,1) $ $\alpha_4=(1,0,1,1,0,0)\quad \alpha_5=(1,0,1,0,1,0)\quad \alpha_6=(1,0,1,0,0,1) $ The dimension of the vector space spanned by these row vectors is
  • $6 $
  • $5 $
  • $4 $
  • $3 $
Solution: According to this question, we have to calculate the row space of the given matrix:
$ C=\begin{pmatrix}1&1&0&1&0&0\\1&1&0&0&1&0\\1&1&0&0&0&1\\1&0&1&1&0&0\\1&0&1&0&1&0\\1&0&1&0&0&1\end{pmatrix} $
Now let $ A=\begin{pmatrix}1&1&0\\1&1&0\\1&1&0\end{pmatrix} $ and $ B=\begin{pmatrix}1&0&1\\1&0&1\\1&0&1\end{pmatrix} $
So,\(C=\begin{pmatrix}A&I\\B&I\end{pmatrix} \rightarrow \begin{pmatrix}A&I\\B-A&0\end{pmatrix}\)
Here its clear that the rows in \((A\quad I)\) are Linearly Independent. So the rank of \(C\) will depend on the dimension of row space of \(B-A\) which is equal to \(\begin{pmatrix}0&-1&1\\0&-1&1\\0&-1&1\end{pmatrix}\) Since all rows of this matrix is exactly the same so, the dimension of its row space=\(1\). So , Required dimension=\(3+1\).
Note: You can even calculate it directly but that will take much more time.
4.Let $T:\mathbb{R}^n \rightarrow \mathbb{R}^n $ be a linear transformation. Which of the following statements implies that $T $ is bijective?
  • $nullity(T)=n $
  • $Rank(T)=nullity(T)=n $
  • $Rank(T)+nullity(T)=n $
  • $Rank(T)-nullity(T)=n $
Solution:
$T:\mathbb{R}^n \rightarrow \mathbb{R}^n $ and If \(T\) is bijective then  only \(0\) is mapped to zero. that is, \(kernel(T)=0\) then $Rank(T)-0=n\Rightarrow Rank(T)-nullity(T)=0$
So, fourth option is correct.
5.Let $S:\mathbb{R}^3 \rightarrow \mathbb{R}^4 $ and $T:\mathbb{R}^4 \rightarrow \mathbb{R}^3 $ be linear transformations such that $T \circ S $ is the identity map of $\mathbb{R}^3 $. Then
  • $T \circ S $ is the identity map of $R^4 $.
  • $T \circ S $ is one-one, but not onto.
  • $T \circ S $ is onto, but not one-one.
  • $T \circ S $ is neither one-one nor onto.
Solution: Let $S:\mathbb{R}^3 \rightarrow \mathbb{R}^4 $ and $T:\mathbb{R}^4 \rightarrow \mathbb{R}^3$ are defined as:
\(T(a,b,c,d)=(a,b,c)\) and \(S(a,b,c)=(b,a,0,0)\)
then \(T\circ S(a,b,c)=T(S(a,b,c))=T(b,a,0,0)=(b,a,0)\) So, \(T\circ S\) is not an identity map.
Next, for \((a,b,c) \neq (a,b,c_1)\Rightarrow\) \(T\circ S((a,b,c))=(b,a,0)=T\circ S((a,b,c_1))\) So, it's not one-one.
and there exists no pre-image of  $(b,a,2)$. So, it's not onto. So, fourth option is correct.
6.Let $W$ be the vector space of all real polynomials of degree at most \(3\). Define $T:W \rightarrow W $ by $T(p(x))=p'(x) $ where $p' $ is the derivative of $P $. The matrix of $T $ in the basis $\{1,x,x^2,x^3\} $, considered as column vectors, is given by
  • $ \begin{pmatrix}0&0&0&0\\0&1&0&0\\0&0&2&0\\0&0&0&3\end{pmatrix} $
  • $ \begin{pmatrix}0&0&0&0\\1&0&0&0\\0&2&0&0\\0&0&3&0\end{pmatrix} $
  • $ \begin{pmatrix}0&1& 0&0\\0&0&2&0\\0&0&0&3\\0&0&0&0\end{pmatrix} $
  • $ \begin{pmatrix}0&1&2&3\\ 0&0&0&0\\0&0&0&0\\0&0&0&0\end{pmatrix} $
Solution:
\(T(1)=0\times 1+0\times x+0\times x^2+0\times x^3\)
\(T(x)=1\times 1+0\times x+0\times x^2+0\times x^3\)
\(T(x^2)=0\times 1+2\times x+0\times x^2+0\times x^3\)
\(T(x^3)=0\times 1+0\times x+3\times x^2+0\times x^3\)
So matrix of linear transformation is:
$ {\begin{pmatrix}0&0&0&0\\1&0&0&0\\0&2&0&0\\0&0&3&0\end{pmatrix}}^T=\begin{pmatrix}0&1&2&3\\ 0&0&0&0\\0&0&0&0\\0&0&0&0\end{pmatrix} $
Third option is correct.
7.Let $N $ be the vector space of all polynomials of degree at most 3. Define $S:N \rightarrow N \text{ by } S(p(x))=p(x+1), p\in N $ Then the matrix of $S $ in the basis $\{1,x,x^2,x^3\} $, connsidered as column vectors, is given by
  • $ \begin{pmatrix}1&0&0&0\\0&2&0&0\\0&0&3&0\\0&0&0&4\end{pmatrix} $

  • $ \begin{pmatrix}1&1&1&1\\0&1&2&3\\0&0&1&3\\0&0&0&1\end{pmatrix} $

  • $ \begin{pmatrix}0&2&0&0\\0&0&2&0\\0&0&0&3\\0&0&0&0\end{pmatrix} $

  • $ \begin{pmatrix}0&1&2&3\\0&0&0&0\\0&0&0&0\\ 0&0&0&0\end{pmatrix} $
Solution:
\(S(1)=1\)
\(S(x)=(x+1)=1.1+1.x\)
\(S(x^2)={(x+1)}^2=1.1+2.x+1.x^2\)
\(S(x^3)=(x+1)^3=1.1+3.x+3.x^2+x^3\)
So $ \begin{pmatrix}1&1&1&1\\0&1&2&3\\0&0&1&3\\0&0&0&1\end{pmatrix} $ option is correct.
8.A linear transformation $T $ rotates each vector in $\mathbb{R}^2 $ clockwise through $90^{\circ} $. The matrix $T $ ralative to the standard ordered basis $\left(\begin{bmatrix}1\\ 0 \end{bmatrix} ,\begin{bmatrix} 0\\ 1 \end{bmatrix} \right) $ is
  • $\begin{bmatrix}0&-1\\ -1&0\end{bmatrix} $

  • $\begin{bmatrix}0&1\\ -1&0\end{bmatrix} $

  • $\begin{bmatrix}0&1\\ 1&0\end{bmatrix} $

  • $\begin{bmatrix}0&-1\\ 1&0\end{bmatrix} $
Solution: the rotation matrix is given by  $\begin{bmatrix}\cos (\theta)&-\sin(\theta)\\ \sin(\theta)&\cos(\theta)\end{bmatrix} $ (In counterclock wise direction.)
So, the required matrix is :$\begin{bmatrix}\cos (-90^o)&-\sin(-90^o)\\ \sin(-90^o)&\cos(-90^o)\end{bmatrix}=\begin{bmatrix}0&-1\\ 1&0\end{bmatrix} $ So, fourth option is correct.
9..Let $n $ be a positive integer and let $M_n(\mathbb{R}) $ denote the space of all $n\times n $ real matrices. If $T:M_n(\mathbb{R})\rightarrow M_n(\mathbb{R}) $ is a linear transformation such that $T(A)=0 $ whenever $A \in M_n(\mathbb{R}) $ is symmetric or skew-symmetric, then the rank of $T $ is
  • $n(n+1)/2 $
  • $n(n-1)/2 $
  • $n $
  • $0 $
Solution: Since it given that \(T(A)=0\) for all symmetric and skew symmetric matrices.
And we know that, every matrix can be expressed as the sum of symmetric and skew symmetric matrix.
So let $M\in M_n(\mathbb{R})$
then
\(M=\frac{1}{2}\times (M+M^T)+\frac{1}{2}\times (M+M^T)\)
\(\Rightarrow T(M)=\frac{1}{2}\times T(M+M^T)+\frac{1}{2}\times T(M+M^T)=0+0\)
So. 4th option is correct.
10.Consider $\mathbb{R}^3 $ with the standard inner product. Let $W $ be the subspace of $\mathbb{R}^3 $ spanned by $(1,0,-1) $. Which of the following is a basis for the orthogonal complement of $W $?
  • $\{(1,0,1),(0,1,0)\} $
  • $\{(1,2,1),(0,1,1)\} $
  • $\{(2,1,2),(4,2,4)\} $
  • $\{(2,-1,2),(1,3,1),(-1,-1,-1)\} $
Solution:
Since inner product of  $\{(1,0,-1)\}$ with $\{(1,0,1),(0,1,0)\}$ is zero.
So, 1st option is correct.

11.Let $A=\begin{bmatrix} 1&3&5&a&13\\0&1&7&9&b\\0&0&1&11&15\end{bmatrix} $ where $a,b \in \mathbb{R} $. Choose the correct statement.
  • There exist values of $a $ and $b $ for which the columns of $A $ are linearly independent.
  • There exist values of $a $ and $b $ for which $Ax=0 $ has $0 $ as only solution.
  • For all values of $a $ and $b $ the rows of $A $ span a $3- $dimensional subspace of $\mathbb{R}^5 $.
  • There exist values of $a $ and $b $ for which $Rank(A)=2 $.
Solution: This matrix is in standard Echelon Form.
There are 3 pivots.
so, dimension of Row space= Dimension of column space\(=3\).
No, matter what is the value of \(a,b\) the rows of \(A\) Spans a 3-dimesnsional subspace of \(\mathbb{R}^3\).
So, third option is correct.
12.Let $A $ be a $4\times 4 $ invertible real matrix. Which of the following is NOT necessarily true?
  • The rows of $A $ form a basis of $\mathbb{R}^4 $.
  • Null space of $A $ contains only the $0 $ vector.
  • $A $ has 4 distinct eigenvalues.
  • Image of the transformation $x \mapsto Ax $ on $\mathbb{R}^4 $ is $\mathbb{R}^4 $.
Solution: 3rd option is correct.
For example: \(\begin{bmatrix} 1&1&0&0\\ 0&1&0&0\\0&0&2&1\\0&0&0&2 \end{bmatrix}\)
Eigenvalues are:\(1,1,2,2\)
13.Let $n $ be an integer, $n\geq 3 $, and let $u_1,u_2,\cdots,u_n $ be $n $ linearly independent elements in a vector space over $\mathbb{R} $. Set $u_0=0 $ and $u_{n+1}=u_1 $. Define $v_i=u_i+u_{i+1} $ and $w_i=u_{i-1}+u_{i} $ FOR $i=1,2,\cdots, n $. Then
  • $v_1,v_2,\cdots,v_n $ are linearly independent if $n=2010 $.
  • $v_1,v_2,\cdots,v_n $ are linearly independent if $n=2011 $.
  • $w_1,w_2,\cdots,w_n $ are linearly independent if $n=2010 $.
  • $w_1,w_2,\cdots,w_n $ are linearly independent if $n=2011 $.
Solution:
Given,
\(v_1=u_1+u_2\)
\(v_2=u_2+u_3\)
\(\cdots\)
\(v_n=u_n+u_n+1=y_n+u_1\)(given \(u_n=u_1\))
now consider,
\(a_1v_1+a_2v_2+\cdots +a_nv_n=0\)
=\((a_1+a_2)u_1+(a_2+a_3)u_2+\cdots +(a_{n-1}+a_n)=0\) Substituting values and taking common
\(\Rightarrow (a_1+a_2=0)\quad (a_2+a_3=0)\cdots (a_{n-1}+a_n=0\))
Let \(a_1=\alpha\)
then we have
\(a_1=\alpha\quad a_2=-\alpha \cdots a_{n-1}=(-1)^n\alpha ,\quad a_n=(-1)^{n+1}\alpha\)
but since \(a_1=-a_n\) Therefore \((-1)^{n+1}\alpha=-\alpha\)
Therefore, \(n+1\) must be odd.
So, 3rd option is correct
\(a_1w_1+a_2w_2+\cdots +a_nw_n=0\)
\(\Rightarrow a_1(0+u_1)+a_2(u_1+u_2)+\cdots +a_n(u_{n-1}-u_n)=0\)
\(\Rightarrow (a_1+a_2)u_1+(a_2+a_3)u_2+\cdots +(a_{n-1}+a_n)u_{n-1}+a_n(u_n)=0\)
Since $u_1,u_2,\cdots,u_n $ be $n $ linearly independent elements in a vector space over $\mathbb{R} $
Therefore we get:
\((a_1+a_2)=(a_2+a_3)=\cdots +(a_{n-1}+a_n)=a_n(u_n)=0\)
\(\Rightarrow a_1=a_2=\cdots a_n=0\) for all \(n\).
Therefore, 3rd and 4th options are correct
Hence, \(2,3,4\) options are correct.
14..Let $n $ be a positive integer and $V $ be an $(n+1)- $dimensional vector space over $\mathbb{R} $. If $\{e_1,e_2,\cdots,e_{n+1}\} $ is a basis of $V $ and $T:V\rightarrow V $ is the linear transformation satisfying $T(e_i)=e_{i+1} \text{ for } i=1,2,\cdots, n \text{ and } T(e_{n+1})=0. $ Then
  • trace of $T $ is nonzero.
  • rank of $T $ is $n $.
  • nullity of $T $ is $1$.
  • $T^n=T\circ T \circ \cdots \circ T $(n times) is the zero map.
Solution:
\(T(e_1)=0\times e_1+1\times e_2+\cdots+0\times e_n\)
\(T(e_2)=0\times e_1+0\times e_2+1\times e_3\cdots+0\times e_n\)
\(\cdots\)
\(T(e_n)=o\times e_1+0\times e_2+\cdots+0\times e_n\)
So matrix of linear transformation Is:
\(\begin{pmatrix}0&0&0&\cdots&0&0&0\\1&0&0&\cdots&0&0&0\\0&1&0&\cdots&0&0&0\\0&0&0&\cdots&0&1&0\end{pmatrix}\)
From this matrix, It is clear that,
rank of $T $ is $n $ and nullity of $T $ is $1$.
now let \(e_m\in \{e_1,e_2\cdots e_n\}\).
then $T^n(e_m)=T\circ T \circ \cdots \circ T(e_m) $
after operating \(T\) ,   \(n-m\) times,
we get
$\underbrace{T\circ T \circ \cdots \circ T(0)}_{m\quad times}$=$0$

So,\(2,3,4\) are correct options.
15.Let $a,b,c,d \in \mathbb{R}$ and let $T:\mathbb{R}^2 \rightarrow \mathbb{R}^2 $ be the linear transformation defined by $T\left (\begin{bmatrix} x\\ y \end{bmatrix}\right) =\begin{bmatrix} ax+by\\ cx+dy \end{bmatrix} \text{ for } \begin{bmatrix} x\\ y \end{bmatrix}\in \mathbb{R}^2, $ Let $S:\mathbb{C} \rightarrow \mathbb{C}$ be the corresponding map defined by $S(x+iy)=(ax+by)+i(cx+dy) \text{ for } x,y\in \mathbb{R}. $ Then
  • $S $ is always $\mathbb{C} $linear, that is $S(z_1+z_2)=S(z_1)+S(z_2) $ for all $z_1,z_2 \in \mathbb{C} $ and $S(\alpha z)=\alpha S( z) $ for all $\alpha \in \mathbb{C}$ and $z \in \mathbb{C} $
  • $S$ is always $\mathbb{C}- $linear if $b=-c $ and $d=a $.
  • $S$ is always $\mathbb{C}- $linear only if $b=-c $ and $d=a $.
  • $S$ is always $\mathbb{C}- $linear if and only if $T $ is identity transformation.
solution:
option 1 is correct.
\(S(x_1+iy_1+x_2+iy_2)\)
\(=a(x_1+x_2)+b(y_1+y_2)+i(c(x_1+x_2)+d(y_1+y_2)\)
=\(S(x_1+x_2)\)

\(S(\alpha(x+iy))=(\alpha(ax+by)+i\times \alpha (cx+dy)\)
So, \(S\) is \(\mathbb{C}\)-Linear for all \(a,b,c,d\)
Therefore, 1,2,3 are correct.
Remark:
If you look closely there is a similarity between \(T\) and \(S\) both are same kind of Linear Transformation with respect to different basis.
16.For a positive integer $n $, let $P_n $ denote the space of all polynimials $p(x) $ with coefficients in $\mathbb{R} $ such that $\deg p(x) \leq n $, and let $B_n $ denote the standard basis of $P_n $ given by $B_n=\{1,x,x^2,\cdots , x^n\} $. If $T:P_3 \rightarrow P_4 $ is the linear transformation defined by $T(p(x))=x^2p'(x)+\int_0^x p(t)dt $ and $A=(a_{ij}) $ is the $5\times 4 $ matrix of $T $ with respect of standard bases $B_3 $ and $B_4 $, then
  • $a_{32}=\frac{3}{2},a_{33}=\frac{7}{3} $
  • $a_{32}=\frac{3}{2},a_{33}=0 $
  • $a_{32}=0,a_{33}=\frac{7}{3} $
  • $a_{32}=0,a_{33}=0 $
Solution: \(T(1)=x^2p'(1)+\int_0^x p(1)dt=x=1\times 0+1\times x\cdots x^n\times 0\)
\(T(x)=x^2p'(x)+\int_0^x p(x)dt=x=1\times 0+0\times x+\frac{3}{2} x^2\cdots x^n\times 0\)
\(T(x^2)=x^2p'(x^2)+\int_0^x p(x^2)dt=x=1\times 0+0\times x+0\times x^2+\frac{7}{3}x^3\cdots x^n\times 0\)
So first three column of the matrix of transformation is:
\(\begin{pmatrix}0&0&0&\cdots\\1&0&0&\cdots\\0&\frac{3}{2}&0&\cdots\\0&0&\frac{7}{3}&\cdots\\\cdots&\cdots&\cdots\\\cdots&\cdots&\cdots\end{pmatrix}\)
clearly, \(a_{32}=\frac{3}{2},a_{33}=0\).
2nd option is correct.
17. A linear operator  $T $ on complex vector space  $V $ has characteristic polynomial  $x^3(x-5)^2 $ and minimal polynomials  $x^2(x-5) $.
Choose all correct options.

  • The Jordan form of  $T $ is uniquely determined by the given information.
  • There are exactly  $2 $  Jordan blocks in the Jordan decomposition of  $T $.
  • The operator induced by  $T $ on the quotient space  $V/\ker (T-5I) $ is nilpotent, where  $I $ is the identity operator.
  • The operator induced by  $T $ on the quotient space  $V/\ker (T-5I) $ is scaler multiple of the identity operator.
Solution:
Minimal Polynomial is =$x^2(x-5) $.
SO, Jordan canonical form is:
\(\begin{pmatrix} 0&1&0&0&0\\0&0&0&0&0\\0&0&0&0&0\\0&0&0&5&0\\0&0&0&0&5\end{pmatrix}\)
It's unique upto order of Jordan blocks.
Since, From minimal polynomial It is clear that there exists a Jordan block of order \(2\times 2\) corresponding to \(0\). So, \(1\) zero which is remaining can occur in only 1 way. Similar argument for \(5\).
Now, since all eigenvalues of the operator induced by  $T $ on the quotient space  $V/\ker (T-5I) $ is \(0\)
So, It is Nilpotent.
So, \(1\) and \(3\) options are correct.






7 comments :

  1. I am happy to find your distinguished way of writing the post. Now you make it easy for me to understand and implement the concept. Thank you for the post. vector art

    ReplyDelete
  2. This is such a great resource that you are providing and you give it away for free. I love seeing blog that understand the value. Im glad to have found this post as its such an interesting one! I am always on the lookout for quality posts and articles so i suppose im lucky to have found this! I hope you will be adding more in the future... vector

    ReplyDelete
  3. This blog is really helpful to pass on redesignd enlightening endeavors over web which is genuinely examination. I found one productive instance of this truth through this blog. Image Vectorizer Mac

    ReplyDelete