Posts

Showing posts with the label Linear Algebra

Can You Completely Permute The Elements Of A Matrix By Applying Permutation Matrices?

Answer : It is not generally possible to do so. For a concrete example, we know that there can exist no permutation matrices P , Q P,Q P , Q such that P\pmatrix{1&2\\2&1}Q = \pmatrix{2&1\\2&1} If such a P P P and Q Q Q existed, then both matrices would necessarily have the same rank. Let me add one more argument: For n ≥ 2 n \ge 2 n ≥ 2 : Suppose the entries in the n × n n \times n n × n matrix A A A are all distinct. Then there are ( n 2 ) ! (n^2)! ( n 2 )! distinct permutations of A A A . There are n ! n! n ! row-permutations of A A A (generated by premultiplication by various permutation matrices), and n ! n! n ! col-permutations of A A A (generated by post-multiplication by permutation matrices). If we consider all expressions of the form R A C RAC R A C where R R R and C C C each range independently over all n ! n! n ! permutation matrices, we get at most ( n ! ) 2 (n!)^2 ( n ! ) 2 possible results. But for n > 1 n > 1 n > 1...

Basis Of The Polynomial Vector Space

Answer : A basis for a polynomial vector space P = { p 1 , p 2 , … , p n } P=\{ p_1,p_2,\ldots,p_n \} P = { p 1 ​ , p 2 ​ , … , p n ​ } is a set of vectors (polynomials in this case) that spans the space, and is linearly independent. Take for example, S = { 1 , x , x 2 } . S=\{ 1,x,x^2 \}. S = { 1 , x , x 2 } . This spans the set of all polynomials ( P 2 P_2 P 2 ​ ) of the form a x 2 + b x + c , ax^2+bx+c, a x 2 + b x + c , and one vector in S S S cannot be written as a multiple of the other two. The vector space { 1 , x , x 2 , x 2 + 1 } \{ 1,x,x^2,x^2+1 \} { 1 , x , x 2 , x 2 + 1 } on the other hand spans the space, but the 4th vector can be written as a multiple of the first and third (not linearly independent), thus it is not a basis. The simplest possible basis is the monomial basis: { 1 , x , x 2 , x 3 , … , x n } \{1,x,x^2,x^3,\ldots,x^n\} { 1 , x , x 2 , x 3 , … , x n } . Recall the definition of a basis. The key property is that some linear combination of basis vec...

Can An Idempotent Matrix Be Complex?

Answer : A assume that by "can A A A be complex", you mean "can A A A have any non-real entries". Well, it can! For instance, take A = \pmatrix{1&i\\0&0} In general: for any complex column-vector x x x , A = x x ∗ x ∗ x A = \frac{xx^*}{x^*x} A = x ∗ x x x ∗ ​ (where ∗ * ∗ denotes the conjugate-transpose) is such a matrix. A projection to a subspace is idempotent. Therefore A A A has no reason to be real. For example, take a subspace S S S of C 2 \mathbb{C}^2 C 2 and A A A be the matrix of the projection on to S S S with respect to the standard basis. Any matrix A = \pmatrix{a&b\\c&1-a} will be idempotent provided that a 2 + b c = a a^2+bc=a a 2 + b c = a