

As we mentioned in the proof of Corollary 4 of Linear Independent Vectors, it is easy to see that for any n, C 1, …, C n forms a basis for the set of all n × 1 column vectors. Observation: Let C j be the jth column of the identity matrix I n. Proof: This follows by Corollary 4 of Linear Independent Vectors and Property 2. Similarly, any set of n mutually orthogonal 1 × n row vectors is a basis for the set of 1 × n row vectors. Property 3: Any set of n mutually orthogonal n × 1 column vectors is a basis for the set of n × 1 column vectors. Since this is true for any j, X 1, …, X m are independent. But since X j ∙ X j > 0, it follows that c j = 0. Then for any j, 0 = X j ∙ = c j ( X j ∙ X j) since X j ∙ X i = 0 when i ≠ j. Proof: Suppose X 1, …, X m are mutually orthogonal and let = 0. Property 2: If X 1, …, X m are mutually orthogonal vectors, then they are independent. Proof: ( AX) ∙ Y = ( AX) TY = ( X TA T) Y = X T( A TY) = X ∙ ( A TY) Property 1: If A is an m × n matrix, X is an n × 1 vector and Y is an m × 1 vector, then It is easy to see that ( cX) ∙ Y = c( X ∙ Y), ( X + Y) ∙ Z = X ∙ Z + Y ∙ Z, X ∙ X = > 0 and other similar properties of the dot product.

Note that if X and Y are n × 1 column vectors, then X ∙ Y = X TY = Y TX, while if X and Y are 1 × n row vectors, then X ∙ Y = XY T = YX T.

Observation: As we observed in Matrix Operations, two non-null vectors X = and Y = of the same shape are orthogonal if their dot product is 0, i.e.
