Dimension and solution to matrix-vector product

  • #1
DumpmeAdrenaline
78
2
Let $$ X \in R^{m*n} $$ where m=n with rank(X)<m then there is at-least one equation which can be written as a linear combination of other equations. Let $$ \beta \in R^{n} $$.
$$ X\beta=y $$
Suppose we have x<m independent equations (the equations are consistent) formed by taking the dot product of x row vectors of with the column vector. Each independent equation represents a geometrical object of dimension n-1 (n-1 degrees of freedom) . We have x geometrical objects in n dimensions and we are trying to find the intersection of all these geometrical objects that satisfies the RHS represented by y. The dimension of row space is x which corresponds to the # of independent equations. Can we say that we are reducing the problem to finding a vector x that is perpendicular to all the x row vectors that all lie on some geometric object n-1? If this is the case, why there are infinite solutions?

I understand why we have infinite solutions if we think of X in terms of column vectors. If y is in the column space, we can check this by comparing the rank of X and rank of X augmented with y. We have infinite possibilities for scalars of the independent column vectors in the sum of independent columns that would yield y.
 
Physics news on Phys.org
  • #2
DumpmeAdrenaline said:
Let $$ X \in R^{m*n} $$
Your notation is unusual, so difficult to follow.
Many textbooks would write something like this:
Let ##A \in M^{n \times n}## where the coefficients of A are real.
DumpmeAdrenaline said:
where m=n with rank(X)<m then there is at-least one equation which can be written as a linear combination of other equations. Let $$ \beta \in R^{n} $$.
$$ X\beta=y $$
Or more clearly,
Let ##\vec x \in \mathbb R^n## and let ##\vec y = A\vec x##.
DumpmeAdrenaline said:
Suppose we have x<m independent equations (the equations are consistent) formed by taking the dot product of x row vectors of with the column vector. Each independent equation represents a geometrical object of dimension n-1 (n-1 degrees of freedom) . We have x geometrical objects in n dimensions and we are trying to find the intersection of all these geometrical objects that satisfies the RHS represented by y. The dimension of row space is x which corresponds to the # of independent equations. Can we say that we are reducing the problem to finding a vector x that is perpendicular to all the x row vectors that all lie on some geometric object n-1? If this is the case, why there are infinite solutions?
I don't see how this reduces the problem. The matrix equation (in my notation) ##\vec y = A\vec x## represents n - 1 equations in n unknowns. Each equation represents a hyperplane in n-dimensional space. If the n - 1 equations are linearly independent, then the solution space is a line in n-dimensional space, this there are an infinite number of solutions.
DumpmeAdrenaline said:
I understand why we have infinite solutions if we think of X in terms of column vectors. If y is in the column space, we can check this by comparing the rank of X and rank of X augmented with y. We have infinite possibilities for scalars of the independent column vectors in the sum of independent columns that would yield y.

[/quote]
 
  • #3
Mark44 said:
I don't see how this reduces the problem. The matrix equation (in my notation) ##\vec y = A\vec x## represents n - 1 equations in n unknowns. Each equation represents a hyperplane in n-dimensional space. If the n - 1 equations are linearly independent, then the solution space is a line in n-dimensional space, this there are an infinite number of solutions.
Unfortunately this is the notation used by the author of the textbook I am studying from. How do we have n-1 system of equations when we take the dot product of n rows of matrix A with n entries of the column vector x. Can we think of it this way:

Given a 3*3 matrix A, where 2 row vectors of the matrix A are linearly independent then the 2 row vectors span a plane in 3 D and the 3rd vector lies in that plane.

In a similar way, suppose we have have x<n linearly independent vectors. The x independent row vectors span some some geometrical object in n-dimensional space where all row vectors lie in. We are trying to find a column vector which when dotted to the n row vectors yields the entries in the column vector y.
 

Attachments

  • ff6a1de3-6929-44a5-972b-071923886b55.jpg
    ff6a1de3-6929-44a5-972b-071923886b55.jpg
    17.6 KB · Views: 32
  • #4
DumpmeAdrenaline said:
Given a 3*3 matrix A, where 2 row vectors of the matrix A are linearly independent then the 2 row vectors span a plane in 3 D and the 3rd vector lies in that plane.
Again, I don't see how this reduces the problem. With a 3 x 3 matrix, if all three rows are linearly independent, then there is a unique solution. If two of the rows are linearly independent, and the third row is a linear combination of the other two rows, then the solution is all points on the line of intersection of the two planes represented by the independent rows. If two of the rows are multiples of one of the rows, then the solution is all points on the plane represented by one of the rows.

In what sense does this reduce the problem?
 
  • #5
I thought it reduces the problem in the sense that all the independent vectors that span the row space will lie on a geometric object of dimension x.
 

What is the dimension of the resulting vector in a matrix-vector product?

The resulting vector in a matrix-vector product will have the same number of rows as the matrix and will be a column vector.

How do you multiply a matrix by a vector?

To multiply a matrix by a vector, you take the dot product of each row of the matrix with the vector. The resulting vector is the sum of these dot products.

What is the solution to a matrix-vector product?

The solution to a matrix-vector product is a new vector that is a linear combination of the columns of the matrix, where the coefficients are the elements of the vector being multiplied.

What is the significance of matrix-vector products in linear algebra?

Matrix-vector products are fundamental operations in linear algebra and are used in various applications such as solving systems of linear equations, transforming vectors in geometric transformations, and representing linear mappings.

How does the dimension of the matrix affect the matrix-vector product?

The dimension of the matrix determines the number of rows and columns in the resulting vector. A matrix with more rows than columns will result in a longer output vector, while a matrix with more columns than rows will result in a shorter output vector.

Similar threads

  • Linear and Abstract Algebra
Replies
6
Views
885
  • Linear and Abstract Algebra
Replies
8
Views
883
Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
929
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
13
Views
522
  • Linear and Abstract Algebra
Replies
1
Views
1K
Replies
12
Views
3K
Replies
24
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
Back
Top