Expressing matrices as outer product of two vectors

In summary, the matrix M can be expressed as a linear combination of two matrices, where the singular values are the coefficients of this linear combination. The idea for finding u and v (if they exist) doesn't work for a 2x2 matrix, or even a symmetric 2x2 where there are 4 terms in the vectors and only 3 DOF in the matrix.
  • #1
mnb96
715
5
Hello,

it is known that given two column vectors u and v with N entries, the quantity uvT is a NxN matrix, and such operation is usually called outer product (or tensor product).

My questions are:

1) How can we test whether or not a matrix M is expressible as an outer product of two vectors, that is M=uvT ?

2) Is the identity matrix expressible as an outer product of two (eventually complex) vectors?

Thanks.
 
Physics news on Phys.org
  • #2
Hey mnb96.

Have you come across dyads before?
 
  • #3
I haven't. Although I am familiar with the basics of Clifford algebra, if that can help.
 
  • #4
The key to answering both questions it that ##uv^T## is a matrix of rank 1.
 
  • #5
OK, since M=uv', and we will take the case of a 3x3 for discussion. Now, there are 9 degrees of freedom (df), in general, for a 3x3, but for two vectors with 3 elements, there are only 6 dfs. So, to be a more useful discussion, let's make the matrix M symmetric and now its has 6 df (down from 9 df), the same dfs as the sum of the vectors. But, for a 4x4 symmetric matrix, there are 10 dfs (4+3+2+1), but the vectors have only a total of 8. Going down, for a 2x2 symmetric matrix, there are 3 dfs (2+1), but the vectors have of a total of 4 df, more than enough(?). Explore the case of a 2X2:

a c = |e |*|g h | = | eg eh |
c d ...|f |.....| fg fh |

Then, a = eg; c =eh=fg; and d = fh.
Let ln(e) = E; ln(f) = F; ln(g) = G and ln(h)= H, ln(a) = A; ln(b) = B; ln(c) = C, ln(d) = D, assuming all logs are defined (??). Then

1*E + 0*F + 1*G + 0* H = A
1*E + 0*F + 0*G + 1* H = C
0*E + 1*F + 1*G + 0* H = C
0*E + 1*F + 0*G + 1* H = D

This system has dimension 3 as upon subtracting the last from the 3rd:
0*E + 0*F + 1*G + -1* H = C-D
and subtracting this from the 1st:
1*E + 0*F + 0*G + 1* H = A-C+D
But this is inconsisent with the 2nd equation:
1*E + 0*F + 0*G + 1* H = C
unless it happens that C = 1/2 (A+D). One can also verify that the original matrix of coefficients cannot be inverted.

Case 3x3
a b c = |g |*|j k l | = | gj gk gl |
b d e ...|h |.....| hj hk hl |
c e f ...|i |...| ij ik il |

So a = gj; b=gk=hj ; c=gl =ij; d =hk; e = hl=ik; f = il.

Let G = ln(g); H = ln(h); etc. and assuming all log transforms are defined(??):

1*G + 0*H + 0*I + 1*J + 0*K + 0*L = A
1*G + 0*H + 0*I + 0*J + 1*K + 0*L = B
0*G + 1*H + 0*I + 1*J + 0*K + 0*L = B
1*G + 0*H + 0*I + 0*J + 0*K + 1*L = C
0*G + 0*H + 1*I + 1*J + 0*K + 0*L = C
0*G + 1*H + 0*I + 0*J + 1*K + 0*L = D
0*G + 1*H + 0*I + 0*J + 0*K + 1*L = E
0*G + 0*H + 1*I + 0*J + 1*K + 0*L = E
0*G + 0*H + 1*I + 0*J + 0*K + 1*L = F
we have a system of equations in 6 variables and 9 equations. We again may not always be able to solve the system in even the 3x3 symmetric matrix case. Forget about higher dimensions.
 
Last edited:
  • #6
mnb96 said:
1) How can we test whether or not a matrix M is expressible as an outer product of two vectors, that is M=uvT ?

Look at the various interpretations of the Singular Value Decomposition (SVD) of a matrix. One interpretation of the SVD is that any matrix can be expressed as a linear combination of pairs of matrices of the form [itex] U_i V_i^T [/itex]. The singular values are the coefficients of this linear combination. You are asking when all the singular values are zero except for one of them.
 
  • #7
ajkoer said:
OK, since M=uv', and we will take the case of a 3x3 for discussion.

Your counting of degrees of freedom isn't right, because if M = uv' then M = (au)'(v/a) for any nonzero scalar a.

Your idea for finding u and v (if they exist) doesn't work for a 2x2 matrix, or even a symmetric 2x2 where there are 4 terms in the vectors and only 3 DOF in the matrix.

Suppse ##\begin{bmatrix} a \\ b\end{bmatrix} \begin{bmatrix} c & d \end{bmatrix} =
\begin{bmatrix}ac & ad \\ bc & bd \end{bmatrix} = \begin{bmatrix} p & q \\ r & s \end{bmatrix}##.
Ignoring special cases when some matrix entries are zero, from the first row we have
ac = p, ad = q, or c = p/a, d = q/a
From the second row
bc = r, bd = s, or bp/a = r, bq/a = s
So b/a = r/p = s/q
That cannot be satisfied unless r/p = s/q, which means p q r and s can not be four arbitrary values. Making the matrix symmetric, so q = r, doesn't help, because you still need q/p = s/q.

The above applies to every 2x2 submatrix of any size matrix bigger than 1x1.

Note: the result are still true if some of p, q, r and s are zero, the proof is similar but there are too many special cases to write them all out here.
 
  • #8
AlephZero:

I fixed my error only to observe you commentary.

My revised conclusion was that the system of equations, even in the 2x2 symmetric case over the set of positive numbers, is generally inconsisent (no solution), which I believe is what your analysis implies also. Note, I say generally, as I may have a identified a very special case in which a solution in the 2x2 case is possible.
 
Last edited:
  • #9
Thank you all guys for your advice.
I think that combining all your contributions I can consider my original questions answered.

I will take the risk to go off topic, but I was wondering if it's possible to answer the first question in my original post when instead of having two vectors in ℝn u and v, I have two continuous and integrable functions u(x) and v(x), and instead of the tensor product I consider the 2-dimensional function M(x,y) = u(x)v(y).

When M(x,y) can be expressed as the product of two functions u(x)v(y)?
It's a bit difficult now to talk about SVD and matrix ranks.
 
Last edited:
  • #10
mnb96 said:
When M(x,y) can be expressed as the product of two functions u(x)v(y)?

Going back to the matrices, you can express any matrix as the sum of k outer products, where k is the rank of the matrix. For example if the matrix has full rank, a trivial solution is to take the u vectors each containing a single entry 1, and the v vectors equal to the rows of the matrix, but this is not a unique solution.

The more interesting version of your questiion is to extend this to an "infinite number of dimensions" and ask whether ##M(x,y)## can be expressed as an infinite sum like
$$M(x,y) = \sum_{i=0}^\infty u_i(x)v_i(y)$$ or $$M(x,y) = \sum_{i=0}^\infty \sum_{j=0}^\infty u_i(x)v_j(y)$$

Note, this does not have the same "trivial" solution as the finite dimension matrix problem, because there are only a countably infinite number of functions being summed, but you want the sum to hold for an uncountably infinite number of values of x and y.
 
  • #11
mnb96 said:
When M(x,y) can be expressed as the product of two functions u(x)v(y)?
It's a bit difficult now to talk about SVD and matrix ranks.


That is s a very general, but interesting question. In view of the counterintuitive possibilities of simply using addition (e.g. the PDF: http://www.google.com/url?sa=t&rct=...sg=AFQjCNF1XfRRDuAqJ6_7C60ttGO81rmoQw&cad=rja) I suppose there is reason to be optimistic about such a representation. I think you must establish a more specific scenario to get answers.

For example if we assume that we can evaluate M(x,y) numerically at the points (0,0) ,(0,1), (1,0), (1,1) then we can get 4 equations for the 4 unknowns u(0),u(1),v(0),v(1). If we can solve those, we have reconstructed u() and v() at those points. If M(x,y) can be evaluated at arbitrary (x,y) (for example, if you know a specific formula for M(x,y), but one that isn't factored), we'd have to think about conditions that guarantee that all the possible simultaneous equations we can set up have some consistent set of answers.

If we assume M(x,y) is differentiable, perhaps we can write differential equations for u(x) and v(y).
 
  • #12
I didn't know that the problem I posed is related to Kolmogorov's superposition theorem. Quite an interesting remark.

I agree with Stephen that probably in order to get some answer we must establish a specific scenario. For example I was thinking that if we suppose that [itex]M(x,y)=f(x)g(y)[/itex] and both f and g are integrable and their integral over the reals is not zero, then we have that:

[tex]M_x(x) = \int_{-\infty}^{+\infty}M(x,y)dy = \lambda_g f(x)[/tex]

[tex]M_y(y) = \int_{-\infty}^{+\infty}M(x,y)dx = \lambda_f g(y)[/tex]

where [itex]\lambda_f = \int_{-\infty}^{+\infty} f(x)dx[/itex], and [itex]\lambda_g = \int_{-\infty}^{+\infty} g(y)dy[/itex].

In order words, the projections Mx and My of M onto the x-axis and y-axis are sufficient to reconstruct M (up to a multiplicative scalar). This means that under the aforementioned restrictions on M, a necessary condition for M to be expressible as the product of two one-variable functions is that:

[tex]M(x,y) = k \cdot M_x (x) M_y (y)[/tex]

for some scalar k.

Unfortunately I don't know how to address the case when either one of the projections is zero.
 
  • #13
...actually I just realized that there is an easier way to handle the more general case in which none, one, or both the projections are zero.
Considering a function M(x,y) such that there exist at least one point (x0,y0) where the function is non-zero, we assume that:

(1)
[tex]M(x,y)=f(x)g(y)[/tex]

we have that [itex]f(x_0)\neq 0[/itex] and [itex]g(y_0)\neq 0[/itex].
And if we consider the horizontal and vertical slices of M at [itex](x_0, y_0)[/itex]:

[tex]M(x, y_0)=g(y_0)\cdot f(x)[/tex]
[tex]M(x_0, y)=f(x_0)\cdot g(y)[/tex]

we have that:

(2) [tex]M(x,y) = \lambda M(x, y_0)M(x_0, y)[/tex]

for some scalar λ.

It is also possible to verify that [itex](2) \Rightarrow (1)[/itex]. This means that a function M(x,y) that is nonzero at some point, and that is expressible as a product of two functions, can be reconstructed, up to a multiplicative scalar, by the product of the horizontal and vertical slices of M at a point where the function M is non-zero.

This should be also valid for matrices, and it is related to the fact that, as AlephZero has pointed out, such a matrix must have rank=1.
 
Last edited:

Related to Expressing matrices as outer product of two vectors

1. What does it mean to express a matrix as an outer product of two vectors?

Expressing a matrix as an outer product of two vectors means representing the matrix as the product of two vectors, where one vector is a column vector and the other is a row vector. The resulting matrix will have the same dimensions as the two vectors, and each element of the matrix is the product of the corresponding elements of the two vectors.

2. How do you express a matrix as an outer product of two vectors?

To express a matrix as an outer product of two vectors, you can use the Kronecker product, which is a mathematical operation used to create a new matrix by taking the product of each element of one matrix with the other matrix. Alternatively, you can also use a combination of matrix multiplication and transposition to achieve the same result.

3. What are the benefits of expressing a matrix as an outer product of two vectors?

Expressing a matrix as an outer product of two vectors can help simplify calculations and make complex matrices more manageable. It also allows for easier visualization and interpretation of the data, as the resulting matrix will have the same dimensions as the original vectors and can be easily plotted or graphed.

4. Can any matrix be expressed as an outer product of two vectors?

No, not every matrix can be expressed as an outer product of two vectors. Only matrices that have a special structure, such as being symmetric or having a low rank, can be expressed in this form. For other matrices, it may not be possible to find two vectors that result in the same matrix when multiplied together.

5. What are some real-world applications of expressing matrices as outer products of two vectors?

The outer product of two vectors is commonly used in fields such as signal processing, image compression, and machine learning. It can also be used in data analysis and statistics to identify patterns and relationships between variables. Additionally, the outer product can be used in engineering and physics to represent physical quantities and their interactions.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
758
  • Linear and Abstract Algebra
Replies
9
Views
2K
  • Quantum Physics
Replies
21
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
4K
Replies
12
Views
3K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Quantum Physics
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top