Dot product = Product of norms

In summary: Yes, you are correct. This is what you would do to re-formulate the problem in a more simplified form. Once you have done that, you can then use the Cauchy-Schwarz inequality to show that the vectors are indeed parallel.
  • #1
Bipolarity
776
2
Hey guys I am a beginner in linear algebra. I am doing vectors now and I just noticed that when two vectors are parallel (or antiparallel), the product of their norms is equal to the absolute value of their dot product, or

[tex] |u \cdot v | = ||u|| \ ||v|| [/tex]

I know that this is a special case of the Cauchy-Schwarz inequality. My question is, is the converse necessarily true? In other words, if you know the above equation to be true for a pair of vectors u and v, must they necessarily be parallel? How might one go about proving this? Could we assume the contrary and show an inconsistency?

By the way, by parallel, I mean to say that two vectors are parallel if there exists a scalar (real number, not necessarily positive) that scales one vector onto the other.

BiP
 
Physics news on Phys.org
  • #2
Hey Bipolarity.

You can prove this is you want by letting one vector be a scalar multiple of the other (like v = c*u where c is a non-zero real number) and then use the properties of norms and inner products to show that the above is the case.

For inner products we know that <u,cu> = c<u,u> and for norms we know that ||cu|| = |c|*||u|| and the rest follows from there on.
 
  • #3
chiro said:
Hey Bipolarity.

You can prove this is you want by letting one vector be a scalar multiple of the other (like v = c*u where c is a non-zero real number) and then use the properties of norms and inner products to show that the above is the case.

For inner products we know that <u,cu> = c<u,u> and for norms we know that ||cu|| = |c|*||u|| and the rest follows from there on.

Hey chiro, I am afraid you might have misunderstood my question. I already proved that parallel vectors satisfy the above equation. What I am trying to prove is whether the satisfaction of the above equation necessitates that two vectors be parallel.

Why would you assume the vectors are parallel if you are trying to prove exactly that?
Also, I have not studied inner product spaces yet. Do you mean the dot product when you say inner product?

BiP
 
  • #4
Well as you are aware, the Cauchy-Schwartz puts in inequality which means that if it is not at the extent of the inequality, then it must be within its bounds so yes you should be able to show that if the relationship is satisfied then the vectors are parallel.

In terms of an actual "proof" if you show that there is a bidirectional implication of the two statements, then it means negating that statement means that if one doesn't hold then the other can't either.
 
  • #5
chiro said:
Well as you are aware, the Cauchy-Schwartz puts in inequality which means that if it is not at the extent of the inequality, then it must be within its bounds so yes you should be able to show that if the relationship is satisfied then the vectors are parallel.

In terms of an actual "proof" if you show that there is a bidirectional implication of the two statements, then it means negating that statement means that if one doesn't hold then the other can't either.

Yes that is what I am asking. How would I go about proving that the vectors are parallel? I would have to prove the existence of a scalar that scales one vector onto the other. What would that scalar be, given that the equation above is satisfied? Somehow I would have to define this scalar in terms of the dot product and the norms of the vectors. I feel that is the only way to prove this conjecture.

BiP
 
  • #6
Do you know the proof of the Cauchy-Schwarz inequality?? Can you write it down?
What you want to prove usually follows from a small modification in the proof.
 
  • #7
I've thought about this, and I think one way is to use projections to re-formulate the problem and then show a bi-directional proof based on the difference between the result of projecting one vector onto another and the original vector itself.

So if you want to project u onto v then you calculate (<u,v>/||v||)*v^ = (<u,v>/||v||^2)*v.

Now the difference of the norms of those vectors is ||u|| - ||proj(u,v)|| = x. If x > 0 then these vectors are not parallel (or linearly dependent).

Now if something is not linearly dependent with something, we need to show that this implies the above statement. So to do this, you should break up a general vector u in terms of a scalar multiple times v plus some residual term. (i.e. u = a*v + w where w is the left-over component).

Using distributivity of the inner product we get proj(u,v) = (<u,v>/||v||^2)*v = (<a*v + w,v>/||v||^2)*v = (<a*v,v>/||v||^2)*v + (<w,v>/||v||^2)*v = av + proj(w,v).

So all I have done is extended the above to take into account a vector with a linearly dependent term and a linearly independent term.

You will have to re-arrange that stuff yourself (I'm just fleshing out the skeleton of the idea) but if you show that linear dependence means you can only get the equality with w = 0, then getting that equality means that w = 0. You suppose that w <> 0 and end up with a contradiction to show bi-directionality.

The projection part might be a good catalyst to consider since it gives specific indicators of how <u,v> is related to the dependent and independent parts explicitly and you can relate these ideas to Cauchy-Schwartz as well.
 
  • #8
micromass said:
Do you know the proof of the Cauchy-Schwarz inequality?? Can you write it down?
What you want to prove usually follows from a small modification in the proof.

Thanks micromass! I remember my professor showing us the proof, and I remember having a notion of solving this problem using that proof. Now I know that it can be done.

I will try to apply the Cauchy-Schwarz inequality, however the proof is quite lengthy for me to type it up on LaTeX here.

Thanks for the help!

BiP
 
  • #9
One can show that the dot product of two vectors, u and v, is [itex]u\cdot v=|u||v|cos(\theta)[/itex] where [itex]\theta[/itex] is the angle between the two vectors. (one can show that in two or three dimensions. In higher dimensional vector spaces, that is typically used as the definition of "angle".) That is, if [itex]|u\cdot v|= |u||v|[/itex] then we must have [itex]cos(\theta)[/itex] equal to 1 or -1. Which in turn means that the angle between u and v is either 0 or 180 degrees.
 

Related to Dot product = Product of norms

1. What is the Dot Product?

The dot product, also known as the scalar or inner product, is a mathematical operation that takes two vectors and produces a single scalar value. It is calculated by multiplying the corresponding components of the two vectors and then summing the results.

2. How is the Dot Product calculated?

The dot product is calculated by multiplying the x component of one vector by the x component of the other vector, the y component of one vector by the y component of the other vector, and so on. The results are then summed together to get the final scalar value.

3. What is the relationship between the Dot Product and the Product of Norms?

The product of norms is the product of the lengths of two vectors, while the dot product is the sum of the products of the corresponding components of the two vectors. The relationship between the two is that the dot product is equal to the product of the norms multiplied by the cosine of the angle between the two vectors.

4. What are the applications of the Dot Product?

The dot product has many applications in mathematics, physics, and engineering. It is used in vector calculus, mechanics, computer graphics, and many other fields. It is also used in machine learning and data analysis for tasks such as dimensionality reduction and feature extraction.

5. How is the Dot Product useful in understanding vector relationships?

The dot product is useful in understanding vector relationships because it can be used to determine the angle between two vectors and whether they are parallel or perpendicular. It can also be used to project one vector onto another and calculate the work done by a force on an object in a given direction.

Similar threads

  • Linear and Abstract Algebra
Replies
33
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
465
  • Linear and Abstract Algebra
Replies
14
Views
849
  • Linear and Abstract Algebra
Replies
16
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
408
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Calculus
Replies
4
Views
700
  • Linear and Abstract Algebra
Replies
32
Views
3K
  • Linear and Abstract Algebra
Replies
10
Views
528
  • Linear and Abstract Algebra
Replies
4
Views
1K
Back
Top