Vector Spaces: Provide a counter example to disprove

In summary, the conversation discusses the concept of vector spaces and how the set A=((x,y) \in R^{2}/ x\geq0) does not qualify as a vector space due to the lack of closure under scalar multiplication. It is also mentioned that the terms "commutative" and "associative" are not typically used for the scalar operation, but rather "mixed associativity" and "compatibility of scalar multiplication with field multiplication" are used. The possibility of defining commutativity for this relationship is also discussed but not considered to be an important concept for vector spaces.
  • #1
boings
18
0
Demonstrate with the help of a counter-example why the following is not a vector space.

1. A= ((x,y) [itex]\ni[/itex] R[itex]^{2}[/itex]/ x[itex]\geq[/itex]0)

I have many more questions like this, but since I cannot get the first one I think I might have a chance if I understand it.

As far as an attempt at an answer, I can only grasp that vector space must be commutative and associative and I can guess that this isn't the case because as y is negative x may become negative as well which would be outside the vector space, but how might I say that if if there are no determined operations on the set of (x,y) variables?

thank you!
 
Physics news on Phys.org
  • #2
Hi boings
You should be able to multiply any element of your vector space by a scalar and get a new element of the same vector space
Can you think of some scalars that clearly break this rule ?
Cheers...
 
  • #3
Ok, so that would be any negative scalar which would break the rule of x[itex]\geq[/itex]0?
 
  • #4
Yes :)
multiply any element of your original 'candidate vector space' by a negative number and it clearly doesn't belong to said 'original candidate'
therefore it can't be a valid candidate :)
 
  • #5
Some remarks (and I mean them to be constructive):

boings said:
Demonstrate with the help of a counter-example why the following is not a vector space.

1. A= ((x,y) [itex]\ni[/itex] R[itex]^{2}[/itex]/ x[itex]\geq[/itex]0)

You have the wrong math symbol here. It should be [itex]\in[/itex] instead of [itex]\ni[/itex]. The LaTeX code is \in

I have many more questions like this, but since I cannot get the first one I think I might have a chance if I understand it.

As far as an attempt at an answer, I can only grasp that vector space must be commutative and associative

A vector space being commutative and associate makes no sense. It is the operation + on the vector space that is commutative and associative.
 
  • #6
micromass said:
You have the wrong math symbol here. It should be [itex]\in[/itex] instead of [itex]\ni[/itex]. The LaTeX code is \in

Hah, yeah I thought that was wrong, but couldn't find the right one, thanks!

So am I correct in saying that the scalar operation on a vector space is commutative and associative?

thank you both
 
  • #7
boings said:
So am I correct in saying that the scalar operation on a vector space is commutative and associative?

Very good question.

Hmmm. I know what you mean and you are correct. But the words commutative and associative are not typically used for the scalar operation.
The scalar operation is defined as

[tex]\mathbb{R}\times V\rightarrow V:(\alpha,v)\rightarrow \alpha\cdot v[/tex]

Associativity would mean that [itex]\alpha\cdot (\beta \cdot v)= (\alpha \beta)\cdot v[/itex]. This law certainly holds true, but we don't use the word associative for this. The reason that we don't is that [itex]\alpha,\beta[/itex] and v belong to different sets. We usually only talk about associative if the sets are the same. The word to describe the law that I have seen are "mixed associativity" and "compatibility of scalar multiplication with field multiplication". Of course, there is nothing wrong with thinking of associativity and many authors do this.

Commutativity is also a bit tricky. You want to say that [itex]\alpha\cdot v=v\cdot \alpha[/itex]. But if we are rigorous, then I have to remark that [itex]v\cdot \alpha[/itex] is not even defined. We defined scalar multiplication such that the scalar is always on the left of the vector. We did not define anything such that the scalar is right of the vector. Of course, we can just define [itex]\alpha\cdot v=v\cdot \alpha[/itex] to be true (and many authors indeed do this). But I would still be careful in calling this relation "commutativity" (again, because [itex]\alpha[/itex] and v belong to different sets).
 
  • #8
thanks a lot, that's a great reply.

So if commutativity in this case were to be proven, has it been referenced also as "mixed commutativity"?
 
  • #9
boings said:
thanks a lot, that's a great reply.

So if commutativity in this case were to be proven, has it been referenced also as "mixed commutativity"?

Commutativity can't be proven, it needs to be defined. Remember, when you have a vector space, then the only thing that is defined is something of the form [itex]\alpha\cdot v[/itex]. A product where the scalar is on the right like [itex]v\cdot \alpha[/itex] is not in general defined. If you want [itex]\alpha\cdot v=v\cdot \alpha[/itex], then you will need to define this to be true as the right-hand side does not make any sense until you defined it.

But anyway, even if you have [itex]\alpha \cdot v=v\cdot \alpha[/itex], then there is nothing stopping you to call this commutativity. However, I have not yet heard the term commutativity for this relationship. I don't think many authors consider this to be an interesting relationship for vector spaces in the first place, since it is just a definition and not really a very useful definition.
 
  • #10
You're right, I'm going to take your word and their word on this and not lose any sleep over whether it will be useful to me :)

thanks
 

Related to Vector Spaces: Provide a counter example to disprove

1. What is a vector space?

A vector space is a mathematical structure that consists of a set of objects (vectors) that can be added together and multiplied by scalars (numbers). It follows certain axioms, such as closure under addition and scalar multiplication, and the existence of an additive identity and inverse.

2. What does it mean to disprove a vector space?

To disprove a vector space means to find a counter example that violates one or more of the axioms of a vector space. This would show that the set of objects being considered does not form a vector space according to the given axioms.

3. Can you provide a counter example to disprove the existence of a vector space?

Yes, for example, consider the set of all real numbers. While it satisfies most of the axioms of a vector space, it fails to have a multiplicative identity, as there is no real number that can be multiplied with any other real number to give back the original number. Therefore, this set does not form a vector space.

4. Is it possible for a set of objects to satisfy some, but not all, of the axioms of a vector space?

Yes, it is possible for a set of objects to satisfy some of the axioms of a vector space but not all. For example, the set of all matrices with real entries satisfies the closure under addition and scalar multiplication, but it does not have a multiplicative identity and hence does not form a vector space.

5. What is the importance of counter examples in disproving vector spaces?

Counter examples play a crucial role in disproving vector spaces as they provide concrete evidence that a certain set of objects does not fulfill the requirements of a vector space. They also help in understanding the limitations of a particular set and the importance of each axiom in defining a vector space.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
792
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
17
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
834
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
466
Back
Top