Centre of an Algebra .... and Central Algebras ....

  • I
  • Thread starter Math Amateur
  • Start date
  • Tags
    Algebra
In summary: M3) and (M3*) are equivalent, but didn't really succeed. In a way I think, that every vector space is a left and a right vector space, but I have no idea how to prove that.So, I can't tell you where this comes from, but I can tell you: You're not the only one.
  • #1
Math Amateur
Gold Member
MHB
3,998
48
I am reading Matej Bresar's book, "Introduction to Noncommutative Algebra" and am currently focussed on Chapter 1: Finite Dimensional Division Algebras ... ...

I need help with some remarks of Bresar on the centre of an algebra ...

Commencing a section on Central Algebras, Bresar writes the following:
?temp_hash=1ac92350ae17142169d205fba8955e8d.png


In the above text we read the following:

" ... The center of a unital algebra obviously contains scalar multiples of unity ... ... "Now the center of a unital algebra ##A## is defined as the set ##Z(A)## such that

##Z(A) = \{ c \in A \ | \ cx = xc \text{ for all x } \in A \}##Now ... clearly ##1 \in Z(A)## since ##1x = x1## for all x ...

BUT ... why do elements like ##3## belong to ##Z(A)## ... ?

That is ... how would we demonstrate that ##3x = x3## for all ##x \in A## ... ?

Hope someone can help ...

Peter
 

Attachments

  • Bresar - 1 - Central Algebras - PART 1 ... ....png
    Bresar - 1 - Central Algebras - PART 1 ... ....png
    30.4 KB · Views: 733
Physics news on Phys.org
  • #2
For every vector space ##V## over a field ##\mathbb{F}## holds: ##c\cdot v = v \cdot c## for all ##c \in \mathbb{F}## and all ##v \in V##.
And algebras are vector spaces. It is the same argument you used with the real numbers and the division algebra ##D## over ##\mathbb{R}##.
This means that for algebras ##\mathcal{A}## even ##c\cdot (v\cdot w)=(v \cdot c) \cdot w = v \cdot (c \cdot w) = (v\cdot w) \cdot c## for all ##c\in \mathbb{F}\, , \,v,w \in \mathcal{A}## holds (by definition). Other than modules which can be left- or right-modules, a vector space is always a left- and right-module. One can weaken the requirement of ##\mathbb{F}## being a ring instead of a field, but then we talk of modules rather than vector spaces. However, this doesn't change the rule for scalar multiplication. If algebras are considered over a ring, this still is required. But in general the scalars are taken from a field unless otherwise explicitly stated.
 
  • #3
fresh_42 said:
For every vector space ##V## over a field ##\mathbb{F}## holds: ##c\cdot v = v \cdot c## for all ##c \in \mathbb{F}## and all ##v \in V##.
And algebras are vector spaces. It is the same argument you used with the real numbers and the division algebra ##D## over ##\mathbb{R}##.
This means that for algebras ##\mathcal{A}## even ##c\cdot (v\cdot w)=(v \cdot c) \cdot w = v \cdot (c \cdot w) = (v\cdot w) \cdot c## for all ##c\in \mathbb{F}\, , \,v,w \in \mathcal{A}## holds (by definition). Other than modules which can be left- or right-modules, a vector space is always a left- and right-module. One can weaken the requirement of ##\mathbb{F}## being a ring instead of a field, but then we talk of modules rather than vector spaces. However, this doesn't change the rule for scalar multiplication. If algebras are considered over a ring, this still is required. But in general the scalars are taken from a field unless otherwise explicitly stated.
Thanks for the help, fresh_42 ...

But ... I am not completely following you ... sorry to be slow ...

You write:

" ... ... For every vector space ##V## over a field ##\mathbb{F}## holds: ##c\cdot v = v \cdot c## for all ##c \in \mathbb{F}## and all ##v \in V##. ... ... "

But why exactly is this true ...? it does not seem to be one of the axioms ... see below ...

The axioms for a vector space as given in Cooperstein: Advanced Linear Algebra (Second Edition) are given below ...
?temp_hash=8656ed063d2b4611e0b934806f72a219.png
Could you please help further ...

Again ... sorry if I'm missing something obvious ..

Peter
 

Attachments

  • Cooperstein - Axioms for a vector space.png
    Cooperstein - Axioms for a vector space.png
    38.8 KB · Views: 708
  • #4
I think you may be confusing the unit element of the field with the unit element of the algebra.
In a unital algebra (V, F), let I be the unit of V. A scalar multiple of that is λI, where λ∈F. Commutativity of that with x∈V would look like (λI)x=x(λI). We don't worry about λ commuting with elements of V because the scalar product is always written the same way around. I.e. xλ does not need to be defined.
 
  • #5
Math Amateur said:
Thanks for the help, fresh_42 ...
But ... I am not completely following you ... sorry to be slow ...
You write:
" ... ... For every vector space ##V## over a field ##\mathbb{F}## holds: ##c\cdot v = v \cdot c## for all ##c \in \mathbb{F}## and all ##v \in V##. ... ... "
But why exactly is this true ...? it does not seem to be one of the axioms ... see below ...
The axioms for a vector space as given in Cooperstein: Advanced Linear Algebra (Second Edition) are given below ...
No need for a sorry here. It is in fact a good question. As I've always thought of vectors as little arrows where scalar multiples are only a stretch or compression of them, I never really thought about a difference of a left-stretch and a right-stretch. Other than groups, modules, rings and algebras where the distinction between left and right comes along with the definition, this is not the case here.

This is what I have found:

van der Waerden speaks of left and right vector spaces and distinguishes two associative laws ##(M3)\; (ab)u=a(bu)## and ##(M3^*)\; u(ab)=(ua)b##. He comments:
Bartel Leendert van der Waerden - Algebra I (1) said:
"If ##\mathbb{F}## is commutative one can write ##ua## instead of ##au##. The right vector space becomes a left vector space this way. If, however, ##\mathbb{F}## is not commutative [Remark: e.g. the division algebra ##\mathbb{H}##], then it must be distinguished between left and right vector spaces."
Unfortunately he doesn't explain, whether this is a convention to identify both isomorphic vector spaces ##V_\mathbb{F}## and ##{}_\mathbb{F}V\, ,## i.e. an additional axiom would be needed, or whether it is forced by the commutativity of ##\mathbb{F}.##

Another source (about didactic) mentions, that the distinction has first been made by Bourbaki in 1947, but I did not track this down to quote a proper source.

The definitions I have found are all the same as yours above. And like me, nobody didn't really waste a thought about left and right, except my quote of van der Waerden above. I suppose that the fact, that both would lead to basically the same vector space (isomorphism), they didn't regard it as necessary. I've tried to proof ##au=ua## for (commutative) fields ##\mathbb{F}## but haven't found a solution quickly. It is clear that one has to be careful with non commutative (skew) fields, for associativity (##(M3)##, resp.##(M3^*)##) would get us into trouble if we set ##u(ab)=(ab)u=a(bu)=a(ub)=(ub)a=u(ba) \neq u(ab) ,## which is no danger for commutative fields. My suggestion is: As long as we don't have a better idea (or proof, or someone, who really knows), we should take it like an additional (even though unspoken) axiom, for otherwise the entire linear algebra would be unnecessarily (and probably quite distracting) overloaded by lefts and rights.
I know this isn't a satisfactory view of the issue, and I would definitely prefer a proof, but I think, the idea of a vector being stretched by the same factor gave different results from left and right is even more troublesome.
______________________
(1) https://www.amazon.com/dp/0387406247/?tag=pfamazon01-20
 
Last edited:
  • Like
Likes Math Amateur
  • #6
fresh_42 said:
It is in fact a good question.
Not sure whether you saw/understood my post. In the standard axioms of an algebra, right-multiplication by scalars, i.e.xλ, is not even defined.

Let the algebra be (V, F), I be a unit in V, and λ, μ∈F.
For any x∈V, I.x=x.I.
The compatibility axiom says that if y∈V then (λμ)(x.y)=(λx).(μy).
Thus (λI).x=λ(I.x)=λ(x.I)=x.(λI).
Thus λI commutes with all elements of V.

No doubt you could define right-multiplication by scalars, and there would be no requirement for it to equate to λx.
 
  • Like
Likes Math Amateur
  • #7
Yes, you're right, that answers the question about the center.

Nevertheless, I found it interesting to ask, why we use ##\lambda \cdot v = v \cdot \lambda## in vector spaces without to mention this convention. I looked it up in two different books and it wasn't in there. But I didn't check, whether it is needed or not in the rest of the books. van der Waerden's and Bourbakis remarks on the issue at least showed, that it is not at all self-evident.
And I've been a little bit in the mood of an earlier thread, in which Bresar used the free positioning of reals in a proof about division algebras over ##\mathbb{R}## (if I remember it correctly; not quite sure, if it was really needed).
 
  • Like
Likes Math Amateur
  • #8
haruspex said:
Not sure whether you saw/understood my post. In the standard axioms of an algebra, right-multiplication by scalars, i.e.xλ, is not even defined.

Let the algebra be (V, F), I be a unit in V, and λ, μ∈F.
For any x∈V, I.x=x.I.
The compatibility axiom says that if y∈V then (λμ)(x.y)=(λx).(μy).
Thus (λI).x=λ(I.x)=λ(x.I)=x.(λI).
Thus λI commutes with all elements of V.

No doubt you could define right-multiplication by scalars, and there would be no requirement for it to equate to λx.

Thanks for the help haruspex ... ...

Just a clarification ... you write:

" ... ... Let the algebra be ##(V, F)##, ##I## be a unit in ##V##, and ##λ, μ∈F##... "

I am assuming that you mean ##I## is the unit (or unity or multiplicative identity) in ##V## ... ... and not simply a unit in ##V## ... is that correct?You also write:

" ... ... right-multiplication by scalars, i.e.##xλ##, is not even defined. ... "

and yet we do not speak of left and right vector spaces over fields ... ... so surely ##x \lambda = \lambda x## in some sense ...?I looked up some relevant texts to look for insights on this question ... it may be that Bland's (Paul E. Bland: "Rings and Their Modules") description of how to turn a right module where the action is described by a binary operation ##M \times R \rightarrow M## such that ##(x,a) \rightarrow xa## ... into a left module by setting ##a \cdot x = xa## ... ... is relevant ...The relevant text from Paul E. Bland: "Rings and Their Modules" is as follows:
?temp_hash=3f198595098618f57614b32e4aef5312.png

?temp_hash=3f198595098618f57614b32e4aef5312.png


So my conclusion is that Bland is saying that when the ring is commutative there is essentially no difference between a right and a left module ...

I must say that I would have preferred an axiom that somehow directly implied that ##ax = xa## ...

What do you think ...?Peter
 

Attachments

  • Bland - 1 - Section 1.4 - Scalars acting on module or vector space elements - PART 1.png
    Bland - 1 - Section 1.4 - Scalars acting on module or vector space elements - PART 1.png
    35.2 KB · Views: 499
  • Bland - 2 - Section 1.4 - Scalars acting on module or vector space elements - PART 2 ... .png
    Bland - 2 - Section 1.4 - Scalars acting on module or vector space elements - PART 2 ... .png
    43.1 KB · Views: 492
  • #9
haruspex said:
Not sure whether you saw/understood my post. In the standard axioms of an algebra, right-multiplication by scalars, i.e.xλ, is not even defined.

Let the algebra be (V, F), I be a unit in V, and λ, μ∈F.
For any x∈V, I.x=x.I.
The compatibility axiom says that if y∈V then (λμ)(x.y)=(λx).(μy).
Thus (λI).x=λ(I.x)=λ(x.I)=x.(λI).
Thus λI commutes with all elements of V.

No doubt you could define right-multiplication by scalars, and there would be no requirement for it to equate to λx.
Hi haruspex ... just another few clarifications I hope you can help with ...

You write:

" ... ... The compatibility axiom says that if ##y∈V## then ##(λμ)(x.y)=(λx).(μy)##. ... ... "

In the texts I have checked lately the compatibility axiom reads something like:

##\lambda(xy) = (\lambda x)y = x(\lambda y)## for all ##\lambda\in F## and ##x,y\in V.##

In other words there is only one scalar mentioned in the axiom ... ... why have you included two scalars ...You also write:

" ... ...
Thus ##(λI).x=λ(I.x)=λ(x.I)=x.(λI)##.
Thus ##λI## commutes with all elements of ##V##."Your derivation of the fact that ##λI## commutes with all elements of ##V## involves assuming that ##I.x = x.I## ... but why exactly is this true ...

Indeed ##I.x = x.I## is not an axiom ... see below ... and I cannot see how to derive this from the axioms of a vector space ...

Can you help

Peter======================================================================================================================The axioms for a vector space V over a field F are given in Bruce N Cooperstein's book "Advanced Linear Algebra" (Second Edition) as follows:
?temp_hash=7b64c9380ac73a05ff9ecb72f6316caf.png
 

Attachments

  • Cooperstein - Axioms for a vector space.png
    Cooperstein - Axioms for a vector space.png
    38.8 KB · Views: 540
  • #10
Math Amateur said:
you mean III is the unit
Yes.
Math Amateur said:
yet we do not speak of left and right vector spaces over fields ... ... so surely xλ=λx in some sense ...?
Not really. If you look through the axioms for an algebra, the product of a scalar with a vector is always written the same way around. We are accustomed to algebras in which defining the other product to be the same creates no difficulty, so writing it either way around is harmless. But I bet that we never need to write it the other way.
Math Amateur said:
Bland is saying that when the ring is commutative there is essentially no difference between a right and a left module
No, he wrote that only if it is commutative can you elect to define the product such that it commutes. This leaves open the possibility of defining left and right algebras (or modules) to be different even though the scalars commute.
 
  • Like
Likes Math Amateur
  • #11
Math Amateur said:
In the texts I have checked lately the compatibility axiom reads something like:
There are probably several equivalent ways of writing that axiom. I picked it off a random website. In fact, I don't think I used the other scalar when I applied it.
Math Amateur said:
Indeed I.x=x.I is not an axiom ... see below .
The topic here is unitary algebras, i.e. the vector product operation has a unit, I. By definition, I.x = x = x.I.
 
  • Like
Likes Math Amateur
  • #12
Thanks haruspex ... appreciate your help ...

Peter
 
  • #13
Math Amateur said:
Thanks haruspex ... appreciate your help ...

Peter
You are welcome, and thanks for bringing up the subject. I had never heard of unitary algebras. My pure maths education did not go that far.
 
  • #14
fresh_42 said:
Yes, you're right, that answers the question about the center.

Nevertheless, I found it interesting to ask, why we use ##\lambda \cdot v = v \cdot \lambda## in vector spaces without to mention this convention. I looked it up in two different books and it wasn't in there. But I didn't check, whether it is needed or not in the rest of the books. van der Waerden's and Bourbakis remarks on the issue at least showed, that it is not at all self-evident.
And I've been a little bit in the mood of an earlier thread, in which Bresar used the free positioning of reals in a proof about division algebras over ##\mathbb{R}## (if I remember it correctly; not quite sure, if it was really needed).
fresh_42 ...Thank so for all your help on this issue ...

Peter
 
  • #15
Math Amateur said:
fresh_42 ...Thank so for all your help on this issue ...

Peter
Hi Peter,
I have another point on the issue. If we consider vectors written in some basis, such that we have coordinates. Then we write a vector ##v=(v_1,v_2, \ldots)## and ##\lambda v = (\lambda v_1,\lambda v_2, \ldots)##. The ##v_i## are all elements of the field, and therefore ##\lambda v_i = v_i \lambda## holds, which turns into ##\lambda v = v \lambda## for the entire vector. Hence any different definition of left and right multiplication with scalars would get very problematic.
 
  • Like
Likes Math Amateur
  • #16
fresh_42 said:
Hi Peter,
I have another point on the issue. If we consider vectors written in some basis, such that we have coordinates. Then we write a vector ##v=(v_1,v_2, \ldots)## and ##\lambda v = (\lambda v_1,\lambda v_2, \ldots)##. The ##v_i## are all elements of the field, and therefore ##\lambda v_i = v_i \lambda## holds, which turns into ##\lambda v = v \lambda## for the entire vector. Hence any different definition of left and right multiplication with scalars would get very problematic.
Hi fresh_42 ... well! ... most interesting ...

It explains why we don't talk about left and right vector spaces ... based on your analysis, they are both the same ...Just a point that worried me ...

You write:

" ... ... The ##v_i## are all elements of the field, ... ... "

Rather than being elements of the field, ##F##, the ##v_i## seem to me to be elements of the vector space ...But, anyway, I agree with everything else you wrote ... and it is most illuminating ... thank you ...

Maybe your analysis should be in textbook presentation of vector spaces ... especially those books that are at senior undergraduate and beginning graduate levels ...

Peter
 
  • #17
No, I meant the ##v_i## being the coordinates.

Let's consider for simplicity a single vector ##\vec{b_1}##. This is a basis vector for a one-dimensional vector space ##V = \mathbb{F}\cdot \vec{b_1}##. Then an arbitrary vector can be written as ##\vec{w_{}}=v_1\cdot \vec{b_1}## with ##v_1 \in \mathbb{F}## being the coordinate according to the basis ##\{\vec{b_1}\}##. We now usually write ##\vec{w_{}}=(v_1)## for short.

Now let us assume for a moment that ##\lambda \vec{w_{}} \neq \vec{w_{}} \lambda##. Then in our (usual) notation in coordinates we would have ##\lambda \vec{w_{}} = (\lambda v_1) \neq (v_1 \lambda) = \vec{w_{}} \lambda ## which is strange, because the numbers ##\lambda## and ##v_1## do commutate as elements of ##\mathbb{F}##.

This is not a proof, that ##\lambda \vec{w_{}} = \vec{w_{}} \lambda##, but a reason that it makes sense, for otherwise we would have to say goodbye to our convenient notation like e.g. ##\vec{u_{}} =(1,3,0,-1) \in \mathbb{R}^4##.
 
  • Like
Likes Math Amateur
  • #18
Thanks for the explanation fresh_42 ...

That really clarified things ...

Thanks again ...

Peter
 

Related to Centre of an Algebra .... and Central Algebras ....

1. What is the centre of an algebra?

The centre of an algebra is the set of elements that commute with all other elements in the algebra. In other words, for any element a in the centre, a*b = b*a for all elements b in the algebra.

2. Why is the centre of an algebra important?

The centre of an algebra is important because it provides a way to identify the elements that behave like scalars. This allows for certain simplifications in calculations and can help in understanding the structure of the algebra.

3. What are central algebras?

Central algebras are algebras in which all elements commute with each other, meaning that the centre of the algebra is equal to the entire algebra. Examples of central algebras include commutative algebras and matrix algebras.

4. How are central algebras different from other algebras?

Unlike other algebras, the elements in central algebras commute with each other, meaning that the order in which operations are performed does not affect the result. This property makes central algebras easier to work with and often allows for simpler solutions to problems.

5. What are some applications of central algebras?

Central algebras have many applications in mathematics, physics, and engineering. They can be used to solve equations and systems of equations, model physical systems, and study symmetry and group theory. They also have applications in coding theory, cryptography, and quantum mechanics.

Similar threads

  • Linear and Abstract Algebra
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
931
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
11
Views
2K
Replies
2
Views
1K
Back
Top