Circle composition operator / Jacobson Radical

In summary, this author found that a proof of an algebraic identity in physics was simplified if he used a "circle composition operation." This operation is used in one definition of the Jacobson radical. The results of this operation are interesting. The radical is important in the basic theory of the area the author is writing about.
  • #1
robphy
Science Advisor
Homework Helper
Insights Author
Gold Member
6,882
2,317
In attempting to prove an algebraic identity in physics, I found that my proof was simplified in appearance if i used an operation
a@b=a + b - ab

By flipping through a book on ring theory, I found that this is called a "circle composition operation", and it is used in one definition of the Jacobson radical.

I'm not a mathematician and my highest formal class in abstract algebra is some basic group theory.

What is special about this operation? Why is it interesting?

What is special about the Jacobson radical? Why is it interesting?
 
Physics news on Phys.org
  • #2
The jacobson radical is the intersection of all the maximal ideals, and is often denoted J, and has the property that R/J is semi-simple, and this is the largest such quotient, a ring is semisimple precisely when J=0.

Here are some results:

J is the set of all x such that 1-axb has a two sided inverse for all a and b,

for a finitely generated R module M, J.M=M iff M=0


there are lots of result about the radical, and it can be seen as the obstruction that prevents the ring being semisimple (that is being a sum of full matrix rings)

for example, take R to be the ring of upper trianglar matrices over some field, then J is the strictly uppper tringular matrices, and J/R is the diagonal matrix algebra.


other examples and lemmas are probably a bit too technical, but you might want to know that if b is in J, then 1-b has a left inverse, and as a@b = a(1-b) +b

so that b is in J implies there is an a such that a@b = 1+b, and a@(1+b) looks as though it might be an interesting thing to calculate.

it is also used a lot when thinking about representation theory (fo groups) in non-zero characteristic, as the radical of a module (related to the radical of the group ring) is the intersection of the maximal submodules, and the corresponding quotient is the head of the module.
 
  • #3
If Matt cares to reply to this question...

That is an impressive answer. It has got me wondering how much of it is "off the top of your head" vs. looked up in some reference--not that there is anything wrong with doing it the latter way. I'm just curious.
 
  • #4
The results I mention are almost all things off the top of my head but I did look them up to check. For instance my first reaction was to do with nilpotency of J, but fortunately my check showed this was only true for Noetherian (Artinian perhaps) rings. The only result in there I put in because of my looking things up was that 1-axb result, everything else is in my head, if that's any interest. But then the radical is important in the basic theory of my area.
 
  • #5
I see.

Thanks for the reply. I find myself wondering, when I read a book on mathematics or physics, just how much of it the author pounds out on his keyboard without having to look at any other resources he may have lying around his or her office. As an example, Steven Weinberg surely knows the results of Schur's Lemma on irreducible representations, but when he shows how the lemma is derived (and I don't know if Weinberg has ever bothered to do this, so I am pretending here for the sake of example), would he have to look it up in order to get it just right?
 
  • #6
Look it up? Of course not; schur's lemma is in the 'bleeding obvious' category of proofs. Basically once you know the idea behind the proof (that eigenspaces are invariant) there is nothing to that particular result. The gist of the proof is often all that you carry in your head, usually the details are obvious once you think about it. Some results though are complicated by the hypotheses one uses implicitly. Schur's Lemma of course requires a field with sufficiently many roots of unity in it, and you explicitly use that in the proof, hence the classification of bleeding obvious. Some proofs are certainly not carried in ones head, principally because it is not a useful proof. I class Sylow's theorems in this category.

Lots of other results are far too long to remember; of course even though they ought to be memorized in an ideal world.

Gowers said that he remembered some particular proof as: assign colours at random and do the obvious thing. That was a graph throry result. That seems a reasonable way knowing when you've got a good grasp on it.
 
  • #7
Schur's Lemma of course requires a field with sufficiently many roots of unity in it- Matt Grime

Well, to a mathematics guy like you, there is an of course, but I wonder how many high-energy physicists see it that way! Even a Nobel laureate like Weinberg?
 
  • #8
matt grime said:
The results I mention are almost all things off the top of my head but I did look them up to check.

Thanks for your response to my question. When I have more time, I'll go back and try to digest what you wrote.

Could you reveal your source and suggest other references on this topic?
 
  • #9
Maybe one of you two guys can tell me if I am being stupid. I have been reading a book on field theory. The author makes the claim that it is an "important identity" that

det M = exp(Tr ln M).

He puts no conditions on the matrix M, though of course it has to be square to have a determinant and a trace. For the two by two identity matrix,

det I = 1 and exp(Tr ln I) = exp(0+0) = 1,

so this works out, at least if I gloss over the fact that the log of the off-diagonal zeroes blows up.

Now consider the 2x2 matrix with all four entries equal to 1.

det M = 0 and exp(Tr ln M) = exp(0+0) = 1, not 0.

What is going on here?
 
Last edited:
  • #10
I can safely say that the book i checked the facts in is not suitable for the general reader (this is by common consent - it is a useful book that a lot of people have to look up things but not to use to learn things). You might try Jacobsen's Algebra (as opposed to Jacobson's Radical) book.

As for the 'of course' comment - what does Schur's Lemma require? For you to pull out arbitrary eigenvalues of a matrix, hence some algebraic closure type requirement. The high faluting people would point out it was a 1-dim division algebra, and they're all fields, hence Hom(S,S)=k for S a simple module and k the underlying field.

If its written by a physicist then perhaps you want to check all the things in that identity. ln is presumably a formal power series (log, to a mathematician) and as such wouldn't converge at the identity anyway.

I can see the idea behind the result: imagine M were invertible and diagonalizible as diag(x,y,z...,w)

then log(M) is diag(logx,logy,logz,...logw) so Tr log(M) is the sum of these, the exp is the product of the eigenvalues x*y*z*...*w = det(M).

As he's a physiscist talking about Field Theory then there's a chance that he's talking about Unitary matrices, which are all diagonalizible (and no e-values are zero)
 
  • #11
Matt as always goes the extra mile, and it is appreciated.

The author does an integral for the amplitude of a certain process, and it evaluates to a constant term times the reciprocal square root of the determinant of [tex]\partial^2 + m^2[/tex]. He then uses the identity to convert this to a constant term times exp[-(1/2)Tr ln ([tex]\partial^2 + m^2[/tex])].

The integral sign as written in the book is a simple single indefinite type integral sign, but I am pretty sure it is shorthand for a quadruple integral evaluated over a block of spacetime. So in accordance with that, it is probably understood that the d'Alembertian operator is to be expressed making use of Cartesian coordinates and is multiplied by an implied 4x4 identity matrix and that the squared mass scalar is likewise multiplied by an implied 4x4 identity matrix. The product scalar x identity matrix is diagonal, so I suppose the author doesn't need his "important identity" to extend beyond that case. Of course there is my issue of "blowing up" due to the log of zero, which may be what you are getting at when you say, "wouldn't converge at the identity anyway."
 
Last edited:

Related to Circle composition operator / Jacobson Radical

1. What is the circle composition operator?

The circle composition operator is a mathematical operation denoted by the symbol "∘" which combines two functions to create a new function. It is also known as function composition and is written as f ∘ g = f(g(x)). This means that the output of g(x) is used as the input for f(x).

2. How is the circle composition operator used in mathematics?

The circle composition operator is used to combine two functions in order to create a new function. It is commonly used in calculus and abstract algebra to simplify and manipulate equations. It is also used in computer science to create more complex functions from simpler ones.

3. What is the Jacobson Radical?

The Jacobson Radical, also known as the J-radical, is a concept in abstract algebra that describes the intersection of all maximal ideals in a ring. It is denoted by J(R) and is used to measure the "non-commutativity" of a ring. It is named after the mathematician Nathan Jacobson.

4. How is the Jacobson Radical related to the circle composition operator?

In abstract algebra, the Jacobson Radical is defined as the set of elements that "annihilate" all elements in the ring under the circle composition operator. This means that the Jacobson Radical is the set of elements that when composed with any other element in the ring, the result is always zero.

5. What is the significance of the Jacobson Radical in mathematics?

The Jacobson Radical is an important concept in abstract algebra as it allows for the study of the "non-commutativity" of a ring. It is also used to define other important concepts such as semisimplicity and the Jacobson Density Theorem. It has applications in various areas of mathematics such as representation theory and algebraic geometry.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
2K
  • Beyond the Standard Models
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Differential Geometry
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
3K
  • Linear and Abstract Algebra
Replies
9
Views
2K
Back
Top