Group Theory Basics: Where Can I Learn More?

In summary, Group Theory is a fundamental mathematical concept that has various applications in physics, particularly in the study of symmetry and patterns. It involves the study of groups, which are sets of elements that follow certain rules and properties when combined. Some good resources for understanding Group Theory include the books "Groups and Symmetry" by M.A. Armstrong, "An Introduction to the Theory of Groups" by J. Rotman, and "Group Theory: An Intuitive Approach" by R. Mirman. Online resources are also available, such as the website http://www.cns.gatech.edu/GroupTheory/index.html, which provides a free introductory book on Group Theory. The idea of having an entry-level workshop on groups has been proposed, with
  • #36
Originally posted by Hurkyl

A Lie Algebra is simply a vector space A over a field F equipped with a bilinear operator [,] on A that satisfies [x, x] = 0 and the jacobi identity:

[[x, y], z] + [[y, z], x] + [[z, x], y] = 0


(If F does not have characteristic 2, [x, x] = 0 is a consequence of the Jacobi identity and may be dropped as an axiom)

whoa! is that true? i m not so sure. i think what you want to say here is:
[x,y]=-[y,x] is a consequence of [x,x]=0, and if the field does not have characteristic 2, then [x,y]=-[y,x] implies [x,x]=0, but not in fields with characteristic 2, so we drop that as an axiom.


I would like to point out that [x, y] is not defined by:

[x, y] = xy - yx

(or various similar definitions); it is merely a bilinear form that satisfies the Jacobi identity and [x, x] = 0.


However, for any associative algebra A, one may define the lie algebra A- by defining the lie bracket as the commutator.


An example where [,] is not a commutator is (if I've done my arithmetic correctly) the real vector space R3 where [x, y] = x * y, where * is the vector cross product.

other examples include the poisson bracket and the lie bracket (well, the lie bracket does turn out t be a commutator, but it is certainly not defined that way).
 
Physics news on Phys.org
  • #37
You wan a complete synthetic definition of Lie algebra? Here it is: A Lie algebra L is a pair (V,t) formed by an F-module (F being a commutative ring) and an alternated tensor t of mixed type (2,1) satisfying the Jacobi identity.

A special case is F a field.
 
  • #38
Originally posted by Hurkyl
I would like to point out that [x, y] is not defined by:
[x, y] = xy - yx
[/B]

For the so called abstract Lie algebras only the bracket [,] by itself has a meaning, but it can be proven that for any (finite dimensional) Lie algebra L we can find a vector space V such that the elements of L are linear transormations of V, so that the formulae above holds (that is, we can alsays find a faithful linear representation of L). For those interested in details, this is known as the theorem of Ado (1945).
 
  • #39
Originally posted by Tom
A Lie algebra is a nonAbelian algebra whose elements ai satisfy the following properties:

1. [ai,ai]=0 (ai commutes with itself.)
2. [aj+ak,ai]=[aj,ai]+[ak,ai] (Linearity of commutator.)
3. [ai,[aj,ak]]+[aj,[ak,ai]]+[ak,[ai,aj]]=0 (Jacobi identity.)

Just a comment, you can drop the word nonabelian since any vector space is an abelian Lie algebra simply taking the zero bracket. Indeed abelian algebras play a fundamental role in the theory (see for example the Cartan subalgebras). You have also the require bi-linearity, otherwise the result is not necessarily a Lie algebra. The (local!) relation with the Lie groups is expressed by the Campbell-Hausdorff formula (exponentiation of elements). But this is delicate, since not all elements of the Lie group must be the exp of some element of the Lie algebra (the example is well known to you!).
 
  • #40
Finally, you can also use operators to construct Lie algebras. If you take hermitian conjugate operators B, B* (in an infinite dimensional space) with the rule [B,B*]=BB*-B*B you obtain the Heisenberg Lie algebra, which is the basis of all classical analysis of harmonic oscillators and gave rise to the boson formalism used by Schwinger, Holstein and Primakoff in the 40's to analyze angular momentum.
 
  • #41
Originally posted by rutwig
But this is delicate, since not all elements of the Lie group must be the exp of some element of the Lie algebra (the example is well known to you!).

what? this is not known to me, i thought that any element of the Lie Group could indeed be obtained by exponentiation of the Lie Algebra. what is the example?
 
  • #42
Originally posted by lethe
what? this is not known to me, i thought that any element of the Lie Group could indeed be obtained by exponentiation of the Lie Algebra. what is the example?

If you have worked only with compact groups, then you will not have observed this; any element is the exponentiation of some element in the Lie algebra. But for noncompact groups this is no longer true, and we have to consider a finite number of elements in the Lie algebra to recover the elements of the group.

Example: show that the element

|-a 0 |
|0 -1/a |

of SL(2,R) cannot be expressed as the exponential of an unique element X of the Lie algebra sl(2,R) if a is different from 1.
 
  • #43
Originally posted by rutwig
If you have worked only with compact groups, then you will not have observed this; any element is the exponentiation of some element in the Lie algebra. But for noncompact groups this is no longer true, and we have to consider a finite number of elements in the Lie algebra to recover the elements of the group.

Example: show that the element

|-a 0 |
|0 -1/a |

of SL(2,R) cannot be expressed as the exponential of an unique element X of the Lie algebra sl(2,R) if a is different from 1.

OK, so let me see. the lie algebra of SL(2,R) is just the set of real 2x2 traceless matrices right? a basis for this algebra is:

[1 0] [0 1] [0 0]
[0 -1], [0 0], [1 0]

right? obviously the matrix you mentioned has to be constructed from the first basis element.

what about
exp(ln a[1 0])
[0 -1]

no wait, that will give me only

[a 0]
[0 1/a]

obviously, i ll never be able to get negative numbers by exponentiating these matrices, so it s impossible, as you say. why is that? what does this have to do with compactness?
 
  • #44
The result is not entirely obvious, but compactness ensures some properties (like existence of invariant integration)
that are not given otherwise. For this case, the key is that for compact (connected) groups any element is conjugate to
an element in a maximal torus (analytic subgroups corresponding to an abelian subalgebra of the Lie algebra).
 
  • #45
These are notes taken from Marsden. Some of the proofs have been omitted, but are available in Marsden's text.

The Real General Linear Group

GL(n,R) is defined as GL(n,R) = {A in Rnxn: det(A)!=0}
GL+(n,R) is defined as GL+(n,R) = {A in Rnxn: det(A)>0}
GL-(n,R) is defined as GL-(n,R) = {A in Rnxn: det(A)<0}

where R in the set of real numbers, and Rnxn is the set of real nxn matrices.

GL+(n,R) is the connected component of the identity in GL(n,R), and GL(n,R) has exactly two connected components. Marsden proves this using the real polar decomposition theorem. Following the proof of the the conclusion below is reached.

The real general linear group is a non-compact disconnected n2 dimensional Lie group whose Lie algebra consists of the set of all nxn matrices with the bracket [A,B] = AB-BA.


The Special Linear Group

SL(n,R) is defined as SL(n,R) = {A in GL(n,R): det(A)=1}

R\{0} is a group under multiplication, and det:GL(n,R)->R\{0} is a Lie group homomorphism since det(AB) = det(A)det(B).

The Lie algebra of SL(n,R) consists of the set of nxn matrices with trace 0 and bracket [A,B] = AB-BA.

Since trace B=0 imposes one condition dim[sl(n,R)]=n2-1, where sl(n,R) is the Lie algebra of SL(n,R).

It is useful to introduce the inner product on the Lie algebra gl(n,R) of GL(n,R) <A,B> = trace(ABT). Note that ||A||2 = &Sigma;i,j=1naij2 which shows that the norm on gl(n,R) coincides with the Euclidean norm on Rn2. This norm can be used to show that SL(n,R) is not compact.

Let v1 = (1,0,...,0), v2 = (0,1,...,0),...,vn-1 = (0,...,1,0) and vn = (t,0,...,1) where all vi are members of Rn

Let A = (v1, v2,...,vn) be a matrix in Rnxn. All matrices of this form are elements of SL(n,R) whose norm is equal to &radic;(n+t2) for all t in R. SL(n,R) is not a bounded subset of gl(n,R), and hence SL(n,R) is not compact. SL(n,R) is also connected, but the proof has been left to Marsden due to space constraints.

The section concludes with the following propostition:

The Lie group SL(n,R) is a non-compact connected (n2-1) dimensional Lie group whose Lie algebra sl(n,R) consists of the nxn matrices with trace 0 and bracket [A,B] = AB-BA.

Apologies for any typos/inaccuracies. I'm pretty new to this stuff.
 
  • #46
I could not find any typos/inaccuracies
except for the one typo on an unintended extra t in "proposition"
"The section concludes with the following propostition:"
And I could not even find any instance of lack of clarity.
This is great. We might even have a Lie groups workshop
going. If the others, like Chroot and Hurkyl, keep in touch.

I am wondering what the others think would be good to do now.

One could look at what you just said and try to
say intuitively why those things are, in fact, the Lie algebra
of GL(n,R). How gl(n,R) really does correspond to infinitesimal transformations around the identity----and how it is the tangent space at the point on GL(n, R) which is the identity matrix.
Or one could do the same kind of concrete case investigation for SL(n, R) and its Lie algebra. I mean, try out and verify a few special cases and get our hands dirty.

Or, alternatively, we could move on to some more matrix groups like O(n) and SO(n), or begin looking at their complex cousins.

Or, if enough people were involved, we could go in both directions at once. Some could proceed to describe the other classic Lie groups and their algebras while others, like me, cogitate about the very simplest examples.

Let's see what happens. I hope something does.
 
  • #47
exp(A) the exponential function of a matrix

In a previous post I was going thru a section of marsden
in that pages 283-292 part of chapter 9, and it mentioned
the exponential function defined by the power series
exp(x) = 1 + x + x2/2! +...
and gave a case of where you plug a matrix A in for x
and get a matrix exp(A)

this has always seemed to me like a cool thing to do
and I see it as illustrating a kind of umbilical connection between Lie algebra and Lie group.

The algebra element A is what gets plugged into exp() to give exp(A) which is in the group.

Or in more cagey differential geometry style----exp(tA) for t running from 0 to 1 gives you a curve down in the Lie group (the manifold) which starts out at the identity point and travels along in the manifold and reaches exp(A) as its destination. Indeed exp(tA) for t running forever gives a one-dimensional subgroup--but this is a bit too abstract for this time of morning.

What I always think is so great is that if A is 3x3 skew sym
matrix, meaning AT = -A
then plugging A into that good old exp() power series gives a rotation matrix, one of the SO(3) Lie group.

More wonderful still, exp(A) is the rotation by exactly |v| radians about the vector v = (v1, v2, v3) as axis where A is
given by

+0 -v3 +v2
+v3 +0 -v1
-v2 +v1 +0

any skew symmetric matrix would have such a form for some
v1,v2,v3

And we may be able to convince ourselves of this, or prove it a bit, without much effort, just by looking at the power series in A.

If I stands for the identity matrix,

B = exp(A) = I + A + A2/2! +...

Now consider that since AT = - A, we can take the transpose of this whole power series and it will be as if we put a minus sign in front of A.

BT = exp(A)T = exp(- A)

But multiplying exp(x) and exp( -x) always gives one. When you multiply the two power series there is a bunch of cancelation and it boils down to the identity. So exp (-A) is the matrix INVERSE of exp(A).

BT = exp(A)T = exp(- A) = exp(A)-1 = B-1

BT = B-1 means that B is orthogonal


BTW one reason to think about paths exp(tA) from the identity to the endpoint exp(A) is to see clearly that exp(A) is in the same connected component of the group. O(3) is split into two pieces, one with det = 1 and one with det = -1.

The latter kind turn your shirt inside out as well as rotating it, so they are bad mothers and it is generally safer to work with the det = 1 kind which are called "special" or SO(3).

this curve going t = 0 to 1 shows that exp(A) is in the same connected component as the identity, because how could the curve ever leap the chasm between the two components?
So it shows det A = 1. But that is just mathematical monkeyshines, of course the determinant is one! :wink:

All this stuff can be written with an n sitting in for 3, but
as an inveterate skeptic I often suspect that
dimensions higher than 3 do not exist and prefer to write 3 instead of n. It looks, somehow, more definite and specific that way.

We should check out the elementary fact that [A,B] works with
skew sym matrices A and B! Why not! Maybe later today, unless someone else has already done it.

I will bring along this earlier post with an extract from pages 289-291 of the book
**************************************
SO(3) is a compact Lie group of dimension 3.

Its Lie algebra so(3) is the space of real skew-symmetric 3x3 matrices
with bracket [A,B] = AB - BA.

The Lie algebra so(3) can be identified with R3
the 3-tuples of real numbers by a vectorspace isomorphism
called the"hat map"

v = (v1,v2,v3) goes to v-hat, which is a skew-symmetric matrix
meaning its transpose its its NEGATIVE, and you just stash the three numbers into such a matrix like:

+0 -v3 +v2
+v3 +0 -v1
-v2 +v1 +0

v-hat is a matrix and apply it to any vector w and
you get vxw.

Everybody in freshman year got to play with v x w
the cross product of real 3D vectors
and R3 with ordinary vector addition and cross product v x w is kind of the ancestral Lie algebra from whence all the others came.

And the hat-map is a Lie algebra isomorphism

EULER'S THEOREM

Every element A in SO(3) not equal to the identity is a rotation
thru an angle &phi; about an axis w.

SO SO(3) IS JUST THE WAYS YOU CAN TURN A BALL---it is the group of rotations

THE EIGENVALUE LEMMA is that if A is in SO(3) one of its
eigenvalues has to be equal to 1.
The proof is just to look at the characteristic polynomial which is of degree three and consider cases.

Proof of Euler is just to look at the eigenvector with eigenvalue one----pssst! it is the axis of the rotation. Marsden takes three sentences to prove it.

A CANONICAL MATRIX FORM to write elements of SO(3) in
is

+1 +000 +000
+0 +cos&phi; -sin&phi;
+0 +sin&phi; cos&phi;

For typography I have to write 0 as +000
to leave space for the cosine and sine under it
maybe someone knows how to write handsomer matrices?

EXPONENTIAL MAP
Let t be a number and w be a vector in R3
Let |w| be the norm of w (sqrt sum of squares)
Let w^ be w-hat, the hat-map image of w in so(3), the Lie algebra. Then:

exp(tw^) is a rotation about axis w by angle t|w|

It is just a recipe to cook up a matrix giving any amount of rotation around any axis you want.
 
Last edited:
  • #48
routine checks

sometimes just doing the routine checks is a good way to
get used to something. In the last post I was talking about
so(3) the skewsym matrices that are the Lie algebra of SO(3) the rotations and I said


"We should check out the elementary fact that [A,B] works with
skew sym matrices A and B! Why not! Maybe later today, unless someone else has already done it."

What I mean is just verify the extremely simple fact that
if you have skew sym A,B then the bracket [A,B] is also skew sym!

And there is also the dreaded jacobite identity to verify namely

[[A,B], C] + [[B,C], A] + [[C,A], B] = 0

this terrible formula can only be verified by those who have memorized the alphabet, at least up to C, and
in our culture very young children are made to recite the alphabet to ensure that when they reach maturity they will be able to
verify the Jacobi identity.

It is, you may have noticed the main axiom of abstract Lie algebra.

There are sort of two wrong approaches to anything, (1)purely axiomatic and (2)bloodyminded practical----really have to do both, if one is learning about concrete examples one should occasionally look around and verify that they satisfy the axioms too.
 
  • #49
whoa! is that true? i m not so sure. i think what you want to say here is:
[x,y]=-[y,x] is a consequence of [x,x]=0, and if the field does not have characteristic 2, then [x,y]=- [y,x] implies [x,x]=0, but not in fields with characteristic 2, so we drop that as an axiom.

Yes, that's essentially what I meant to say.
 
  • #50
I've got a proof for the Lie algebra of skew-symmetric matrices in three dimensions produces another three-dimensional skew-symmetric matrix since that's our focus, but it's pretty simplistic, and unelegant. Maybe someone has a better one.

Let A be defined as
Code:
(0 -a -b)
(a  0 -c)
(b  c  0)
Let B be defined as
Code:
(0 -d -e)
(d  0 -f)
(e  f  0)
Let g=-(ad+be), h=-(ad+ce), i=-(be+cf)

Then AB is
Code:
( g  -bf af)
(-ce  h -ae)
( cd -bd  i)
And BA is
Code:
( g  -ce cd)
(-bf  h -bd)
( af -ae  i)
So, [A,B]=AB-BA=
Code:
(0      ce-bf af-cd)
(bf-ce  0     bd-ae)
(cd-af  ae-bd 0    )
Which is again a skew-symmetric matrix.

EDIT: Took chroot's advice.
 
Last edited:
  • #51
The best way to render matrices here is to put them in a [ code ][ /code ] container, which preserves spacing:
Code:
( 0       ce-bf    af-cd )
( bf-ce     0      bd-ae )
( cd-af   ae-bd      0   )
- Warren
 
  • #52
For A and B skew symmetric matrices:

(AB - BA)t = (AB)t - (BA)t
= BtAt - AtBt
= (-B)(-A) - (-A)(-B)
= BA - AB
= -(AB - BA)

So the commutator of any two skew symmetric matrices is again skew symmetric.

In general, for any involution *:

[A, B]* = (AB-BA)*
= (AB)* - (BA)*
= B*A* - A*B*
= [B*, A*]

where [,] is the commutator

edit: fixed a formatting error
 
Last edited:
  • #53
two good things just happened.
Lonewolf who is new to groups (background = one course in linear algebra) tackled it and proved it down-in-the-mud
and then Hurkyl proved it elegantly as a special
case of a more general fact that would include
the complex case of skew-Hermitian where you
take transpose and then complex conjugate of the matrix entries
can not restrain a broad grin
because both the dirtyhands approach and the elegant one
are indispensible
great


Originally posted by Hurkyl
For A and B skew symmetric matrices:

(AB - BA)t = (AB)t - (BA)t
= BtAt - AtBt
= (-B)(-A) - (-A)(-B)
= BA - AB
= -(AB - BA)

So the commutator of any two skew symmetric matrices is again skew symmetric.

In general, for any involution *:

[A, B]* = (AB-BA)*
= (AB)* - (BA)*
= B*A* - A*B*
= [B*, A*]

where [,] is the commutator
 
  • #54
you know, this thread is turning into a pretty nice lie group/lie algebra thread. there is the differential forms thread. now all we need if for someone to start a representation theory thread, and we ll have all the maths we need to do modern particle physics.

who wants to volunteer?
 
  • #55
I would absolutely love a rep theory thread -- especially if we could include both the down-n-dirty and the high-level approaches. I'm resonably competent to talk about Lie groups, but I am lost on representations.

- Warren
 
  • #56
Originally posted by chroot
I would absolutely love a rep theory thread -- especially if we could include both the down-n-dirty and the high-level approaches. I'm resonably competent to talk about Lie groups, but I am lost on representations.

- Warren

i m down for the high level part.
 
  • #57
Sure, I'll have a go at representation theory. Even if I don't understand it all, I'm sure I'll get something out of it.
 
  • #58
Originally posted by Lonewolf
Sure, I'll have a go at representation theory. Even if I don't understand it all, I'm sure I'll get something out of it.

lonewolf-

how much maths do you know? i don t think representation theory is all that hard. hang in there, i m sure we can get through it.
 
  • #59
I've covered the basics of group theory, and completed a course in linear algebra to be concluded next academic year. I'm pretty comfortable with the prerequisites you listed in the other thread. I'm willing to learn and I've got four months to fill, so I'm prepared to put some time in.
 
  • #60
Originally posted by Lonewolf
I've covered the basics of group theory, and completed a course in linear algebra to be concluded next academic year. I'm pretty comfortable with the prerequisites you listed in the other thread. I'm willing to learn and I've got four months to fill, so I'm prepared to put some time in.
dat s good to hear!
 
  • #61
would it work to have one inclusive study group

I see several people are interested in group representations
and I'm thinking maybe we can just follow our interests.

I don't remember being part of an online study group and
dont have much idea of what works and what doesnt.

I propose chroot to be our nominal emcee or leader if we need one. But I don't care if we have a leader or are complete anarchy. And if somebody else is leader that is fine too.

Lonewolf defines the prerequisites, as I see it----one course in linear algebra and some time and willingness to work.

Why don't we see if we can get to some target in, say, the representation of some classic Lie group.

Maybe we will run out of gas halfway, but anyway we will have a destination.

What say this for a target-----classify the irreducible representations of SU(2). Can we get there from scratch?
Start with basic definitions and try to touch all the essential bases on the way?

I mention it because that target is highly visible. Maybe Hurkyl, or Chroot of Lethe can suggest a more practical goal.[
Having some goal will determine for us what things we have to cover, so we won't have to decide anything.

It might not matter what order we do things either.
Lethe for example could probably say right now what all the irred. reps of SU(2) are (up to isomorphism)

oops have to go

QUOTE]Originally posted by Lonewolf
I've covered the basics of group theory, and completed a course in linear algebra to be concluded next academic year. I'm pretty comfortable with the prerequisites you listed in the other thread. I'm willing to learn and I've got four months to fill, so I'm prepared to put some time in. [/QUOTE]
 
  • #62


Originally posted by marcus

What say this for a target-----classify the irreducible representations of SU(2). Can we get there from scratch?
Start with basic definitions and try to touch all the essential bases on the way?

I mention it because that target is highly visible. Maybe Hurkyl, or Chroot of Lethe can suggest a more practical goal.[
Having some goal will determine for us what things we have to cover, so we won't have to decide anything.

It might not matter what order we do things either.
Lethe for example could probably say right now what all the irred. reps of SU(2) are (up to isomorphism)

a slightly more ambitious goal, that i would like to suggest, is the poincaré group SL(2,C)/Z2. it includes the rotation group as a subgroup (and thus includes all the concepts of SU(2), which would probably be a very good starting place), but it has a less trivial algebra, it is noncompact, so we can address those issues, and not simply connected, so we can also address those issues.

perhaps this is too ambitious. at any rate, SU(2) is a good starting point, and if that ends up being where we finish too, so be it.
 
  • #63
I too think going for the representations of SU(2) and SO(3) would be a good first goal, if only because of the importance of those groups in physics. In any case, that's the first goal I had set myself after I read that LQG primer. :smile:
 
  • #64
Originally posted by Hurkyl
I too think going for the representations of SU(2) and SO(3) would be a good first goal, if only because of the importance of those groups in physics. In any case, that's the first goal I had set myself after I read that LQG primer. :smile:

Two online books have been mentioned.

Hurkyl I believe you indicated you were using Brian Hall
("An Elementary Introduction to Groups and Reps.")

That is 128 pages and focuses on matrix groups so it works
with a lot of concrete relevant examples. I really like it.

Earlier I was talking about Marsden's Chapter 9, and Lonewolf
extracted some stuff from that source and posted his notes,
I essentially did likewise with another patch of Marsden.

It would be helpful if we all had one online textbook to focus on.

I now think Brian Hall (your preference) is better adapted to people's interests and that maybe I goofed when I suggested Marsden.

I regret possibly causing people to waste time and printer paper printing off that long Chapter 9. I'm personally glad to have it for reference though, not the end of the world. But Brian Hall on balance seems better.

Lets see what theorems he needs to get the representations of SU(2). I mean---work backwards and figure out a route.

Brian hall's chapter 3, especially pp 27-37, seem to me to be grand central station.

chapter 3 is "Lie algebras and the exponential mapping"

He shows how to find the *logarithm* of a matrix
and he proves the useful formula

det exp(A) = exp( trace(A) )

and he proves the "Lie product formula"

and I can honestly say to Lonewolf that there is nothing scary here----nothing (that I can see with my admittedly foggy vision) that is fundamentally hard

(except at one point he uses the Jordan canonical form of a matrix----the fact that you can put it in a specially nice upper triangular form---which is a bit tedious to prove so nobody ever does they just invoke it. just one small snag or catch which we need not belabor)

It seems to me that to get where we want to go the main "base camp" destination is to show Lonewolf (our only novice and thus the most important person in a curious sense) the logarithm map that gets you from the group up into its tangent space (the algebra)
and the exponential map that gets you back down from the tangent space to the group

these are essentially the facts Brian Hall summarizes in the first 10 pages or so of Chapter 3 and then he gives a whole bunch of nice concrete examples illustrating it----pages 37-39.

Hurkyl, I am glad you mentioned Brian Hall's book

arXiv;math-ph/0005032
 
Last edited:
  • #65
Have to say, if where we want to go is the representations of SU(2) that we can certainly take a peek at the destination
and it is lovely

just was glancing at Brian Hall's page 71

this being about five pages or so into his chapter 5 "Basic Representation Theory"

So simple!

SU(2) is just a nice kind of 2 x 2 matrices of complex numbers! We always knew that, but suddenly he does the obvious thing and uses a matrix U, or (just a slight variation on the idea) its inverse U-1 to STIR UP polynomials in two complex variables!

We have to be talking to an imaginary novice to define the level of explanation and for better or worse Lonewolf is standing in for that novice. I think this polynomial idea will make sense to him!

If you have a polynomial in two variables z1 and z2,
then you can, before plugging z1 and z2 into the polynomial,
operate on them with a 2 x 2 matrix!

This gives a new polynomial in effect. It is a sweet innocent obvious idea. Why not do this and get new polynomials?

And indeed the polynomials of any given combined degree in two variables are a vector space. So there we already have our group busily working upon some vectorspace and stirring the vectors around.

And to make everything as simple as possible we will consider only homogeneous polynomials of degree m
meaning that in each term the power z1 is raised to and the power z2 is raised to---those two powers add up to m.
It is a "uniformity" condition on the polynomial, all its terms have the same combined degree.

this must be the world's easiest way to come up with an action of SU(2) on an m+1 dimensional vectorspace. Must go back to the days of Kaiser Wilhelm.

a basis of our vectorspace Vm can consist of m+1 singletons like

(z1)2(z2)m-2

the coefficients can be complex numbers, it is a vector space over the complex numbers which may be somewhat less familiar than over the reals but still no big deal.

The official (possibly imaginary) novice may be wondering "what does irreducible mean". Indeed i hope Lonewolf is around and wondering this because we really need someone to explain to.
Well there is a group
and a mapping of the group into the linear operators on a vector space (some method for the group to act on vectors, like this scheme of using matrices to stir up polynomials)

that is called a representation (speaking unrigorously)
and it is irreducible if there is no part of the vectorspace left unstirred.
no subspace of V which is left invariant by the group.
no redundant part of V which doesn't get moved somewhere by at least one element of the group.

if there were an invariant subspace you could factor it out and
so-to-speak "reduce" the representation to a lower dimensional one.
so that's what irreducible means

it looks like these polynomials get pretty thoroughly churned around by preprocessing z1 and z2 with a matrix, but to be quite correct we need to check that they really are and that there is no invariant subspace.

******footnote*****

I think I said this before but just to be fully explicit about the action of the group:


If P(z1,z2) is the old polynomial, then the matrix U acts on it to produce a new polynomial by taking U-1 and acting on (z1, z2) to produce a new pair of complex numbers

(w1, w2) = U-1 (z1,z2)

and then evaluate the polynomial with (w1, w2):

P(U-1 (z1,z2) )

*****************
hope its not unwise to take a peek at the destination
first before trying to see how to get there
especially hope to get comments from Lethe Chroot Hurkyl
on how this should go, which theorems to hit, whether to have an orderly or random progression, whether Brian Hall gives a good focus etc.
 
Last edited by a moderator:
  • #66


Originally posted by lethe
a slightly more ambitious goal, that i would like to suggest, is the poincaré group SL(2,C)/Z2. it includes the rotation group as a subgroup (and thus includes all the concepts of SU(2), which would probably be a very good starting place), but it has a less trivial algebra, it is noncompact, so we can address those issues, and not simply connected, so we can also address those issues.

perhaps this is too ambitious. at any rate, SU(2) is a good starting point, and if that ends up being where we finish too, so be it.

first off, I would love it if you would do a whole bunch of explanation and get us started moving.
I tend to talk to much so I have to shut up and wait.
But I don't want this thread to get cold!

second. I totally agree. SU(2) and SO(3) are good initial targets but if it turns out to be fun to get to them then it would be
great to go on past to Poincare

I am counting (hoping) on you (plural) to explain the exponential map that connects the L.algebra to the L.group, because that seems to be crucial to everything including describing the reps
 
  • #67
Hey Lonewolf, is there anything you need explained.

I wish Chroot or Lethe, both of whom could take over,
would take over and move this ahead.
I tend to talk too much and would like to be quiet for a while.

It is a good thread. It should do something.
What are you up to mathwise now its summer vacation?
 
  • #68
Please don't slow down the threads on my behalf. I'll be around, just nodding and smiling in the background.
 
  • #69
Explaining? Only the exponential map. I can't seem to see how it relates to what it's supposed to...maybe that gets explained further along in the text than I am, or I'm just missing the point.
 
  • #70
Originally posted by Lonewolf
Please don't slow down the threads on my behalf. I'll be around, just nodding and smiling in the background.

OK I must have said something wrong and derailed the thread.
I have this fundamental fixed opinion that in any explanation the most important person is the novice and I cannot imagine having a explanation party about groups or lie algebras or anything else without one person who freely confesses to not knowing the subject.

Then you focus with one eye on the target (the theorems you want to get to) and with one eye on the novice

and you try to get the novice to the target destination

and the novice is also partly imaginary----the real one may get bored and go away meanwhiles.

but anyway that is how I imagine it. I can't picture doing groups with just Lethe and Chroot because they both already KNOW groups. Chroot is a tech Stanford student almost to his degree. Lethe is also clearly very capable and knowledgeable.

Dont sit in the background nodding for heavens sake. ASK these people to explain something to you. Well that is how I picture things and that is my advice. But who knows, it may all work out differently.

this is a great fact:

det( exp A) = exp (trace A)

do you know what det is and what trace is and do you know
what the exponential ex map is? I sort of assume so.
But if not then ask those guys and make them work it will be good for their mathematical souls.
 
<h2>What is group theory?</h2><p>Group theory is a branch of mathematics that deals with the study of groups, which are mathematical structures that consist of a set of elements and a binary operation that combines any two elements to form a third element. It is used to study symmetry and patterns in various fields such as physics, chemistry, and computer science.</p><h2>What are the basic concepts of group theory?</h2><p>The basic concepts of group theory include groups, subgroups, cosets, homomorphisms, and isomorphisms. Groups are sets of elements with a binary operation, subgroups are subsets of groups that also form groups, cosets are subsets of groups that are obtained by multiplying a subgroup by a fixed element, homomorphisms are functions that preserve the group structure, and isomorphisms are bijective homomorphisms.</p><h2>Where can I apply group theory?</h2><p>Group theory has applications in various fields such as physics, chemistry, computer science, and cryptography. In physics, it is used to study symmetries in physical systems and in particle physics. In chemistry, it is used to study molecular structures and chemical reactions. In computer science, it is used in the design and analysis of algorithms and data structures. In cryptography, it is used to design secure encryption algorithms.</p><h2>What are some good resources for learning group theory?</h2><p>There are many resources available for learning group theory, including textbooks, online courses, and video lectures. Some recommended textbooks include "Abstract Algebra" by Dummit and Foote, "A First Course in Abstract Algebra" by Fraleigh, and "Group Theory" by Rotman. Online courses and video lectures can be found on websites such as Coursera, Khan Academy, and YouTube.</p><h2>What are some important theorems in group theory?</h2><p>Some important theorems in group theory include Lagrange's theorem, which states that the order of a subgroup must divide the order of the group, the first and second isomorphism theorems, which relate the structure of a group to its subgroups and homomorphisms, and the Sylow theorems, which provide information about the number of subgroups of a given order in a finite group.</p>

What is group theory?

Group theory is a branch of mathematics that deals with the study of groups, which are mathematical structures that consist of a set of elements and a binary operation that combines any two elements to form a third element. It is used to study symmetry and patterns in various fields such as physics, chemistry, and computer science.

What are the basic concepts of group theory?

The basic concepts of group theory include groups, subgroups, cosets, homomorphisms, and isomorphisms. Groups are sets of elements with a binary operation, subgroups are subsets of groups that also form groups, cosets are subsets of groups that are obtained by multiplying a subgroup by a fixed element, homomorphisms are functions that preserve the group structure, and isomorphisms are bijective homomorphisms.

Where can I apply group theory?

Group theory has applications in various fields such as physics, chemistry, computer science, and cryptography. In physics, it is used to study symmetries in physical systems and in particle physics. In chemistry, it is used to study molecular structures and chemical reactions. In computer science, it is used in the design and analysis of algorithms and data structures. In cryptography, it is used to design secure encryption algorithms.

What are some good resources for learning group theory?

There are many resources available for learning group theory, including textbooks, online courses, and video lectures. Some recommended textbooks include "Abstract Algebra" by Dummit and Foote, "A First Course in Abstract Algebra" by Fraleigh, and "Group Theory" by Rotman. Online courses and video lectures can be found on websites such as Coursera, Khan Academy, and YouTube.

What are some important theorems in group theory?

Some important theorems in group theory include Lagrange's theorem, which states that the order of a subgroup must divide the order of the group, the first and second isomorphism theorems, which relate the structure of a group to its subgroups and homomorphisms, and the Sylow theorems, which provide information about the number of subgroups of a given order in a finite group.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
688
  • Linear and Abstract Algebra
Replies
7
Views
1K
  • Linear and Abstract Algebra
Replies
17
Views
4K
  • STEM Academic Advising
2
Replies
43
Views
4K
  • Science and Math Textbooks
Replies
9
Views
2K
  • STEM Academic Advising
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
886
  • STEM Educators and Teaching
2
Replies
35
Views
3K
  • Linear and Abstract Algebra
Replies
1
Views
3K
Back
Top