# Category Theory - is it just "abstract nonsense"

#### Peter

##### Well-known member
MHB Site Helper
In my reading of various texts on abstract algebra on a number of topics, including recently, tensor products of modules, that a number of authors tend to make use of the concepts of category theory.

Do members of MHB believe it is worth investing time on category theory? Will it bring a better understanding of algebra. Indeed, is it necessary to understand category theory in order to get a good understanding of contemporary algebra?

Further can members inform me of any good textbook treatments or online resources in order to get a basic introduction to category theory.

Peter

#### Fantini

##### "Read Euler, read Euler." - Laplace
MHB Math Helper
Re: Category Theory - is it just "abstact nonsense"

It's the other way around: a good, firm grasp of algebra will help you understand category theory. This is a trap. I had a colleague who did asked this same question on MathOverflow. You can check here.

Best wishes,

Fantini.

#### ModusPonens

##### Well-known member
Re: Category Theory - is it just "abstact nonsense"

In my reading of various texts on abstract algebra on a number of topics, including recently, tensor products of modules, that a number of authors tend to make use of the concepts of category theory.

Do members of MHB believe it is worth investing time on category theory? Will it bring a better understanding of algebra. Indeed, is it necessary to understand category theory in order to get a good understanding of contemporary algebra?

Further can members inform me of any good textbook treatments or online resources in order to get a basic introduction to category theory.

Peter
It would be crazy to study category theory before having a good contact with abstract algebra.

But as for the second question, you will have to know category theory to understand modern abstract algebra _ that is, if you want to go deep. This is not the case of a first course on abstract algebra. Hardly on the second. Maybe on the third you can explore more.

#### Fantini

##### "Read Euler, read Euler." - Laplace
MHB Math Helper
Re: Category Theory - is it just "abstact nonsense"

I believe the take away is: yes, eventually (to both invest time and will it help). But not now. Learn (a good lot) more algebra and then jump in to category theory.

#### Deveno

##### Well-known member
MHB Math Scholar
Yes! I mean, no! Um,err...maybe!

OK, it's like this:

When you first undertake a study of groups, for example, one looks at EXAMPLES of groups. This is to motivate the idea that groups are worth studying in and of themselves, as a unifying concept of what at first appear to be disparate structures.

As one studies groups, or rings, or modules, or whatever, one notices that certain relationships keep coming up time and again, just with different "names". For example, there are group homomorphisms, that preserve "group-ness", or ring homomorphisms, that preserve "ring-ness", or $R$-linear maps that preserve the $R$-module structure.

Often key concepts can be described solely in terms of these "special kinds of maps", without direct reference to the INTERNAL structure that makes a group a group, or a ring a ring.

Category theory provides a framework to talk about these "generic constructions" and it turns out that this can be handy in both classifying structures, and in deriving certain basic rules of said structures without knowing that much about them. For example, rings have kernels (ideals) so BOOM! we get an isomorphism theorem for free.

A good case in point is "products". In sets, we have the cartesian product, in groups we have the direct product, in topological spaces we have the topological product. In all these cases, we get a certain kind of map (respectively: surjective function, surjective homomorphism, continuous surjection) from $A \times B$ to $A$ and a similar map from $A \times B$ to $B$ that lets us define any pair of "morphisms" $f: C \to A$ and $g: C \to B$ in terms of our "special maps" to get an uniquely defined map:

$f \times g: C \to A \times B$

We do this "in the same way" (or CANONICALLY) by setting:

$(f \times g)(c) = (f(c),g(c))$

so that if $p_1:A \times B \to A, p_2:A \times B \to B$ are our "special maps" we have:

$p_1\circ(f \times g) = f$

$p_2\circ(f \times g) = g$.

With category theory, you can make a "one size fits all" construction, without going into the specific details of each kind of structure. If one can do this for a particular kind of category, one just says: "Category $K$ has products", which saves a certain amount of time. The actual FORM the product takes may vary considerably from category to category, but often all one is interested in is some "universal property", which one can then use to PROVE things.

#### Evgeny.Makarov

##### Well-known member
MHB Math Scholar
Do members of MHB believe it is worth investing time on category theory? Will it bring a better understanding of algebra. Indeed, is it necessary to understand category theory in order to get a good understanding of contemporary algebra?
I can't talk for algebra because I am more into computer science. I can say that category theory (CT) became very important in the theory of programming languages. Even now to some researches CT is "abstract nonsense", but more and more the situation is that if you want to make a contribution to the modern state of knowledge in programming languages or even to understand it, you need CT. I would guess the situation in algebra is similar. So, if you plan to become a "working mathematician" (a nod towards the book "Categories for the working mathematician" by Mac Lane), you'll most likely need CT at some point.

#### Peter

##### Well-known member
MHB Site Helper
I can't talk for algebra because I am more into computer science. I can say that category theory (CT) became very important in the theory of programming languages. Even now to some researches CT is "abstract nonsense", but more and more the situation is that if you want to make a contribution to the modern state of knowledge in programming languages or even to understand it, you need CT. I would guess the situation in algebra is similar. So, if you plan to become a "working mathematician" (a nod towards the book "Categories for the working mathematician" by Mac Lane), you'll most likely need CT at some point.
Thanks to Fantini, ModusPonens, Deveno and Evgeny for help and advice ...

As I mentioned in my post I was interested in CT since in my attempt/struggle to understand tensor products of modules, I found that a number of texts mentioned CT (in particular mentioning the universal mapping property - and the fact that the TP is a functor.) I spent some hours studying CT, but still have only an extrmely modest grasp of the basic concepts ...

(I certainly struggled with Dummit and Foote's introduction to TPs of modules, but when I went to other books, as I mentioned, I was confronted with CT concepts).

Thanks to a post by Chris L, I got onto Keith Conrad's notes on TP and am still working through them (Part 1 has 53 pages ...). I am however, still finding it a bit of a struggle uphill ... I think I need more knowledge of the following ...

1. Abelian groups -

2. "free" structures - groups and modules - including the notion of commutator groups and subgroups

3. bilinear forms

(I know that Deveno in particular has often exhorted me to review my understanding of linear algebra and he is right but my fascination with algebraic structures tends to keep me focussed on understanding more about them without the requisite knowledge of the linear algebra machinery one often needs ... so I do take detours to concepts of linear algebra every now and then based on his advice ...)

Just for the information of MHB members I am a general reader who is intrigued and fascinated by the structures of algebra and the history of their discovery or creation. I study the area for its own sake and am not taking any formal courses. Hence I rely on the MHB community rather than professors or tutors ... and indeed benefit from the generous help that is available from this wonderful community ...

Peter

#### Deveno

##### Well-known member
MHB Math Scholar
One reason to study "linear algebra" is because it is a "special case" of module theory (the case where the ring $R$ is commutative with unity, and the group of units is all the non-zero elements). So it provides a number of possible illustrations of any given concept that applies to modules. Of course, some of linear algebra doesn't generalize to $R$-modules, but it can serve as a starting point to get a grip on what is going on.

"Free" structures are an example of a categorical construction. For many kinds of structures, we have a set with certain other structural features added to it. It is possible to just "forget" the additional structure, and consider the morphisms between two like structures as just ordinary functions. This yields a FUNCTOR (called a "forgetful functor"), which is basically a "category homomorphism" from some category to the category Set.

Free objects are this process in "reverse" (sort of). We start with a structureless something (like a set we want to embed in a group in a natural way, or a group we want to turn into an algebra in a natural way), and want to create a "more structured structure" out of it.

For example, suppose we start with a set with just one element in it:

$S = \{x\}$.

If we want to make an abelian group out of it, we'll need to define $x+x$, for example. So we just "invent" a new element $y = x+x$. Now we have:

$S' = \{x,y\}$.

Of course, we need to also create new elements:

$x+x+x$
$x+x+x+x$, etc., so we just add those in, too.

We need an identity, so we add that in, as well:

$T = \{0,x,x+x,x+x+x,......\}$.

Finally, we need inverses for all these, so we have to enlarge our set to include those, too:

$T' = \{\dots,-x+(-x)+(-x),(-x)+(-x),-x,0,x,x+x,x+x+x,\dots\}$.

Now, all we need is associativity, so we just declare it holds (since we haven't really said what $x+x+x$ is, we just state that:

$x+x+x = (x+x)+x = x+(x+x)$, and so on for any other "3 terms").

If you were paying attention, and you will agree to abbreviate:

$nx = x + x + \cdots + x$ (n times)

you might notice that our abelian group is isomorphic to $(\Bbb Z, +)$.

In other words, we might have DEFINED integers as "elements of the free abelian group generated by one object" (which, if you think about it, is kind of what we do when doing accounting, say with: $x$ = "a penny"). This construction is CATEGORICAL, which means we can use integers to keep track of any kind of set element (apples, dollars, salaries, trucks....if someone asks you: "what is a negative truck?", you can reply: "clearly, the additive inverse of a truck!" ).

As far as bilinear forms go, why stop with 2? There is a rich theory of multi-linear forms, leading to some very satisfying results.

#### Peter

##### Well-known member
MHB Site Helper
One reason to study "linear algebra" is because it is a "special case" of module theory (the case where the ring $R$ is commutative with unity, and the group of units is all the non-zero elements). So it provides a number of possible illustrations of any given concept that applies to modules. Of course, some of linear algebra doesn't generalize to $R$-modules, but it can serve as a starting point to get a grip on what is going on.

"Free" structures are an example of a categorical construction. For many kinds of structures, we have a set with certain other structural features added to it. It is possible to just "forget" the additional structure, and consider the morphisms between two like structures as just ordinary functions. This yields a FUNCTOR (called a "forgetful functor"), which is basically a "category homomorphism" from some category to the category Set.

Free objects are this process in "reverse" (sort of). We start with a structureless something (like a set we want to embed in a group in a natural way, or a group we want to turn into an algebra in a natural way), and want to create a "more structured structure" out of it.

For example, suppose we start with a set with just one element in it:

$S = \{x\}$.

If we want to make an abelian group out of it, we'll need to define $x+x$, for example. So we just "invent" a new element $y = x+x$. Now we have:

$S' = \{x,y\}$.

Of course, we need to also create new elements:

$x+x+x$
$x+x+x+x$, etc., so we just add those in, too.

We need an identity, so we add that in, as well:

$T = \{0,x,x+x,x+x+x,......\}$.

Finally, we need inverses for all these, so we have to enlarge our set to include those, too:

$T' = \{\dots,-x+(-x)+(-x),(-x)+(-x),-x,0,x,x+x,x+x+x,\dots\}$.

Now, all we need is associativity, so we just declare it holds (since we haven't really said what $x+x+x$ is, we just state that:

$x+x+x = (x+x)+x = x+(x+x)$, and so on for any other "3 terms").

If you were paying attention, and you will agree to abbreviate:

$nx = x + x + \cdots + x$ (n times)

you might notice that our abelian group is isomorphic to $(\Bbb Z, +)$.

In other words, we might have DEFINED integers as "elements of the free abelian group generated by one object" (which, if you think about it, is kind of what we do when doing accounting, say with: $x$ = "a penny"). This construction is CATEGORICAL, which means we can use integers to keep track of any kind of set element (apples, dollars, salaries, trucks....if someone asks you: "what is a negative truck?", you can reply: "clearly, the additive inverse of a truck!" ).

As far as bilinear forms go, why stop with 2? There is a rich theory of multi-linear forms, leading to some very satisfying results.
Thanks Deveno ... As usual, most informative and instructive ...

BTW, regarding linear algebra ... What texts do you recommend .... Especially for gaining the understanding a senior undergrad or beginning grad should have ...

Peter

#### Deveno

##### Well-known member
MHB Math Scholar
The text I learned from was Hoffmann and Kunze. The chapter on linear algebra and modules in Herstein's Topics in Algebra is sort of a "reader's digest version" that mainly focuses on canonical forms and annihilating polynomials.

Some other often-recommended books:

Linear Algebra Done Right
, Sheldon Axler
Introduction to Linear Algebra, Gilbert Strang
Finite-dimensional Vector Spaces, Paul Halmos

With linear algebra there is sort of a fork in the road: as with the calculus, there are two "good" reasons for learning it:

1. As an important tool in mathematics, used as a starting point for other more in-depth areas (such as the operator algebras one of our moderators specializes in)

2. As a tool for solving "real-life" problems: the applications of linear algebra is a HUGE list, it's used to solve engineering problems, physics, economics, weather patterns, game theory, probability, coloring, differential equations, computer calculations/programming, navigation...it's just *that* important.

There is a natural progression:

Numbers-->Vectors--->Matrices--->Tensors

each of which "condenses" more information in a smaller package, enabling us to encode many relationships in fewer symbols.

Often in advanced proofs, you will encounter lines like: "we obviously have A by linearity, so.." in other words it is assumed by most advanced texts that linear algebra is already well-understood, and no further explication is needed.

Personally, I believe that sort form of "intuitive linear structure" is already encoded in our brains, results in linear algebra feel as if they "ought to be true". Plus, you can make pretty pictures, always a good thing. I often wish I could thank the "apocryphal spider" on Rene Descartes' windowpane (in some versions of this story, it is a fly on his ceiling).

A short digression: for centuries, "real things" were abstracted into DRAWINGS, or "constructions". Turning these drawings into numbers involved a process of "commensuration", a "unit length" was chosen, and the ratio of a length was compared to the unit length. This process of "comparing" was equivalent to our current use of rational numbers, and while it was known that some lengths were "incommensurable" (such as the length of the diagonal of a square and a side of that square), creating a number system that included such things didn't come about until the theory of equations (what we would call polynomials) developed sufficiently.

Coordinatizing space finally allowed geometry and equations to "talk to each other". One could express the "top of a curve" by an actual number: a "local extremum". Trying to "pin down" what "continuity" meant in terms of "numbers systems" led directly to the development of the real numbers (the continuum), "filling in the gaps in between the rationals".

Often one needed to work in DIFFERENT coordinate systems: for example, a problem might become much easier in polar coordinates instead of rectangular coordinates. So you need to know how coordinate systems transform into each other (we do this every time we convert from Metric to English units or vice versa). This is equivalent to solving several linear equations simultaneously.

At the same time, it became clear over the centuries that the Pythagorean Theorem actually encoded a notion of "distance". Thus coordinates and distance together could be used to leverage arithmetic (what we knew about solving equations) into calculating spatial relationships. We could shift back and forth between "qualitative" and "quantitative" statements about things.

Linear algebra beautifully captures a great deal of this: you have qualitative statements like:

$T$ has a 1-dimensional kernel

side by side with quantitative statements like:

the null space of $T$ is spanned by $(1,2,1)$.

Visually, once can say: $T$ shrinks down a certain line to a point, and leaves a hyperplane behind, numerically one can calculate the null space and range of $T$. It's a perfect balance, we can go back and forth between what we learn either way, to get useful information.

Abstractly, vector spaces are one answer to: "what do we really mean by "+" if we want scaling (stretching and shrinking), as well?". Concretely, the answer comes: "a lot of things!". Assigning numbers (coordinates) to shapes in a Euclidean space (such as the plane) being one of those things.

#### ThePerfectHacker

##### Well-known member
It is abstract non-sense. It is best to learn it passively than actively. In other words, instead of just jumping straight into category theory it is best to see the language of it come up in other contexts. This way it will feel more natural, to use a pun, when you see a definition of a category. I first came across category theory in algebraic topology and since then it made more sense when I saw more official definitions from category theory.

#### Deveno

##### Well-known member
MHB Math Scholar
There has been some not inconsiderable discussion on the use of category theory (or its favored son, topos theory) as a competitor to Zermelo-Fraenkel set theory as a foundational basis for mathematics. An interesting paper I found on the subject is:

http://eis.bris.ac.uk/~rp3959/papers/onlyuptoiso.pdf

The basic dilemma can be summed up thus:

Do we care what mathematical objects ARE, or do we care how they WORK? In the former view, we must assert the existence of familiar mathematical objects (up to and including such things as: the natural numbers, the real numbers, vector spaces, blah blah blah). In the latter view, we must only show the properties we wish such objects to have are not contradictory, whether or not "more than one realization" of a concept like "real numbers" is possible isn't relevant (so we get "several set theories", one of which is the usual one).

Both approaches must (I argue) be evaluated in light of what we have empirically observed in the "natural world", for we wish "actually true" to be at least some sub-collection of "mathematically true". Thus, one's views of the nature of PHYSICAL reality will undoubtedly influence one's view of MATHEMATICAL reality.

The stamp of Kantian thought is unmistakably evident in the choice of names for category theory, and presumably attempts to be "totally synthetic". Whether or not this can be reconciled with "analytic" deduction from known data is to my knowledge, an open question..

There are large bodies of mathematics originating from the pressing need to solve a physical problem, such as: "how to faithfully transmit sound." Often, the tools developed in such a pursuit lead to ideas which have applications in areas that are far removed from the original impetus, but later prove of great utility. From a pedagogical standpoint, one must consider: what is the most efficient way to communicate the ideas? A human lifetime is, after all, of finite duration.

One thing is clear: our ability to reason purely in abstract terms matures somewhat slowly, and our grasp of particular facts is much more immediate. In light of this, it is probably best (even IF inefficient) to delay study of category theory until one comes across a NEED for it. That said, if one is to USE it at all, one must do so before one's capacity to absorb abstract information begins to diminish (that is to say, before the age of 40 or so, some place this number even lower for mathematicians due to the high rate of "burn out").

One might argue that set-theory is likewise "abstract non-sense", which makes the heavy dependence on it in, say, calculus, a bit mystifying.

#### ThePerfectHacker

##### Well-known member
Category theory will be a bad foundational start. It makes a lot more sense to work with sets. Much easier to grasp and handle. The only problem is that it is not possible to define the category of sets, of the category of modules, or the category of pointed topological spaces, and so forth, rigorously using set theory alone. So typically what one does it extend set theory to work with proper classes and then categories are phrased in terms of proper classes instead of sets. Most mathematicians do not really care, they just work with the set of all topological spaces, and do not worry about this distinction. It is needed if one wants to provide a rigorous theory. But it will over complicate things just to make the logicians happy, so it is not a good idea, it is more natural to therefore keep sticking to set theory.