Category theory is what you get when you take mappings instead of sets as the primitive objects of your universe. At first this might seem a perverse thing to do as mappings seem more complex than sets, but that is just because traditionally mappings have usually been defined in terms of sets.
In set theory you can specify that two sets be equal and you can also specify that one set be an element of another.
In category theory you can specify that two mappings be equal and you can also specify that two mappings compose end to end to produce a third mapping.
Category theory can be used to express some requirements in a very concise way.
> Category theory is what you get when you take mappings instead of sets as the primitive objects of your universe.
I'm not sure about that, because you still need some concept of set (or collection or class) to define a category, because you need a set of objects and mappings between them (technically that's a "small" category, but to define any larger category would require at least as much set-theoretical complication).
More exactly, whereas in set theory it's the membership relation between sets and their elements that is basic, in category theory it's the mapping between objects.
Nevertheless, the basic concepts of set theory can also be defined within category theory, so in that sense they're inter-translatable. In each case though, you need some ambient idea of a collection (or class or set) of the basic objects. Tom Leinster has a brilliantly clear and succinct (8 pages) exposition of how this is done here https://arxiv.org/abs/1212.6543
The thing is, even defining first-order logic requires a (potentially infinite) collection of variables and constant terms; and set theory is embedded in first-order logic, so both set theory and category theory are on the same footing in seemingly requiring a prior conception of some kind of potentially infinite "collection". To be honest I'm a bit puzzled as to how that works logically
Defining first-order logic doesn't really require set theory, but it does require some conception of natural numbers. Instead of saying there's an infinite set of variables, you can have a single symbol x and a mark *, and then you can say a variable is any string consisting of x followed by any number of marks. You can do the same thing with constants, relation symbols and function symbols. This does mean there can only be countably many of each type of symbol but for foundational purposes that's fine since each proof will only mention finitely many variables.
Allowing uncountably many symbols can be more convenient when you apply logic in other ways, e.g. when doing model theory, but from a foundational perspective when you're doing stuff like that you're not using the "base" logic but rather using the formalized version of logic that you can define within the set theory that you defined using the base logic.
Thanks, that sharpens it then to a question about natural numbers, or at least some idea of an indefinitely extensible collection of unique elements: it seems the ordering on the numbers isn't required for a collection of variables, we just need them to be distinct.
I don't think you need set theory to define set theory (someone would have noticed that kind of circularity), but there still seems to be some sleight of hand in the usual presentations, with authors often saying in the definition of first-order logic prior to defining set theory that "there is an infinite set of variables". Then they define a membership relation, an empty set, and then "construct the natural numbers"... But I guess that's just a sloppiness or different concern in the particular presentation, and the seeming circularity is eliminable.
Maybe instead of saying at the outset that we require natural numbers, wouldn't it be enough to give an operation or algorithm which can be repeated indefinitely many times to give new symbols? This is effectively what you're illustrating with the x, x*, x**, etc.
If that's all we need then it seems perfectly clear, but this kind of operational or algorithmic aspect of the foundation of logic and mathematics isn't usually acknowledged, or at least the usual presentation don't put it in this way, so I'm wondering if there's some inadequacy or incompleteness about it.*
So category theory is really the theory of composition of mappings. I conjecture that all programming can be seen as just the composition of mappings. If this is correct then category theory is a theory of programming.
It can very much be. Here’s one example of this phenomenon (there are many others but this is the most famous): https://wiki.haskell.org/Curry-Howard-Lambek_correspondence
You don't need category theory to connect dots with arrows, graph theory is enough for this.
Category theory is actually a ‘simplified’ graph theory, i.e. you can see categories as a restricted class of graphs. E.G. ‘Category Theory for Computing Science’ introduces categories this way (a category is a directed graph with associative composition and identity; the free category on a graph is the graph with all identities and compositions filled in). But the restrictions (associative composition and identity) are harmless and natural for programming applications where there's always a notion of ‘do nothing’ or ‘do one thing after another’, and unlock a lot of higher structure.
But what's the utility of this definition? Does it help solve or prove something?
It helps you build an intuition for categories, if you're used to graphs :)
If you have a working intuition for categories then in most cases the specific formulation you choose as a foundation doesn't matter, just as most mathematicians work nominally in set theory without worrying about the subtleties of ZFC.
IMO the right intuition about a tool comes from applying it in the context where it provides a real leverage. In case of Category Theory that would be advanced algebraic topology (not re-phrasing basic things which are easier to understand without CT).
If you allowed infinite graphs maybe. How would you define a functor or natural transformation in graph theory? Seems like you would need to construct a conceptual system that is just equivalent to category theory
No, but if you want to talk about composing those arrows (and a sensible notion of composition should probably be associative, and perhaps have a unit) you eventually end up reinventing category theory.
> you can also specify that two mappings compose
Two mappings with corresponding domain/codomain have to compose by definition of a category. It's not something you can specify.
Yes. When you are specifying a system you are building the category that you want it to live in.
That is probably what they mean by specifying that they compose.
If all you know is that you have two mappings you don't know they compose, until you get the additional information about their sources and targets. In a way that's what the source and targets are: just labels of what you can compose them with.
> Category theory can be used to express some requirements in a very concise way.
Could you give an example in programming, what can be easier expressed in CT than with sets and functions?
It's more that category theory foregrounds the functions themselves, and their relationships, rather than on the elements of sets which the functions operate on. This higher-level perspective is arguably the more appropriate level when thinking about the structure of programs.
For more detail, see Bartosz Milewski, Category Theory for Programmers:
"Composition is at the very root of category theory — it’s part of the definition of the category itself. And I will argue strongly that composition is the essence of programming. We’ve been composing things forever, long before some great engineer came up with the idea of a subroutine. Some time ago the principles of structured programming revolutionized programming because they made blocks of code composable. Then came object oriented programming, which is all about composing objects. Functional programming is not only about composing functions and algebraic data structures — it makes concurrency composable — something that’s virtually impossible with other programming paradigms."
https://bartoszmilewski.com/2014/10/28/category-theory-for-p...
> Category theory is what you get when you take mappings instead of sets as the primitive objects of your universe
Why have I never seen it explained like this before. Wow, thank you!
> Category theory can be used to express some requirements in a very concise way.
Can you an example?
Take a mapping a and precompose it with the identity mapping i. By the definition of the identity mapping the resulting composition is equal to a.
(Here ';' represents forward composition. Mathematicians tend to use backward composition represented by '∘' but I find backward composition awkward and error-prone and so avoid using it.)Now, if there is another mapping j that is different from i, such that
then the mapping a loses information. By this I mean that if you are given the value of a(x) you cannot always determine what x was. To understand this properly you may need to work through a simple example by drawing circles, dots and arrows on a piece of paper.If there is no such j then mapping a is said to be a monomorphism or injection (the set theoretic term) and it does not lose information.
This specification of the property 'loses information' only involves mapping equality and mapping composition. It does not involve sets or elements of sets.
An example of a mapping that loses information would be the capitalization of strings of letters. An example of a mapping that you would not want to lose information would be zip file compression.
If you alter the above specification to use post-composition (a;i = a and a;j = a) instead of pre-composition you get epimorphisms or surjections which capture the idea that a mapping constrains all the values in its codomain. I like to think of this as the mapping does not return uninitialized values or 'junk' as it is sometimes called.
Bartosz Milewski works through this in more detail (including from the set-theoretic side) in the last 10 minutes of https://www.youtube.com/watch?v=O2lZkr-aAqk&list=PLbgaMIhjbm... and the first 10 minutes of https://www.youtube.com/watch?v=NcT7CGPICzo&list=PLbgaMIhjbm....