Showing posts with label Abstract Algebra. Show all posts
Showing posts with label Abstract Algebra. Show all posts

18 September 2012

Lattice Theory for Programmers and Non Computer Scientists


Part One: Mathematical Foundations


Matt Might has a blog post "Order Theory for Computer Scientists" in which he concisely outlines some basics of order theory.  It is a nice post but it is targeted towards computer scientists and is possibly only of interest to that audience, especially since all of his examples are in Haskell, not that there is anything wrong with that, but I think this is unfortunate as these ideas are potentially interesting to programmers as well.


In this post I am mostly talking about lattice theory as opposed to the more general order theory. While general order theory is both interesting and relevant to programming and CS, lattice theory is too and there are many interesting ideas that relate to lattice theory. Lattices impose more structure on orders and that is probably why there are a lot of books on lattice theory such as George Gratzer’s comprehensive and recently updated Lattice Theory : Foundation.  Lattice theory has been described by Gian-Carlo Rota, an accomplished twentieth century mathematician and lecturer, as having "been the object of such vociferous vituperation".  The book Combinitorics the Rota Way, which he coauthored, gives deep insights into this subject and its relationship to other areas of math including, of course, combinatorics, but also topology, set theory, Boolean algebras, probability theory, and more.  I think Rota will be vindicated with a vengeance as it seems to me that lattices show up almost everywhere in math.  Additionally I feel that lattice theory will be become quintessential in computer science and software engineering and should be a mandatory math for those curricula.  I will expand on these ideas in follow up posts.


One anecdote tells of a senior mathematician who stopped Rota in the halls of MIT and stared at him demanding: "Admit it! All lattice theory is trivial."  I confess I have some difficulty with lattice theory as it does seem to be both trivial and deep, at first the ideas don’t seem to really have that much substance, but deeper you dig the more things you find, and it seems that it is always showing up in other areas of math, and even if it is not obvious it often seems to be right under the surface.  I think that’s the duality here, the ideas, per se, are not that impressive, much of the importance comes from the application of the ideas and their relation to other areas of math.


In George Gratzer’s updated Lattice Theory : Foundation he states that a difference between the new addition and the previous edition is the addition of more diagrams, and recommends the reader drawing their own as well, in my opinion he does not go far enough, one of the goals of this post to have some pictures for the basic ideas.  I admit I will do my best here to help the motivated learner, but this post should probably be viewed as supplemental to other resources.  Also you will need to have some knowledge of set theory and concepts like Cartesian products and some knowledge of abstract algebra such as group theory may also be helpful.  Also knowledge of basic graph theory is needed.  You will also need some comfort level with basic mathematical notation.  If you don’t have this knowledge, hopefully you are willing to get up to speed on it, trust me, it’s worth it.  Another point to note is that everything covered here will be finite, all sets, orderings, and lattices are finite, I believe that much of this extends to the countable infinite, etc.  I leave that for you to pursue.



Lattices Drawn as Graphs


Lattices have some connections to graph theory and to really fully understand the nuances of these ideas you need to have an understanding of basic graph theory, specifically the differences between directed graphs and undirected graphs and an understanding of cycles and acyclic graphs.



Lattices can be drawn as graphs the main diagram used to visually represent lattices is the Hasse diagram which depict lattices as a undirected graph, more on this below.  Also it should be noted that issues of graph planarity are ignored and two crossing edges, that is with no vertex, have no significance in terms of order, also sometimes for aesthetic reasons intersecting lines are shown as crossing behind other lines, again this has no impact on the order structure.



Binary Relations


Lattices are built on the concept of a binary relation on a non empty set. This is usually expressed in terms of the Cartesian product of a set with itself, where R is the relation and S is the set:


R : S x S → S


In this case the binary relation emits a member of the same set, which means it is closed on the set.


For example for the set S = {a,b,c} a possible relation is the ordered pairs: R = {(a, c), (a, a), (b, c), (c, a)}.  In general a Binary Relation can be written as a set of ordered pairs.  When talking about orders there is a distinction as to whether the relationship between two elements in the set exists, meaning does the ordered pair (x,y) exist in R.  In the example above (b,c) exists, (b,c) ∈ R. Also note that opposite relation of (b,c), (c,b) is not in R, (c,b) ∉ R.


This binary relation can be seen in the diagram below, each of the ordered pairs map to a directed edge:




Properties of Binary Relations


Binary Relations can have certain properties.  I feel this is the crux of understanding ordering relationships.  The first two properties are:


Reflexive ∀x ∈ S (x,x) ∈ R

Transitive∀ x,y,z ∈ S if (x,y) ∈ R  and (y,z) ∈ R then (x,z) ∈ R


If a binary relation has these two properties, that is it is Reflexive and Transitive then it is known as a preorder or quasi-order.  This is a more general type of binary relation which we are not really interested in initially except in how it relates to two more specific types of binary relations.


The symmetric property is defined as:


Symetric ∀ x,y ∈ S if (x,y) ∈ R  then (y,x) ∈ R


A preorder with the symmetric property is known as an equivalence relation.  In terms of Lattice Theory this is another binary relation that we are not immediately interested in.  Equivalence relations are well worth learning more about in their own right, as they play an important role in partitioning sets which have interesting applications in abstract algebra in relation to quotients of groups and rings and homomorphisms of groups and rings.


The next property, which we are very interested in, is the antisymmetric property which is defined as:


Antisymmetric ∀ x,y ∈ S if (x,y) ∈ R and (y,x) ∈ R then x = y


A preorder with the antisymmetric property is known as an order relation sometimes referred to as a partial order. A set with a partial order on it is called a partially ordered set, poset, or just an ordered set.  So now we have a definition of an order:


If S is a non-empty set then by an order on S we mean a binary relation on S that is reflexive, anti-symmetric, and transitive.


We will now use the less than equals symbol ≤ to denote our order relation which follows as:


(∀x∈ S) x ≤ x;                                     (reflexive)
(∀x, y ∈ S) if x ≤ y and y ≤ x then x = y;      (antisymmetric)
(∀x, y, z ∈ S) if x ≤ y and y ≤ z then x ≤ z.(transitive)

In general an order is defined using the notational convention: (S, ≤) where S is the Set on which  the order is defined and ≤ denotes ordering relation.  Note:  ≤ is not necessarily the same as less than or equals, it might be, but it could also be a number of other order relations.



Comparability: Partial Orders, Total Orders and Antichains


One thing that I found confusing when learning about ordering relations is the difference between a total order and a partial order.  The idea of a partial order is that not all elements in a set have a relation.  Some pairs of elements are incomparable, meaning no relation exists between them.  This is denoted using the parallel lines symbol: ∥, for example x ∥ y means (x,y) ∉ R and (y,x) ∉ R. 


An example of a partial order is the set of partitions below:



This is a partial order of set partitions of a set of three items.  For example suppose that the set {a,b,c} is a plate containing an apple, a banana and a cherry.  As you move down the lattice you get more partitions, in the second row you can partition two pieces of fruit onto one plate and one on another, in three different ways {a}, {b,c}  or  {b}, {a,c}  or  {c}, {a,b} .  In the bottom row you can only partition three pieces of fruit one way on three different plates {a},{b},{c}.  If you move up the lattice you can reverse the operation and consolidate the set partitions (plates) in the same manner. 


You should note that above the elements:  a|bc,  b|a,c, and c|a,b have no ordering relation to each other, that is there is no way to further partition the current plates, rearranging them isn’t allowed. So these items are incomparable: a|bc  ∥  b|a,c,   b|a,c  ∥  c|a,b , ...


A total order is also known as a linear order or a chain. In a total order all elements are comparable meaning that the relationship has the following total property:


Total ∀ x,y ∈ S  (x,y) ∈ R  or  (y,x) ∈ R


A fairly simple example of a chain is the integers 1-5 with the ordering relation less than or equals, which would be written as ({1,2,3,4,5}, ≤) which would have the following diagram:



An antichain is a set where no two elements are comparable:


Antichain ∀ x,y ∈ S  (x,y) ∉ R  and (y,x) ∉ R


An example is the suits of a card deck, there are probably some exceptions, but in general a spade, a heart, a club and a diamond have no order relation to each other, this antichain can be depicted as:



Also you should note that above the elements:  a|bc,  b|a,c, and c|a,b in our set partition lattice above form an antichain within that lattice.



Order Duality


Order duality is actually a fairly intuitive idea, for example if you had a list of words in alphabetical order, you can reverse the order. This holds true for every ordering relation and is called the Duality principle.  More formally if you have an order P = (S, ≤), its dual denoted Pd would be defined as Pd =(S, ≥).  If you take any of the lattice examples above or below and reverse them, flip them upside-down, you get the dual of the order relation.  This is one of the most seminal ideas in order theory. It also shows up in other parts of math.



The Powerset Lattice


The powerset of set is a set of all of the subsets of the set.  For a set of three items, S = {a,b,c} The power set of S, which is a set of sets, is: { {a,b,c}, {a,b }, {a,c}, {b,c}, {a}, {b}, {c}, ∅ }.  A common notation for denoting the powerset is 2S.  This notation draws from the fact that the cardinality of the powerset is the cardinality, the size of the set, of the set raised to the power of 2, this can be expressed more concisely using the notation |S| to describe cardinality, so the above set has the cardinality |S| = 3. The powerset of S 2S has the cardinality |2S | = 23 = 8.


The lattice below shows the ordering relation "is a subset of", this can be denoted by ⊆ and the order would be given in notational form as (2S, ⊆).



Its order dual which is "is a superset of" given by (2S, ⊇):




Binary Relations Drawn as Graphs


Now that we have some definitions and diagrams in place I wanted to revisit the drawing of lattice diagrams.  Our first relation diagram above has a loop for (a,a) and (b,c) (c,a) makes it transitive, but it is not reflexive because it is missing loops for b and c.  So that relation is not a preorder which means it is also neither an equivalence relation nor an ordering relation.


For every preorder, there corresponds a directed graph, with elements of the set corresponding to vertices, and the order relation between pairs of elements corresponding to the directed edges between vertices.  Graphs of preorders can contain cycles.  Also all nodes in a preorder group will have a loop for the reflexive relation that must exists on each node.  Graphs representing equivalence relations will be undirected graphs since these relations are symmetric and will of course have loops on each vertex for reflexivity.   As mentioned above graphs for partial orders will be a directed acyclic graph graph due to antisymmetry also they should have reflexivity loops.


The total order for 1-6 mentioned above would look as follows for the relation ≤ it is a simple directed graph:



If we were to add all of the transitive and reflexive relations it would look like this:



As we have seen in our diagrams the reflexivity loops are omitted as they are not needed and would be increase the complexity of the diagrams.  Also in the diagrams the transitivity can be inferred by following the two paths connecting three nodes.


Hasse diagrams are drawn as undirected graphs which implies the order duality, we can decompose that graph into two directed graphs as in our example above for the two dual realtions (subset ⊆ and superset ⊇) on the powerset lattice, the lattice is conventionally drawn as the following Hasse diagram: 



Another interesting point here is the difference between equivalence relations and order relations is that no cycles will occur in an ordering relation.



Subsets of Posets


Since we are dealing with sets, a set with an order relation, a poset (short for partially ordered set), can have subsets.  The interesting thing is that the subsets as a whole can be different types of orders than the original poset.  For example the power set above has as a sub-poset, {a,b,c}-{a,b}-{a}-∅, which is a total order.  Additionally the antichain {{a,b}  {a,c}  {b,c}} is also a subset of the powerset poset.  These elements still have the same relations as they did in the original set, but the subsets themselves are different structures.



Upper and Lower Bounds


A finite set, remember we are only considering finite sets, with an order relation implies that if you move up and down the ordering relation eventually you will hit a maximum or minimum element.  The above examples, the total order has an upper bound of 5 and lower bound of 1, the powerset lattice has an upper bound of the set itself and the empty set as a lower bound.  The antichain does not have a single upper or lower bound the four elements are each both their upper and their lower bounds.  The idea of bounds, specifically the idea of greatest lower bound (GLB)  which is also known as an infimum and the dual the least upper bound (LUB) which also known as the supremum are important in order theory but show up in other areas of math such as topology.


A poset may not have one upper or lower bound, this is a distinction from a lattice which is still a poset but it has a single upper and lower bound.   A lattice will have both a distinct upper and distinct lower bound, that is one element that is the upper bound and one element that is the lower bound. These are sometimes denoted as ⊤ for upper bound and ⊥ for lower bound.  This bounding feature of lattices is very important as we will see.



The Samuel L Jackson Lattice (Total Order)


Since Samuel L. Jackson has a habit of showing up everywhere, he is the new Kevin Bacon, and it is only fitting that he would show up in Lattice Theory.  Another example of a total order or chain is what is known as the "Samuel L. Jackson Scale of Black Emotion" which was developed by Larry Willmore of the Daily Show.  It forms a Total Order also known as the Samuel L Jackson Lattice, which includes the following elements:


  • Furious Anger Guy
  • Shaft
  • Changing Lanes Guy
  • Snakes on a Plane Guy
  • Jurrasic Park Jackson
  • Frozone
  • Mace Windu

As you can see the sup(Samuel L. Jackson Lattice) = ⊤ = "Furious Anger Guy" and the inf(Samuel L. Jackson Lattice) = ⊥ = "Mace Windu". Now it’s pretty obvious in an intuitive human sense that the Master of the Order and Jedi Master Mace Windu is the lattice dual of Furious Anger Guy of Pulp Fiction, well up to his epiphany in the diner robbery scene anyway.


As with all lattices, this lattice has an order dual denoted (Samuel L Jackson Lattice)d known as the Inverse Samuel L Jackson Lattice, which just a reverse of the above list.


Ok, I’m having a little fun here, trying to lighten the mood a little.  However, this is a legitimate example of a lattice.



Lattices vs Posets


By definition a lattice is a poset, however, a poset is not necessarily a lattice.  A lattice is defined as, with sup being supremum and inf being infimum:

An order (S,≤) is a lattice if sup{a,b} and inf{a,b} exist ∀ a, b ∈ S.


Actually there are some distinctions between different classes of lattices:


A lattice is a poset in which all nonempty finite subsets have both a least upper bound and a greatest lower bound.

A complete lattice is a poset in which all subsets have both a least upper bound and a greatest lower bound.

A bounded lattice has a unique top element and a unique bottom element, actually there are algebraic implications that we will discuss below.

Finite sets form complete lattices and since we are limiting ourselves to finite sets we will only be dealing with complete lattices. Also every complete lattice is a bounded lattice so when we talk about lattices here we will mean complete lattices aka bounded lattices.



Up and Down Sets


If you take an element of a poset or a Lattice and take the subset of all elements that are greater than the element you get a new poset or lattice that is a subset, these types of subsets are referred to as up-sets or down-sets. An up-set of an element x denoted as ↑ x includes the element and everything above it is defined as:


↑ x = {y∈ P : y ≥ x}


The up-set of the element {a} in the powerset lattice above would look like (shown in blue):



Similarly, or perhaps "dually" the down set, denoted as ↓ x, is defined as:


↓ x = {y∈ P : y ≤ x}


With the down-set of the element {a,b} in the powerset lattice (in red):




Meet and Join


Meet and Join are two binary operators defined on elements of posets and and lattices which yield a unique value and they are dual operations of each other.  In the case of a poset meet and join may or may not exist on any two elements, in the case of a bounded lattice they will exist.  Join is usually denoted with the or symbol: ∨ and meet is denoted with the and symbol: ∧.  Join is equivalent to supremum and meet is equivalent to infumum. They are defined as: 


x ∨ y = sup{x,y}

x ∧ y = inf{x, y}


The following describes their relation with ordering:


x ≤ y  ⇔ x ∨  y = y

x ≤ y  ⇔ x ∧  y = x


As you can see above when two elements are comparable meet and join will reduce to the appropriate minimum or maximum element, when two elements are incomparable as in the picture of the join below between {a} ∨ {c} the lattice lines show how the join moves up to the element {a,c} that is above each element, not coincidently the join on this lattice is the set union operation ∪ actually this join can be expressed as {a,c} = {a} ∪ {c} which is right out of set theory.



And the dual relation is {a} ∩ {c} = ∅:




Semilattices


A semilattice will define either a meet or a join operation over an order resulting in either a meet  semilattice or join semilattice, technically a lattice is both of these combined. A good example is a tree which as a single root node, let’s call it a lower bound aka an infimum, a common example is the prefix tree for strings, in this case we are using binary strings 0 or 1 up to length three:



In this example the order relation is: "Is a prefix of".  As you can see you can define a meet on any two elements with the empty string λ serving as the semi-lattice bottom ⊥. Also as you can see there are no joins on this order.  Pretty much any tree can be viewed as semi lattice with descendency from the root node through all of the other nodes serving as the ordering relation.



Lattices are Algebraic


One of the interesting and sometimes confusing things about lattices is that, while they are structural they are also algebraic.  Meet and join follow a number of algebraic rules.  In general lattices can be defined as an algebra L=(S, ∧, ∨) which satisfy the following algebraic rules for all x, y, z ∈ S:


Idempotent

x ∧ x = x

x ∨ x = x,


Commutative

x ∧ y = y ∧ x

x ∨ y = y ∨ x,


Associative


x ∧ (y ∧ z) = (x ∧ y) ∧ z

x ∨ (y ∨ z) = (x ∨ y) ∨ z,


Absorption

x ∧ (x ∨ y) = x

x ∨ (x ∧ y) = x.


The first three pairs of axioms say that L is both a meet and join semilattice. The fourth pair (the absorption laws) say that both operations induce the same order on L.


Since we are dealing with bounded lattices we extend the above laws to include the following for the top and bottom elements, here we will represent the bottom element ⊥ as 0 and the top element ⊤ as 1.  For a bounded lattice the algebra is defined as L=(S, ∧, ∨, 0, 1) with the following identity laws:


x ∨ 0 = x

x ∧ 1 = x

x ∧ 0 = 0

x ∨ 1 = 1



For our powerset lattice above the algebra is (S, ∩, ∪, ∅, S) with the empty set as the identity for the union operation and the set itself as the identity for the intersection.



Other Lattice Properties and Operations


If you take our powerset example above, you might have noticed that there are some more algebraic properties of this lattice that are not listed above.  The first is the distributive property.  While the powerset lattice does have the distributive property you should note that not all lattices do.  The distributive property is defined as:


Distributive Law

x ∨ (y ∧ z) = (x ∨ y) ∧ (x ∨ z)

x ∧ (y ∨ z) = (x ∧ y) ∨ (x ∧ z)


Another property which can be found in certain lattices and also applies to our powerset lattice is the property of complementation which means that for every element x the lattice has a complement y under both the meet and join operations:


x ∧ y = 0

x ∨ y = 1


Some Lattice which are Complemented can have additional properties which is known as orthocomplementation and has the following laws:


Complement Law

x ∧ xc = 0

x ∨ xc = 1


Involution Law

(xc)c = x


Order Reversing

x ≤ y then yc ≤ xc


Our power set lattice which is also a Boolean Algrebra also follows Demorgans laws:


Demorgans laws

(x ∨ y)c = xc ∧ yc

(x ∧ y)c = xc ∨ yc


Some lattices can have a property known as Modularity which follows the following law, which is similar to the distributive law:


Modular law

x ≤ z implies x ∨ (y ∧ z) = (x ∨ y) ∧ z




I am including Modularity for completeness as I am not discussing it or including examples.  Distributivity, Modularity and Complementation have many intricacies and there is much theory and complexity surrounding how these work and where they show up, this becomes quickly obvious if you dig into more resources on lattices.



The Algebraic Structure of the Powerset Lattice


With the above algebraic laws of complemented (orthocomplemented) bounded lattices defined, I figured it might be nice to list out these laws as they apply to our powerset lattice, if we take the powerset 2S of any set S, the empty set ∅ is the lower bound and the set S is the upper bound. The powerset lattice has the following laws:


Idempotent

x ∩ x = x

x ∪ x = x


Commutative

x ∩ y = y ∩ x

x ∪ y = y ∪ x


Associative

x ∩ (y ∩ z) = (x ∩ y) ∩ z

x ∪ (y ∪ z) = (x ∪ y) ∪ z


Absorption

x ∩ (x ∪ y) = x

x ∪ (x ∩ y) = x


Distributive

x ∪ (y ∩ z) = (x ∪ y) ∩ (x ∪ z)

x ∩ (y ∪ z) = (x ∩ y) ∪ (x ∩ z)


Identity

x ∪ ∅ = x

x ∩ S = x

x ∩ ∅ = ∅

x ∪ S = S


Complement Law

x ∩ xc = ∅

x ∪ xc = S


Involution Law

(xc)c = x


Order Reversing

x ⊆ y  then  yc ⊆ xc


Demorgans laws

(x ∪ y)c = xc ∩ yc

(x ∩ y)c = xc ∪ yc



The D4 Dihedral Subgroup Lattice


This topic might be filed under an advanced example, so feel free to skip if you’ve had enough for the moment.  In doing my research I came across the following lattice diagram, it is the lattice of the subgroups of the D4 Dihedral Group, I looked at cyclic groups and subgroups including the cyclic subgroups of D4 in a previous post on abstract algebra.  I have modified it to better show the identity element marked in blue. It also includes the names of well known isomorphic groups that correspond to each subgroup.



This lattice shows some really interesting things about the dihedral group. The top element is the D4 group itself and the bottom element is the trivial Z1 group consisting of only the identity. As you can see the center forms a chain consisting of three cyclic groups, Z1, Z2, Z4, and D4 (not cyclical). The anti-chain of five Z2 isomorphic groups, all of which are involutions, like negation it is an operation which if applied twice results in the original value, on the left are the horizontal (H) and vertical (V) reflection involutions, the middle one is a 180 degree rotation (R2) involution and on the right are two diagonal (D1, D2) reflection involutions. As you can see on the vertical and horizontal involutions on the left along with the rotation involution form the group (R0, V, H, R2) and on similarly on the right the subgroup (R0, D1, D2, R2) is formed. Also both of these groups are isomorphic to what is known as the V group, aka the four group, aka Klein group, aka the Vierergruppe. I am not sure what all of this means but I find it fascinating. Also please note that there is a minor nomenclature conflict as V refers to both the Klein group and the vertical reflection D4 group element, but the context of each is mutually exclusive. The same goes for Dn where D1 and D2 are elements and D4 is a group. Additionally this lattice give a nice visual depiction of Lagrange's Theorem which states that the order of each subgroup will evenly divide the order of the group. Therefore, as in the lattice, a group of order 8 only has subgroups of orders 4,2,1.


Some Notes on Notation


In math notation can vary and order theory has several variants on notation that it helps to be aware of.  Firstly the following sets of symbols can often be used in relations to orders:



Sometimes these choices are made at a particular authors preference other times combinations of these are used, such as in discussing things like Galois Connections which are mappings between two different orders, in such a case multiple sets of the above symbols might be needed to differentiate between specific instances of different orders.  Usually the set notation symbols are not used for lattices but they sometimes are, and in our examples above both meanings apply.  Also the following symbols get used to describe comparison as well:



Another set of symbols that show up, which are often used to describe covers, or immediately precedes or immediately succeeds, in that there are no elements between the two items compared are:  




Additional Topics


The topics of lattice theory and order theory are extensive and this post really just scratches the surface.  When you feel comfortable or even if you don’t depending on your learning style, there are many things that I did not cover.  There are order specific concepts like strict and asymmetric. Complete partial orders show up a lot in CS as well.  For lattices there are things that I only mentioned like modularity and Galois connections, there functions on lattices which include ideas like monotone, antitone, and isotone.  New lattice can be created by number of techniques like the Cartesian product of two lattices.  And of course there are many interesting results relating to fixed points, and the list goes on.


In my next post I will look at some specific examples of some the ideas discussed here in everyday programming and some more theoretical examples as well.



References and Resources


There are a number of books on the subject, including the previously mentioned  the recently updated Lattice Theory: Foundation  by George Grätzer and Combinatorics: The Rota Way by Joseph P. S. Kung, Gian-Carlo Rota and Catherine H. Yan, there are a number of books on the subject including:


  • Lattices and Ordered Algebraic Structures by T.S. Blyth
  • Introduction to Lattices and Order by B. A. Davey and H. A. Priestley
  • General Lattice Theory (Second Edition) by George Grätzer, B.A. Davey, R. Freese and B. Ganter

If you are just getting into the subject, and even if you are not, I would recommend taking advantage of some good free resources, listed in no particular order:


Lattice Theory with Applications  by Vijay K. Garg

Notes on Lattice Theory[complete pdf here] by J. B. Nation of the University of Hawaii

Chapter 1 - Lattice theory in Vaughan Pratt’s CS 353: Algebraic Logic course handouts

Dave Abrams "Order I Say" blog post, some of the above images are from his post, with his permission, thanks.

Lattice Tutorial by Nenad Jovanovic of Secure Systems Lab

Lecture 7 and lecture 8 of Lionel Levine’s course on Algebraic Combinatorics

Chapter 2 - Ordered Sets and Complete Lattices, A Primer for Computer Science by Hilary A. Priestley of  the Mathematical Institute, University of Oxford

The Many Lives of Lattice Theory by Gian-Carlo Rota

Wolfram’s Mathworld on lattices

28 March 2012

Amazing Abstract Algebra: Groups and Rotations


A little while ago I noticed a very interesting group, I am sure it is well known in math circles but I find it intriguing for a number of reasons and I have recently learned more about Abstract Algebra which ties in some more interesting ideas. The group relates to imaginary numbers, it is the set {1,-1,i,-i} which is closed under multiplication. It has an identity which is 1. Each element has an inverse, -1 is its own inverse: -1 · -1 = 1 and -i · i = 1. The following cayley table shows all this:


1 i -1 -i
1 1 i -1 -i
i i -1 -i 1
-1 -1 -i 1 i
-i -i 1 i -1

If you are new to these ideas I recommend looking at my previous post on abstract algebra, especially since we will be looking at some isomorphisms again. In order to talk about the ideas in the post we need to get some group theory concepts out of the way first, don't worry they aren't hard in fact they're fairly intuitive. Also in this post I am restricting the discussion to finite groups, groups with a finite number of elements. In fact the first idea is the concept of group order.

The order of a finite group is simply the number of elements of the set the group is defined on and is denoted as |G|.

The group above has an order of 4.

The next idea is that of exponentiation, which can be thought of as a subset of "classical" exponentiation on the field of real numbers, see identities here. Exponentiation on group elements extends to integer exponents only and refers to repeated application of the binary operation to the element itself. So:


an = a·a·a·a...a ( a "times" itself n times)

a-n = a-1·a-1·a-1·a-1...a-1 (a-1 "times" itself n times or the inverse of an)

a0 = e (where e is the identity element)

The laws of exponents are:

am an = am+n

(am)n = amn

a-n = (a-1)n = (an)-1


Just as a group has an order an element of a group also has an order and it is related to exponentiation and it is defined as:

If there exists a nonzero integer m such that am=e, then the order of the element is defined to be the least positive integer n such that an=e. Also if there does not exist any nonzero integer m such that am=e, we say that a has order infinity.

So the elements in our group (⋅, {1,-1,i,-i}) above have the following orders:

11 = 1 (this is the identity and has order 1) [11 = 1, ...]

-12 = 1 (has order 2) [-11 = -1, -12 = 1, ...]

i4 = 1 (has order 4) [i1 = i, i2 = -1, i3 = -i, i4 = 1, ...]

-i4 = 1 (has order 4) [-i1 = -i, -i2 = -1, -i3 = i, -i4 = 1, ...]

Order is also sometimes referred to as period and you should note that each of the above will cycle through each power listed in the square brackets so for both the order 4 elements {i, -i}: a1n=a1, a2n=a2, a3n=a3, a4n=a4 for all integers n>0 and the values will cycle through the list which would be the case for any element of finite order.

As it turns out elements like i and -i which cycle through all of the elements of a group exactly once in the period have an order that is equal to the order of the group. These define what is known as a cyclic group. So our group is cyclic and any element which cycles through all elements once is the generator so i and -i are generators for the group.

Another interesting result from group theory is that any cyclic group of order n is isomorphic to the group (+,Zn) which is the integers (mod n) under addition so our group above is just a renaming of Z4 see the similarity to the cayley table below:


+ 0 1 2 3
0 0 1 2 3
1 1 2 3 0
2 2 3 0 1
3 3 0 1 2

Now I want to take a look at another important group, it's called the dihedral group, this one is known as D4 and has the cayley table:

  R0 R1 R2 R3 V H D1 D2
R0 R0 R1 R2 R3 V H D1 D2
R1 R1 R2 R3 R0 D1 D2 H V
R2 R2 R3 R0 R1 H V D2 D1
R3 R3 R0 R1 R2 D2 D1 V H
V V D2 H D1 R0 R2 R3 R1
H H D1 V D2 R2 R0 R1 R3
D1 D1 V D2 H R1 R3 R0 R2
D2 D2 H D1 V R3 R1 R2 R0

The dihedral group is the group of symmetries of a regular polygon, including both rotations and reflections. In this case we will consider the rotations and reflections of a square denoted as D4 represented visually as four couterclockwise rotations of a square:



And the following four reflections, vertical, horizontal, and two diagonal reflections:




With each image is included the permutation, actually dihedral groups are permutation groups and D4 is a set of 8 permutations, hence the order of the group is 8. It is a subgroup of the group of all permutations of four items which is known as the Symmetric Group which has order 4!=24, it's denoted as S4. The interesting thing here is the rotations form a subgroup that is isomorphic to our group from above and Z4. In a sense this isn't really surprising since it is a cyclic group of order four meaning it cycles through 4 items just as a square cycles through four rotations.


As you can see from the color coding, the elements {R0, R1, R2, R3} form a subgroup, a subgroup is just a subset of the elements that is still a group. These subgroup elements forming the upper left hand corer of the table are color coded to show the isomorphism between this subgroup and Z4 and also our group (⋅, {1,-1,i,-i}). Also R0 is the identity which makes sense since it is a zero rotation, which means that you didn't rotate it, or you did by 360° aka one tau.


Our Z4 isomorphic group (⋅, {1,-1,i,-i}) also has a subgroup. Our original arrangement of the cayley table was ordered to show the Z4 isomorphism. To show our subgroup we need to rearrange the cayley table:


1 -1 i -i
1 1 -1 i -i
-1 -1 1 -i i
i i -i -1 1
-i -i i 1 -1

Now it should be fairly easy to the subgroup (⋅, {1,-1}) again forming the upper left corner in the table above, or shown by itself:


1 -1
1 1 -1
-1 -1 1

Now if you think about it our subgroup which is just another group is a group, 1 is the identity and -1 is its own inverse, actually it turns out that all subgroups of cyclic groups are also cyclic groups and since all cyclic groups are isomorphic to Zn groups, this group is actually one that we have seen before it's just Z2 addition aka exclusive-or, etc.

Also there are some remarkable connections to number theory in regards to the order of subgroups and the order of elements, for example the order of our subgroup is 2 which evenly divides 4 the order of the group, the order of a subgroup will always evenly divide the order of the group. Also the order of each element evenly divides the order of the group. I suspect that these ideas have profound implications.


Going back to our original group there are a couple of interesting things about imaginary numbers, first are the following equations:





If we consider the sequence of raising i to successive powers, multiplying by I each time, if we start with 1, we get the sequence (1, i, -1, -i), these are actually coordinates in the complex plane and can be written as (0i+1, 1i+0, 0i-1, -1i+0). This cycle (increasing powers of i) is the following (counter clockwise) rotation in the complex plane:



Conversely if we divide, or repeatedly multiply by -i, we get the following sequence (1,-i,-1,i) which correspond to the complex numbers (0i+1, -1i+0, 0i-1, 1i+0) and is the opposite (clockwise) rotation:



Another interesting seemingly related concept is the idea of turning number or winding number of a curve, the red circle with clockwise rotation corresponds to curve with a winding number of -1 and the blue curve with positive increasing powers of i has a winding number of 1.


As we saw our group is cyclic and isomorphic to Z4 and isomorphic to the rotation subgroup of the dihedral group D4 and both of the generators of our group correspond to rotation in the complex plane. Counterclockwise rotation corresponds to the period of i and clockwise rotation corresponds to the period of -i.

29 January 2012

Abstract Algebra: The Logic of Odd and Even




Steven Strogatz wrote a series of math related articles on the New York Times website one of which is called "Rock Groups" where he looks at adding integers visually and with respect to odd and even and ventures into the territory of triangles and squares and visually portrays the closed formula for computing triangular numbers. My previous post on triangular numbers in some ways parallels his "Rock Groups" article at least in respect to the visual portrayal of triangular numbers.  This post also parallels ideas in that article, in contrast his observations are geometric, these are algebraic observations about, among other things, adding and multiplying even and odd numbers, which work as follows:



+

*

Even + Even = Even

Even * Even = Even

Odd + Even = Odd

Odd * Even = Even

Odd + Odd = Even

Odd * Odd = Odd




Now this doesn’t really give one a sense of what is going on here, however, by using a Cayley Table it can also be called an operation table in that it shows the details of an operator. It becomes more interesting we now have a better depiction of the Group for Addition (+) with Even as the Identity Element and the Monoid for Multiplication (*) with Odd as the Identity Element:



+

Even

Odd

Even

Even

Odd

Odd

Odd

Even




*

Even

Odd

Even

Even

Even

Odd

Even

Odd



Now that still may not be very intuitive, however, if you convert Even to Zero (0) and Odd to One (1) you get the following, possibly more familiar Group with 0 as the identity and the Monoid respectively with 1 as the identity:



+

0

1

0

0

1

1

1

0




*

0

1

0

0

0

1

0

1



The Group has 1 as is its own inverse 1 + 1 = 0.  In general these "types" of Algebraic structures consist of a binary operator and a set that is closed under the operation so I am using the notation (Operator, {Set}) to denote these and since the sets are small I can just list the elements, but you could also write (+, Z) where Z is the set of integers and in fact this is a well known group: addition on the set of integers, but the set is infinite in this case in contrast to our finite sets.  We can describe the above as the Group (+, Z2) and the Monoid  (*, Z2) where  Z2 is the set {0,1} of the Integers Modulo 2, or simply the Binary digits which are Base 2.


Now let’s do another transformation on our Monoid and Group, we will now map Zero (0) to False (F) and One (1) to True (T) . We will also change our operators, + now becomes Exclusive-Or and * now becomes Logical Conjunction aka Logical And.



Xor

F

T

F

F

T

T

T

F




And

F

T

F

F

F

T

F

T



All of these transformations are actually Isomorphisms and the two structures combine to form a Ring this is also a Field, the smallest Finite Field GF(2). I have a follow up for this post and I will get into these and other ideas in more detail.


Some basic algebraic structures are:


Magma or Groupoid – A closed binary operation on a set.


Semigroup – A closed binary operation on a set, with associatively. (An associative Groupoid)


Monoid – A closed binary operation on a set, with associatively and an identity element. (A Semigroup with an identity element)


Group – A closed binary operation on a set, with associatively and an identity element. Each element has an inverse element. (A Monoid with inverses)


Commutative (Abelian) – used to describe any of the above with the commutative property, these two terms are interchangeable but term abelian1 in abstract algebra is used mostly just for groups.




 

Closed

Associative

Identity

Inverse

Magma

X

 

 

 

Semigroup

X

X

 

 

Monoid

X

X

X

 

Group

X

X

X

X



Now Commutative is more of a property of the above, as we saw the String Monoid is not or Non-Commutative where as the Group (+, Z2) and the Monoid  (*, Z2) are both Commutative.


Going back to Boolean Logic, if you have had any exposure to Boolean Logic things now may look amiss in regards to operators: such as what about the Or operator? Actually Xor can be defined in terms of Logical Conjunction and Logical Disjunction as follows:




Our Cayley Table for Logical Disjunction (Or) is:



Or

F

T

F

F

T

T

T

T



Now this table is almost the same as Xor with one exception (T or T) = T vs. (T Xor T) = F.  We can still call F an identity since it combines with T to give T, but you can see with that one change we lost our inverse for T so this is a now "downgraded" to Monoid status.


Now if we add Logical Conjunction (And) back we get the familiar values in Caley tables:



Or

F

T

F

F

T

T

T

T




And

F

T

F

F

F

T

F

T



Above we noted that the Group (+, Z2) and the Monoid (*, Z2)  formed a Ring.  Since Or is a Monoid these now combine to form a slightly different structure, a SemiRing.


Another interesting observation about these sets of operators comes from a logical standpoint. The combination of these two operators are what’s known as not being functionally complete, which means that there exists truth functional statements that cannot be expressed by these two operators alone, this can be fixed by adding negation which yields a truth functionally complete set of two dyadic and one monadic2 operators: {And, Or, Not}, which is a pretty well known and familiar set of logical operators.  The Ring operators from above {Xor, And} also form a functionally complete set. It turns out that just {And, Not} and just {Or, Not} are also each functionally complete sets as are the one element operator sets {Nand} and {Nor}, this is why digital logic circuits can be constructed solely with Nand Gates.


The thing that bothered me the most was the fact that And and Xor had this nice mapping from Boolean Logic with {T,F} to Binary Integer Arithmetic {0,1} but Or seemed to be an oddball and then I realized that you can  remap both Logical Conjunction and Logical Disjunction to the Numerical functions Min and Max, respectively:



Max(0,0) = 0


Max(0,1) = Max(1,0) =1


Max(1,1) =1


 


Min(0,0) = 0


Min(0,1) = Min(1,0) =0


Min(1,1) =1



In Caley table form which better shows the isomorphisms (≅ means bidirectionally isomorphic to) from our Caley tables above: (And, {T,F}) ≅ (Min, Z2) and (Or, {T,F}) ≅ (Max, Z2):



Max

0

1

0

0

1

1

1

1




Min

0

1

0

0

0

1

0

1



Interestingly this mapping is the effectively definition of Fuzzy Logic mapped to the discrete cases (two value logic).  Boolean Logic also maps to Lattices with disjunction being meet and conjunction being join, the min and max ideas give a nice intuition in relation to ordering.


To recap we saw a number of Group and Monoid Isomorphisms (remember Z2 = {0, 1}), we saw:



(+, Z2) ≅ (+, {Odd, Even})


 (*, Z2) ≅ (*, {Odd, Even})


(+, Z2) ≅ (Xor, {F, T})


 (*, Z2) ≅ (And, {F, T})


(Or, {F, T}) ≅ (Max, Z2)


 (And, {F, T}) ≅ (Min, Z2)



We also saw Ring Isomorphisms and a Semiring Isomorphism:



[(+, Z2), (*, Z2))] ≅ [(+, {Odd, Even}), (*, {Odd, Even})]


[(+, Z2), (*, Z2))] ≅ [(Xor, {F, T}), (And, {F, T})]


[(Or, {F, T}), (And, {F, T})] ≅ [(Max, Z2), (Min, Z2))]



I have always felt that examples are helpful and I think these give some sense of the diversity that can be found in just a small little corner of math. In researching this I came across some interesting facts which also highlight this diversity, for example the Nand operation is not associative but is commutative so it’s a commutative Groupoid.  The Octonions, which are hypercomplex numbers,are neither associative nor commutative under multiplication but have an identity of 1 making it a non commutative Groupoid with an identity, so my point here is that the above table of basic algebraic structures that makes everything look so hierarchical is deceiving as the diversity does not seem to be constrained in any way, as far as I can tell.


1 This term is named after the Norwegian mathematician Niels Henrik Abel who developed Group Theory independently of Galois and like Galois tragically died young(<30).


2 Dyadic means takes two operators, e.g. (And, Or) and monadic means takes one operand, in this case the negation (Not) operator, this should not be confused with other uses of the term Monad.

19 June 2011

Free Math Resources You Can Use



In my opinion, this actually may be the best time in human history to learn math, or just about anything for that matter. If you are of sufficient means to have access to the internet, which unfortunately not all people are, you have access to an unprecedented amount of information. The problem is the overwhelming amount of information that is out there and how to find it. So I wanted to share some of my tricks to finding good math information, which I call "Math Mining", of course these can be used for any academic information, so maybe "Academic Mining" may be more apropos.

One of the reasons that the present time is a good time to learn math is due to the diversity of sources for information, Wikipedia is one of those sources, Wiki Surfing, as has been previously discussed, but that is just the tip of the math iceberg and the really cool thing is that there are many resources that can give you entirely new and fresh perspectives on things that may sometimes seem dull and obfuscated by more traditional approaches found in books, not that there aren't a lot of great books too. It really is an exciting time.

Many professors have their publications, notes and course resources freely available on the internet and some of these include full books in pdf or ps or html format. In fact that leads me to my little google trick, let's assume you want to find some information about a math subject, we'll use Linear Algebra as an example. Then if you use the Google search:

"Linear Algebra" inurl:pdf

You will get a lot of hits that are academic pages, these will be a mix of publications and course related material. Once you find a document, you can use that url to find more information. For example let's say that our above search leads us to the following (fictitious) url:

After you click the link and get your reward, you should realize that this is a potential gateway to much more information. Now admittedly this might be seen as a moral gray area, because sometimes I get the feeling that some of these resources are not as openly exposed as they could be so it may be that the instructors do not want to openly share their work and they are practicing security through obscurity, but in my opinion if directory browsing is enabled and/or your documents are indexed by Google, then they're fair game, so if you are someone who this applies to, I suggest that you either share it openly it or lock it down. I encourage anyone who is sharing their work to do it freely and openly regardless of whether people are taking your classes. After all it's for the greater good. "The greater good." And if you openly share it then people like me can read it, learn it, know it and talk about how awesome you and your work are. It's a win-win.

Hack the Site

Hack #1 Url Suffix Removal

By removing "chapter10.pdf" yielding www.math.umars.edu/~hjfarnsworth/math420/fall2010/ will expose more resources if this directory has browsing enabled or if it has a default page. You can progressively remove directories to find one that is useful, and actually sometimes it is worth it to jump directly to www.math.umars.edu/~hjfarnsworth/ which will often be a professor home page which can yield links to publications, course pages with documents, and other potentially interesting information.

Hack #2 File Name Enumeration

So you looked at chapter10.pdf and it's awesome but Hack #1 did not yield it or the related chapters. Due to the naming convention try: www.math.umars.edu/~hjfarnsworth/math420/fall2010/chapter09.pdf or www.math.umars.edu/~hjfarnsworth/math420/fall2010/chapter9.pdf, often this approach will yield other related documents.

Hack #3 Invoke the Power of Google

Let's say the hack #1 didn't work and the resultant url had a random characteristic like:

The following Google search will ferret out those pesky hard to find pdf's:

site:www.math.umars.edu/~hjfarnsworth/ inurl:pdf

Also you can use .ps and .ps.gz in place of .pdf for file type searches. If you feel that this is crossing some kind of moral line then don't do it, but I like to say all is fair in Love and Math.

I would like to give another example of this technique, I recently came across "Mapreduce & Hadoop Algorithms in Academic Papers (4th update - May 2011)" which linked to "Max-cover algorithm in map-reduce" which caught my interest, and of course the ACM is charging for it, but no worries, there is usually no need to pay them, actually I recommend boycotting them. I employed the above tricks but they didn't work, simply Googling one of the authors did (always pick the most unique name(s)):

"Flavio Chierichetti"

Pulled up his web site which had a free copy of the paper, now all I have to do is find the time to read it. Also the above techniques yielded the paper's "cliff notes".

Of course you can just look up someone by name, for example, you can find some of Donald Knuth's publications here.

In regards to academic publications there are two excellent repositories with a wealth of information these are Citeseer out of Penn State, this site can be a little flaky in terms of availability, at least that's been my experience in the past and the other is arXiv run by Cornell University. These mostly contain research oriented work but you can often find relevant information even for neophytes, actually a lot of advanced papers and books for that matter start out with introductory sections that can be worth looking at.

Encyclopedic and other Miscellaneous Resources

Wikipedia, obviously, as previously mentioned. Also the oft controversial Stephen Wolfram provides an excellent resource called Wolfram Mathworld.

Project Euler is a site dedicated to collaboratively solving math oriented problems programmatically more about it can be found here.

Math on the Web by category here provides some interesting links, I believe this is run by the American Mathematical Society but I am not sure.


The National Institute of Standards and Technology site: NIST Digital Library of Mathematical Functions.

Also there is Mathoverflow which is a Stackoverflow type of question and answer community devoted to Math.

Blogs

There are a number of blogs that blog about both math and programming related math. Actually if your primary interest is machine learning, I recommend Bradford Cross's Measuring Measures blog, it is hard to find things on his site and it was recently restyled with a magenta/maroon background which I now find a little bit harder to read. The relevant links here are: Learning About Network Theory, Learning About Statistical Learning, and Learning About Machine Learning, 2nd ed. Additionally Ravi Mohan did a follow-up: Learning about Machine Learning.

Good Math Bad Math by Mark Chu-Carroll has lots of good articles about math including some for beginners in various areas. Catonmat by Peteris Krumins has some nice entries with notes about the online MIT courses that he has worked through which currently covers Algorithms and Linear Algebra also mentioned above. The Unapologetic Mathematician has a lot of nice articles, this is a bit more advanced though. Math-Blog has a lot of articles as well. They tend to focus on more traditional areas of math. Math blog's abound and there are too many to mention, here's a few:

Lastly I will mention a blog by Jeff Moser, actually he only has a few math related posts, but his Computing your Skill on Probability and Statistics is a beautiful work of art well worth looking at.

Online Courses

The well known Khan Academy offers a number of courses including several math courses.

MIT Open Courseware has many online courses most notably for CS majors Introduction to Algorithms by the venerable Charles Leiserson and Erik Demaine videos here and Linear Algebra by Gilbert Strang.

On Stanford Engineering Everywhere the following might be of interest:

Artificial Intelligence | Machine Learning

Artificial Intelligence | Natural Language Processing

Linear Systems and Optimization | The Fourier Transform and its Applications

The Mechanical Universe is primarily dedicated to physics, but several math topics such as Calculus and Vectors are covered explicitly. It's also a nice series of lectures on the topic in spite of being a little dated in productions values.

Other Online Videos

Two math documentaries are covered here are Fermat’s Enigma: The Epic Quest to Solve the World’s Greatest Mathematical Problem and the overly dramatic but still interesting Dangerous Knowledge.

The story of Maths by Marcus du Sautoy.

Keith Devlin talks about Pascal and Fermat's coorespondance while working out probability in this intersting talk: Authors@Google: Keith Devlin.

Bob Franzosa - Introduction to Topology.

The Catsters videos on youtube cover various Category Theory related topics.

N J Wildberger's Algebraic Topology

Dan Spielman has a video discussing Expander Graphs.


Introduction to Game Theory by Benjamin Polak at Yale.


The site videolectures has many lectures in Computer Science and Math including:

If you find these videos too slow this might interest you.

Math Software

There are many math related software packages and libraries three of which are covered in more detail here.

Math library Sage written in Python

GNU Octave

The R project for Statistical Computing

Scilab

Maxima, a Computer Algebra System

Various Books and Academic Stuff


Here are a bunch of interesting courses and books that I have encountered during my searching which you might find interesting as well. These are presented in no particular order:




Algorithims

Algorithms by S. Dasgupta, C.H. Papadimitriou, and U.V. Vazirani.

Jeff Erickson has some Algorithms Course Materials



Steven Skiena author of the The Algorithm Design Manual offers some pretty comprehensive course notes for his cse541 LOGIC for COMPUTER SCIENCE not to mention the opportunity to learn how to bet on Jai-alai in the Cayman Islands.


Gregory Chaitin's Algorithmic Information Theory.



Computer Science


Foundations of Computer Science by Jeffrey Ullman and Al Aho.


The Haskell Road to Logic, Math and Programming by Kees Doets and Jan van Eijck



Discrete Math

Discrete Mathematics with Algorithms by M. O. Albertson and J. P. Hutchinson.




Analysis

Analysis WebNotes is a self-contained course in Mathematical Analysis for undergraduates or beginning graduate students.


Introduction to Analysis Lecture Notes by Vitali Liskevich.

Applied Analysis by John Hunter and Bruno Nachtergaele.


REAL ANALYSIS by Gabriel Nagy.



Probability Theory

Introduction to Probability Theory by Ali Ghodsi.

Introduction to Probability by Charles M. Grinstead.

The first three chapters of Probability Theory: The Logic of Science by E. T. Jaynes. Can be found here.

Think Stats: Probability and Statistics for Programmers by Allen B. Downey.


LECTURE NOTES MEASURE THEORY and PROBABILITY by Rodrigo Bañuelos.


Principles of Uncertainty by by Chapman and Hall.



Information Theory, Inference, and Learning Algorithms by David MacKay.





Machine Learning/Date Mining

Machine Learning Module ML(M) by M. A .Girolami.

Alexander J. Smola's and and S.V.N. Vishwanathan's draft of Introduction to Machine Learning.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction (Second Edition) by Trevor Hastie, Robert Tibshirani and Jerome Friedman.

Introduction to Information Retrieval by Christopher D. Manning, Prabhakar Raghavan and Hinrich Schütze.

Mining of Massive Datasets by Jeffrey Ullman.



Fourier Theory


Lecture Notes for EE 261 The Fourier Transform and its Applications pdf By Brad Osgood.



Abstract Algebra

Abstract Algebra by Thomas W. Judson.


James Milne has a number of sets of extensive notes on Algebraic topics like goup theory here.


ABSTRACT ALGEBRA: A STUDY GUIDE FOR BEGINNERS by John A. Beachy.


Elements of Abstract and Linear Algebra Edwin H. Connell.


Abstract Algebra by Elbert A. Walker.


A series of chapters on groups by Christopher Cooper.



Linear Algebra

Linear Algebra by Robert A. Beezer.


A course on Linear Algebra with book chapters.


Really cool interactive tutorial on Singular Value Decomposition by Todd Will.





Model Theory

Fundamentals of Model Theory pdf by William Weiss and Cherie D'Mello.



Set Theory

A book on Set Theory pdf by William Weiss.



Graph Theory

Reinhard Diestel makes his excellent and comprehensive book Graph Theory available, pdf here.



Logic

You can find Introduction to Mathematical Logic by J. Adler, J. Schmid, Model Theory, Universal Algebra and Order by J. Adler, J. Schmid, M. Sprenger and other goodies here.


Introduction to Logic by Michal Walicki.


Logic for Computer Science: Foundations of Automatic Theorem Proving by Jean Gallier.


The Novel Research Institute has a number of free academic books including: Logic and Metalogic:Logic, Metalogic, Fuzzy and Quantum Logics and Algebraic Topology, Category Theory and Higher Dimensional Algebra-Results and Applications to Quantum Physics



Category Theory

Some course notes on Category Theory by Tom Leinster.

Basic Category Theory pdf by Jaap van Oosten.

Abstract and Concrete Categories The Joy of Cats by Jiri Adámek, Horst Herrlich, George E. Strecker.

A gentle introduction to category theory --- the calculational approach pdf by Maarten M. Fokkinga.

Steve Easterbrook's An introduction to Category Theory for Software Engineers.



Algebraic Topology/Topos Theory

Eugenia Cheng of Catsters fame has a course in Algebraic Topology with some substantial notes.

The above links of Eugenia Cheng refer to Algebraic Topology by Allen Hatcher.

An informal introduction to topos theory pdf by Tom Leinster.



Topology

A free, protected, password available by request, e-book on topology: Topology without Tears by Sidney A. Morris.

Chapters for a topology course by Anatole Katok can be found here.



Computational Topology

Jeff Erickson has some nice notes on Computational Topology, pdf's can be found on the schedule page.

Afra Zomorodian has some nice resources on Computational Topology including a nice introductory paper.



Spectral Graph Theory

Fan Chung Graham has a lot interesting stuff, some pretty advanced, relating to graph theory including social graph theory and spectral graph theory.

Dan Spielman has some course notes on Spectral Graph Theory.



Expander Graphs

Avi Wigderson's Expander Graphs and their Applications.



Fractal Geometry

The Algorithmic Beauty of Plants pdf by Przemyslaw Prusinkiewicz and Aristid Lindenmayer is available on the Algorithmic Botany site.



Game Theory

Thomas S. Ferguson's course at UCLA on Game Theory also Game Theory for Statisticians.


A course in game theory by Martin J. Osborne and Ariel Rubinstein, requires registration.




Algebraic/Enumerative Combinatorics

MIT Open Courseware in Algebraic Combinatorics


An uncompleted book and notes on Enumerative Combinatorics by the Late Kenneth P. Bogart also here.


Lionel Levine's notes on Algebraic Combinatorics


Richard P. Stanley new edition of Enumerative Combinatorics Volume one.


A Course in Universal Algebra by Stanley N. Burris and H.P. Sankappanavar


Misc

Pat Hanrahan's CS448B: Visualization.



Sean Luke's "Essentials of Metaheuristics".


It's all a click away

The links in this entry, especially the academic links are susceptible to link rot, people move from institution to institution or leave academia for jobs in the private sector. I will endeavor to revisit this entry and try to keep these up to date and perhaps even add to them, however, if you encounter this page and have any interest in any or all of these resources I recommend downloading them now so that you have them.

Using the resources of this blog you should be able to get your hands on a huge amount of free resources on a wide range of topics. This can be helpful if you are on a budget or just want to try before you buy an expensive book on a topic. I hope you avail yourself of some of these, there's lots of great stuff and if you know of some that I do not please add them in the comments.