MRH.io

“Monads,” huh? Bro, have you even read Leibniz?

January 30, 2017

The practice of functional programming is pinned upon the concept of “monads” — small functions that lift types from simple to complex. However, the origin of the monad comes from a 17th century philosopher named Gottfried Wilhelm Leibniz. He had a much more generalized definition for the term, and understanding this concept may be of use to those engaged in functional programming.

What is a Monad, according to Leibniz?

Leibniz defines a monad as follows:

[a] monad [is] just a simple substance. By calling it simple I mean that it has no parts, though it can be a part of something composite.

This echoes both Descartes’ concept of the ego and Euclid’s definition of a point from The Elements, “that which has no part.” Simply put, Leibniz’ monad is defined an element of the universe that cannot be subdivided. A Leibniz monad can have qualities, and must be created or destroyed as a whole since they have no part. He uses this to conjure up an argument for the existence of God, but this post won’t go there.

Ties Between Leibniz and Computer Science Monads

Let’s replace Leibniz’ generalized definition with a definition based on computation. Then, let’s replace the universe substrate with that of a silicon processor.

To understand this context, we must understand what functional programming is. Functional programming is the practice of programming where computation is expressed primarily through the use of functions. The idea comes from Alonzo Church’s Lambda Calculus, wherein even things like Boolean values are described as simple functions. In programming, even “pure” languages like Haskell, we still use variables and values, but in Lambda Calculus, it’s functions all the way down.

A Picture of Leibniz' Counting Machine Leibniz probably wasn’t thinking of monads of computation (or “reckoning” as it was called in the 17th century) when he invented this machine. However, its innards certainly operate on the same functional principles as modern programs.

The monad in the realm of computer science has two definitions, each separated by about 30 years:

  1. In the APL and J programming languages, monad simply means “a function that takes only one parameter.”
  2. In the Haskell language, and many other modern languages, the definition is more complex. Here,a monad is also function, but it is a special type of function. Monads typically “lift” a primitive variable types (Integer, String, etc), to more complex types (MutableString, NullableInt).

How these definitions relate to Leibniz

It is easy to draw a direct line from the first APL/J definition to Leibniz’ monads. However, it goes deeper. Consider this quote from Monadology:

…every momentary state of a simple substance is a natural consequence of its immediately preceding one, so that the present is pregnant with the future.

This begins to correlate with the monad from category theory, and even sounds a lot like chaining functions. If we introduce the concept of state, as opposed to value, we can define the Haskell monads as that which takes one singular state and outputs another. Thus, we can arrive at a definition of the computerscience monad as a fundamental unit of computation. More Leibniz:

…the only way for monads to begin or end-to come into existence or go out of existence-is instantaneously. Most of the time, Haskell monads will take one parameter (an argument or the object itself as a hidden argument) and act upon it, returning a new or elevated state. They are typically very small functions that, in accordance with both functional programming and category theory, only perform one action with no side effects.

On Composition

When programming functionally, the programmer chains functions together to create compositions, and these compositions are what provides the complexity that our modern programs require. Lambda Calculus can, in a mathematically proven manner, perform any calculation a Turing Machine can. Leibniz seems to have known this intuitively:

A composite thing is just a collection of simple ones that happen to have come together.

When he says “happen to have” he is talking about his perception of God. Here, we play God, by writing our programs and chaining these monads together.

Who was this Leibniz guy?

Gottfried Wilhelm Leibniz was a 17th century polymath. In addition to philosophy, he was a civil engineer, physicist, and inventor. He eventually died arguing with Newton and Newton’s followers about who invented calculus. His hair also put Brian May to shame.

Food for Thought: Leibniz on Perception

Mostly, we’ve taken a 17th century concept and re-applied it to a modern concept. I also want to revisit Leibniz’ original writing out in the modern context of Artificial Intelligence and Autonomous Systems. Leibniz uses his monadology to explore perception. Check it out:

It has to be acknowledged that perception can’t be explained by mechanical principles … suppose we could walk into a [mind] as we walk into a mill … all we would find there are cogs and levers and so on pushing one another, and never anything to account for perception. so perception just be sought in simple substances, not in composite things like machines.

What he’s likely describing are emergent properties — simple rules that come together to form complex behavior. This is everywhere in the natural and man-made world. From DNA, to the game of Go, to the very cortical structure of the part of our brains that give birth to perception.

All in all, very advanced thinking for somebody who lived over 400 years ago.


Kyle Mathews

This is the website of Mark Robert Henderson. He lives in Cape Ann, works in Cambridge, and plays with distributed apps and tech philosophy online.

Mark's social media presence is slowly and deliberately withering away, so the best way to reach him is via e-mail.