alas, poor

Conor McBride

(WG2.8 #26, Frauenchiemsee, 8-12 June 2009)



This slide is nearly ten years old. How times change!










This remains the central point of the case for dependent types. In this talk, I consider validating containers with respect to shape, syntax with respect to type, but most especially, interaction with respect to circumstances.

We still need the revolution, but it seems less controversial these days.


Do the types run off in terror?

No! They're overjoyed at their new articulacy.


Here's a picture of a list, seen as a recursive data structure.


They don't zip so well, some times.


But if each node makes clear what length it delivers and what length subnode it is willing to accept,...


...zipping is always spot on.








It's sensible, and even becoming traditional, to abstract your syntax over the type used to represent variables, pointing to the functorial structure of renaming and the monadic structure of substitution.

I'm just playing the same trick, with typed syntax. We're working in slightly fancier (i.e., slice) categories, but the structure is the same, so the code is the same. You'd expect that if your renaming maps each variable to another of the same (base) type, then deploying it will preserve the (arbitrary) type of any term. And that's what you get, at no extra charge.


It may help to give a candidate for σ :: {Base}→*, representing variables. A common choice is In {Γ}, where Γ :: [Base] represents a context. This definition is at least a decade old — Altenkirch and Reus used it exactly as this typed version of de Bruijn indices in their CSL 1999 paper.



I should, of course, mention
instance IFunctor Tm where
  imap = rename

An IFunctor is, specifically, a monotone predicate transformer.




b or not b, that is the question:

Whether 'tis nobler in the mind to suffer
The slings and arrows of outrageous fortune,

Or to take arms against a sea of troubles







σ :→ φ τ   =   ∀j. σ{j} → τ{j}
    is the type of an arrow of outrageous fortune.

The world gets to choose what state we're in, but must provide evidence that it satisfies the precondition σ.

This is by contrast with the “parameterised monads” of Atkey and others.

class PMonad (ψ :: {ι} → * → {ι} → *) where
  preturn :: &forall α, i.     α → ψ {i} α {i}
  pbind :: &forall α, β, i, j, k.
                ψ {i} α {j} →
                (α → ψ {j} β {k}) →
                ψ {i} β {k}
A PMonad does not permit outrageous fortune: in any call to pbind, we fix the intermediate state {j} is fixed up front, and the world must deliver it. PMonads thus give access to the predictable fragment of effectful computation. Every IMonad can be specialized to the PMonad of its predictable fragment by using
data Eq :: {ι} → {ι} → * where
  Refl :: Eq {i} {i}

data K :: * → {ι} → * where
  K :: α → K α {i}

data (:∧:) :: ({ι} → *) → ({ι} → *) → {ι} → * where
  (:&:) :: σ{i} → τ{i} → (σ:∧:τ){i}

newtype Predict φ i α j =
  Ensure (φ (Eq {j} :∧: K α) {i})
That is, Predict &phi i α j is the type of φ computations starting from state {i} which reach a state considered satisfactory if it happens to be the state {j} we predicted (and if we have a value in α, to boot).



Preconditions and postconditions for our file IO operations...

...uniformly determine the predicate transformer characterizing one step of file IO.


Any monotonic one-step transformer yields an IMonad exactly by closure under skip and sequential composition.


Conor McBride 2009