Posted: Mar 20, 2012 3:08 pm
by VazScep
Sigillum Militum wrote:Well of course not, they don't have late binding.

But you can, in some sense, "subclass" modules with a functor. I recently needed to key a hash table with pairs of big_ints. Since = chokes on big_ints for some reason, what I did was use the Hashtbl.Make functor to get equal to use eq_big_int
In Java and C#, this general sort of behaviour is not implemented via subclassing.

Consider how ordered sets are implemented in Ocaml. We have BatSet.Make, which is a functor from a Comparable module to a Set module. In Java and C#, sets are implemented by TreeSet and SortedSet. Before generics, the container type of both collections was Object, and comparison had to be achieved via downcasts to IComparable. This completely type unsafe solution is basically an admission by the language designers that the object system has failed them (due to backwards compatibility issues, the class still relies on downcasts, so the code still isn't typesafe).

With generics, the correct solution can be achieved, but that solution is the Standard ML/Ocaml solution: you make TreeSet a class/module which is parametrised on a type which implements the interface/signature Comparable. Of course, generics were added to Java and C# by Haskell and ML researchers.

If we want to talk about subclassing in the context of modules, we can at least talk about inheritance, which is a standard feature of SML and Ocaml via the "includes" directive. We can also point out that modules can be understood as values with types as their signatures, and that in this way, the module system uses structural subtyping.

On late binding: it isn't always such a big deal when you have first-class functions. A flat class hierarchy, with various classes implementing a single interface, can be captured adequately by a record of functions. If, however, you find yourself writing transformations of these records which remove elements, you should probably be using classes. Suffice to say, I haven't yet come across a need to do this.

Macros are a shitty form of encapsulation and, from what I'm told, very difficult to debug.
I was told macros were a way to turn Common Lisp into a domain-specific language, to bring your solution domain up to meet the problem domain, to give you the ultimate abstraction mechanism, to transcend object-oriented programming, yadda, yadda, yadda. This sort of thing turned out to be bullshit, and most uses of macros that I came across were spurious.

I still consider macros an important and interesting idea, but they need to be seen for exactly what they are: a means of deriving custom syntax.

Being able to derive new syntax requires knowing your language's grammar, its lexemes and tokens, and its abstract syntax. In Lisp, this is wonderfully trivial: Lisp programmers already know their languages abstract syntax and its tokens. The abstract syntax consists in the Lisp s-expressions they write directly, and the tokens are the symbols they use as internalised strings. They also know how to transform abstract syntax, because in learning Lisp, they've become expert at transforming s-expressions. Writing reader macros is similarly trivial.

In Ocaml, if I want to derive new syntax, I have to learn the language and API of Camlp4. I have yet to do this, but it looks like a headache.

This begs the question of whether we want to derive our own syntax. I can only say, judging by the number of camlp4 extensions out there, Ocaml programmers certainly do. And I personally don't want to use lwt or Ocsigen without their syntax extensions.