I got a few responses to my call for use cases for generalized specializers; I'll try to summarize them in this post. Before I do, it's perhaps worth noting that the work on generalized specializers that Jim Newton and I did was written up at about the same time as Charlotte Herzeel, Jorge Vallejos, Theo D'Hondt and Pascal Costanza's work on filtered dispatch, and so our Custom Specializers in Object-Oriented Lisp paper doesn't refer to it. I'll talk more about filtered dispatch in a later post, when I tie all these threads together, but for those who aren't aware of it it's worth looking at, along with ContextL for dispatch that varies based on dynamic program state.
Meanwhile, some other ideas for extended specializers: James Anderson suggested duck typing, and I think working along similar lines Patrick Stein thought of specializers that could dispatch on the dimensions of array arguments (or length of sequence arguments). That feels to me as though there should be a fairly compelling use case, but the toy example of
(defmethod cross-product ((x (length 3)) (y (length 3)))
...)
doesn't generalize. I suppose there might be an example of selecting particular numerical algorithms based on overall matrix dimensions or properties, something like
(defmethod invert ((x (total-size> 1000000000)))
... invert big matrix ...)
(defmethod invert ((x (total-size<= 1000000000)))
... invert small matrix ...)
but it doesn't feel concrete yet.
On the other hand, just about everyone's first response to the
question (Jan Moringen, and the crowded wisdom of #lisp
IRC) was
pattern specializers, usually through regular expressions or
optima in particular; one example
concrete use case given was to dispatch among handlers for urls. This
is a very interesting case; I first considered this in the early days
of extended specializers, and indeed
mop-27.impure.lisp
from the SBCL test suite captures one other use case, in a (toy)
simplifier for arithmetic expressions, redolent of implementations of
symbolic differentiation from a bygone age. At the time, I took that
example no further, both because I didn't have a real use case, and
also because I didn't fancy writing a full-strength pattern matching
engine; now that optima exists, it's probably worth revisiting the
example.
In particular, it's worth considering how to handle capturing of pattern variables. To continue with the toy simplifier, the existing dispatch and specializer implementation would allow one to write
(defmethod simplify1 ((p (pattern (+ x 0))))
(cadr p))
but what would be really neat and convenient would be to be able to write
(defmethod simplify1 ((p (pattern (+ x 0))))
x)
and in order to do that, we need to intercede in the generation of the actual method function in order to add bindings for pattern variables (which, helpfully, optima can tell us about).
The awkwardness in terms of the protocol – beyond the
problems with metaprogramming at compile-time in the first place
– is that
make-method-lambda
operates in ignorance of the specializers that will be applied to the
method. In fact, this was the root cause of a
recently-reported SBCL bug:
SBCL itself needs to communicate the name and method lambda list from
defmethod
to make-method-lambda
, and has no other way of doing that than
special variables, which weren't being cleared to defend against
nested calls to make-method-lambda
through code-walking and
macroexpansion.
But make-method-lambda
is the function which generates the method
body; it is exactly the function that we need to extend or override in
order to do the desired automatic matching. So how can we expose this
to the metaprogrammer? We can't change the signature of
make-method-lambda
; it might seem obvious to extend it by adding
&optional
arguments, but that changes the signature of the generic function and
would break backwards compatibility, as methods must accept the same
number of optional arguments as their generic function does;
similarly, adding keyword arguments to the generic function doesn't
work.
We could export and document the special variables that we ourselves
use to propagate the information; in some ways that's cleanest, and
has the virtue of at least having been tested in the wild. On the
other hand, it feels unusual, at least in the context of the
metaobject protocol; usually, everything of interest is an argument to
a protocol function. So, we could define a new protocol function, and
have make-method-lambda
default to trampolining to it (say,
make-method-lambda-with-specializer-specifiers
; the basic idea,
though not the horrible name, is due to Jan Moringen). But we'd need
to be careful in doing that; there is some implementation-specific
logic deep in PCL that performs extra optimizations to methods (the
fast-method calling convention) if the system detects that only the
standardized methods on make-method-lambda
are applicable to a
particular call; if we add any extra metaobject protocol functions in
this area, we'll need to make sure that we correctly update any
internal logic based on detecting the absence of metaprogrammer
overrides or extensions...