Jitclass: how to get "boxing" class from "base" class

I am trying to use numba classes in conjunction with functools.singledispatch (which would seem a useful way to get around lack of inheritance and unlowerable functions):

@nbx.jitclass(spec)
class Foo: ...

@nbx.jitclass(spec)
class Bar: ...

@singledispatch
def act(inst: object): ...

@act.register
def act_foo(inst: Foo): ...

@act.register
def act_bar(inst: Bar): ...

Unfortunately, when I call with a Foo or Bar instance, the derived functions are not invoked. A look at the types of instance vs the registered type in the singledispatch registry shows why: the class in registry is <class 'numba.experimental.jitclass.base.Foo> while the instance has type: <class 'numba.experimental.jitclass.boxing.Foo>. Can I somehow get a “boxing” class object from the “base” class object, so I can register it appropriately?

(A more robust solution/feature would be to use ABC to register instances of “boxing” as instances of “base”.)

Aha – using the classic “10-year-old-kid” method (push buttons until everything is irrevocably broken or you get what you want) – I was able to figure out the following hack:

from numba.experimental.jitclass import boxing

...

@act.register(
    boxing._specialize_box(Foo.class_type.instance_type)
)
def act_foo(inst: Foo): ...

I presume this is not a public API. :slight_smile: … I also guess it increases my runtime if I don’t end up actually using any Foo objects. (?)

A soft feature request would be to create a better way.

hi @shaunc , Numba is built around a dispatch mechanism. In fact, numba-compiled functions are held in an object call Dispatcher. It seems odd to me that you would need to rely on singledispatch to perform dispatch.
Have you tried using Numba’s generated_jit or overload?

cheers
Luk

@luk-f-a Thanks for the thought. Hmm… This is a large library/application whose parts are assembled with dependency injection. I don’t want to couple the various parts – all implementations, and consumers, reference the stub @singledispatch, but various implementations, and consumers that need a specific implementation, shouldn’t know about or have to load into memory the other implementations. That doesn’t seem possible with \@generated_jit … is it possible with \@overload? (I also use @singledispatch in various places for which numba optimization wouldn’t be of any use, but if there were a numba mechanism I could use everywhere with nopython=False, I would look into it. While debugging, it seemed to me that numba used @singledispatch itself (hmm… in numba.core.typing…) – so it can’t be that bad! :))