Jitclass: how to get "boxing" class from "base" class

I am trying to use numba classes in conjunction with functools.singledispatch (which would seem a useful way to get around lack of inheritance and unlowerable functions):

class Foo: ...

class Bar: ...

def act(inst: object): ...

def act_foo(inst: Foo): ...

def act_bar(inst: Bar): ...

Unfortunately, when I call with a Foo or Bar instance, the derived functions are not invoked. A look at the types of instance vs the registered type in the singledispatch registry shows why: the class in registry is <class 'numba.experimental.jitclass.base.Foo> while the instance has type: <class 'numba.experimental.jitclass.boxing.Foo>. Can I somehow get a “boxing” class object from the “base” class object, so I can register it appropriately?

(A more robust solution/feature would be to use ABC to register instances of “boxing” as instances of “base”.)

Aha – using the classic “10-year-old-kid” method (push buttons until everything is irrevocably broken or you get what you want) – I was able to figure out the following hack:

from numba.experimental.jitclass import boxing


def act_foo(inst: Foo): ...

I presume this is not a public API. :slight_smile: … I also guess it increases my runtime if I don’t end up actually using any Foo objects. (?)

A soft feature request would be to create a better way.

hi @shaunc , Numba is built around a dispatch mechanism. In fact, numba-compiled functions are held in an object call Dispatcher. It seems odd to me that you would need to rely on singledispatch to perform dispatch.
Have you tried using Numba’s generated_jit or overload?


@luk-f-a Thanks for the thought. Hmm… This is a large library/application whose parts are assembled with dependency injection. I don’t want to couple the various parts – all implementations, and consumers, reference the stub @singledispatch, but various implementations, and consumers that need a specific implementation, shouldn’t know about or have to load into memory the other implementations. That doesn’t seem possible with \@generated_jit … is it possible with \@overload? (I also use @singledispatch in various places for which numba optimization wouldn’t be of any use, but if there were a numba mechanism I could use everywhere with nopython=False, I would look into it. While debugging, it seemed to me that numba used @singledispatch itself (hmm… in numba.core.typing…) – so it can’t be that bad! :))