Ability to declare function signature at run-time

I have created some general purpose optimization functions using njit that work great … except for one annoying warning I’m getting when I run them inside another njit function:

NumbaExperimentalFeatureWarning: First-class function type feature is experimental
  warnings.warn("First-class function type feature is experimental",

I totally understand why this is happening but I don’t know how to get rid of it. I’ve tried searching this group but I fear I just might not know the right terminology to use for an effective search.

The call to my optimization function looks something like this:

lower, middle, upper = glob_min_nb(
        criter_func,
        params,
        ivar,
        -3.0,          # low
        3.0,           # high
        15,            # npts (number of points to try)
        False,         # log_space
        0,
        *criter_func_args
    )

I know the problem is with the glob_min_nb function type signature (or lack thereof).
The parameters criter_func and criter_func_args are going to be dynamic based on the optimization problem being solved and therefore, I cannot strictly type glob_min_nb at time of njit decoration.

Question:
Is there anyway within code to apply the static signatures of criter_func and criter_func_args (as well as the other parameters in the above function call) and dynamically create a function signature which I can apply to glob_min_nb at compile/run time?

Thanks in advance for any help!

Hey @dpats ,

you could use lazy jit compilation and switch off the warnings or you could use eager mode and specify all possible criter_func signatures in advance (in case they are available).

from numba import njit
from numba import types as ty

@njit(['i8(i8)'])
def fn_crit_1(a):
    return 1

@njit(['i8(i8)'])
def fn_crit_2(a):
    return 2

@njit(['i8(i8, i8)'])
def fn_crit_3(a, b):
    return 3

@njit
def fn_optimizer_lazy(fn_crit, *args_crit):
    return fn_crit(*args_crit)

@njit([
    ty.i8(ty.FunctionType(ty.i8(ty.i8)), ty.UniTuple(ty.i8, 1)),
    ty.i8(ty.FunctionType(ty.i8(ty.i8, ty.i8)), ty.UniTuple(ty.i8, 2)),
])
def fn_optimizer_eager(fn, *args):
    return fn(*args)

print('signature evolution in lazy mode:')
print('Before function call:')
print(fn_optimizer_lazy.signatures)
print()
print('After each function call:')
for fn, args in [(fn_crit_1, (1,)), (fn_crit_2, (1,)), (fn_crit_3, (1, 2,))]:
    fn_optimizer_lazy(fn, *args)
    print()
    for sig in fn_optimizer_lazy.signatures:
        print(sig)

print()
print('signature evolution in eager mode:')
print('Before function call:')
for sig in fn_optimizer_eager.signatures:
    print(sig)

fn_optimizer_eager(fn_crit_1, 1)
fn_optimizer_eager(fn_crit_2, 1)
fn_optimizer_eager(fn_crit_3, 1, 2)

print()
print('After function call:')
for sig in fn_optimizer_eager.signatures:
    print(sig)

# signature evolution in lazy mode:
# Before function call:
# []

# After each function call:

# (type(CPUDispatcher(<function fn_crit_1 at 0x7f59b22b1080>)), UniTuple(int64, 1))

# (type(CPUDispatcher(<function fn_crit_1 at 0x7f59b22b1080>)), UniTuple(int64, 1))
# (type(CPUDispatcher(<function fn_crit_2 at 0x7f59b216b600>)), UniTuple(int64, 1))

# (type(CPUDispatcher(<function fn_crit_1 at 0x7f59b22b1080>)), UniTuple(int64, 1))
# (type(CPUDispatcher(<function fn_crit_2 at 0x7f59b216b600>)), UniTuple(int64, 1))
# (type(CPUDispatcher(<function fn_crit_3 at 0x7f59b24f00e0>)), UniTuple(int64, 2))

# signature evolution in eager mode:
# Before function call:
# (FunctionType[int64(int64)], UniTuple(int64, 1))
# (FunctionType[int64(int64, int64)], UniTuple(int64, 2))

# After function call:
# (FunctionType[int64(int64)], UniTuple(int64, 1))
# (FunctionType[int64(int64, int64)], UniTuple(int64, 2))

Thanks the reply, @Oyibo !

I wasn’t aware of the .signatures property which looks incredibly convenient. However, I suppose I was looking for the best of both worlds in this situation :slight_smile: .

In your example, I was hoping to take the fn_optimizer_lazy approach where I didn’t have to decorate the function with a type signature at time of declaration, but then assign the signature in code before it gets compiled, like below:

fn.optimizer.lazy.signatures[0]  = [
    ty.i8(ty.FunctionType(ty.i8(ty.i8)), ty.UniTuple(ty.i8, 1)),
    ty.i8(ty.FunctionType(ty.i8(ty.i8, ty.i8)), ty.UniTuple(ty.i8, 2)),
]

But now, as I’m having to write this out, I’m realizing it may be a bit of a chicken/egg issue. I wouldn’t be able to assign a type signature until it’s been ‘evaluated’ for compilation and that either happens at the time of declaration when a type signature is provided or in real-time JIT-mode when the compiler determines the signature. (Apologies if I’m getting some of the terminology incorrect.

Thank you again for providing some nice examples of what my options appear to be. I really appreciate it!

@dpats , providing signatures at runtime doesn’t help. Lazy compilation handles that, as shown in the signature evolution example. When there’s no suitable signature or specialization for a function, it compiles for the unknown arguments and appends a new signature to your function automatically. This typically prevents caching of such functions. New compilation in every new session might be necessary.