The essential idea of memoization is so simple that it can be captured by the @lru_cache decorator. This decorator can be applied to any function to implement memoization. In some cases, we may be able to improve on the generic idea with something more specialized. There are a large number of potentially optimizable multivalued functions. We'll pick one here and look at another in a more complex case study.
The binomial, , shows the number of ways n different things can be arranged in groups of size m. The value is as follows:
Clearly, we should cache the individual factorial calculations rather than redo all those multiplications. However, we may also benefit from caching the overall binomial calculation, too.
We'll create a Callable object that contains multiple internal caches. Here's a helper function that we'll need:
from functools...