I have a generator function that iterates over a big number of parameters and yields result of another function with this parameters. Inner function may have quite a long time of execution, so I would like to use multiprocessing to speed up process. Maybe it's important, I also would like to have an ability to stop this generator in middle of execution. But I'm not sure what is the right way to implement such logic. I need something like queue, giving the ability to add new tasks after old ones have been finished and to yield results as soon as they ready. I've looked over multiprocessing.Queue, but at first glance it seems not suitable for my case. May be somebody can advise what should I use in such scenario?
Here is approximate code of my task:
def gen(**kwargs):
for param in get_params():
yield inner_func(param)
param?