Skip to content Skip to sidebar Skip to footer

Multiprocessing - Map Over List, Killing Processes That Stall Above Timeout Limit

I have a list of elements that I want to modify using multiprocessing. The issue is that for some particular inputs (unobservable prior to attempting), part of the my function stal

Solution 1:

You can take a look at the pebble library.

from pebble import ProcessPool
from concurrent.futures import TimeoutError

defsometimes_stalling_processing(obs):
    ...
    return processed_obs

with ProcessPool() as pool:
    future = pool.map(sometimes_stalling_processing, dataset, timeout=10)

    iterator = future.result()

    whileTrue:
        try:
            result = next(iterator)
        except StopIteration:
            breakexcept TimeoutError as error:
            print("function took longer than %d seconds" % error.args[1])

More examples in the documentaion.

Post a Comment for "Multiprocessing - Map Over List, Killing Processes That Stall Above Timeout Limit"