多重处理功能上的超时装饰器
问题内容:
我直接从网上发现的一个例子中得到了这个装饰器:
class TimedOutExc(Exception):
pass
def timeout(timeout):
def decorate(f):
def handler(signum, frame):
raise TimedOutExc()
def new_f(*args, **kwargs):
old = signal.signal(signal.SIGALRM, handler)
signal.alarm(timeout)
try:
result = f(*args, **kwargs)
except TimedOutExc:
return None
finally:
signal.signal(signal.SIGALRM, old)
signal.alarm(0)
return result
new_f.func_name = f.func_name
return new_f
return decorate
如果f函数超时,它将引发异常。
很好,它可以工作,但是当我在多处理功能上使用此装饰器并由于超时而停止时,它不会终止计算中涉及的进程。我怎样才能做到这一点?
我不想启动异常并停止程序。基本上我想要的是f超时时,让它返回None,然后终止所涉及的进程。
问题答案:
尽管我同意亚伦的回答的要点,但我想详细说明一下。
由启动的过程multiprocessing
必须 在要装饰的功能中 停止;
我认为一般不能从装饰器本身简单地完成此操作(装饰的函数是唯一知道它启动了哪些计算的实体)。
除了具有修饰的功能catch之外SIGALARM
,您还可以捕获您的自定义TimedOutExc
异常-这可能更灵活。您的示例将变为:
import signal
import functools
class TimedOutExc(Exception):
"""
Raised when a timeout happens
"""
def timeout(timeout):
"""
Return a decorator that raises a TimedOutExc exception
after timeout seconds, if the decorated function did not return.
"""
def decorate(f):
def handler(signum, frame):
raise TimedOutExc()
@functools.wraps(f) # Preserves the documentation, name, etc.
def new_f(*args, **kwargs):
old_handler = signal.signal(signal.SIGALRM, handler)
signal.alarm(timeout)
result = f(*args, **kwargs) # f() always returns, in this scheme
signal.signal(signal.SIGALRM, old_handler) # Old signal handler is restored
signal.alarm(0) # Alarm removed
return result
return new_f
return decorate
@timeout(10)
def function_that_takes_a_long_time():
try:
# ... long, parallel calculation ...
except TimedOutExc:
# ... Code that shuts down the processes ...
# ...
return None # Or exception raised, which means that the calculation is not complete