-
-
Notifications
You must be signed in to change notification settings - Fork 30.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
multiprocessing.Queue hangs when process on other side dies #87971
Comments
When child process dies unexpectedly Queue.get waits indefinitely. Here is example: import os
import signal
import multiprocessing
def child_func(qa, qb):
input = qa.get()
print('Child received: ', input)
os.kill(os.getpid(), signal.SIGTERM)
qb.put('B')
exit(0)
qa = multiprocessing.Queue()
qb = multiprocessing.Queue()
process = multiprocessing.Process(target=child_func, args=(qa, qb))
process.start()
qa.put('A')
try:
input = qb.get()
print('Parent received: ', input)
except Exception as ex:
print(ex)
process.join()
exit(0) |
Possible duplicate of bpo-22393 |
Somewhat related bpo-43806 with asyncio.StreamReader |
I think this is expected. The queue itself doesn't know that one particular process is meant to put data into it. It just knows that there's no data to get, so .get() blocks as the docs say it should. This doesn't apply to bpo-22393, because the pool knows about its worker processes, so if one dies before completing a task, it can know something is wrong. You could add a method to 'half close' a queue, so it can only be used for receiving, but not sending. If you called this in the parent process after starting the child, then if the child died, the queue would know that nothing could ever put data into it, and .get() could error. The channels API in Trio allows this, and it's the same idea I've just described at the OS level in bpo-43806. |
Yes, something like that would indeed be really helpful. |
It's not my decision, so I can't really say. But the Queue API is pretty stable, and exists 3 times over in Python (the queue module for use with threads, in multiprocessing and in asyncio). So I'd guess that anyone wanting to add to that API would need to make a compelling case for why it's important, and be prepared for a lot of wrangling over API details (like method names and exceptions). If you want to push that idea, you could try the ideas board on the Python discourse forum: https://discuss.python.org/c/ideas/6 . You might also want to look at previous discussions about adding a Queue.close() method: bpo-29701 and bpo-40888. |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: