Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multiprocessing.Queue hangs when process on other side dies #87971

Open
kormang mannequin opened this issue Apr 11, 2021 · 6 comments
Open

multiprocessing.Queue hangs when process on other side dies #87971

kormang mannequin opened this issue Apr 11, 2021 · 6 comments
Labels
3.8 (EOL) end of life stdlib Python modules in the Lib dir topic-multiprocessing

Comments

@kormang
Copy link
Mannequin

kormang mannequin commented Apr 11, 2021

BPO 43805
Nosy @pitrou, @takluyver, @applio, @kormang, @shnizzedy

Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.

Show more details

GitHub fields:

assignee = None
closed_at = None
created_at = <Date 2021-04-11.11:52:50.914>
labels = ['3.8', 'library']
title = 'multiprocessing.Queue hangs when process on other side dies'
updated_at = <Date 2021-12-06.10:37:11.589>
user = 'https://github.com/kormang'

bugs.python.org fields:

activity = <Date 2021-12-06.10:37:11.589>
actor = 'takluyver'
assignee = 'none'
closed = False
closed_date = None
closer = None
components = ['Library (Lib)']
creation = <Date 2021-04-11.11:52:50.914>
creator = 'kormang'
dependencies = []
files = []
hgrepos = []
issue_num = 43805
keywords = []
message_count = 6.0
messages = ['390774', '390777', '390779', '406953', '407709', '407785']
nosy_count = 6.0
nosy_names = ['pitrou', 'takluyver', 'davin', 'kormang', 'shnizzedy', 'myles.steinhauser']
pr_nums = []
priority = 'normal'
resolution = None
stage = None
status = 'open'
superseder = None
type = None
url = 'https://bugs.python.org/issue43805'
versions = ['Python 3.8']

@kormang
Copy link
Mannequin Author

kormang mannequin commented Apr 11, 2021

When child process dies unexpectedly Queue.get waits indefinitely.

Here is example:

import os
import signal
import multiprocessing

def child_func(qa, qb):
    input = qa.get()
    print('Child received: ', input)
    os.kill(os.getpid(), signal.SIGTERM)
    qb.put('B')
    exit(0)

qa = multiprocessing.Queue()
qb = multiprocessing.Queue()
process = multiprocessing.Process(target=child_func, args=(qa, qb))
process.start()

qa.put('A')
try:
    input = qb.get()
    print('Parent received: ', input)
except Exception as ex:
    print(ex)
process.join()
exit(0)

@kormang kormang mannequin added 3.8 (EOL) end of life stdlib Python modules in the Lib dir labels Apr 11, 2021
@kormang
Copy link
Mannequin Author

kormang mannequin commented Apr 11, 2021

Possible duplicate of bpo-22393

@kormang
Copy link
Mannequin Author

kormang mannequin commented Apr 11, 2021

Somewhat related bpo-43806 with asyncio.StreamReader

@takluyver
Copy link
Mannequin

takluyver mannequin commented Nov 24, 2021

I think this is expected. The queue itself doesn't know that one particular process is meant to put data into it. It just knows that there's no data to get, so .get() blocks as the docs say it should.

This doesn't apply to bpo-22393, because the pool knows about its worker processes, so if one dies before completing a task, it can know something is wrong.

You could add a method to 'half close' a queue, so it can only be used for receiving, but not sending. If you called this in the parent process after starting the child, then if the child died, the queue would know that nothing could ever put data into it, and .get() could error. The channels API in Trio allows this, and it's the same idea I've just described at the OS level in bpo-43806.

@kormang
Copy link
Mannequin Author

kormang mannequin commented Dec 5, 2021

Yes, something like that would indeed be really helpful.
How likely is that something like that gets implemented?

@takluyver
Copy link
Mannequin

takluyver mannequin commented Dec 6, 2021

It's not my decision, so I can't really say. But the Queue API is pretty stable, and exists 3 times over in Python (the queue module for use with threads, in multiprocessing and in asyncio). So I'd guess that anyone wanting to add to that API would need to make a compelling case for why it's important, and be prepared for a lot of wrangling over API details (like method names and exceptions).

If you want to push that idea, you could try the ideas board on the Python discourse forum: https://discuss.python.org/c/ideas/6 .

You might also want to look at previous discussions about adding a Queue.close() method: bpo-29701 and bpo-40888.

@ezio-melotti ezio-melotti transferred this issue from another repository Apr 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
3.8 (EOL) end of life stdlib Python modules in the Lib dir topic-multiprocessing
Projects
Status: No status
Development

No branches or pull requests

1 participant