-
-
Notifications
You must be signed in to change notification settings - Fork 29.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SSLContext.load_verify_locations leaks memory on Linux #84904
Comments
Minimal code to reproduce:
Running this code on several linux machines (with python from 3.6.9 to 3.9.0a5, and openSSL from 1.1.1 11 Sep 2018 to 1.1.1g 21 Apr 2020) causes a significant memory leak, while on windows memory usage peaks around 1 GB but gets freed |
Does it also leak without asyncio? |
Removing the |
Without asyncio memory consumption stays low and stable for me: $ ./python -m venv venv
$ ./venv/bin/pip install psutil
$ ./venv/bin/python
>>> ssl.OPENSSL_VERSION
'OpenSSL 1.1.1g FIPS 21 Apr 2020'
>>> import psutil, ssl, os
>>> p = psutil.Process(os.getpid())
>>> cafile = ssl.get_default_verify_paths().cafile
>>> p.memory_info()
pmem(rss=14811136, vms=237223936, shared=8138752, text=2125824, lib=0, data=6701056, dirty=0)
>>> for i in range(1000):
... ssl.SSLContext(ssl.PROTOCOL_TLS).load_verify_locations(cafile)
...
>>> p.memory_info()
pmem(rss=17489920, vms=238170112, shared=9863168, text=2125824, lib=0, data=7647232, dirty=0)
>>> for i in range(1000):
... ssl.SSLContext(ssl.PROTOCOL_TLS).load_verify_locations(cafile)
...
>>> p.memory_info()
pmem(rss=17489920, vms=238170112, shared=9863168, text=2125824, lib=0, data=7647232, dirty=0) |
When I run your example, RSS jumps from 20 MB to about 1,600 MB. There is almost no increase when I run the look several more times. >>> p.memory_info()
pmem(rss=19902464, vms=240513024, shared=10014720, text=2125824, lib=0, data=9887744, dirty=0)
>>> asyncio.run(main(2000))
<stdin>:2: DeprecationWarning: The explicit passing of coroutine objects to asyncio.wait() is deprecated since Python 3.8, and scheduled for removal in Python 3.11.
>>> p.memory_info()
pmem(rss=1608568832, vms=1829105664, shared=10014720, text=2125824, lib=0, data=1598480384, dirty=0)
>>> asyncio.run(main(2000))
>>> p.memory_info()
pmem(rss=1608835072, vms=1829367808, shared=10014720, text=2125824, lib=0, data=1598742528, dirty=0)
>>> asyncio.run(main(2000))
>>> p.memory_info()
pmem(rss=1608601600, vms=1829367808, shared=10014720, text=2125824, lib=0, data=1598742528, dirty=0) Why are you creating so many SSLContext objects any way? It's very inefficient and really not necessary. I recommend that you create one context in your application and reuse for all connection. You only ever need additional contexts for different configuration (protocol, verification, trust anchors, ...). |
Same for me
That is the memory consumption I observe as well, the issue is that it doesn't get freed on Linux
Same for me, but of course only if I exit the "async" context between runs
The original issue was observed in a very long running process (months), that occasionally needed a context and it was convenient to just create one every time (actually it creates an AsyncClient context encode/httpx#978) even if it is relatively inefficient, it didn't really matter, but memory usage unexpectedly slowly grew to 1 GB which was very unexpected |
FWIW I've been running into the same issue independently with pyOpenSSL. One potentially relevant observation is that calling ctypes.CDLL('libc.so.6').malloc_trim(0) returns most memory back to the OS. I suspect that |
Leaks happen on Python3.11 too, but only on linux(compiled with Ubuntu 22.04.1 LTS : Line # Mem usage Increment Occurrences Line Contents
=============================================================
8 21.6 MiB 21.6 MiB 1 @profile
9 def leak_test():
10 21.6 MiB 0.0 MiB 1 ca_certs = certifi.where()
11 26.7 MiB 5.2 MiB 303 contexts = [SSLContext(PROTOCOL_TLS_CLIENT) for _ in range(300)]
12 238.5 MiB 0.0 MiB 301 for context in contexts:
13 238.5 MiB 211.7 MiB 300 context.load_verify_locations(ca_certs)
14
15 238.5 MiB 0.0 MiB 1 del contexts
16 238.5 MiB 0.0 MiB 1 gc.collect()
17
18 238.5 MiB 0.0 MiB 1 print('Test complete!') Windows 10 : Line # Mem usage Increment Occurrences Line Contents
=============================================================
8 22.7 MiB 22.7 MiB 1 @profile
9 def leak_test():
10 22.9 MiB 0.3 MiB 1 ca_certs = certifi.where()
11 24.6 MiB 1.6 MiB 303 contexts = [SSLContext(PROTOCOL_TLS_CLIENT) for _ in range(300)]
12 251.8 MiB 0.0 MiB 301 for context in contexts:
13 251.8 MiB 227.2 MiB 300 context.load_verify_locations(ca_certs)
14
15 35.5 MiB -216.3 MiB 1 del contexts
16 35.5 MiB 0.0 MiB 1 gc.collect()
17
18 35.5 MiB 0.0 MiB 1 print('Test complete!') My code to reproduce this bug: import gc
import certifi
from ssl import SSLContext, PROTOCOL_TLS_CLIENT
from memory_profiler import profile
@profile
def leak_test():
ca_certs = certifi.where()
contexts = [SSLContext(PROTOCOL_TLS_CLIENT) for _ in range(300)]
for context in contexts:
context.load_verify_locations(ca_certs)
del contexts
gc.collect()
print('Test complete!')
if __name__ == "__main__":
leak_test() |
Test on Python 3.10.6(Ubuntu 22.04.1 LTS), it leaks too |
On macOS (Ventura 13.1) I also see most memory being returned, like on Windows 10 above. So it does seem to be Linux specific. Since the latest repro doesn't involve async code I removed that from the subject and removed the expert-asyncio label. @mhils may well be right that this is some other leak together with a weird allocation pattern. |
Python |
This is still reproducing in 3.12.2 on ubuntu 22.04, compiled against openssl 3.0.2 |
What exactly did you try? |
@gvanrossum Here is the code I used. It is a variant of the original - I am holding an reference to the SSLContext objects in a dictionary to avoid any nebulous object lifetime issues in the original With the
|
But if |
I was just wanting to report that the s/leak/fragmentation/ was still observable with Linux on 3.12.2. I do not know enough about python memory management to know if that behavior is a linux specific artifact or not; but it does feel a bit odd to only be observed on that platform. |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: