Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refs #31949 -- Enable @sensitive_variables to work with async functions #16831

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

bigfootjon
Copy link
Contributor

@bigfootjon bigfootjon commented May 7, 2023

This is v2 of my attempt to add async support to @sensitive_variables and @sensitive_post_parameters. The original version had trouble with nested async functions which obviously required it to be reverted. I've realized that for further work on asyncifying the auth stack (current PR, discussion) I'll need @sensitive_variables eventually.

The broad problem that this attempts to solve is that async functions do not have a proper f_back field in tracebacks. I think python/cpython#91048 is the relevant upstream python ticket but I'm not sure. Since we can't rely on f_back we have to find another way to store and recover information about which variables are sensitive. Fortunately, due to the way @sensitive_post_parameters works it is not affected by the limitations of f_back. That decorator stores its state in the request parameter to the view function so no need to walk the stack or other trickery.

To persist the sensitive variables list for async functions, we can use a global dict with the hash of the filename and line number of the function as the key (more formally: hash(f"{file_path}:{line_number}") and the value is the tuple or sensitive variables (or __ALL__). Then, during traceback cleanup we can read this dict by reconstructing the same key with f_code.co_filename and f_code.co_firstlineno to lookup the relevant value in the dict.

We could swap over sync functions to this new and improved way of doing things, but that felt like too drastic of a change to me. It wouldn't be too difficult to swap everything over.

I've also added some more test coverage compared to v1 of this PR, which should help alleviate any concerns that this version is stiill fragile. (happy to add more tests, I added what I hypothesized this solution might not work for, open to other ideas)

@bigfootjon
Copy link
Contributor Author

Ah shoot, looks like co_qualname was added in 3.11 :(

Maybe it should be some combination of the function name and the module, I'll leave this up for review but this might have to wait until 3.11 is the lowest supported version of python if there isn't a suitable analog to co_qualname

@smithdc1
Copy link
Member

smithdc1 commented May 7, 2023

until 3.11 is the lowest supported version of python

By my workings...

  • 3.10 is supported to Oct 2026.
  • 4.2LTS ends in Apr 2026. (I.e. before 3.10 support ends)

3.10 will therefore be supported in the 5.x series. Therefore the first version to drop support for 3.10 will be Django 6.0, release date Jan 2026.

Co-authored-by: Mariusz Felisiak <felisiak.mariusz@gmail.com>
@bigfootjon bigfootjon force-pushed the ticket_34391_sensitive_variables_v2 branch from b12bea3 to d28e229 Compare May 20, 2023 18:34
@bigfootjon
Copy link
Contributor Author

Ok I've reworked the implementation here to not need co_qualname. It's now hash(filename:line_number) as the key to the dict, which I think is a better implementation anyway

Copy link
Member

@andrewgodwin andrewgodwin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, having looked over this code I think it makes sense, but I'm slightly concerned about memory leaks with that global dictionary storing the variables. I don't think it will happen - hash should be consistent within the same process, and the number of places one is storing should be fixed - but that doesn't stop me being a little worried.

Despite that, I can't think of anything better, so I say go with this plan and let's hope it all holds up when faced with everyone's environments.

@felixxm felixxm self-assigned this Jun 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants