Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.
Sign upbpo-36654: add example to generate token from another file #12947
Conversation
This example confuse me, because the file |
Example of tokenizing from another file:: | ||
|
||
import tokenize | ||
f = open('example.py', 'rb') |
This comment has been minimized.
This comment has been minimized.
berkerpeksag
Apr 27, 2019
Member
Using tokenize.open()
would be better as it automatically detect the encoding of the file:
Line 449 in 5d90954
This comment has been minimized.
This comment has been minimized.
berkerpeksag
Apr 27, 2019
Member
Also, I agree with Emmanuel that using the existing hello.py
as an example would make the example easier to follow.
This comment has been minimized.
This comment has been minimized.
Windsooon
Apr 28, 2019
Author
Contributor
I found the tokenize function already call detect_encoding
.
This comment has been minimized.
This comment has been minimized.
berkerpeksag
Apr 28, 2019
Member
I just realized that tokenize.open()
returns a text stream, so passing it to tokenize.tokenize()
wouldn't work.
We can use tokenize.generate_tokens()
to demonstrate how the str
API works:
import tokenize
with tokenize.open('hello.py') as f:
token_gen = tokenize.generate_tokens(f.readline)
for token in token_gen:
print(token)
Then fallback to your example to show the usage of the bytes
API:
import tokenize
with open('hello.py', 'rb') as f:
token_gen = tokenize.tokenize(f.readline)
for token in token_gen:
print(token)
This comment has been minimized.
This comment has been minimized.
bedevere-bot
commented
Apr 27, 2019
A Python core developer has requested some changes be made to your pull request before we can consider merging it. If you could please address their requests along with any other requests in other reviews from core developers that would be appreciated. Once you have made the requested changes, please leave a comment on this pull request containing the phrase |
This comment has been minimized.
This comment has been minimized.
Hi @Windsooon Could you update your PR with the last comment of @berkerpeksag and with the Thank you |
Thanks for your contribution but you have to update your PR with the recommendations of @berkerpeksag |
This comment has been minimized.
This comment has been minimized.
Hello, @matrixise, I would love to update the example if needed. However, Serhiy Storchaka didn't agree with it
|
This comment has been minimized.
This comment has been minimized.
I think Serhiy's comment was about the current form of the PR. In my last comment, I was proposing to add examples for both bytes and unicode APIs of the tokenize module. As a core developer, even I missed that |
This comment has been minimized.
This comment has been minimized.
@Windsooon, any updates? Thanks! |
This comment has been minimized.
This comment has been minimized.
Sure, I will fix it in two days. Thank you for reminding.
|
This comment has been minimized.
This comment has been minimized.
Hi, @csabella. I just updated the PR base on @berkerpeksag 's suggestion. |
Looks pretty good to me, thank you! |
This comment has been minimized.
This comment has been minimized.
bedevere-bot
commented
Jan 25, 2020
@berkerpeksag: Please replace |
This comment has been minimized.
This comment has been minimized.
miss-islington
commented
Jan 25, 2020
Thanks @Windsooon for the PR, and @berkerpeksag for merging it |
…honGH-12947) (cherry picked from commit 4b09dc7) Co-authored-by: Windson yang <wiwindson@outlook.com>
This comment has been minimized.
This comment has been minimized.
bedevere-bot
commented
Jan 25, 2020
GH-18187 is a backport of this pull request to the 3.8 branch. |
…honGH-12947) (cherry picked from commit 4b09dc7) Co-authored-by: Windson yang <wiwindson@outlook.com>
This comment has been minimized.
This comment has been minimized.
bedevere-bot
commented
Jan 25, 2020
GH-18188 is a backport of this pull request to the 3.7 branch. |
Windsooon commentedApr 25, 2019
•
edited by bedevere-bot
Add example to generate token from another file.
https://bugs.python.org/issue36654