It's using a fixed-sized 8k buffer, and raises MemoryError for a longer row.
I think the column list should also be dynamically allocated.
This isn't theoretical - we have tables with (historically) up to 1600 columns, and just failed inserting a row with an 7800 byte column into a narrow table. I still wonder whether inserttable should be re-written in python. I mailed talking about this this some time ago.
The text was updated successfully, but these errors were encountered:
Yes, maybe that would not be such a big perfomance loss. But we should definitely verify this with benchmarks and such huge numbers of columns and rows before we replace it.
As a quick remedy, we could increase the buffer size. Since it is taken from the heap, I think we could safely set it to 64k.
I noticed that MAX_BUFFER_SIZE is only used one other time in getline(), where the buffer is put on the stack.
justinpryzby
changed the title
inserttable() fails with long rows
inserttable() fails with wide rows
Mar 22, 2022
justinpryzby commentedFeb 10, 2022
It's using a fixed-sized 8k buffer, and raises MemoryError for a longer row.
I think the column list should also be dynamically allocated.
This isn't theoretical - we have tables with (historically) up to 1600 columns, and just failed inserting a row with an 7800 byte column into a narrow table. I still wonder whether inserttable should be re-written in python. I mailed talking about this this some time ago.
The text was updated successfully, but these errors were encountered: