Message7282
in issue2551142, I "fixed" the problem by committing after every row in
the csv import was created. This has performance implications.
Implementing this using subtransactions in posgresql:
https://postgres-
py.readthedocs.io/en/latest/#postgres.Postgres.get_cursor
may allow us to solve the initial issue without having to
perform a per-row commit.
I am not sure how this interacts with the python DBI standard and how
roundup interacts with the database, but it is worth a look. |
|
Date |
User |
Action |
Args |
2021-06-13 19:42:33 | rouilj | set | recipients:
+ rouilj |
2021-06-13 19:42:33 | rouilj | set | messageid: <1623613353.14.0.983541431787.issue2551144@roundup.psfhosted.org> |
2021-06-13 19:42:33 | rouilj | link | issue2551144 messages |
2021-06-13 19:42:32 | rouilj | create | |
|