Page MenuHomePhabricator

Benevity import errors
Closed, ResolvedPublic4 Estimated Story Points

Description

As discussed in Civi Fortnightly, I've been having trouble with the Benevity import. It keeps leading to a 504 Gateway Timeout error and indicates "Unknown import error: Database deadlock encountered" on the import page. After a few attempts, most rows in my file are now imported, but I'll need to import more this week.

Event Timeline

Change 463152 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] More more variables to class properties

https://gerrit.wikimedia.org/r/463152

Change 463179 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Remove numSkippedRows handling

https://gerrit.wikimedia.org/r/463179

Change 463193 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Unwind weird row_index vs rowNum thing

https://gerrit.wikimedia.org/r/463193

Change 463152 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] More more variables to class properties

https://gerrit.wikimedia.org/r/463152

Change 463179 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] Remove numSkippedRows handling

https://gerrit.wikimedia.org/r/463179

Change 463193 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] Unwind weird row_index vs rowNum thing

https://gerrit.wikimedia.org/r/463193

@Eileenmcnaughton I just did imported a new large file and it led to a 504 timeout error.

Change 467569 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Import code tweak - Use headers property rather than passing headers variable

https://gerrit.wikimedia.org/r/467569

Change 467595 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Extract the minor functions that mark the outcomes of the row processing.

https://gerrit.wikimedia.org/r/467595

Change 467597 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Handle NULL for url rather than expect a duff value

https://gerrit.wikimedia.org/r/467597

Change 467832 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Instantiate all output files at the start.

https://gerrit.wikimedia.org/r/467832

Change 467833 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Do our validation & header instantiation in construct, as long as we have the field uri

https://gerrit.wikimedia.org/r/467833

Change 467845 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Use function to write csv rows.

https://gerrit.wikimedia.org/r/467845

Change 467847 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Switch to phpleage csv Writer package.

https://gerrit.wikimedia.org/r/467847

Change 467569 merged by Eileen:
[wikimedia/fundraising/crm@master] Import code tweak - Use headers property rather than passing headers variable

https://gerrit.wikimedia.org/r/467569

Change 467595 merged by Eileen:
[wikimedia/fundraising/crm@master] Extract the minor functions that mark the outcomes of the row processing.

https://gerrit.wikimedia.org/r/467595

Change 467597 merged by Eileen:
[wikimedia/fundraising/crm@master] Handle NULL for url rather than expect a duff value

https://gerrit.wikimedia.org/r/467597

Change 467852 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Switch to permitting rowCount & offset in import function

https://gerrit.wikimedia.org/r/467852

Change 468174 had a related patch set uploaded (by Eileen; owner: Eileen):
[wikimedia/fundraising/crm@master] Add a statement about server load to inform import timing decisions.

https://gerrit.wikimedia.org/r/468174

Change 467833 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] Do our validation & header instantiation in construct, as long as we have the field uri

https://gerrit.wikimedia.org/r/467833

Change 467832 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] Instantiate all output files at the start.

https://gerrit.wikimedia.org/r/467832

Change 467845 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] Use function to write csv rows.

https://gerrit.wikimedia.org/r/467845

Change 467847 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] Switch to phpleage csv Writer package.

https://gerrit.wikimedia.org/r/467847

Change 467852 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] Switch to permitting rowCount & offset in import function

https://gerrit.wikimedia.org/r/467852

@LeanneS I just deployed the new changes that should prevent timeouts - let me know how you go.

Will do! I should have a large file later this week.

Change 468174 merged by jenkins-bot:
[wikimedia/fundraising/crm@master] Add a statement about server load to inform import timing decisions.

https://gerrit.wikimedia.org/r/468174

@Eileenmcnaughton Is the "number of items to process each batch" the number of rows in the file? If so, does this exclude the header?

Okay I tried importing and updating the number of items, but it timed out with my file. Is there a maximum number of rows it can accept?

Screen Shot 2018-10-30 at 10.16.22 AM.png (176×1 px, 41 KB)

@LeanneS - yes it excludes the header- you are not aiming to enter total rows here but the number of rows to import before it 'takes a breath and then does the next lot' - how many did you try?

Ah got it. So if I leave it at a smaller number like 100, it will still import the whole file but in pieces? I first had it on 100 but noticed that most of it wasn't showing in Civi so then tried a second time with 1810 items.

@LeanneS the tech details: @Eileenmcnaughton's new code uploads the whole file when you hit submit, then processes it piece by piece while showing a progress bar. Your browser makes a new request for each piece, so if those requests are timing out you can try smaller pieces.

When you imported it with batch size 100, did you see the progress bar make it all the way across the screen? Or did it stop at some point?

It stopped and said that all rows were imported, but when I checked an hour or so later, it still showed only 100 records added.

@LeanneS maybe we can do a screen share in about 90 mins?