Page MenuHomePhabricator

civi Time out
Closed, ResolvedPublic

Description

I have a list of roughly 20K records that need to be updated with about 6 data fields that I am trying to import. Civi keeps timing out.

I've broken the list down a few times to sometimes less than 2k and it is still timing out. PLEASE HELP
I need to get this information in civi ASAP

Event Timeline

DStrine moved this task from Triage to Current Sprint on the Fundraising-Backlog board.
DStrine added a subscriber: DStrine.

I'm looking for info in the logs about this, @NNichols. Do you remember what time you started one of the imports that timed out? That will help narrow down the search.

I see a log entry with this error and your user ID:

Data too long for column 'preferred_language' at row 39

Can you check the language column and make sure the data is not too long for any entries?

Aha, I found it - Civi can't use the longer codes for Chinese languages (zh_hans / zh_hant) - instead we need to use the locales zh_CN and zh_TW for simplified / traditional. Try doing a search and replace for zh_hans with zh_CN and zh_hant with zh_TW and see if that lets you do the import.

I started this morning around 4am EST? Then I tried multiple times this morning just before filing this task.

I'm not trying to import any languages data.

My fields are: Affinity, Estimated Capacity, Capacity Range, Planned Giving Score, Planned Giving Segment

@NNichols I was just going to test the first 500 rows - did you save a mapping for the import?

@NNichols oops, I was looking at errors from a few days ago - looks like Civi might have been giving you problems when you tried to export some of these same users? Anyway, I don't actually see any crashes in that same log, and @Eileenmcnaughton has been able to import quite a few rows. Maybe you were just up against some heavy database processing?

@NNichols I plugged through the spreadsheet trying to find an issue or pattern - but I found pretty consistently I could import around 2000 rows in a batch but not much more.

The way it works is that internally it batches into 50 rows per request - if those 50 rows don't process in 30 seconds it times out. For some reason I've never quite figured out batches of processing slow down as they go along so somewhere around 2000 rows in it can't get through 50 rows in 30 seconds. I went through the spreadsheet chunk by chunk to see if it was struggling with any specific rows or just the volume and it seems to be the latter. I did import the entire spreadsheet

@Dwisehaupt the php time out is 30 seconds I think - which is pretty low, possibly to ensure background processes aren't going into loops - could we increase for UI processes?

@Eileenmcnaughton Thank you for getting that in for us! We will need to make these kind of large import updates on a regular basis in the future

Yeah can anyone explain what happened here? Civi users are used to importing much larger files on a regular basis. We need to support that use case.

@DStrine most of our imports are done through our custom code - this is the main civi ui - for the custom code we can toggle the batch right down in the UI- which is probably why it doesn't time out - although I think a file of 20k rows would probaly still need to be broken up

@Eileenmcnaughton Just for clarity, the php config setting that is at 30 seconds is max_execution_time (https://www.php.net/manual/en/info.configuration.php#ini.max-execution-time). Is this what you are thinking of adjusting? All other 'timeout's I see listed are at least 60 seconds.

Additionally, it looks like there are a few settings that are hard-coded for the cli config vs parameters for the cgi/apache config. I could standardize those if we see the need.

Updated the timeout for php::mod::cgi to 60 seconds.

[frack::puppet] 4fc65d09 update php::mod::cgi max_execution_time to 60 for civicrm
DStrine set Final Story Points to 4.