Page MenuHomePhabricator

Improve handling of large number of rows for SearchKit download
Open, Needs TriagePublic

Description

This hangs with large numbers of rows (couple hundred thousand). The issue is that SK passes the list of ids back to the server that the user has selected. When this list gets long, the server rejects it.

I believe the limit in nginx is client_max_body_size which I think might be set at the default 1MB. @Dwisehaupt could we increase this?

But that's not an ideal fix. I think it would be better in core if when all rows are selected and we're doing a download, we instead just don't pass the filter = [ids] at all and then we should get the full results of the search. I guess that could be a bit unexpected if we select all 10 rows and we get back 11 rows because one was added in the interim, but seems better than the current situation. We could set a minimum count before we don't filter, if this is a concern.

Related Objects

Event Timeline

Actually passing filters id = [] doesn't work (you can achieve this by selecting download without selecting any rows, which does work for a lower number of rows, but not when there are as many as there are in the SK in question). So there's something more going on here that would need looking into.

@Lars in nginx client_max_body_size was already increased to 10M in T219191. Also, on the php side we have the following php adjustments from that task :

post_max_size            => '12M',
upload_max_filesize      => '10M',
AKanji-WMF subscribed.

@Lars this is on the list for CiviCore work consideration, right?

@AKanji-WMF Potentially, I need to dig in a little more to figure out if this is something in core or something on our end.
Thanks @Dwisehaupt, looks like that isn't the issue then.

@MBeat33 also reports failures for this SK with only 24k rows.