Page MenuHomePhabricator

server error seen when attempting to export long search kit list
Closed, ResolvedPublic

Description

When using this search:
https://civicrm.wikimedia.org/civicrm/admin/search#/edit/483

The current setting as of the writing of this task produces ~8K rows. When you go to export you get a server error:

image.png (852×2 px, 141 KB)

I was able to export a shorter list of ~45 just fine. I don't know if this is due to list length or some other factor.

Event Timeline

It turns out the issue is that the list is passed via the url and length of the url is the issue - I've created https://lab.civicrm.org/dev/core/-/issues/2736 to discuss this with Coleman & Tim

DStrine triaged this task as Medium priority.Aug 4 2021, 1:18 PM

@Jgreen @Dwisehaupt - do you know what limit we have on url length?

"6.5.12. 414 URI Too Long

The 414 (URI Too Long) status code indicates that the server is
refusing to service the request because the request-target (Section
5.3 of [RFC7230]) is longer than the server is willing to interpret.
This rare condition is only likely to occur when a client has
improperly converted a POST request to a GET request with long query
information, when the client has descended into a "black hole" of
redirection (e.g., a redirected URI prefix that points to a suffix of
itself) or when the server is under attack by a client attempting to
exploit potential security holes."

We are using the standard nginx and apache settings for these. The relevant sections of documentation are:

Nginx:
large_client_header_buffers: https://nginx.org/en/docs/http/ngx_http_core_module.html#large_client_header_buffers

Apache:
LimitRequestFieldSize: https://httpd.apache.org/docs/2.4/mod/core.html#limitrequestfieldsize
LimitRequestLine: https://httpd.apache.org/docs/2.4/mod/core.html#limitrequestline

They all default to ~8k bytes.

@Dwisehaupt how do you feel about increasing it? I am not 100% sure how long it would need to be to get the string in that they want (which is basically a list of ids to export) - but if you are relaxed about trying we could increase it substantially?

I guess if we look at ^^ then 8000 rows would be 8000 bytes * the length of the id - which could be up to 8 digits (I don't think we get to 9 in our ids) - so it would have to go up a lot to accomodate that much

After discussion today, we are going to look at increasing this 10x for now to 80k.

Jgreen moved this task from In Progress to Done on the fundraising-tech-ops board.

The nginx and apache limits have been raised to 80k.

I am unable to run the Fredge for ~10K lines.

In relation to T292784 we have bumped the buffers up to 640k. Please try this again and see if you are still encountering issues.

Tried for 17K lines and it worked. Thank you!