Page MenuHomePhabricator

Templatetiger-Updating: Lost connection to MySQL server during query
Closed, DeclinedPublic


I want to update my templatetiger[1] database with a longer running mysql import query.

I use jsub for it:
"jsub -mem 1000m public_html/"

So I start "" with following content to start mysql:
mysql -h tools-db --local --database=s51071__templatetiger_p < ~/public_html/einspielen.sql

The sql-file is with 126 kB not small, but it works in the past.

But after some time I get:
"ERROR 2013 (HY000) at line 23: Lost connection to MySQL server during query"

What can I do?


Event Timeline

It's really seems related to T180380, but there is no solution.

There is a long-running query killer on these servers that you might be running afoul of, depending on the length of time we are talking.

The import of some GB need some time. This was no problem in the past because I use "/* SLOW_OK */" in the sql-statement[1].

What can I do now to update the service?


Taking a look at where I think the query killer lives, it seems like the comment won't have any affect. However, it could be that some exception is needed for LOAD DATA LOCAL type statements? @jcrespo Am I close to the mark here?

There is no query killer on toolsdb, users are free to create long queries as long as they don't affect other users. In any case, a 200KB is unlikely to be a case of long queries, as it should be pretty much instant (I do consider it very small).

LOAD DATA from server is restricted as it has been a cause of vulnerabilities in the past, but I don't think it should affect LOAD DATA LOCAL at all. Maybe it is trying to execute a higher level command (?) like CHANGE MASTER or something else-- they need to share the original .sql to know more. The point is that there is nothing limiting or preventing that that we are aware of. Normally those errors happen when a command that is not compatible with the existing configuration or there is a database issue- isolating the specific command causing the issue is the way to help solve it.

Oh, I just saw the sql up here, @Kolossos please execute it line by line and tell us where it fails, and we will be able to help you better, thank you!

The line by line execution shows that the LOAD DATA LOCAL line was the point where it breaks.

Than I saw that this line was inside a

set autocommit = 0;  
            LOAD DATA LOCAL... ;
commit; set autocommit = 1;

After removing this part, that I added years ago, it seems to work.

But it's seem relatively slow, it need's 20h to import a 3GB file.

After starting it with jsub (without autocommit=0) it lose the connection to the database during the LOAD DATA LOCAL.

Change 477137 had a related patch set uploaded (by GTirloni; owner: GTirloni):
[operations/puppet@production] labsdb: Increase net_{read,write}_timeout

Although the SQL file is small, it's actually loading multi-GB files into the database.

Issues inherent to a shared platform could mean performance is not always consistent (the host executing the script is shared with other users and is itself on a shared hypervisor, there are network limits, the database server is also shared, etc). This could cause the script to experience periods of slowdown that trigger default MySQL timeouts:

> show variables like 'net%timeout';
| Variable_name     | Value |
| net_read_timeout  | 30    |
| net_write_timeout | 60    |

I've submitted the change above to increase these values to 120 and 240 seconds, respectively.

Following some advice from @Volans , I'll try a per-session timeout setting in my own environment and see how it reacts.

@GTirloni @jcrespo The issue still exist but now it's necessary to import the data as the database was deleted by T216441.
I try to import Ukrainian Wiki a file with 4.3 GB and lost during the import the connection to the database:

ERROR 2013 (HY000) at line 11: Lost connection to MySQL server during query

You can try the SQL commands you can found under:

I don't have much context for this, but I would recommend you to chop the original file in smaller pieces, try to import them in batches. E.g. you can read the original text file and convert it in an INSERT (),(),(), inserting several rows at the same time, but not the whole 4GB file in a single step. I am not aware of any limit imposed that would prevent the original import, but this way you will be able to debug it and/or do it faster.

Change 477137 abandoned by GTirloni:
labsdb: Increase net_{read,write}_timeout

No more user reports about this issue.

Closed for inactivity

Perhaps TemplateTiger should be turned off as it does no longer work due to this issue?