Page MenuHomePhabricator

A database query error occurred when transwiki importing a page from zhwikipedia to zhwikiversity
Closed, DuplicatePublic


A database query error has occurred. Did you forget to run your application's database schema updater after upgrading? Query: INSERT IGNORE INTO `page` (page_namespace,page_title,page_restrictions,page_is_redirect,page_is_new,page_random,page_touched,page_latest,page_len) VALUES ('0','邪典電影列表','','0','1','0.163800399756','20190608053733','0','0') Function: WikiPage::insertOn Error: 1205 Lock wait timeout exceeded; try restarting transaction (

Source wiki:w (zhwikipedia)
Source page: 邪典電影列表
Selected: Copy all history revisions for this page and Assign edits to local users where the named user exists locally

Event Timeline

Aklapper changed the task status from Open to Stalled.Jun 8 2019, 10:17 AM

@94rain: Please provide steps to reproduce and complete links (that someone can click directly).
See - thanks!

  1. Go to the link: (Require transwiki importer permission)
  2. Filling the forms with settings below:
Source wiki:w (The actual site is:
Source page:邪典電影列表 (The url of the article is
Selected:1. Copy all history revisions for this page and 3. Assign edits to local users where the named user exists locally
  1. Click Import for about 2 minutes, the import failed and give the message below:

TIM图片20190608203451.png (701×1 px, 36 KB)

Is the page history too large for import? I tried to export the xml file of the article and the file size is up to 16MB.

Aklapper changed the task status from Stalled to Open.Jun 8 2019, 1:11 PM

What happens if you retry the import? Do you get the same error or does it succeed now?

If I try again immediately it returns this message:

2.png (175×727 px, 21 KB)
. But if I try after a while I would still get the above message (A database query error occurred).

I've tried to import that article many times and it always fails. Importing other pages still works.

First, I can tell that the page and some revisions made it in. If we compare with the first 129 revisions were imported.

Checking the lengths of those revisions, 7 590 125 bytes made it in That doesn't seem too huge to be honest. All the revisions must have been dumped out to XML for the import to start properly, so we're not hitting a limit there.

The import was retried a few times and I can see batches of revisions got in, each batch in a single COMMIT and so from a single attempt to import:

  • binlog db1075-bin.004013: first 82 revisions
  • binlog db1075-bin.004020: next 26 revisions
  • binlog db1075-bin.004022: next 21 revisions

From the first import attempt, at 5:34 UTC, we can see a couple InnoDB slow row locks:
There's a logstash entry right around there that is probably connected somehow:

Here's a sample INSERT from the first batch that made it in so we can go back and look at it later (first revision in the batch):

INSERT /* MediaWiki\Revision\RevisionStore::insertRevisionRowOn  */  INTO `revision` (rev_page,rev_parent_id,rev_minor_edit,rev_timestamp,rev_deleted,rev_len,rev_sha1,rev_content_model,rev_content_format) VALUES ('13811','0','0','20170620065905','0','33434','of7pr8iky9ufr4cq5h546g70rkz87fu',NULL,NULL)

I wonder, if the import is tried again, whether more old revisions will show up.

I don't know if it's exactly the same, because I didn't see other long (or short) running inserts around that time. I can copy/paste the relevant part of the binlog in a file if that's useful.

Found them (thanks, tendril)!
transaction: 78697996224, run time 14s, stamp 2019-06-08 05:37:38

INSERT /* WikiPage::insertOn */ IGNORE INTO `page` (page_namespace, page_title, page_restrictions, page_is_redirect, page_is_new, page_random, page_touched, page_latest, page_len) VALUES ('0', '??????', '', '0', '1', '0.163800399756', '20190608053733', '0', '0')

transaction 78697994684, run time 7s, stamp 2019-06-08 05:36:58

INSERT /* User::addToDatabase */ IGNORE INTO `user` (user_name, user_password, user_newpassword, user_email, user_email_authenticated, user_real_name, user_token, user_registration, user_editcount, user_touched) VALUES ('H2226', '', '', '', NULL, '', '998e17c3782d1e27598c7c15f27374af', '20190608053650', '0', '20190608053650')

One or both of these probably did us in.

Links: and

Linkigg the ones from later in the day too, they probably line up with the retries:

So this is likely something in MediaWiki getting hung up.

I'll merge this into the other ticket.