Page MenuHomePhabricator

Run in -async mode, the logs sequence mess
Closed, ResolvedPublic


Originally from:
Reported by: yfdyh000
Created on: 2012-07-15 10:53:38
Subject: Run in -async mode, the logs sequence mess
Original description:
In asynchronous mode, recommended to wait for the same page update details full output, then insert other information.
"Updating links on page \[\[xx:XXX\]\]" should not be inserted into the middle of the Continuous log.
The "Changes to be made​​:" of Same page should be with a continuous output.

Version: unspecified
Severity: enhancement
See Also:



Event Timeline

bzimport raised the priority of this task from to Needs Triage.Nov 22 2014, 2:15 AM
bzimport set Reference to bz55029.
bzimport added a subscriber: Unknown Object (????).

That does not make any sense to me. If we have to wait for the update so we can have the output from the script and from the async save routine in the same place, the entire advantage of async saving is gone\!

Oh. The following idea feasible?

Suppose a situation:
\# max\_queue\_size = 10
\# The current waiting put queue is 9
Getting 60 pages from wikipedia:en...
\# Add 10+ pages to the put queue at same time
\# The current waiting put queue is 19+

Post-processing \[\[en:xxx\]\]

"Updating links on page \[\[en:xxx\]\]." 10+
"Updating page \[\[en:xxx\]\] via API" 10+
"Getting 60 pages from wikipedia:de..."

What you want is not really clear to me, but I suppose you want to let the bot stall when there are 10 pages waiting to be saved?

What would be the use case for that?

I mean is all associated pages of a page as a unit to output and updates, rather than each page as a unit.

Is this a enhancement for core or compat?

jayvdb set Security to None.

@Mpaa Could this be fixed in a similar way as T135992? Or is this even solved with T135992?

Honestly the wanted behavior is not clear to me.

As I understand this, this is a request to have synchronous log in asynchronous mode. The request to have 3 main threads: a read thread, a write thread, and a log thread. The log thread would group incoming log messsages by page and output them to the log file when a certain final message (post-processing-for-page-is-done hook or something?) is thrown finally.

There is another similar request in T73646: That suggests there should be another - the 4th? - thread: read thread, write thread, log thread and an output thread. The output thread would group incoming messages (save-success or -fail messages, API warnings or similar) and output them only if pywikibot.input() or similar is currently not being called (not waiting for the user interaction).

At least this is how I understand these two tasks.

But maybe I got it wrong. This task may be to some particular script as the attached log file from the sourceforge bug report page could suggest:

Xqt claimed this task.
Xqt added a subscriber: Xqt.

cached output was ported from compat