Fri, Sep 20
All that reminds me of T71222#4994266. We might have to add index ignoring to LogPager like we did for ApiQueryLogEvents in rMWe6021abc9c16: ApiQueryLogEvents: Add IGNORE INDEX to avoid MariaDB optimizer bug.
Since the behavior used as justification for reopening this has been deemed a bug, I'm going to re-decline this task.
Fixed by the same patch that fixed T231582: ApiQueryRevisions.php: PHP Notice: A non well formed numeric value encountered.
Thu, Sep 19
- For various other ad-hoc logging we seem to use [ 'trace' => … ] mostly.
Looks like it did exist until very recently.
Wed, Sep 18
See T221869: Remove deprecated ApiQueryDeletedRevs for what's needed to remove the module from a Wikimedia usage perspective. The status seems unchanged since that task was created.
@Tchanders: Maybe this summary will help:
Having the language or domain in the path would be very difficult to do within MediaWiki. Configuration based on language and project has already happened by the time the REST router is called. At Wikimedia sites, this "configuration" includes selection of the particular deployment branch to use. Changing all that would be difficult to say the least.
If the timeout errors aren't occasional, you need to rework your page so it doesn't use so much time.
Too bad Doxygen doesn't use wikitext ;)
Supports MySQL, SQLite, and Postgres (sqlite doesn't support regex, so that option is hidden)
Supports MySQL, SQLite, and Postgres (if there are schema changes)
Since rMWc29909e59fd8: Mostly drop old pre-actor user schemas dropped the index entirely, this is almost fixed now. https://gerrit.wikimedia.org/r/537676 is still needed to fully clean it up during update.php, as the confusing history of this index may have resulted in the index existing as ar_usertext_timestamp, usertext_timestamp, or both and c29909e59 only drops one name (which may therefore error out if that name happens to not exist).
See T233221: update.php needs to drop both archive.usertext_timestamp and archive.ar_usertext_timestamp, if they exist, on MySQL for details on what's going on here.
Also, see T223151: Review special replica partitioning of certain tables by `xx_user` where we discussed this before since we knew this was coming.
No, MediaWiki knows nothing about the paritioning. That's purely a Wikimedia thing.
Tue, Sep 17
Maintenance script runs completed.
So probably the most consistent thing to do here is to pass it through Title::newFromText( $value, NS_USER ) (then get the IP back out with ->getText()), like User::getCanonicalName() does for registered user names, before running the regexes to determine if it's an IP.
Looks like T233004 is handling this already. Since there's more discussion on that on, I'm going to close this as the duplicate even though it's older.
Currently putting this in "Blocked Externally" on the CPT Clinic Duty board. Once the blocker is resolved, this should go to "External Code Review Needed".
After discussion in our triage meeting, we (CPT) decided this should be closed in favor of T118413: Wikimedia wikis should use https:// in $wgServer.
Status: Blocked on T184615: Once MCR is deployed, drop the rev_text_id, rev_content_model, and rev_content_format fields from the revision table also being ready to go. The intention is to combine all the alters of revision into one task for the DBAs.
Note the final Wikimedia production schema changes are tracked in T233135: Schema change for refactored actor and comment storage, and cleanup of revision_comment_temp in T215466: Remove revision_comment_temp and revision_actor_temp. Changes for any extensions should be tracked in their own tasks.
Note the final Wikimedia production schema changes are tracked in T233135: Schema change for refactored actor and comment storage, and cleanup of revision_actor_temp in T215466: Remove revision_comment_temp and revision_actor_temp. Changes for any extensions should be tracked in their own tasks.