Are you autoconfirmed on Wikispore? I can't imagine why capcthas would show up then.
In any case, we can probably get rid of captchas entirely now that we rely on Wikimedia login.
See T243931: Allow a module to load (require) another module from a global module repository wiki for a similar discussion.
Sat, May 30
Thanks a lot for your work on this, @Pppery!
Synapse supports OpenID Connect now. We don't (filed as T254063: OAuth extension should support OpenID Connect), but that also seems like a realistic path towards identity integration with Synapse; maybe a more sustainable one.
OpenID has little support today. To some extent it has been replaced by OpenID Connect (not as a federated identity system, but at least as a standard method of proving your identity at a specific website). IMO we can decline this in favor of T254063: OAuth extension should support OpenID Connect.
Registered the site on Google Search Console as a first debugging step. Ping me if you need access.
Done. I've opted for a puppetized approach since the Interwiki extension does not support export/import. See puppet/modules/role/manifests/wikispore/interwikis.pp in c546156.
Current copyright holders:
vagrant@localhost:[wiki]> select user_name, count(*) edits from revision join revision_actor_temp on rev_id = revactor_rev join actor on revactor_actor = actor_id join user on actor_user = user_id left join ipblocks on user_id = ipb_user where ipb_user is null group by user_name order by edits desc; +--------------------+-------+ | user_name | edits | +--------------------+-------+ | Pharos | 394 | | Koavf | 125 | | ChristianSW | 49 | | A12n | 41 | | Zblace | 41 | | Spinster | 20 | | Denny | 17 | | Ottawahitech | 13 | | Mvolz | 9 | | Sj | 9 | | Perohanych | 6 | | Tgr | 6 | | PKM | 6 | | Funcrunch | 6 | | R9H9 | 6 | | Ircpresident | 6 | | Sm8900 | 4 | | 1234qwer1234qwer4 | 3 | | DutchTreat | 3 | | Maintenance script | 2 | | Sannita | 2 | | Bluerasberry | 2 | | FULBERT | 2 | | Killarnee | 2 | | Reify-tech | 1 | | Battleofalma | 1 | | TrMendenhall | 1 | +--------------------+-------+
We need their permission for making the switch.
Fri, May 29
Best we could do is prevent new usernames from being registered, unless someone takes it on themselves to rename all the users who have an equal sign in their name (which probably requires a global discussion first since the existing rename policy is pretty narrow).
Out of curiosity, what does it mean that Google requires us to do this? I imagine they won't skip indexing an image just because it does not have schema.org data associated with it...
Also, where do you plan taking the license metadata from? CommonsMetadata or do you expect the data model discussions to be wrapped up by then?
Changing uploaded files is scary (data loss if anything goes wrong, changing what data needs to be added is a major PITA, duplicate checking breaks during train rollout of changes to the image modification algorithm...). Adding metadata to thumbnails is fine (to some extent we do it already), although thumbnails are preserved forever so purging all existing thumbnails would take a while. The task for it is T5361: Embed image author, description, and copyright data in file metadata fields. Note though that sometimes we display the original file, not a thumbnail. It would be nice to get rid of that for a number of reasons (T67383: Generate optimised thumbnail even when dimensions match original) but it involves some gnarly areas of legacy code so I wouldn't expect to be easy. Although that task implies that it's already being done for Wikimedia production at least... maybe @Gilles can clarify.
This is normally done at https://meta.wikimedia.org/wiki/Talk:Interwiki_map.
I guess the question is, why is Thanks using $.cookie instead of mw.cookie?
I think the most intuitive behavior would be to only remove the innermost element and only if it actually wraps the cursor. Also it probably shouldn't remove block formatting, especially paragraphs (the current selection-based mechanism doesn't do that either). So
- <p>Foo<b>b|ar<a href="./baz">ban<i>g</i></a></b></p> -> Ctrl+M -> <p>Foob|ar<a href="./baz">ban<i>g</i></a></p> -> Ctrl+M -> (no effect)
- <p>Foo<b>bar<a href="./baz">ban|<i>g</i></a></b></p> -> Ctrl+M -> <p>Foo<b>barban|<i>g</i></b></p> -> Ctrl+M -> (no effect)
I don't see anything relevant in the logs after May 2.
Thanks for the super quick fix! Properly doing the build step fixes about 20 tests (filed T254023: MobileFrontend repo needs npm build step after code update in MediaWiki-Vagrant about that), the patch fixes 50-ish more. Only one error remains:
MobileFrontend mobile.startup/OverlayManager: #getSingleton (hash present and overlay not managed) If a page is loaded with a hash fragment a new entry is placed before it to allow the user to go back. Expected: true Result: false
It passes in isolation, but only when started with an URL with no hash fragment. So probably some other test lets state bleed into the hash fragment, and this one doesn't start from a fixture.
Seems like actually only one test is broken, the others breaking is a knockover effect.
Thu, May 28
Wed, May 27
Tue, May 26
Mon, May 25
GrowthExperiments has a recommendation API where result ordering is intentionally randomized. In an browser test we'd want to test result navigation but don't care about testing the randomization (if we want to test it at all, easy to do in a unit test) and not knowing the order of results beforehand is unhelpful.
The common concern about @ is that it encourages misuse because it seems very lightweight but it's actually not cheap (it changes php.ini settings dynamically, all the error message generation still happens even though the result is discarded, allegedly it even impacts compiler optimizations - although those are PHP 5 era claims, not sure if they still hold up).
I filed T253568: Make the terminology of OAuth UI and documentation easy to understand although in hindsight I should have probably edited this task.
Yeah, that came up as an option (this task is the continuation of discussions from here and here) but especially for extensions having that code in the extension repo is probably the lesser evil (easier to ensure it stays current, for example).
The page title is set in DifferenceEngine::showDiffPage, you'd have to factor out that part into something that can be sanely overridden. Or, I guess, change the title from the DifferenceEngineOldHeaderNoOldRev and DifferenceEngineOldHeader hooks, although that's a rather ugly hack.
Fixed in the latest version of the role::wikispore Vagrant patch.
Needs something like
$wgSMTP = [ 'host' => '<%= scope['::role::mediawiki::smtp'] %>', 'IDHost' => '<%= scope['::role::mediawiki::hostname'] %>', 'port' => 25, 'auth' => false, ];
but only when the SMTP variable is set.
Sun, May 24
The relevant docs are at Help:Email in Cloud VPS.
Can this be resolved then?
Fri, May 22
Comparing wgStableRevisionId to wgRevisionId seems like an easy and accurate way to tell whether the current version of the page is shown by default or pending review, so we can easily show different messages.
It would be nice to link to or at least name the Gerrit repository name in the closing message. (It would be even cooler to provide specific instructions on how to submit the pull request to Gerrit, not sure how feasible that is though.)
Yeah, we are talking about the same thing, I just can't reproduce it happening.
Looking at your first example, it doesn't even seem related to the image being narrower than the card.
Thu, May 21
Yeah, we can just replace the size parameter in the URL to get what we want (unless the original image is narrower than 260px which should be very rare). The flip side is that it will load a bit slower as 260px is not a standard image width so the thumbnail is probably not pre-rendered. (Then again, I'm not sure the size offered by RESTBase is any different...)
@RHo looking at the app screenshots made me wonder why we don't just scale our images to full width. I originally assumed this problem occurs when the image is physically not wide enough to fill the card, but the app uses larger sizes so that cannot be the case (and I checked manually and indeed it isn't). So e.g.
|instead of this||like this|
Maybe that's just an artifact of what API we are using to get the image URL?
TBH I don't think it matters too much whether the post-edit notification uses the exact same choice of word is as the editor submit button. The MediaWiki default post-edit notice is "Your edit was saved.", for example.
In some sense you are still publishing the edit, since it will be visible to anyone, even if it does not show in the default article view. It would be nice if it said something like "submit for review" but that's a bit too long for a button. (I don't think "Submit" tells anything useful to the user.)
Wed, May 20
This is surprisingly hard to get right:
- As shown above, CSS blur spreads the image outside its original area (could be addressed in a number of ways, overflow:hidden, clip, background-clip...) and more problematically fades the edges out. Or more precisely, assumes the image is on a white background and uses that white color for the convolution when near the edges. SVG has an option to avoid this (edgeMode) but it does not seem to be implemented in any browser. I guess we could scale the background image to extend beyond the card a bit and then clip it, so this would be manageable, although it would add to the weirdness factor.
- Some images have transparent areas, where the blurred background version shows through. I don't really see a way to deal with this. If we add a white background, there will be an edge between that background and the stretched or repeated background image - the same problem this task is trying to solve.
Tue, May 19
Short description handling currently lives in Wikibase, so mw.wikibase makes sense nevertheless.
In any case, wikitext changes should probably be blocked on the parser unification for now.
As an aside, the search autocompleter honoring the namespace defaults set for advanced search seems highly counterintuitive to me. I seem to recall having reported that in the past but I'm unable to find any relevant task now.
Manually selecting the "Classic prefix search" user preference makes all browsers give the correct results.
Specifically, the working request has namespace= and the other one has namespace=0|1|2|3|4|5|6|7|8|9|10|11|12|13|14|15|90|91|92|93|100|101|828|829|2300|2301|2302|2303. And the response for the first says x-opensearch-type: prefix and the second one x-opensearch-type: comp_suggest. So maybe this is the combination of a CompletionSuggester bug and some kind of obscure user setting resulting in the use of a different search method for one account? Not sure what that would be though, both accounts have the Search completion user preference set to default.
also if I directly request the API URL used by the search autocompletion JS code (this for the example above) it gives the correct result for all browsers
Mon, May 18
This is unlikely to happen for the same reasons T243931: Allow a module to load (require) another module from a global module repository wiki didn't.
Sun, May 17
|Topic filter, closed||Topic filter, open||Topic filter, with input field selected|
Sat, May 16
Wrt using node 10 based CI tools on vagrant, my workaround is using nodeenv. It goes something like
sudo pip install nodeenv cd <repo> nodeenv ./nodeenv --node=10.20.1 source ./nodeenv/bin/activate npm install npm test deactivate_node
Fri, May 15
So, the general changes (and the changes needed to ensure consistency with EditAttemptStep) are handled in c593972, c594780 and c595224, those specific to the post-edit dialog in 595173; most of the help panel event logging was in place already (a little bit inconsistent, T250327: Newcomer tasks: clean up help panel EventLogging fixes that); the mobile peek is handled in c596719. All that's left is events related to guidance content; that's left for later since the functionality is still in progress.
The fundamental issue here is that the session is initialized in Setup.php, which is outside of the main try..catch block of the API and so the exception would not go through the usual error logging / output formatting logic. So one option is to make that logic not rely on the try block and register it early as the PHP error handler; but it's probably easy to get into service dependency loops that way.
I guess technically it could also happen that the user queries the task API in the short timeframe between deleting a page and the ElasticSearch index being updated, so they receive a task card about an article that does not exist anymore, and since the ID is coming from a DB lookup and not ElasticSearch, it would be missing from the task data. It's a pretty unlikely edge case though.
On a side note, the unresolved comment count (is that new, or have I just not payed attention before?) is pretty cool. Thanks for working on that!
Thu, May 14
I'll remove the broken library when I get around to it. The error doesn't really break anything though.
Wed, May 13
Note that indentation style can change on a per-page or per-section basis. * or # is common for voting-style discussions.
This went away after a restart, so I guess the real problem is the wikidiff2 role not triggering an Apache or PHP restart.