Nov 13 2017
Oct 31 2017
Oct 27 2017
Aug 26 2017
Aug 23 2017
Aug 9 2017
Can anyone comment on why the patch hasn't been merged? I can't see any outstanding review issues, but I may be missing something.
I've also been seeing this problem on Wikidata. Here's a screenshot of the issue, as an additional data point:
Aug 2 2017
Aug 1 2017
Jul 28 2017
Note that the Abuse filter was just a stop-gap measure while we finished the setup (which has been taking longer than we'd like), it wasn't meant to be permanent. I personally would prefer not making well-meaning editors with established accounts on other Wikimedia wikis (like MarcoAurelio :)) jump through hoops to make edits. But that's something we can look at later down the road. For now the fishbowl approach is probably the quickest solution.
Jun 18 2017
Jun 8 2017
May 26 2017
May 19 2017
May 15 2017
May 10 2017
May 2 2017
On a slight tangent: during the import, we're taking care to avoid importing all pages indiscriminately, to reduce some of the cruft (templates, redirects, images from commons, etc.) that accumulated over the years. It would be very helpful to this effect if we could run maintenance scripts on the wiki during the import process. Would it be possible to install Extension:Maintenance? If so, let us know if you'd prefer a separate issue to track that.
May 1 2017
I need to be able to get the formatted extracts either via the old api.php or the new REST service. If a new endpoint is created providing such extracts with well-formed HTML, yes, that would work for me.
Those weren't really sensitive (in e.g. the legal sense) documents, just internal organizational stuff that didn't make sense to be published, so there's no significant risk for us (in fact, most of those documents aren't event current anymore). We already have plans to deal with that content, and were just making sure there weren't new recommendations for that use case. Thanks all for confirming.
Apr 30 2017
Apr 28 2017
IIUC, that is not exactly what this request is about: currently I'm using a request like https://en.wikipedia.org/w/api.php?action=query&prop=extracts&exintro&indexpageids=true&format=json&generator=random&grnnamespace=0&format=json, which returns a html-formatted extract. The problem is that the html is not guaranteed to be valid, hence it fails if embedded in a xhtml page (or any other context where well-formed xml is required).
Apr 27 2017
Thanks everyone, much appreciated!
Apr 9 2017
Just pinging to make sure this hasn't fallen off the radar :) is there anything blocking the scheduling?
Apr 7 2017
Apr 5 2017
Can someone clarify when/why this was closed as declined/wontfix? I can't see that in the activity logs of this task -- perhaps it's a detail that wasn't preserved in the bugzilla import.
Mar 28 2017
I'm a bit confused by this issue. Judging by its title and description, isn't it the same as T62437? And if so, isn't it resolved already? If not, it probably needs clarification.
Feb 28 2017
Sorry all for the spam caused by adding & removing this task from Wikimedia-site-requests. I was trying to get a list of wikis where this extension has been/is planned to be enabled, and I came up with this query, which suggested that the combination of Wikimedia-site-requests and ArticlePlaceholder was the right way to get this information. Please let me know if such a list is being maintained elsewhere.
Feb 19 2017
OBJ would be a good format for open 3D models because it has an exclusively text-based representation, contrary to STL (T143201, T132058), and PLY (T145499), which can be either ASCII or binary. What's more, STL files are often saved in the binary format by default, while providing no no surface distinction --e.g. in the filename extension-- which would allow distinguishing them from the ASCII ones.
Jan 19 2017
Jan 8 2017
Indeed, the WMF logo is in the bottom row of the logos grid, but still part of that grid, so the main issue remains. Considering that the page contains 16 logos in a neat 4x4 grid, we could still have a perfect 5x3 grid if we removed one logo from there, so not even aesthetics would be sacrificed with such a change.
Dec 18 2016
In case this helps, I am also using Firefox (50.1.0) and have the uBlock extension installed.
Nov 28 2016
Nov 22 2016
I don't think we should be the ones to choose where to put the emphasis. It could well be that newcomers to Wikimedia development would be interested in working on high-impact issues (the same reason why editing articles on Wikipedia and having the result immediately published for all to see is attractive), which may include submitting changes to core MediaWiki. Are there any processes through which we could collect what documentation resources people are lacking the most? Maybe the Analytics team has collected / can collect some data in that regard?
Oct 31 2016
Oct 28 2016
Since two hours is way too little time to make a considerable dent in the actual documentation needs, I would suggest using that time to work out higher-level issues, particularly figuring out what's needed and defining immediate next steps for:
Oct 27 2016
@Qgil I'd be very interested in a proposal around this, but honestly the nature and scope of the expected activities for the summit aren't clear from the CFP page. Maybe one or two generic examples, or ballpark estimates in terms of expected duration, level of structure, type (hands-on, discussion, presentation?...) would be useful. Or even some highlights among current proposals which could serve as inspiration.
Oct 1 2016
Mar 18 2016
Thanks @demon! Let us know if there's anything you need from our side.
Feb 22 2016
Thanks @Nikki, I'll mark this as a dependency.
Feb 19 2016
Feb 17 2016
Thanks for the clarifications, @Krenair. SpecialInterwiki isn't critical, just a nice-to-have. Configure, OTOH, would be really handy, but I understand there are concerns about its usage -- it was a long shot, but I thought I should at least ask.
Feb 16 2016
Oh, also, if that's still a possibility, it would be nice to use "WMPT" and "WMPT Discussão" as project / project talk namespaces.
Sorry for not commenting sooner, but if possible, we'd also like to have at least the CategoryTree, EasyTimeline, PdfHandler, ParserFunctions and SpecialInterwiki extensions installed. I'm not sure those are installed by default on all such wikis. Something like this: https://nyc.wikimedia.org/wiki/Special:Version would probably cover most of our potential needs. I would also ask for the Configure extension, if that would be ok.
Dec 19 2015
Dec 18 2015
Dec 7 2015
Nov 14 2015
I've created a category to make it easier to find examples where mask rendering fails.
Nov 4 2015
Hurray! The fix worked :) I've created Category:DjVu files with errors for the files that are themselves corrupted.
Nov 3 2015
Nov 1 2015
Oct 30 2015
Oct 29 2015
I've edited the task description to remove the examples where the files were actually corrupted (as I had previously noted above), and added the files which (to the best of my knowledge, please correct me if I'm wrong) exhibit the issue due to the same cause (large metadata field).
Oct 27 2015
@matmarex does that date correspond to the #wmf-deploy-2015-10-27_(1.27.0-wmf.4) tag? I'm asking because it's a day later. If it does, I wonder if the naming scheme of the tag here in Phabricator shouldn't include dates like that, or make it clearer it's the start of a period (maybe something like "wmf-deploy-2015-weekXX" or "wmf-deploy-2015-oct-i", -ii, etc.).