Tue, Dec 1
@tstarling any chance we can get this backported to 1.35 as well? I'm developing an extension making use of that hook and wish to support 1.35 with it (on account of it being LTS) and right now my options are to either only target 1.36+ or to use the legacy hooking system (losing all of the benefits of implementing the interface and getting guarantees on what the signature looks like)
Sep 26 2020
Apr 11 2020
CookieWarning should expose custom hooks that other extensions that integrate with it can use:
- A way for extensions to tell CookieWarning what other cookies they are setting which would require consent and which categories they belong to. This is best accomplished by a custom hook imo
- A way for extensions to check whether consent was given for a category in question. This is best accomplished by an API that extensions can call (e.g. a public static function) imo.
Feb 18 2020
@lucamauri Sorry for the delay in response, got sidetracked with a lot of other stuff. That log is unfortunately not useful as it doesn't contain any of the API requests or responses. You can either try enabling the http debug log group, or enable the debug toolbar which also logs those requests/responses. Just be aware that the request will include things like plaintext passwords and auth tokens; you'll want to redact that from the log before posting it anywhere.
Jan 19 2020
The code is fetching the correct properties; all successful API responses have a top-level element equal to the action specified in the request.
Dec 26 2019
We now generate jobs proportional to the number of edits that need to be reattributed instead of total number of edits, which should reduce database impact by a lot. Some extra work is needed at import time to do this, however.
Aug 30 2019
Random thoughts about my experiences with the RFC process:
Jul 24 2019
It looks like the current behavior of the extension is to generate jobs for each block of 300 edits in the wiki, and then try to reattribute later within each block. So, if your wiki has 300,000 edits, each import would generate 1,000 jobs. This means that the majority of the jobs will actually be doing absolutely nothing, and it just serves to bloat the job queue.
What version of MediaWiki are you importing into (target wiki where MWA is installed)?
Jul 18 2019
Jul 16 2019
I’m going to be a bit more blunt here, since that seems that is required.
Jul 15 2019
Jul 8 2019
@Krinkle Can you please point to a policy that all core features require someone at WMF to take "ownership" of it for maintenance even though there are plenty of parts in core and WMF-deployed extensions that are completely unmaintained by WMF? I am unaware of such a policy. This seems like a very odd double-standard where WMF employees can do whatever arbitrary thing they want, but volunteer developers have barrier after barrier erected to block their contributions and make them feel as unwelcome as possible.
Jun 5 2019
Abstract schemas is an argument for keeping MSSQL, Oracle, and other lightly-supported DBMSes in Core, not splitting them out. Once we have an abstract schema, the supporting code to make use of it need only be written once, and that same schema can then apply everywhere.
Apr 28 2019
Apr 17 2019
I've tried two different devices and two different libraries. Checking my PC clock (one of the devices tested), it's about 1.5 seconds behind what the time.gov website lists as the current time.
Mar 20 2019
I believe that the user stories mentioned in Goals adequately describe what this RFC is meant to tackle. If you disagree with some of the user stories, that is a valuable discussion to have. Many of these stories are things I have heard from other people across various channels. The "consistent DOM" point arose around VisualEditor rather than Gadgets, as VE expects certain elements to exist in order to power the editor interface. I mentioned Gadgets in the RFC to generalize it more as any js which would want to add things to a page needs to know where it should be adding it.
Mar 14 2019
An elephant in the room is what will be happening to the DOM as a result of this RFC. As one of the explicit goals of this RFC is to unify DOM across skins so that Gadget authors don't have to do skin-specific things if they don't have to. This means, however, that the overall DOM of most if not all skins will change in order to make them unified.
Mar 12 2019
This can move forwards without that RFC being settled yet. The work done on this patch set would help provide some of the work that's needed for the linked RFC, but additional changes will be required should the RFC be approved and work begun on it.
This RFC aims to deprecate BaseTemplate and as such make backwards-incompatible changes to the entire skinning system as a whole. This does not open up the option of using Mustache to generate parts of skins with PHP code to glue those parts together, it makes using Mustache mandatory for skins and eliminates PHP from the presentation logic entirely.
Mar 5 2019
Mar 3 2019
Unless anyone has any lingering concerns or questions they'd like addressed before the RFC is submitted for TechCom perusal, I'm going to be moving this RFC along to the next step of the process in a few days (requesting an IRC meeting).
Feb 26 2019
I opened the RFC T217158 due to the above change in gerrit that was made and reverted. I believe that an RFC is necessary for the expanded scope of making all skins use a template engine and have a mostly-consistent DOM between them. This task fits within the scope of the RFC I opened, and if the RFC is accepted, I would love to work with @Krinkle at moving this task forward in a way that serves the short-term needs presented here while still being usable for the long-term goals mentioned in the RFC.
Dec 28 2018
Dec 18 2018
I believe that patch execution fails:
Sep 27 2018
Registering a freenode account requires email verification, and freenode is very picky about what email providers they accept (no temp/burner email accounts or email services without anti-spam measures on signup). In any case, it's a lot more work to have a botnet go through that process before spamming, so the spambots simply do not register accounts.
Sep 23 2018
Sep 14 2018
Aug 31 2018
Jul 30 2018
There is no quantitive data. The pingbacks are horribly inaccurate, especially insofar as enterprise installations of MediaWiki are represented. Corporate firewalls as well as an attitude of sysadmins to not share installation data by default both cause installation numbers (across all database systems) to be largely underrepresented. In any case, I know personally of at least 20 companies and government organizations using MediaWiki with mssql or that have used MediaWiki with mssql, and I would not be surprised if the true number was in the hundreds. Yes, this isn't much compared to MySQL installations, but it is a non-negligible number.
Declining this task, as the structure to do this isn't nearly in a workable state. Moving out the Database class is about 10% of what is required here; the installer/updater need to work (keep in mind that LocalSettings.php doesn't exist when the installer is being run, yet that needs to be able to load the abstraction and schema). And, if the support isn't in core, then people will simply start using more and more MySQL-specific features making it impossible for an extension to work anyway.
Jul 25 2018
A bot named WikimediaDevelop was recently k-lined from freenode by Sigyn due to sending a channel notice. Need to appeal the k-line (easy) and configure the bot to not send channel notices (just message the channel regularly instead).
Jun 25 2018
Jun 11 2018
Jun 2 2018
In my opinion we should pick one datatype and standardize on it. Migrating existing columns to binary or utf8mb4 should be doable with a maintenance script, and these sorts of migrations and changes seem within scope for the schema abstraction RFC as part of the concerns there is taking the disparate environments of existing installs and standardizing them (although primarily with non-MySQL dbmses, but making MySQL focused on one and exactly one schema seems good as well).
May 16 2018
Modern C++ (aka C++11 and later) can be memory safe and free from buffer overflows as long as good practices are followed -- avoid usage of raw pointers (use unique_ptr and shared_ptr instead), make good use of const reference and move semantics instead of passing things by pointer between functions, allocate on the stack instead of the heap where feasible, and use RAII patterns to encapsulate OS resources like file handles. The main issue is the layer in between the PHP API and the extension, as the PHP API is raw C. Both C++ and Rust can have issues here if that API is misused, so Rust doesn't bring you any advantages in that regard.
Apr 6 2018
Jan 30 2018
Jan 29 2018
Dec 3 2017
The user's registration date and edit count are maintained on import, and existing edits are reattributed (so the maint script which recounts edits won't wipe things). As such, if they meet autoconfirmed criteria on the new wiki, they should be receiving it automatically. I see no reason and have no desire to manually add autoconfirmed as an explicit group in the db due to this.
Added $wgMediaWikiAuthImportWatchlist config var in v1.0.0
Nov 25 2017
Nov 17 2017
New permission mwa-createlocalaccount can be leveraged for this purpose, now in 0.10.0.
Nov 15 2017
Nov 9 2017
Yep, I'm aware of the old one (I was one of the mods there towards the latter years of its life and had thousands of posts as well as wrote a few how-tos). Didn't realize SO was also CC-BY-SA so that was my bad 😊.
Nov 8 2017
As an FYI, I just launched a new site https://mwusers.org which is relevant to this discussion. It's still new and unproven, so I think marking it as the "official" site right now would be largely premature, but I wanted to throw it in for consideration down the road. Unlike Stack Overflow, my forum has the advantage of the content being licensed under CC-BY-SA so that relevant bits can be imported into mediawiki.org's documentation.
Oct 25 2017
Now uses AuthManager as of v0.9.0 (will be submitted to gerrit in the near future)
No longer requires core patches as of v0.9.0 (will be submitted to gerrit in the near future)
Jul 23 2017
Patch pending review at https://gerrit.wikimedia.org/r/#/c/367128/ -- would appreciate if someone could give it a look over
Jul 25 2016
Do not assign tasks to other people without asking them first. If you want to work on this yourself then feel free to submit a patch.
May 2 2016
As of 1.23 the mssql layer used the native drive (sqlsrv), so this has actually been fixed for a very long time :)
Apr 28 2016
Apr 25 2016
Fixed in 1.27 / https://gerrit.wikimedia.org/r/#/c/285139/ (still pending merge, so don't close this task just yet)
Mar 25 2016
Dec 14 2015
- Keep it on-wiki and don't bother having a copy in git/gerrit at all, or perhaps have a placeholder RELEASE-NOTES that offers a link to its location on-wiki
Sep 28 2015
To get a gauge on the sheer number of wikis you'd be impacting by this decision, I reached out to Softaculous to ask for statistics on how many MediaWiki installations were made by them. Softaculous is a one-click installer script found on a number of hosts, as it reduces the friction and knowledge required to install and update applications. Their support team offered this as a response:
Sep 26 2015
Cut the offensive tone @MaxSem, that was incredibly uncalled for. I spent at least 100 hours developing and testing making sure that it worked for far more than just "basic stuff" when I submitted the patch for support in 1.23. Does it work beyond 1.23? No, I went into it only planning to release updates for LTS releases of mediawiki because I don't get paid for this and there are far better uses of my time. I'll update it for the next LTS version, and the LTS version after that as well, and so on until the internal politics and abrasive behavior around here drive me away from mediawiki permanently (which may end up happening sooner rather than later at this rate; so much for what whole Code of Conduct thing).