Page MenuHomePhabricator

[Tracking] Add new sites to validation
Closed, ResolvedPublic

Description

Just using for tracking these tasks (language codes for new languages, otherProjects, etc)

Event Timeline

Restricted Application added a project: User-DannyS712. · View Herald Transcript
DannyS712 renamed this task from [Tracking] Add new languages to validation to [Tracking] Add new sites to validation.Oct 27 2019, 10:11 PM
DannyS712 updated the task description. (Show Details)
DannyS712 triaged this task as Medium priority.Oct 29 2019, 6:36 AM

Or better, the globaluserinfo API which gives you only the projects for which the current user has an account.

Is there a reason you're not using the SiteMatrix API?

Didn't know about it. But, it doesn't look like there is a way to exclude closed projects, just a way to filter it so that only closed projects are shown

Or better, the globaluserinfo API which gives you only the projects for which the current user has an account.

Didn't think about it. But, that requires an extra API call and processing for the validation, and also has a bunch of other info that I don't need. I'll take a look at integrating that instead, but using stored data (to me) appears faster

I would use the globaluserinfo API and sort by edit count to get your "priority" wikis. Any wiki for which I have no account should not unnecessarily be queried.

I would use the globaluserinfo API and sort by edit count to get your "priority" wikis. Any wiki for which I have no account should not unnecessarily be queried.

Yes, but the validation is applied when saving the settings, and the user could be planning to visit that site and autocreate an account

I don't quite follow. What are you validating? My point is tasks like T237380 shouldn't be necessary. New wikis are created regularly, and you shouldn't be burdened with maintaining a local copy of data you can get from existing APIs. If the user hasn't visited a wiki yet, they wouldn't have any pages watchlisted anyway.

I don't quite follow. What are you validating? My point is tasks like T237380 shouldn't be necessary. New wikis are created regularly, and you shouldn't be burdened with maintaining a local copy of data you can get from existing APIs. If the user hasn't visited a wiki yet, they wouldn't have any pages watchlisted anyway.

When a user goes to save their settings (at https://meta.wikimedia.org/wiki/Special:BlankPage/GlobalWatchlistConfig) the options given are checked - trying to add foo.wikipedia or outreach.wiktionary results in an error. As new wikis are added, I update the settings to allow, eg, szy.wikipedia.

When a user goes to save their settings (at https://meta.wikimedia.org/wiki/Special:BlankPage/GlobalWatchlistConfig) the options given are checked - trying to add foo.wikipedia or outreach.wiktionary results in an error. As new wikis are added, I update the settings to allow, eg, szy.wikipedia.

Right, but you could validate against the globaluserinfo or sitematrix API, no? I don't see why you need to maintain your own list of wikis.

When a user goes to save their settings (at https://meta.wikimedia.org/wiki/Special:BlankPage/GlobalWatchlistConfig) the options given are checked - trying to add foo.wikipedia or outreach.wiktionary results in an error. As new wikis are added, I update the settings to allow, eg, szy.wikipedia.

Right, but you could validate against the globaluserinfo or sitematrix API, no? I don't see why you need to maintain your own list of wikis.

I was doing it against data so it would be faster (no API calls needed) but I guess I can

Yes. The API call should be fast enough, and if you find otherwise, consider caching it with mw.storage or something (but I don't think you need to). I would also use meta.wikimedia per convention but mediawiki.org works just as well, as it's part of the farm.

Yes. The API call should be fast enough, and if you find otherwise, consider caching it with mw.storage or something (but I don't think you need to). I would also use meta.wikimedia per convention but mediawiki.org works just as well, as it's part of the farm.

I was just giving mediawiki as an example here - I would use new mw.Api() to query locally