Page MenuHomePhabricator

Review and deploy UrlShortener extension to Wikimedia wikis
Closed, ResolvedPublic

Assigned To
Authored By
Legoktm
Aug 10 2015, 6:46 AM
Referenced Files
None
Tokens
"Like" token, awarded by Jc86035."Love" token, awarded by jrbs."Like" token, awarded by Nicolas_Raoul."Dislike" token, awarded by Shizhao."Like" token, awarded by Liuxinyu970226."Like" token, awarded by Jonas."Like" token, awarded by sumanah."Like" token, awarded by Ladsgroup."Like" token, awarded by Varnent."Like" token, awarded by Luke081515."Dislike" token, awarded by Ricordisamoa.

Description

The UrlShortener extension is designed to implement Tim's suggestion from https://www.mediawiki.org/wiki/Requests_for_comment/URL_shortener#Tim.27s_implementation_suggestion

It provides a special page, Special:UrlShortener, that accepts arbitrary links from whitelisted domains, and gives them short urls. It is intended that we will use the "w.wiki" domain for this.

Checklist:

  • Create a tracking bug for the extension's deployment to Wikimedia wikis. This bug should only concern deployment itself, any sub-issues (that block deployment) should be separate bugs that are listed under "Blocked by" for this tracking bug.
  • Create Extension: mediawiki.org page for developers and people who will install or configure the extension.
  • Create Help:Extension: mediawiki.org page for the end user documentation. Cross-link it with the above.
  • Request a project in Phabricator if none exists yet.
  • Get the extension code in Gerrit.
  • Show community support/desire for the extension to be deployed.
  • Request (and respond to) a product review, if applicable
  • Request (and respond to) a design review, if applicable. T191592: Design review for UrlShortener
  • Open (and respond to) a Application Security Reviews ticket blocking this.
  • Make sure the extension is automatically branched.

Test at: https://en.wikipedia.beta.wmflabs.org/ ("Get shortened URL" in Toolbox - Beta cluster gives a local "short" domain with e.g. https://w-beta.wmflabs.org/4, hacky setup, but it should work).

Related Objects

StatusSubtypeAssignedTask
ResolvedLadsgroup
ResolvedBBlack
ResolvedSmalyshev
ResolvedLadsgroup
Resolvedmatmarex
Resolvedmatmarex
ResolvedLegoktm
Resolvedmatmarex
DuplicateNone
Resolvedori
ResolvedBBlack
ResolvedBBlack
Resolved dpatrick
Resolved Prtksxna
ResolvedLegoktm
ResolvedLegoktm
ResolvedArielGlenn
ResolvedJoe
ResolvedBBlack
ResolvedSmalyshev
ResolvedLadsgroup
ResolvedMarostegui
ResolvedLegoktm
Resolved Prtksxna
ResolvedLadsgroup
DeclinedNone

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes

What is the current status? Any progress?

Sorry to ask again, but the Wikidata team would really love to know what the state here is, as this is blocking Wikidata-Query-Service development more and more (see T112715). I hear there are (possibly more than one) security issues that are blocking a deployment of this otherwise finished service from happening. Can someone briefly explain what these issues are, and what is probably needed to resolve them? That would be super helpful.

The code seems to have no ownership.

T116986: Set up UrlShortener dumps is kind of a chicken-and-egg problem, we can't create dumps until the extension is turned on, and we don't want to turn the extension on until we have dumps working...so I wrote https://gerrit.wikimedia.org/r/276826 which adds a read-only mode, preventing anyone from generating new short codes. So we can turn on the extension in production, but not let anyone create short urls until we have dumps and apache stuff set up properly.

^ This seems to be the outstanding TODOs

No, the outstanding TODO is having any group of people competently maintaining and monitoring the url shortener for abuse/security issues.

So the issue seems to be there is no one having plans to support this.

Create Help:Extension: mediawiki.org page for the end user documentation. Cross-link it with the above.

Although https://www.mediawiki.org/wiki/Extension:UrlShortener exists, the request for a Help page still seems to need to be done

Create Help:Extension: mediawiki.org page for the end user documentation. Cross-link it with the above.

Although https://www.mediawiki.org/wiki/Extension:UrlShortener exists, the request for a Help page still seems to need to be done

I don't think a Help page is really a blocker? Anyone could help documenting how it works, once it's out there?

Create Help:Extension: mediawiki.org page for the end user documentation. Cross-link it with the above.

Although https://www.mediawiki.org/wiki/Extension:UrlShortener exists, the request for a Help page still seems to need to be done

I don't think a Help page is really a blocker? Anyone could help documenting how it works, once it's out there?

I agree. We keep running into use cases for this feature - anything that Comms can do to help move it along - please let us know. :)

Create the page then? ;)

Seriously though, we have to be careful. WMF wanting an extension deployed, but not following the "rules"/procedure because it can't be bothered/doesn't want to do it... just looks bad to the community. When they want things deploying, they have to go through the same procedures. It's only fair.

It's a relatively simple task, and doesn't have to be done by the author of the extension. Create it, and it will collaboratively be edited/improved

Baring in mind the more important code stewardship question has only just been resolved.

Who decides if it needs a product/design review?

You tell us? You're the one claiming that there's actual strict rules/procedures here to follow, so your book should also tell you what the next steps are...

(Nobody wants to put pressure on an individual or a team here. We're trying to figure out where the responsibilities are, and huge props to Greg for fixing a step despite the fact that he (or his team) definitely didn't have to.)

You tell us? You're the one claiming that there's actual strict rules/procedures here to follow, so your book should also tell you what the next steps are...

Because there are. The checklist is there for a reason. It's not a "if we feel like it" -- if we didn't care about anything like that'd we just end up deploying more extensions that would inevitably become tech debt. And no body wants that.

I'm however not a product manager, nor am I a designer. These have been things that often don't get done (and as such, have been marked as "if applicable", as the rules and guidelines for these are at best, vague.

There is T133109 still outstanding too

To clarify the current status...I think the checklist is mostly accurate. I'm working on T133109: Add basic abuse prevention to UrlShortener right now, and hope to have that wrapped up by the end of the week. For the design review part, I'll ask @Prtksxna to take a look at the extension, since things have probablly changed when he originally implemented it.

You tell us? You're the one claiming that there's actual strict rules/procedures here to follow, so your book should also tell you what the next steps are...

Because there are. The checklist is there for a reason. It's not a "if we feel like it" -- if we didn't care about anything like that'd we just end up deploying more extensions that would inevitably become tech debt. And no body wants that.

I like the checklist. :) I am just unaware it's as official as you say it is. (Can you point me to where it's documented?)
If it is, then ownership of each step should be way more clear than it is. We would never achieve anything if we literally left blockers "up for grabs", hence why I argued that specific step must be not it.

It's also ok if we keep discovering blockers. As long as we identify what's the next step, all's good.

I'm however not a product manager, nor am I a designer. These have been things that often don't get done (and as such, have been marked as "if applicable", as the rules and guidelines for these are at best, vague.

There is T133109 still outstanding too

<tangent> The process is documented at https://www.mediawiki.org/wiki/Review_queue#Checklist/Process - The person/team that is taking responsibility for the code, is in charge of determining which of the optional steps need to/should be done, and getting those done by someone. </>

If the screenshot on the new help page is anything to go by, the URL shortener will change URLs in the form:

https://en.wikipedia.org/wiki/Something_very_long

to something in the form:

https://en.wikipedia.org/wiki/Sthg_short

but will not change the "https://en.wikipedia.org/" domain.

Why not use shorter domains, like https://enwp.org/ ?

From the original description...

It is intended that we will use the "w.wiki" domain for this.

Why not use shorter domains, like https://enwp.org/ ?

It should be noted that the Foundation does not own enwp.org. I am not sure anyone knows who does. This is a potentially pretty serious security hole.

Just to point this out, https://dewp.org is owned by the German Wikimedia chapter (WMDE), so there shouldn't be any security problems with this one.

I don't think it was ever suggested that those were the domains we were going to use...

One would hope that the domain to be used would be universal for all 700+ wikis, not focused on enWP, or just WPs.

This shouldn't have to be said.

One would hope that the domain to be used would be universal for all 700+ wikis, not focused on enWP, or just WPs.

This shouldn't have to be said.

And I quote, from the description, again...

It is intended that we will use the "w.wiki" domain for this.

Ladsgroup changed the task status from Stalled to Open.Mar 25 2019, 11:20 PM
Ladsgroup claimed this task.

I'll do this.

I'll do this.

The desc doesn’t read like this is ready to go

Shouldn’t the maintaining team also sign off that they’re happy for it to go ahead now/soon? Rather than you just doing it... and then leaving it to them? :)

Do we have SSL certs for the w.wiki domain now?

The desc doesn’t read like this is ready to go

I'm doing the subtasks. They have patches up and ready for merge.

Shouldn’t the maintaining team also sign off that they’re happy for it to go ahead now/soon? Rather than you just doing it... and then leaving it to them? :)

Yup, already got it from @CCicalese_WMF and @Fjalapeno

Do we have SSL certs for the w.wiki domain now?

It's not mentioned in the task. Is there any tasks for this? I can ping traffic about this. (Already sent an email to ops-l a week ago)

Do we have SSL certs for the w.wiki domain now?

https://w.wiki/3 already sends a valid certificate (the same one as for Wikipedia etc.), and w.wiki is listed as one of the canonical domains at wikitech:HTTPS as well. (It looks like this was done in T91612: acquire SSL certificate for w.wiki.)

Change 499760 had a related patch set uploaded (by Ladsgroup; owner: Ladsgroup):
[mediawiki/extensions/UrlShortener@master] Put sidebar link behind a config variable

https://gerrit.wikimedia.org/r/499760

Change 499760 merged by jenkins-bot:
[mediawiki/extensions/UrlShortener@master] Put sidebar link behind a config variable

https://gerrit.wikimedia.org/r/499760

Change 500910 had a related patch set uploaded (by Ladsgroup; owner: Ladsgroup):
[operations/mediawiki-config@master] Enable UrlShortener in mediawikiwiki

https://gerrit.wikimedia.org/r/500910

Change 500910 merged by jenkins-bot:
[operations/mediawiki-config@master] Enable UrlShortener in mediawikiwiki

https://gerrit.wikimedia.org/r/500910

Should we ping archive.org to make sure they are properly archived into perpetuity like all the bit.ly links etc are these days ?

@TheDJ The Internet Archive don't collect them, the Archive Team do (the data is sent to Internet Archive servers). Usually they do this by sequentially or randomly checking every possible alphanumeric combination, although perhaps the complete list of URLs could be given to them instead to save some time for them, since WMF own the database.

Alternatively, if the list is available someone could prefix all the URLs with https://web.archive.org/save/ and have them crawled once in a while.

Should we ping archive.org to make sure they are properly archived into perpetuity like all the bit.ly links etc are these days ?

See T116986#1764562 and the follow-up comment.

@TheDJ The Internet Archive don't collect them, the Archive Team do (the data is sent to Internet Archive servers). Usually they do this by sequentially or randomly checking every possible alphanumeric combination, although perhaps the complete list of URLs could be given to them instead to save some time for them, since WMF own the database.

We are going to proactively provide dumps in the URLTeam format to avoid any reliability and archiving concerns, but will leave it up to AT on how they want to ingest it (happy to help in any way possible of course).

Mentioned in SAL (#wikimedia-operations) [2019-04-10T12:05:24Z] <ladsgroup@deploy1001> Synchronized wmf-config/InitialiseSettings.php: SWAT: Prep work for deploying UrlShortener extension (T108557), part I (duration: 01m 00s)

Mentioned in SAL (#wikimedia-operations) [2019-04-10T12:07:13Z] <ladsgroup@deploy1001> Synchronized wmf-config/CommonSettings.php: SWAT: Prep work for deploying UrlShortener extension (T108557), part II (duration: 01m 00s)

Change 502998 had a related patch set uploaded (by Ladsgroup; owner: Ladsgroup):
[operations/mediawiki-config@master] Deploy UrlShortener

https://gerrit.wikimedia.org/r/502998

Change 502998 merged by jenkins-bot:
[operations/mediawiki-config@master] Deploy UrlShortener

https://gerrit.wikimedia.org/r/502998