Page MenuHomePhabricator

Create URL for Mexico Awareness Campaign
Closed, ResolvedPublic

Description

We need a URL redirect setup for the Mexico Awareness Campaign. The Gerrit repo is here - https://gerrit.wikimedia.org/r/admin/projects/wikimedia/campaigns/eswiki-2018 (static single page website, no js)

Would it be possible to setup a redirect on any of the following URLs:

  • es.wikipedia.org/bienvenida
  • wikipedia.org/es/bienvenida

Event Timeline

Prtksxna created this task.Oct 24 2018, 2:37 AM
Restricted Application changed the subtype of this task from "Deadline" to "Task". · View Herald TranscriptOct 24 2018, 2:37 AM

We understand that these URLs currently take one to https://es.wikipedia.org/wiki/Bienvenida (and correctly so). If neither of these is possible, let us know and we'll try to come up with an alternative (possibly a sub-domain on wikipedia.org, like 15.wikipedia.org)

Dzahn added a comment.Oct 26 2018, 9:58 PM

I think the right place for this would be the Wikipedia namespace on es.wikipedia, so the existing page:

https://es.wikipedia.org/wiki/Wikipedia:Bienvenidos

Second best would be a new page in the Wikipedia namespace on es.wikipedia or a page on meta.

In those cases you don't even need any help from us, just create the wiki page, insert the REDIRECT line (https://www.mediawiki.org/wiki/Help:Redirects#Creating_a_redirect) and be done with it.

Both es.wikipedia.org/bienvenida and wikipedia.org/es/bienvenida as well as non-wiki subdomains of wikipedia.org should be avoided.

Another option would be to use an URL shortener.

If you think it really needs a subdomain please select something in wikimedia.org.

From my end, the URL isn't terribly important. We should not replace
something that's community created/maintained, but otherwise the
constraints around tracking and UX are more important than the URL.

I understand. Yea, that would be great if you can use some (new) page on es.wikipedia.org/wiki/Wikipedia:.

I don't know much about the tracking constraints but i would expect we have existing analytics for pages on es.wiki. I am sure the analytics team can say more about that.

@Dzahn thanks for all the suggestions! I am wondering how we'll deploy our static site* by setting up a redirect on a wiki, would you be able to help with that?

Dzahn added a comment.EditedOct 26 2018, 11:38 PM

@Prtksxna Is this a dynamic page with scripting or is it a static page with just HTML/CSS and some images? Could the content as well be in a wiki page given that we can upload images and format the text? (and if you imagine it would be locked, so regular users can't edit it)

@Prtksxna Is this a dynamic page with scripting or is it a static page with just HTML/CSS and some images?

Yep. But, one of the main requirements of the page is the analytics of out-bound links. That is something that we are planning to do with Piwik (Matomo). I am not sure if we can script that on the wiki (without of course deploying EventLogging code, which is not something we'd like to get into)

Could the content as well be in a wiki page given that we can upload images and format the text? (and if you imagine it would be locked, so regular users can't edit it)

That would be a significant change to the designs though. @Nirzar, what do you think?

Could the content as well be in a wiki page given that we can upload images and format the text? (and if you imagine it would be locked, so regular users can't edit it)

yes, there is no requirement of editing this page. it's just a simple html/css page and it has to be clean enough to welcome the people who are not aware of Wikipedia. that was the requirement from Communications team. this won't be a wiki page.

url is not important as much as the cleanliness of this page. so let's go with the url that is easier to implement

Dzahn added a comment.EditedOct 29 2018, 7:13 PM

In this case, let's pick something in wikimedia.org and deploy it as a micro-site along other static sites like design.wikimedia.org, research.wikimedia.org etc.

Additionally you can then add redirects to it from es.wikipedia.org.

How about bienvenida.wikimedia.org ?

And yea, i can do the needed puppet and DNS changes for that.

Dzahn claimed this task.Oct 29 2018, 7:14 PM
bd808 removed a subscriber: bd808.Oct 29 2018, 7:50 PM

Change 470531 had a related patch set uploaded (by Dzahn; owner: Dzahn):
[operations/dns@master] create bienvenida.wikimedia.org for Mexico awareness campaign

https://gerrit.wikimedia.org/r/470531

Krinkle added a subscriber: Krinkle.EditedOct 30 2018, 1:29 AM

I'll add that if this url is meant to be typed by humans and visually transmitted in text form on social media or in images/posters, a Wikimedia subdomain may not the most effective.

I believe that people would be prone to type bienvenida.wikipedia.org (with a p), especially because the campaign explicitly focusses on Wikipedia and most viewers will have never heard of Wikimedia and will continue to not have heard of it after successfully interacting with this campaign (awareness of Wikimedia being a non-goal).

Additionally, I think needing to type a specific URL is very hard in general (most people will probably google "wikipedia bienvenida"). But within the set of users willing and able to type a URL, I believe they would be far more familiar with typing a url with a slash than a "subdomain".

In other words, I recommend installing the redirect at wikipedia.org/bienvenida, which would forward users to this microsite - which in turn could live at bienvenida.wikimedia.org (with an m).

Bonus: If they give up at the domain, they'll end up at the localised Wikipedia.org portal (pretty good).
Bonus: If the mistype "bienvenida" they'll get a 404 served by us with links to Wikipedia (instead of an unknown domain in DNS).

Dzahn added a comment.Oct 30 2018, 1:39 AM

I think we should avoid adding more subdomains to the wikipedia.org name space that are not actual wikis. One of the concerns when doing 15.wikipedia.org was that it would set precedence for each following, temporary campaign.

es.wikipedia.org can easily be used without needing any code changes or requests just by editing wiki. and it's what it's about, the Spanish edition of Wikipedia.

Most importantly the requestors themselves said the URL isn't that important to them but once we create them we have to support them until forever.

Dzahn added a comment.Oct 30 2018, 1:41 AM

If it _is_ important because it is used for printed materials i get the point though. In that case i am thinking "i wish the w.wiki URL shortener would be enabled". Not sure about the status of https://meta.wikimedia.org/wiki/Special:UrlShortener

This will be linked from a video campaign - we won't be showing the URL
that much for folks to type, but will expect people to click through to it
(and from it).

How about bienvenida.wikimedia.org ?

This should be fine, as @atgo points out people will just be clicking through.


Thanks for the patch @Dzahn

Change 470728 had a related patch set uploaded (by Dzahn; owner: Dzahn):
[operations/puppet@production] microsites: create bienvenida.wikimedia.org apache static site

https://gerrit.wikimedia.org/r/470728

Change 470728 merged by Dzahn:
[operations/puppet@production] microsites: create bienvenida.wikimedia.org apache static site

https://gerrit.wikimedia.org/r/470728

Change 470732 had a related patch set uploaded (by Dzahn; owner: Dzahn):
[operations/puppet@production] microsites::bienvenida: enable content cloning

https://gerrit.wikimedia.org/r/470732

Change 470531 merged by Dzahn:
[operations/dns@master] create bienvenida.wikimedia.org for Mexico awareness campaign

https://gerrit.wikimedia.org/r/470531

Dzahn added a comment.Nov 1 2018, 10:41 PM

Hi all,

https://bienvenida.wikimedia.org/ exists now and points to our standard page for projects yet to be created.

The puppet part is done except the part to actually clone content is commented out. There is an Apache site on a pair of backend servers (the ones that also host other "misc static" sites).

One more change is needed to add it to Varnish caching to actually send traffic to the backend. Coming soon!

I think technically a security review might be required for the content before it can go live, though i understand that might sound like overkill for a site that is just static. I will ask.

Do you have a specific date to go live or just "asap" ?

Do you expect there will be changes to the code after initially going live? I guess not unless something unexpected shows up, right?

Change 471184 had a related patch set uploaded (by Dzahn; owner: Dzahn):
[operations/puppet@production] varnish/trafficserver: add bienvenida.wm.org -> bromine/vega

https://gerrit.wikimedia.org/r/471184

@Dzahn the campaign will launch as early as Nov 7, so ideally before that so we can run some clicks through it and make sure the data is flowing properly.

It's unlikely we'll make much in the way of changes, but there's a possibility that we'll want to iterate if we're not seeing traffic flowing through. In that case, we'd want to be able to make changes quickly to optimized during the campaign.

Dzahn added a comment.Nov 1 2018, 11:58 PM

I asked security-team and they took a look at the contents and said it's fine to go ahead. We are unblocked there.

I got feedback that the Piwik logo and tracking code seems duplicated though. ( @Prtksxna wanna check that?)

@atgo Ok, no worries, can be done as early as tommorow at this point.

Dzahn added a comment.EditedNov 2 2018, 12:09 AM

@atgo @Prtksxna Regarding content updates there are 2 options.

Either a human step is required to git pull changes or it is automatically done by puppet.

The first one is secure but you have to ask somebody in our team to do it for you. If it's mostly all expected around the launch date then i don't see that as an issue. You could ping me personally and we also have a rotating "on duty" person, so it would not mean waiting for a long time.

The other option is that i set it to automatically pull latest changes. That means everything you merge goes into production after you wait max. 30 minutes. It can be possible that pinging somebody on IRC to do it is even faster than this though. You would still have to wait for puppet.

And in this case it matters a bit more who actually has +2 rights on your content repo. Everybody who can merge would also become a deployer effectively, otherwise the 2 steps are separated which has advantages and disadvantages.

I am not sure yet if anyone in "wmf" can merge there or if it is just your team and if you would be concerned that a lot of people could change the site content then.

A disadvantage of the automatic method might be that you lose the option to merge content before actually "going live". Maybe you would prefer to be able to review/merge and decide separately when a change goes live.

Change 470732 merged by Dzahn:
[operations/puppet@production] microsites::bienvenida: enable content cloning

https://gerrit.wikimedia.org/r/470732

Dzahn added a subscriber: Platonides.EditedNov 2 2018, 12:44 AM

I went ahead and enabled content cloning:

Notice: /Stage[main]/Profile::Microsites::Bienvenida/Git::Clone[wikimedia/campaigns/eswiki-2018]/Exec[git_clone_wikimedia/campaigns/eswiki-2018]/returns: executed successfully

It worked fine and the top of git log already shows @Platonides change to remove the duplicate Piwik code.

Puppet is set to automatically clone the latest head in master.

Dzahn added a comment.EditedNov 2 2018, 12:56 AM

If you have shell access to anything in production you could now start looking at the website before it is actually available to the public.

The servers bromine.eqiad.wmnet and vega.codfw.wmnet (one in each data center for redundancy) are now serving http://bienvenida.wikimedia.org with the content from your repo.

The only part missing is the varnish change that would send public traffic to them.

You can do for example "ssh -D 8080" to a prod server (bastion, deployment, maintenance, the people.wikimedia.org backend rutherfordium that anyone has access to or whatever) and then in your local browser set a proxy and use SOCKS5 and localhost:8080 . Additionally you would have to edit your local /etc/hosts file and point bienvenida.wikimedia.org to the IP of on of the backend servers above.

Or you could use the FoxyProxy extension or you can just work with curl or wget from a prod server.

Dzahn added a subscriber: ema.Nov 2 2018, 1:03 AM

I am pinging @ema from Traffic to review / merge https://gerrit.wikimedia.org/r/#/c/operations/puppet/+/471184/ and then the site should work without any of the above and we are done.

Dzahn triaged this task as High priority.Nov 2 2018, 1:06 AM

Change 471184 merged by Ema:
[operations/puppet@production] varnish/trafficserver: add bienvenida.wm.org -> bromine/vega

https://gerrit.wikimedia.org/r/471184

Patch merged, the site works:

$ lynx -dump https://bienvenida.wikimedia.org | grep Bien
¡Bienvenidas y bienvenidos a Wikipedia!
Dzahn added a comment.Nov 2 2018, 4:36 PM

Thanks @ema :)

@Prtksxna @atgo @Nirzar done -> https://bienvenida.wikimedia.org/ :)

I will claim it's resolved, but please keep commenting here on ticket nevertheless, i am getting notifications even before email and it can always be reopened.

Dzahn closed this task as Resolved.Nov 2 2018, 4:36 PM

Thanks @Dzahn. Much appreciated.

Thank you so much, @Dzahn!

atgo reopened this task as Open.Nov 13 2018, 9:29 PM

@Dzahn just commented on the wrong task, whoops. Reopening because:

@Dzahn sometimes I'm getting a Bugzilla page instead of the Bienvendia page. See this screenshot:

Right now that's the only page I'm getting. Have tested across multiple devices.

atgo raised the priority of this task from High to Unbreak Now!.Nov 13 2018, 9:33 PM

Campaign is meant to launch tomorrow :|

Restricted Application added subscribers: Liuxinyu970226, TerraCodes. · View Herald TranscriptNov 13 2018, 9:33 PM
jrbs added a subscriber: jrbs.Nov 15 2018, 12:32 AM

@Dzahn just commented on the wrong task, whoops. Reopening because:
@Dzahn sometimes I'm getting a Bugzilla page instead of the Bienvendia page. See this screenshot:


Right now that's the only page I'm getting. Have tested across multiple devices.

Seems to work for me now (Chrome and Safari on MacOS)

@jrbs Oh yea, it's fixed. Just failed to comment here and did instead on T202592#4744207

Dzahn closed this task as Resolved.Nov 15 2018, 12:35 AM

It was actually fixed by Brandon by just restarting one of the 2 backend servers.