Page MenuHomePhabricator

Additional Webmaster tools access
Closed, ResolvedPublic

Description

Following up on a ticket from last August. I’m digging deeper into webmaster tools probably easiest to just add stu@a8c.com to all the different WMT sites including the https:// ones so for example https://en.wikipedia.org. Trying to dig into how much of the traffic Google sends us goes to those instead of the http ones. Wes Moran should also have access as he and I will be going thru together.

On Aug 6, 2014, at 6:23 AM, Stuart West via RT <ops-requests@wikimedia.org> wrote:

Yes that’s right. stu@a8c.com is easiest for me. For now let’s do wikipedia.org, en.wikipedia.org, de.wikipedia.org, tr.wikipedia.org, zh.wikipedia.org. That’ll give me a few different languages to drill into.

On Aug 6, 2014, at 2:19 PM, Alexandros Kosiaris via RT <ops-requests@wikimedia.org> wrote:

Hello,

So, we delegate access to individual user accounts per domain for Google webmaster's tools. That helps improve accountability and minimizes account sharing. I assume what is being asked is access to wikipedia.org and subdomains (en.wikipedia.org, el.wikipedia.org). Any specific subdomain preference or all of them ?

We can also do both swest@wikimedia.org and/or stu@a8c.com, whatever suits you best.

Event Timeline

Stu raised the priority of this task from to Needs Triage.
Stu updated the task description. (Show Details)
Stu added a project: acl*sre-team.
Stu added subscribers: Stu, akosiaris, Wwes.
Dzahn triaged this task as Medium priority.May 6 2015, 2:05 AM
Dzahn added a project: SRE-Access-Requests.
Dzahn set Security to None.
Dzahn subscribed.

Bump on this. I'm on a hangout with @Wwes right now trying to show him some stuff but don't have access. :-(

@damons can you help unstick this blocker for Wes and me?

So it looks like we have around 980 domains, x 2 if we want https for them all. We've been looking for a way to add them that doesn't require going through the web browser one site at a time because that would take... well, a ridiculously long time. Didn't find anything in the google webtools api yet.

While we wait for a simple non tedious manual solve, how about we start with these for wikipedias:
en, de, zh, ru, it, es, fr, ja, pt, tr, nl, pl, ar, ko, hi
and wikipedia.org

These represent our top 20ish by mobile standard and also cover the core desktop languages and regions.

Just would love access to those as a start. Longer term solve of more granular can become individual requests later.

Thanks.

I've added you both to all of these where you weren't on them already, but only for http. Google now has a cap on the number of domains (1000) and we are over that, the cap must have been introduced recently. I"m adding @dr0ptp4kt to ask if the sitemap entries are still in use or whether we can toss some of them.

Team, I have received approval for the access request from Lila Tretikov.

@ArielGlenn, regarding the domain capping, would you create a new tracking ticket on that? I'll follow up on email about that.

adding @Jalexander because i know he knows about the 1000 domains limit. He mentioned it to me.

@ArielGlenn, regarding the domain capping, would you create a new tracking ticket on that? I'll follow up on email about that.

Done. please see T99132

While we wait for a simple non tedious manual solve, how about we start with these for wikipedias:
en, de, zh, ru, it, es, fr, ja, pt, tr, nl, pl, ar, ko, hi
and wikipedia.org

Hi, i logged in to get this done and when I checked the first 2, en and de, i noticed both addresses, swest@ and stu@ had already been added. So i assume somebody has already resolved this just the ticket did not get updated, right?

Best,

Daniel

I added them to http and not https, as I mentioned in an earlier comment. But the ticket is not 'done".

So we need to find 15 sites we can delete to replace them with the 15 missing https versions of en, de, zh, ru, it, es, fr, ja, pt, tr, nl, pl, ar, ko, hi ?

Dzahn changed the task status from Open to Stalled.May 29 2015, 5:32 AM

@ArielGlenn - sitemap.wikimedia.org/* entries can be safely removed, making it possible to make room for those top 15 https://<lang>.wikipedia.org entries plus both http://www.wikipedia.org/ and https://www.wikipedia.org/ entries.

Removed all the sitemap.wikimedia.org entries. We are now down to 636 entries.

@akosiaris did you happen to find an automated way to do that, for example with https://developers.google.com/site-verification/v1/ and https://developers.google.com/webmaster-tools/v3/quickstart/quickstart-oacurl?

It seems like for the moment if the goal is to mass add delegated read-only access to a particular user the only way to automate that would be with a Selenium script or something to that effect. Any other ideas?

And meanwhile we have yet another request to get access here for @Ironholds T101157

The latter. But for now it has not worked for adding domains, but has worked for deleting them, hence my comment. For some reason oacurl seems to fall into some endless loop when using PUT to add a domain.

It seems like for the moment if the goal is to mass add delegated read-only access to a particular user the only way to automate that would be with a Selenium script or something to that effect. Any other ideas?

I have found 0 ways to delegate access to a user (read-only or otherwise). Perhaps an automation script based on selenium or otherwise would work

Let's give Stu the missing 15 https sites and resolve this ticket, then discuss automating the process as a separate thing.

While we wait for a simple non tedious manual solve, how about we start with these for wikipedias:
en, de, zh, ru, it, es, fr, ja, pt, tr, nl, pl, ar, ko, hi

I added these missing ones with https:// )(or thought i would). For some reason every single one already had 2 owners, noc@wikimedia and abaso@wikimedia.

I added swest@wikimedia to them.

Dzahn claimed this task.

Let's give Stu the missing 15 https sites and resolve this ticket, then discuss automating the process as a separate thing.

and wes please

Yes, and Wes, please :)

I do recommend adding the https://www.wikipedia.org/ and http://www.wikipedia.org/ domains for the two as well.