User Details
- User Since
- Nov 24 2021, 5:14 PM (210 w, 3 d)
- Availability
- Available
- LDAP User
- Unknown
- MediaWiki User
- HShaikh (WMF) [ Global Accounts ]
Thu, Dec 4
Wed, Dec 3
Tue, Dec 2
Please check again and let us know if the access is not resotored.
We have updated an ip whitelisting rule on our side
Mon, Dec 1
Hey folks,
I have raised this to our SRE team and they are looking into it.
Will keep this ticket updated as we progress.
Tue, Nov 18
Wed, Nov 12
@FNavas-foundation Would a retention/lifecycle of content in this bucket of 10 days be good?
Nov 5 2025
Nov 3 2025
Oct 30 2025
Sep 15 2025
reading the exchange above. I feel like there is a need to get the IPs whitelisted and i can forward that request but I also see that there is an exploration for other mechanisms being considered as well. Are we ready to ask for IP whitelisting or do we want to go another route to avoid requesting IP list changes?
Sep 11 2025
Sep 3 2025
Aug 25 2025
Noticed the order of events above has add kafka healthchecks at the end. I believe it needs to be there before the production deploy
Jul 17 2025
Approved. Thank you
Jul 7 2025
https://gitlab.wikimedia.org/repos/wme/pageviews
repo has been created
Jun 17 2025
Hey ben, So for the need for new instance was looking at it from a future isolation perspective. (WME doesn't want their jobs to potentially cause resource starvation or be affected by other jobs hunger)
The Pageviews project is a starter project where we are looking to start doing some computer inside the DPE infrastructure to benefit from access to data at an earlier stage in the pipelines, allowing us to be more timely with signals we would like to create.
Creation of this page views data set is one part of the pipeline. the second part is syncing that data over to AWS. The mechanism for the syncing is still being designed to be most optimal. (in terms of form and frequency)
Looking at the Gitlab folder structure the team was thinking of having a high level folder for WME jobs ( which it turns out translates to a separate instance).
In the initial phase our compute needs will probably not too high and can use the shared resources in the airflow-main instance.
Would it be possible to create a folder at the https://gitlab.wikimedia.org/repos/data-engineering/airflow-dags level but have it run on the airflow-main instance?
Jun 12 2025
As part of this work we have requested resources on the DPE teams side: Ticket is here: https://phabricator.wikimedia.org/T396672
May 12 2025
Update from Prabhat in a different channel:
Apr 23 2025
yes accessing these through the Enterprise APIs does require a separate login for now and is a little more involved than how it was on the WMF dumps site. Hopefully the SDKs available on github make it easier to get started.
Jan 20 2025
Dec 18 2024
Dec 16 2024
github invitation sent.
Zendesk and pagerduty will be done once training and dry run is complete.
Dec 11 2024
Dec 6 2024
For some clarity. The request is for Chris to be able to eventually run jupyter notebooks.
So he is requesting access to the analytics-privatedata-users group in the analytics groups with SSH and kerberos access.
Dec 3 2024
Jul 8 2024
Jun 7 2024
@Abbe98 thanks for the explicit ask on the code snippet scenario it clarifies the asks for me. I was under the impression this was something that could be done server side. In the case of the client side only scripts, the solution would not be simple to implement. Some possibilities are listed below (I understand that some of these might be options that can not be used or are ill advised for some use cases):
Jun 5 2024
Hello @Abbe98. Sorry for the delay. Thank you for your patience in waiting for my reply. I’m posting a reply here in Phabricator (and @LWyatt will link to it on the Mastodon thread where this question originated). But moreover, given the question you have asked might come up again in the future, we will add “why does the API require authentication?” to our FAQ on metawiki soon.
May 23 2024
@Abbe98 Let me look into your requests. I will get back to you in a few days.
In the meantime, a request which I think is similar to yours was previously posted on our talk page back in January. Please check if my response there addresses any of your questions/concerns:
https://meta.wikimedia.org/wiki/Talk:Wikimedia_Enterprise#c-HShaikh_(WMF)-20240112213100-LWyatt_(WMF)-20240104000900
I believe this has been solved. A delivery mechanism has been setup for Francisco and Ehi has all the access she needs
Apr 29 2024
Mar 21 2024
Feb 22 2024
Would it be possbile to block signups from offending lists as part of the pre-signup lambda
Nov 8 2023
@Mitar Preferably we would like it if you can signup and use the trial account to see if that meets your needs. But as mentioned earlier access to our APIs are also available through the WMCS services.
Following the Enterprise section on https://wikitech.wikimedia.org/wiki/Portal:Data_Services You can follow the link for PAWS to access the services and create a Jupyter lab instance to access the WME services.
Oct 30 2023
@Kelson pinging to see if this is still on the radar.
Oct 17 2023
Oct 2 2023
yes this is resolved now.
Sep 29 2023
@MPhamWMF Can you confirm that you have credentials now?
Sep 13 2023
Jul 6 2023
Also adding the request to add Ehi to the aws and gitlab access.
Jun 19 2023
Jun 7 2023
We discussed some options and will research a few more on the AWS implementation side and come back early next week.
Jun 1 2023
May 4 2023
Sorry I thought I had already responded but it seems I forgot to hit submit
on the ticket. Reuven is correct we are fine with the current explanation
of something comes up and we n Ed to reopen I will let you know
Apr 24 2023
@DAbad I think we can close this ticket. I was able to get initial results for myself. But if this is something that is going to be revistied by the Data teams. We can keep it around
Apr 17 2023
Apr 14 2023
Sample query:
SELECT * from mediawiki_api_request where year = 2023 and month = 01 and day = 27 and hour = 16 limit 10;
