User Details
- User Since
- Aug 10 2018, 4:17 PM (319 w, 8 h)
- Availability
- Available
- LDAP User
- Unknown
- MediaWiki User
- Samuel (WMF) [ Global Accounts ]
Jul 25 2024
Jul 16 2024
Hi @joanna_borun, tagging you here as per @acooper’s guidance. The current task is a continuation of a discussion on restricting the on-wiki visibility of Abuse filter modifications (T241667). The Security-Team recommendation on that question has traditionally been to err on the safe side and restrict Abuse Filter modifications to specific groups rather than making them available publicly to everyone by default, including anonymous users. So T284944 is to make sure that the redaction happens on replicas as well. Let me know if further clarification is necessary.
Jun 29 2024
Jun 28 2024
The Cloud-Services project tag is not intended to have any tasks. Please check the list on https://phabricator.wikimedia.org/project/profile/832/ and replace it with a more specific project tag to this task. Thanks!
Jun 14 2024
Thanks for you input @Bawolff and @sbassett. So far the script outputs ~2800 potential candidates (See table in F55305255#7434). I'll see how to leverage semgrep to improve the filtering logic and reduce false positives.
$ python wikimedia-gadgets-loading-tpr.py
To get the data on gadgets loading in much quicker way, I'm exploring a different approach that requires fewer steps. The idea is to have a script that can:
Jun 5 2024
Apr 29 2024
Feb 1 2024
Jan 25 2024
Jan 21 2024
Dec 16 2023
Dec 14 2023
Dec 8 2023
I was able to include the high-level category and presented the relevant data in a heatmap-like fashion. Although I did not include any percentage to avoid visual cluttering, I think the heatmap colors convey a sense of where exactly the most concerning areas. Kindly take a look at the WIP_Matrix sheet and let me know what your thoughts are.
Dec 7 2023
Perhaps it makes sense to use Message::parse() in directly in User::getRightDescription(). It could be redundant for Special:ListGrants et al. since there is also some parsing happening there but at least it would prevent wikitext from being displayed as raw text. I'll submit a patch in line with this.
Dec 1 2023
The color-coding approach using Gsheet formulas seems reasonable and fairly straightforward to implement. The current table format is suited for color-coding each individual health check metric row based on its score. However, if we need to include the high-level headings to the table, I'd need to think of a good way to feature both the individual row color-code and the high-level heading color-code while avoiding visual confusion or cluttering. I'll give it some thought and try something.
Nov 15 2023
Nov 14 2023
Nov 13 2023
Hi, all — I’ll share here a joint privacy review of the two proposed changes: enabling the TagManager and the Marketing Campaign Reporting plugins, as well as a succinct privacy risk assessment of the self-hosted Matomo instance, although it wasn’t specifically requested.
Nov 12 2023
Oct 30 2023
Oct 13 2023
I've given it a try in the sheet COPY_High-level indicators and grouped the factors under 4 high-level indicators: security, testability, activity, and stewardship. These indicators are mainly inspired by our Mediawiki documentation on Codehealth[1][2] and some external resources[3].
I had another thought about this requirement. Besides the higher level organizational header names, it would be helpful if the risk of those columns could be collectively expressed by a single value.
Oct 12 2023
@acooper I think I need edit privileges to modify the sheet. I'd like to make a copy of the 'Matrix' sheet and tweak it.
Sep 18 2023
Aug 30 2023
Jul 18 2023
Jun 29 2023
Jun 5 2023
The policy draft is now publicly available for feedback on meta-wiki. Hope to hear your thoughts there!
my main point is that since we now have the stats above, and they point out fonts as among the main third-party resources people load, we should make sure we try to address that in some way (find a reasonable way to enable that use case without unreasonably raising the risk). It's currently bright-line prohibited, but the fontcdn solution has properties that reduce that risk to what I assert is an acceptable level. It's just not zero, which is what the current bright-line policy requires.
Jun 3 2023
May 31 2023
Hello — just a heads up that the policy draft will be released publicly for discussion next week, on June 5th, as part of the official consultation. When the policy discussion opens, there will be an announcement through the usual channels: wikimedia-l, IRC, etc. You can find more details about the upcoming steps and dates of the consultation in the subtask T337863.
May 30 2023
Hey @ldelench_wmf, the Security team's review is complete. You'll find below our feedback:
May 17 2023
May 8 2023
May 4 2023
May 3 2023
Hey there -- just a heads-up that I have started compiling some data on Gadgets and User scripts loading third-party resources across Wikimedia projects in T335892. This may help get a sense of the impact of the policy. The initial data is probably off/incomplete. So any ideas to get more accurate data is warmly welcome :)
Mar 30 2023
My recommendation from a data privacy perspective is to show aggregated data only and keep the PII in the back end for 90 days, during which time participants can update their answers, after which time we anonymize the PII data and keep only the aggregated data.
As far as aggregated data, I recommend reporting out when we have more than x persons in a [sub]category; below that we could either not report out or report, for example a compilation "other <x"
Mar 28 2023
Usually, with PII data on persons we set a minimum for calculating averages so the data cannot be disambiguated and persons identified.
Essentially, such data should not be disaggregated at small numbers. There may be a standard at 10, that is usually 20 though for power to detect differences. So, for example, if you have <10 persons in a [sub] category then you don't report out on that [sub] category.
Security and GDI teams may be able to provide additional insights and feedback.
Hey @Iflorez, how will that work if event organizers are able to view data at an individual level anyway (F35861130)? Will the detailed view of the Participants tab be unavailable once data is aggregated, after the 90-day window?
Mar 7 2023
Thanks for the ping @sbassett. We could borrow some ideas from the generic message currently displayed when logged in users visit external links, and a privacy notice(T65598#6914486) which was provided by WMF-Legal. Privacy best practices encourage both brevity and clarity of notices. So, a more privacy-conscious message could be something along these lines:
Mar 6 2023
Hey everyone, I agree that having the Security-Team review every single Gadgets and User script would not be scalable or even realistic.
Tagging Privacy Engineering for an opinion/risk rating about the following. I'm not certain there's precedent for this on Wikimedia production or that wmcs would completely satisfy any privacy concerns for proposed, embedded content like this.
Mar 3 2023
Hello @Skizzerz, is there a publicly accessible repository for the source code of https://owidm.wmcloud.org?
Feb 10 2023
Nov 28 2022
@jrbs was added as a maintainer to the NDA bot, see (https://toolsadmin.wikimedia.org/tools/id/tsbot). Also, the code was moved the Wikimedia's Gitlab instance: https://gitlab.wikimedia.org/repos/security/tsbot-nda
The repository was imported to Wikimedia's Gitlab instance: https://gitlab.wikimedia.org/repos/security/tsbot-nda
Nov 24 2022
I examined the proposed API through common privacy risk categories:
Nov 8 2022
Hey @ldelench_wmf, I have no objections to closing this one, thanks.
Nov 1 2022
Hello, some quick updates.
Oct 25 2022
@ifried, the Security-Team hasn't gotten the chance to discuss the mitigating options surfaced in the Google Docs conversation. Meanwhile, I would like to keep the ticket open and update it once we've made some progress.
Oct 20 2022
Hey @ifried, the Privacy Engineering review is complete. Could you take a look at our conclusions and address any potential misunderstanding there? https://docs.google.com/document/d/1lFeq7jtUCmXdwoKwIfqgO-74ccTU0kBtX7zkJkeMByw/edit#?
Oct 14 2022
Hello @ifried, Privacy Engineering will start looking into this as part of our current sprint. On a side note, I am aware that the previous features have been looked at by WMF-Legal. For this additional feature, are you having any conversation with Legal in parallel?
Oct 12 2022
Oct 4 2022
From a privacy angle, I see no concerns here. The output of https://netbox.wikimedia.org/metrics and https://toolhub.wikimedia.org/metrics does not appear to contain any identifying information.
@sbassett , let me know if I should leave a separate comment on T318839 or the parent ticket as well.
Aug 22 2022
Jul 12 2022
@Aklapper, sure thing. It used to be a private repo which I was the sole maintainer when I was still in Trust-and-Safety. Moving it to GitLab makes sense but I am not sure which project would be suited for it. I don't currently see anything related to Trust & Safety. Any suggestion?