@Pcoombe my bad. done now.
May 5 2021
Apr 16 2021
Not sure why it didn't work, but thanks for walking through it with me yesterday. Added email@example.com to the domains you requested, @Pcoombe.
I'm confused by this task. There's a big leap between "see how many edits are coming from users without JS" and "create instrumentation not affected by ad blockers".
We should not work on anything attempting to bypass ad blockers. Not only is it adversarial to users, but it's also a big waste of effort, as blocking is much easier than blocking the blocking. I have no objection to measuring actions by no-JS users, to prioritize what we work on etc., but the way you phrased this is giving me reservations.
Mar 10 2021
Mar 9 2021
Mar 5 2021
Hey folks, I just wanted to chime in here and say that from a product perspective, paying any cost or effort to continuing to support browsers that we don’t support on our flagship products feels like something we should stop. I understand that there are 3rd party wikis that might choose to make different decisions about TLS support, but the value of increasing development velocity and improving user experience on Wikipedia (visited by 1.5B devices a month) vastly outweighs the negative consequences. Let me know how I can help communicate this if you feel a communication for this is necessary.
Feb 18 2021
Added the need for this to split by country. Not an absolute requirement, but a pretty important one. 1% of edits from JS is low, but if it's 50% in Afghanistan, then that makes a difference.
Reopening because we really do need an answer to this question (and the previous answer was marred by high ad blocker penetration.) Let me know if that's not the right protocol
@sdkim @kzimmerman this is the first step towards really answering T240697. Since the answer to T240697 was undermined by the ad blockers, I'm going to reopen that ticket for now rather than open a new one.
Feb 8 2021
Feb 2 2021
Jan 22 2021
Dec 17 2020
@RLazarus approved, thanks!
Dec 4 2020
Dec 2 2020
@Krinkle Thanks for clarifying. The platform breakout explains the discrepancy we had. Sorry the superset link didn't work--it works for me, so might be an auth issue with Superset.
Nov 30 2020
@Krinkle quick note, re: above. When you filter out bots and spiders, using the "agent" dimension and filter it to users, the IE11 percentages are actually significantly lower. Which is good! https://w.wiki/oqf. When you open it up to top 1000 versions, possible in Superset, the overall % for IE11 last quarter seems to be: .93% (superset link)
Sep 28 2020
Sep 23 2020
Jul 21 2020
Jul 13 2020
Jul 6 2020
Jun 9 2020
Jun 8 2020
@elukey Ahh, bummer! Thanks for looking into it.
Jun 5 2020
@elukey I seem to have lost the ability to un-pin items from showing up in the graph. It's hard to describe, but using the colored squared seen in this screenshot, I could filter items in/out of the chart without using the formal filter function. This was incredibly handy when looking at large breakdowns like the one shown. I'm hoping it's a config setting that you can change.
Apr 20 2020
Apr 14 2020
Mar 19 2020
this was resolved during the OCG replacement
the reading team is no more
this came out of an old strategy session
Mar 17 2020
Mar 5 2020
@kzimmerman Just closing the loop here publicly. Kate and I resolved this offline the week of Feb 17th.
Feb 18 2020
@kzimmerman and @cchen Thank you for this incredible summary!! 12% is huge. Before deciding whether or not to table this, can someone on the team run through this list of the top 30 and decide if they are bots or not? Judging from the first username with 10M edits, more than 1/2 of the edits (or 6% of total) are unidentified bot edits. If this is the only bot on the list, then the short-term solution is simply registering this as a bot. But if there are more in that top 30, then we might need something more. Does that sound fair? Here is a sheet where I started, it's probably an hour of work for someone familiar with edit histories.
Feb 13 2020
Dec 14 2019
Nov 25 2019
Nov 1 2019
@kzimmerman pinging @dr0ptp4kt and @kaldari as my understanding was that unfortunately we do care about people who have turned off JS.
We anticipate there is going to be equal or more concern about/from folks who turn off JS intentionally, so getting the edit % (in particular) from there was primarily to understand that impact.
Oct 25 2019
Oct 24 2019
Whether it is on by default or not, we should probably turn one off if we turn the other on. I just had the confusing experience of turning on reference previews and seeing both versions appear. It is ugly. I started writing up a bug report before I figured out what was going on.
Oct 11 2019
@elukey Thanks for making this adjustment! That makes a huge difference.
Oct 5 2019
Oct 4 2019
Reopening. This is happening for other metrics now @Nuria
Sep 19 2019
Great data. Thanks for sharing!
Sep 6 2019
@Nuria Apparently it's too long for a shortener (hah!). The name of the chart is "Pageviews by browser family" and I am the creator. You should be able to find it from that, but let me know.
Sep 4 2019
@Nuria it's not working on even 1 year. I should add that it used to work.
Sep 3 2019
Aug 29 2019
Aug 26 2019
@Nuria It's fixed for me! 😍 Thanks!
Jul 2 2019
I'm on Mac OS 10.4 (Kate's out this week).
Jul 1 2019
Still happening. This is me:
Jun 25 2019
Jun 3 2019
@Nuria Thanks! Please let me know if/when the +10000 edits gets scheduled or fixed.
May 29 2019
Quick questions regarding the user tenure bucket.
May 22 2019
May 6 2019
May 3 2019
@chelsyx I don't know what wonderful deed I did to deserve the insights derived here. This is gold. Thank you. It actually upends some previous 3rd party data I had been using about the relative traffic on "geography" pages.
Apr 25 2019
Mar 22 2019
Congratulations to Noe and the other French Wiktionarians on your huge birthday!!!
Jan 24 2019
Jan 22 2019
- i saw in the ticket about the bug that hundreds of duplicate events were being sent.
- If we can detect them is there a way to correct for them? and
- if there are hundreds, what % of total events is that? The data doesn't have to be perfect, so if it's less than 5% of events of any given event-type, then I wouldn't let it get in the way of analysis.
Jan 18 2019
Jan 9 2019
Dec 7 2018
Yeah, I've only gotten it to work well by looking at 1 year and the previous year overlaid. I tried but have not been able to create a "periodicity pivot" chart (or learn what that means), but it sounds cool.
Makes sense. Thanks for clarifying!
@Nuria while bot detection certainly plays a role, I am nervous about classifying this as an issue that can be more or less fixed with better bot detection. Other sites have something like 5% of their external referrals coming from none + unknown (which includes direct) and we have 29% (in the last year). Our "referrals" from other sites is 14% of external traffic, for comparison. Even if 10% of TOTAL traffic was undetected bots (yikes), it wouldn't make up this difference. Also, the % of "none" traffic seems to be dropping over the last few years, while one would expect inflation over time of undetected bots (assuming a static bot definition).
Dec 5 2018
Nov 1 2018
Oct 18 2018
Oct 5 2018
@AfroThundr3007730 Hi! I don't think we've crossed paths yet, but you are clearly very active on both the wikis and phab. I'm a director of product at the foundation and the interim manager for the product analytics team. Thanks for reaching out and for drafting such a well-thought out request. As you likely suspected, the team is incredibly overloaded right now: our current mandate is to provide guidance to the product development teams here at the Wikimedia Foundation, and all of them have unanswered questions and would like more support from us.