Page MenuHomePhabricator

[SDS 1.2.3] Develop a working definition for moderation activity and moderator
Closed, ResolvedPublic

Description

Develop:

Based on comprehensive literature review, that includes academic and product-related work, we operationalize these definitions in a detailed list of moderation actions,

  • A working definition for "moderator" based on the "moderation activity" that meets essential metric criteria
We define Moderators as the human actors responsible for social, technical and governance work needed to sustain an online community, including the creation, revision and enforcement of community values, rules, and norms.

Scope for this task

  • Only consider Wikipedia moderation activity.
  • It is ideal that the definition works for all Wikipedia languages. However, if you run into challenges that makes scaling to all languages in one quarter hard, it is okay to scale down. Follow the escalation process below.
  • Output will be based on processing offline data (dumps) and will not be in production at delivery time (Decision made on October 9th)
  • Main namespace only (per November 22nd update by Diego and a slack thread between Diego, Marshall and Leila)

Output
If a change in expectation with regards to outputs is needed, @diego can discuss with Leila and Marshall. As the team works on the project, the team may want to propose other outputs better than what we imagined below. This is very much welcomed. Diego will let Marshall and Leila know about such proposals.

  • Report
  • Reportis available in this Google Doc.
  • Data:
    • classified revision IDs (examples). Ideally: A research tool where given a revision ID can output the classification of relevant edit types. If this is not possible: at least a spreadsheet that lists 100s of diffs and their classifications, spanning different kinds of diffs and users.
    • Aggregates. A spreadsheet that we can pivot to count number of moderators, number of each type of moderation activity, ... against some dimensions. Examples of dimensions: editor tenure, editors with extended rights, edit count, language, year, ... Additionally, it would be good to be able to pivot at the user level to be able to pull data such as "for a random sample of 1000 users, how many of each edit types have the users done over a span of time"
      • Spreadsheet: Aggregate data broken out by year of editor (year of registration, edit count bucket), whether the edit was counted as moderation or not, and whether the edit was a revert or itself reverted

At the delivery time, and for the languages that you can offer the working definitions, we should be able to know:

  • if an edit is a moderation related edit and if so what kind of moderation it is.
  • who are the moderators using the previous information.

How to escalate
If you are blocked, please escalate to @leila and keep Kenyatta (our program manager) in the loop. If you need additional resources, same. If you need to further escalate, please escalate to Kate Zimmerman. Please don't be shy.

Stakeholder
There are multiple groups and people who will benefit from this work and who can/should be consulted for this work. The Stakeholder whose input is needed for scoping and will need to sign off the work at the end of the quarter is @MMiller_WMF who has asked for this work to advance WE 1.3 (in this FY and in the future ones).

Confirmed list of direct contributors
This list can expand as the project progresses and more needs are identified.
@cwylo, @Isaac, @Pablo,
Research Engineering: the support is cleared by @XiaoXiao-WMF. As soon as you know more specifics about your needs and timelines, @diego please request directly of Xiao and she will assign one of Fabian or Muniza.

Confirmed list of folks available for consulting
@YLiou_WMF: for sampling (if more survey is needed), creating connections between T370439 and possible survey needs for this task, sharing learnings from T368791
@Easikingarmager: for sharing learnings from T368791 and drawing connections between the two projects as relevant
@KCVelaga_WMF: for supporting Isaac on the essential metric piece.
@Samwalton9-WMF: for helping us to produce outputs relevant for product
@OTichonova: supporting on the deliverables

Details

Due Date
Dec 17 2024, 12:00 AM

Related Objects

Event Timeline

leila triaged this task as High priority.
leila moved this task from Backlog to FY2024-25-Research-October-December on the Research board.
leila set Due Date to Dec 17 2024, 12:00 AM.
leila renamed this task from Develop a working definition for moderation activity and moderator to [SDS 1.2.3] Develop a working definition for moderation activity and moderator.Oct 8 2024, 11:32 PM
leila updated the task description. (Show Details)
leila added a subscriber: YLiou_WMF.

Progress update on the hypothesis for the week

  • We have define the list of participants in this project. Apart from myself this includes two 3 people from research (Isaac, Pablo and Yu-Ming), 2 from design research (Claudia and Eli), one from Moderation Tools team (Sam), one Product Analytics (KC) and one from Product Design (Olga T.)
  • Given the size of the team, we decided to split the work in two branches, a qualitative piece lead by Claudia, and a quantitative lead by Isaac.
  • Together with the KR owner (Leila) the hypothesis was defined as: If we combine existing knowledge about  moderators with quantitative methods for detecting moderation activity, we can systematically define and identify Wikipedia moderators.

Any new metrics related to the hypothesis

  • Not yet.

Any emerging blockers or risks

  • I need to talk with Claudia Lo to clarify if she would need a Design Researcher / Contractor support.
  • Time is super short considering this a short quarter and we have the offsite. Our approach would be to come-up with a working definition that allows to identify and measure moderators in offline data (dumps) as goal for Q2.

Any unresolved dependencies

  • No

Have there been any new learning from the hypothesis?

  • We are starting from relevant work done in the past by different teams, specially research and design research. We think that combining users studies from admins and patrollers, with the edittypes work and Knowledge Observatory learning with are going to be able to come up with a working definition for moderators.

Have there been any changes to the hypothesis scope or timeline?
No

Next Steps

  • I am going to meet the stakeholder (Marshal) to set expectations and deliverables
  • Meet with subteam leaders (Claudia and Isaac) to discuss the timeline and milestones.

@diego thank you to you and everyone in the team, as well as Marshall, Mikhail, Miriam, Rita and Xiao who worked behind the scenes to secure the resourcing for this hypothesis. I appreciate everyone's proactive approach in offering time and resources.

In your weekly update you mention that time is "super short". I am with you that given that you all have roughly 8 remaining weeks to work on this particular hypothesis and this is not the only hypothesis many of you are working towards, scoping is important. I'm in support of your decision to focus on offline data. If you need to make other scoping decisions (which I suspect you will run into), please bring them up and I'm happy to make decisions on your proposals. I also want to ask that you monitor stress in yourself and for the team for this project (in addition to our managers checking in with folks on energy and stress). We need to scope the work in a way that is reasonable to deliver in 8 weeks. So if ever you and the team feels very stressed, then let's talk. There are different ways we can explore (the least of which is through scoping down).

And with no intention of creating stress and only to help you believe in your power: I trust in the collective power of all of you who are working on this hypothesis for the coming 8 weeks. There is a lot of talent, knowledge and drive in this group. I am looking forward to learning with you all and supporting you in this hypothesis. This hypothesis is coming to life at an important stage for the organization. We know we need to invest more in supporting moderators and patrollers and we cannot be effective in that work if we can't even identify these folks across languages in a systematic and scalable way.

Thank you @diego and the team. <3

Hi @leila , thanks for your words.
I'm optimistic about the project ending on time, as you said we have a great team. I just highlighted the time constrains to explain why we are focusing on offline data for this quarter. However, depending on how these definitions are going to be used in the future, it would be interesting to think how they can work with live data (like real time monitoring), but for now, this is out of the scope for this quarter.

Weekly update:

Progress update on the hypothesis for the week

  • Working together with the KR owner (Leila) and stakeholder (Marshall), we have clarified the expected outputs for this hypothesis:
    • Report: A Research page on meta-wiki that contains important information about the project and routes to other important project assets (spreadsheets, presentations, Phabricator ticket, etc.).
    • Data:
      • classified revision IDs (examples). Ideally: A research tool where given a revision ID can output the classification of relevant edit types. If this is not possible: at least a spreadsheet that lists 100s of diffs and their classifications, spanning different kinds of diffs and users.
      • Aggregates. A spreadsheet that we can pivot to count number of moderators, number of each type of moderation activity, ... against some dimensions. Examples of dimensions: editor tenure, editors with extended rights, edit count, language, year, ... Additionally, it would be good to be able to pivot at the user level to be able to pull data such as "for a random sample of 1000 users, how many of each edit types have the users done over a span of time"
  • We are going to work with the same list of wikis S.D.S 1.2.2: enwiki, eswiki, frwiki, idwiki, ruwiki and arwiki.
  • We have defined a timeline and responsible for deliverables:
DeliverableDateResponsible
State of research (qualitative)Nov 1, 2024@cwylo
State of research (data)Nov 1, 2024@Isaac
Moderators Definition proposalNov 22, 2024@cwylo
Metrics statsDec 2, 2024@Isaac
Report proposalDec 6, 2024@diego
Discussions and improvementsFrom December 6 to 12thTeam + KR Owner + Stakeholder
Final ReportDecember 17th@diego
  • Quantitative work highlights: Sub-team has collected an initial set of packages and features that has been used to measure activities related to moderation (eg. patorlling, admin, etc). Details can be found in T377324#10233727 .
  • Qualitative work highlights: Claudia Lo and Olga Tichonovahad starting collecting related work.

Weekly update:

  • Qualitative report T376945#10263033 We are moving from compilation of sources to writing the report, in the form of a wikitext article.
  • To note - we are trying to reconcile the broader use of the term "moderator" as well as academic definitions of "volunteer community moderator", with the goals of this project; most of the extant qualitative work on the topic tends not to clearly define "moderator work" or "moderator actions" outside of very broad general actions (i.e. content removals and user removals, equivalent to deletes and blocks).
  • Quantitative work highlights are in T377324#10258839 - in short, explorations of the logging table and edit actions for the chosen wikis, as well as looking at log actions that appear outside that table, such as flaggedrevs.
  • Potential future blocker from the previous item:
    • "initial indications is that doing any useful analysis of edit actions will depend on overcoming two major technical limitations"
  • We're considering a multi-stage definition, where we define a "core" moderator-type users and moderator actions, but also define a periphery of related user types/user groups and actions. This may give us more flexibility given the many permutations of user groups, group rights, and the contextual nature of actions that are surfaced in various log actions. While such a definition might be more complicated to present, it will let us better describe the realities of our users while providing enough structure for its intended purposes.

Progress update on the hypothesis for the week

  • We have finalized the first sprint of this project reaching the two milestones we have set as goal for this period:
    • State of Research (Qualitative): That is a summary of the qualitative related work relevant for this project. This report allows to have a first approach for describing moderation activities in Wikipedia.
    • State of Research (data) spreadsheet: This is a spreadsheet that summarizes existing structured data that can be used to measure moderation actions. The spreadsheet has tabs broken out for log actions, edit actions (namespace 0), edit actions (other namespaces), and edit metadata. Details can be found in T377324#10281670
  • With these deliverables we have started the work of understanding how the moderation activities described in the qualitative research can be measured with the existing data and which data or models we are missing for this purpose.
  • Our main conclusion is that, although existing data allows to give a general picture of some moderation activities, if we wan to able to measure moderation at revision level (ie. classify each revision as moderation or not), we need to extend the edittypes library to understand more moderation actions. The specific requirements for this work are described in T378617. I'm going to use that task to coordinate this work that requires the support of research engineers.

Any new metrics related to the hypothesis

  • Nothing beyond the list of potential metrics described on the State of Research (data) spreadsheet.

Any emerging blockers or risks

  • We urgently need to unblock the creation of HTML diffs (T360794).
  • We are working on an transitory solution (T378761), but he sustainability of this work relies on unblocking the first one.

Any unresolved dependencies

  • No

Have there been any new learning from the hypothesis?

  • We have learnt that we need HTML dumps to be able to capture all moderation actives due the complexities of parsing transclusions (templates) on the existing wikitext dumps.

Have there been any changes to the hypothesis scope or timeline?

  • Not at this moment.

Next steps

  • We need to move forward with T378617.
  • Quant and qual teams needs to keep discussing the best ways to implement qualitative outputs on quantitative metrics.
  • We are going to have a meeting with KR owner and stakeholder.

@diego Thank you for the update.

Re T378617: Do you have dedicated research engineering resources for this? If yes, can you assign the task to the person and also set the priority of the task high? otherwise, @XiaoXiao-WMF please help.

Re T360794: Please ping @VirginiaPoundstone and let her know that you have a dependency. Also let her know who on your end is the point of contact for scoping or follow-up questions. My ask is to be very clear about your need (through the poc) and the deadline by which you need a solution (your responsibility:). I won't immediately take action on my end given that resourcing is the hypothesis owner's job. Escalate to me if you need help.

Progress update on the hypothesis for the week

  • We met with stakeholder and KR owner to show current progress, and discuss about next steps:
    • We are going to improve the edittypes library to collect all most of the edit actions listed on this spreadsheet. (please read below on the second bullet point of blockers section, to understand the limitations on this)
    • Qual and quant teams are going to discuss an prioritize the most relevant actions (ex. adding certain template)
    • The first week of December, I'm going to share this results with Marshall, and give one week to receive feedback (ex. prioritize certain actions)
    • We are going to implement the feedback and produce the final data outcomes.
  • Pablo continued review of extensions to determine if there were any major sources of log data that we were missing. He identified four major ones with confirmation from Sam Walton to make sure are included (sheet):
    • abuse_filter_history table (AbuseFilter)
    • cu_log table (CheckUser)
    • flaggedrevs table (FlaggedRevs)
    • pagetriage_log (PageTriage)
  • Olga has extended the State of Research (Qualitative).

Any new metrics related to the hypothesis

No

Any emerging blockers or risks

  • We decided to create an ad-hoc dataset for this quarter (T378761), so I'm removing T360794 as blocker, however is important to call out that sustainability of this work for relies on unblocking this last task, without it be difficult to continue producing metrics for future iterations of this project.
  • The complexities around getting HTML data is making us to rely more heavily on logged actions (see above). The completeness of data we will be able to extract from Edit Actions would depend on how fast we are able to proceed with T378617 and T378761, what is difficult to estimate given this is the first time we are working with this data. However, as backup solution, we can at least cover the actions that the current edittypes library is capturing. I'm going to talk with Isaac and Claudia to estimate what would be the coverage (of the total moderation actions) that his solution would cover.

Any unresolved dependencies

  • No

Have there been any new learning from the hypothesis?

  • No

Have there been any changes to the hypothesis scope or timeline?

  • Not at this moment.

Progress update on the hypothesis for the week

  • The whole time has met and made some relevant decisions, reflections and definitions:
    • As part of the discussion, we realized that there are a good number of actions for which we have very limited ability to measure. A good example of this is checking diffs to determine if an edit should be reverted or otherwise moderated. On wikis with FlaggedRevs or similar tooling for marking edits as "patrolled", we'd expect relatively diffs are unlikely to be viewed after they have been marked.
    • The addition/removal/changing of patrolling templates are the main edit change we're looking to detect in diffs (reverts are handled via edit metadata).
    • We explicitly chose to exclude a few things that might be seen as borderline patrolling -- e.g., category changes, reference changes. These feel more like adminstrative/maintenance work.
    • We also are not looking at pages outside of the main namespace.
  • Qualitative work:
    • We have differentiated between "moderation" and "maintenance".
      • Moderation is focused on the social and governance work needed to sustain an online community. This entails the creation and revision of community values, rules, and norms, and the social work required to support this (e.g. guiding discussion, modeling norms) in addition to the technical work of enforcing the space’s boundaries (by removing content or users that fall outside of these boundaries). Moderators are the human actors that carry out moderation actions with the intent of performing moderation work. Note that non-human actors (such as bots) can carry out moderation actions, but we restrict the definition of "moderator" for our purposes to human actors since it is reliant on the subjective intention of the human taking the action, or creating/directing/modifying the non-human actor that takes the action.
      • Maintenance is the technical activity that allows the community space to exist in its current or desired form, focused on the creation and ongoing maintenance of the infrastructure that facilitates regular activity in the community. Good examples of maintenance work that is also moderation work include: the creation of templates or bots that facilitate a policy on a wiki (e.g. archival bots, Articles-for-Deletion templates, creation of maintenance categories). Non-moderator maintenance work might include things like renaming pages in accordance with the Manual of Style, gadget maintenance, contributions to MediaWiki, etc.
    • These definitions helps us to narrow down to focus on moderation actions. We are currently working on finalizing this list of actions. Current list can be found here.
  • Quantitative work:
    • We are working on building an dashboard to visualize moderation actions aggregated by project, and that can be capture with the existing logged actions
    • We have started working on producing the HTML dataset to capture certain actions. (for details check T378761)

Any new metrics related to the hypothesis
No

Any emerging blockers or risks

  • As mentioned in previous updates, we are operating with tight deadlines. There is no buffer time in case of problems (technical or in the team).
  • In order to meet the deadlines, we are approaching the work as an iterative process where we first focus on moderation actions that are easy to capture from metadata (eg. logging table), and then moving to more complex signals (eg. usage of certain templates).
  • Subteams (qual and quant) are working in parallel, which is not ideal because feedback between teams is important to improve both outcomes, but allows deal with limited time and mitigate potential blockers that can appear, anyhow, this might require future iterations to improve and harmonize their outcomes.

Any unresolved dependencies

  • No

Have there been any new learning from the hypothesis?

  • No

Have there been any changes to the hypothesis scope or timeline?

  • See the risks section

@diego thanks to you and the team for the work and the update.

  • Further scoping. I updated the task description to reflect the further focusing on "main namespace" for this quarter's work. It would be really helpful if in the hypothesis report at the end of the quarter, you can comment on what it takes to expand the definition beyond the main namespace if your group already has insights about it.
  • Your last update has surfaced some decisions or points of alignment between you, Marshall, and I. I have opened a thread (as you're aware) between the three of us and I expect that we can make further updates in this task in the week of December 2nd as a result of those conversations.
  • I appreciate you flagging the risks. We further talked about this and we made a couple of adjustments in the overall KR (SDS 1.2) planning to give you some more buffer for this hypothesis. You also have a plan B for this project. If more adjustments are needed, please let Kenyatta or myself know.

Thanks.

Progress update on the hypothesis for the week

  • This week most of the quantitative team was OoO (as planned) and also was a short week (due USA holidays) for the qualitative team. However, we have done some relevant progress:
    • We have done significant progress on creating the HTML dataset to analyze moderation actions. This dataset would include revisions made during October 2024. The data collection process had started and we expect to have it ready this Monday.
    • We are iteratively refining the moderation action list. We have received inputs from Sam Walton on specific templates we should consider.
    • We have coordinated between teams and organized the sprint for the next week (see next steps)

Any new metrics related to the hypothesis

No

Any emerging blockers or risks
No

Any unresolved dependencies

No

Have there been any new learning from the hypothesis?

No

Have there been any changes to the hypothesis scope or timeline?

No

Next Steps

  • Refine the moderation action list.
  • Finalize the the HTML dump
  • Produce a draft for data outcomes including summary of logged moderation actions per project, and spreadsheet with a list of revisions with semi-structured moderation actions (ie. coming from the edittypes analysis)

Briefly describe what was accomplished over the course of the hypothesis work

  • For the first time, we established a formal definition of “moderators”: We define Moderators as the human actors responsible for social, technical and governance work needed to sustain an online community, including the creation, revision and enforcement of community values, rules, and norms.
  • To measure moderator activity, we drew on our prior qualitative knowledge of patrolling and admins work and conducted an extensive review of research literature and internal reports, resulting in a comprehensive list of 81 traceable moderation actions. We classified these actions based on their relevance to moderation, measurability, availability, and other dimensions.
  • We assessed the feasibility of measuring each moderation activity and decided to focus on 12 key actions for this hypothesis, measuring them across 13 different Wikipedia language editions: dewiki, arzwiki, plwiki, nlwiki, itwiki, frwiki, eswiki, svwiki, zhwiki, enwiki, jawiki, and ruwiki.
  • To measure the 12 key actions, we leveraged and expanded our previous work on edit classification (a.k.a edit types) to distinguish between moderator and non-moderator edits.
  • Additionally, we needed to create ad-hoc datasets of HTML article versions (T380871) to capture complex moderation activities that are difficult or impossible to detect with existing data. To achieve this within a short timeframe, we leveraged previous work by research engineers.
  • Based on the work described above, we developed an initial approach to measure moderation activities, focusing on the 12 key actions and 13 language editions previously mentioned:
    • As a preliminary result, we found that moderation-related edits range from less than 1% in some editions, such as German (0.09%) and Polish (0.53%), to nearly 10% in others, like Russian (9.6%).
    • We also developed a prototype dashboard to track logged moderation activities, demonstrating its potential for monitoring moderation efforts within our infrastructure.
    • These results are a proof of concept, demonstrating the potential of measuring and tracking moderation activities. However, they should not be considered final, due to the limited number of actions tracked and the reliance on ad-hoc data, which is not available in our infrastructure.
  • More details can be found in the final report.

Major lessons

  • We observed promising results, demonstrating that measuring moderation activity is feasible. However, greater investment is needed to scale this work for production and achieve broader coverage of moderator actions.
  • There were several key components that had to come together in one place at one time to successfully work on this hypothesis in under 10 weeks:
    • Essential past work was critical. The achievements and limitations of this hypothesis are directly tied to the essential work our teams have been doing over recent years. For example, the basic research on edit types and the fundamental engineering work on HTML article versions were crucial for capturing moderation actions and delivering results within a short timeframe. Investment in basic and fundamental research is what will enable us to answer new and complex questions in the future.
    • Collaboration across teams and expertise was key. The successful delivery of relevant results relied on strong collaboration across diverse teams. The interaction of design and scientific researchers, data scientists, strategists, and research engineers—with product specialists previously exposed ton research projects—enabled us to understand and model a complex process like moderation on Wikipedia in a short span of time, producing valuable outcomes for this project.

Potential next steps

  • Product Stakeholder decide whether further tuning and work is needed for their applications based on this research. If the decision is to invest further, we need to:
    • Implement stream of HTML content in production T360794.
    • Potentially expand edit-types library to include more moderation actions.
diego updated the task description. (Show Details)

@diego Thanks for this summary! I have a few questions:

  • The definition includes "creation, revision, and enforcement" - in the quantiative side of this work it seems like enforcement was the focus here. I don't know if this is something you explored at all, but do you think there's a good way for us to track creation and revision of values, rules, and norms too? My initial reaction is to think about cataloguing policy and guideline pages and looking at substantive edits to those pages, but perhaps there's a better method.
  • In the moderation activities dashboard you include upload as a log type indicative of moderation activity - I wondered if you could explain the thought process on this one, because it's not immediately obvious to me that file uploads would be moderation.

As a preliminary result, we found that moderation-related edits range from less than 1% in some editions, such as German (0.09%) and Polish (0.53%), to nearly 10% in others, like Russian (9.6%).

  • Does this include bot edits? Just curious, as I notice that a substantial % of the Russian Wikipedia moderation activity comes from bots.

From the report:

eswiki % Moderation (considering revert-related): 35.71%

This seems huge! Especially compared to the values for other wikis. Do we have any insight on what this is driven by?

Some moderation actions may be very popular in some wikis, but marginal or even inexistent in others, e.g., review in ruwiki (link), patrol in enwiki and frwiki (link).

I wonder if it's worth noting that this is in part a software-driven result. review is used heavily on ruwiki because it is a FlaggedRevs log, which many other wikis do not have. I believe patrol shows up for enwiki because of PageTriage, and frwiki because it's one of the only wikis in your sample using RCPatrol (each edit can be marked as patrolled). So popularity in these examples seems less about community/cultural decisions and more about what software a community has configured.

  • The definition includes "creation, revision, and enforcement" - in the quantiative side of this work it seems like enforcement was the focus here. I don't know if this is something you explored at all, but do you think there's a good way for us to track creation and revision of values, rules, and norms too? My initial reaction is to think about cataloguing policy and guideline pages and looking at substantive edits to those pages, but perhaps there's a better method.

I think it would possible to design some methods to get signals for these numbers, following your suggestion would be one approach. However, is difficult to assess the relevance/impact of those edits. Probably a mix of metrics plus some (permanent) qualitative analysis would be required.

  • In the moderation activities dashboard you include upload as a log type indicative of moderation activity - I wondered if you could explain the thought process on this one, because it's not immediately obvious to me that file uploads would be moderation.

@Pablo please can you explain this?

As a preliminary result, we found that moderation-related edits range from less than 1% in some editions, such as German (0.09%) and Polish (0.53%), to nearly 10% in others, like Russian (9.6%).

  • Does this include bot edits? Just curious, as I notice that a substantial % of the Russian Wikipedia moderation activity comes from bots.

Those numbers were on human edits.

From the report:

eswiki % Moderation (considering revert-related): 35.71%

This seems huge! Especially compared to the values for other wikis. Do we have any insight on what this is driven by?

We noticed this, but didn't have time during this work to analyze specific cases. Given that we consider just one month of data , October 2024, results might be affect by some specific (exogenous or endogenous) events or edit wars. It is important to highlight that our goal here was to understand which actions were measurable, and show how stable or sparse were those numbers. To get actionable insights about specific project it would be necessary to apply these methods on larger data.

  • In the moderation activities dashboard you include upload as a log type indicative of moderation activity - I wondered if you could explain the thought process on this one, because it's not immediately obvious to me that file uploads would be moderation.

@Pablo please can you explain this?

This is because logs with log_type='upload' and log_action='revert', i.e., reverts to file uploads (a sample can be explored at https://superset.wikimedia.org/superset/dashboard/p/KawrLyKO29R/)

  • In the moderation activities dashboard you include upload as a log type indicative of moderation activity - I wondered if you could explain the thought process on this one, because it's not immediately obvious to me that file uploads would be moderation.

@Pablo please can you explain this?

This is because logs with log_type='upload' and log_action='revert', i.e., reverts to file uploads (a sample can be explored at https://superset.wikimedia.org/superset/dashboard/p/KawrLyKO29R/)

Ahh yes I missed that, thank you!