Page MenuHomePhabricator

Workflow engagement check: Replying v1.0 & v2.0
Closed, ResolvedPublic

Description

Purpose

This task is about analyzing how people are engaging with the new Replying tool to:

Help answer these questions:

  1. Are people are having success using the feature?
  2. Are people using the tool in ways that negatively impact the experience of others?

So we can decide:
What – if any – changes should be made to the workflow before deploying the feature to more people and with it, beginning an A/B to determine its impact?

Workflow engagement metrics

This section is still being drafted. Once complete, it will continue the information needed make the two decisions listed above.

META

  • For each metric, we'd like to see a breakdown by wiki where the Reply Tool is currently deployed. You can see where and how the Reply Tool is deployed here: https://www.mediawiki.org/wiki/Talk_pages_project#Active_initiatives .
  • "Experience level": we'd like to see experience level expressed using the following buckets:
    • 0 cumulative edits
    • 1-4 cumulative edits
    • 5-99 cumulative edits
    • 100-999 cumulative edits
    • 1000+ cumulative edits

1. Are people publishing the replies they start?

  • Reply Tool funnel (grouped by experience level and the input mode – source / visual – they are shown by default):
    • Of the people who click a [ reply ] link (action = 'init'), what % of people see the interface ready (action = 'ready')?
    • Of the people who see the interface is ready action = 'ready, what % of people start typing a comment (action = 'firstChange')?
    • Of the people who start typing a comment (action = 'firstChange'), what % of people click the Reply button (action = 'saveIntent')?
    • Of the people who click the Reply button (action = 'saveIntent'), what % of people successfully publish the comment they were drafting (action = 'saveSuccess')?
  • Reply Tool comment completion (grouped by experience level and the input mode – source / visual – they are shown by default):
    • Of the people who click the Reply link (action = 'init'), what % of people successfully publish the comment they were drafting (action = 'saveSuccess')?
  • Full-page wikitext talk page edit completion rate (grouped by experience level):
    • Of the people who click an Edit link on a talk page (action = 'init'), what % of people successfully save an edit (action = 'saveSuccess')?

2. Are people satisfied with the feature?

  • % of people who publish at least one comment and turn off the Reply Tool preference (name: Enable quick replying) in Special:Preferences.
  • Distribution of people who posted a comment with the tool on only one day?
  • Distribution of people who posted a comment with the tool on ___ (see grouping below) distinct days:
    • 1-5 days
    • 6-10 days
    • 11-15 days
    • 16-20 days
    • 21-25 days
    • 26-30 days
    • etc.
  • Distribution of people who have posted the following number of comments with the Reply Tool:
    • 1-5 comments
    • 6-10 comments
    • 11-15 comments
    • 16-20 comments
    • 21-25 comments
    • 26-30 comments
    • 31-35 comments
    • 50+ comments

3. Are people posting disruptive comments?

  • % of comments posted using the Reply Tool that are reverted within 48 hours
  • # of people blocked after posting a reply using DiscussionTools

Metric refinements

The below are ordered by priority. Where "1." is the highest priority and "3." is the lowest priority.

  • 1.Where applicable, can you please add charts that aggregate Junior Contributors and Senior Contributors into two separate "bins"? Where "Junior Contributors" are people who have made <100 cumulative edits edits across namespaces ? Where "Senior Contributors" are people who have made >500 cumulative edits across namespaces?
  • 2. Can we see companion charts to the already-existing "Reply Tool Wikitext vs Full Talk Page Wikitext Completion Rates" chart that shows "Reply Tool visual vs Full Talk Page Wikitext Completion Rates" and "Reply Tool (visual and source) vs. Full Talk page Wikitext"?
  • 3. Filter out section=new edits from the "Reply Tool Wikitext/Visual/Both vs Full Talk Page Wikitext Completion Rates" charts.

Open questions

  • How long after v2.0 is released should this analysis happen?
    • This analysis is scheduled to begin the week of 2-November per T263050#6565226.
  • Is new information required to know what text input mode (source or visual) a user is when they take a certain action (e.g. switching text input modes, abandoning/publishing a comment)?
    • No. For any event in EditAttemptStep we will be able to know what editing input mode (source or visual) the user was in when they took the action.
    • Caveat: the switching event (between source and visual) has not yet been implemented. See: T247139#6114499.

Done

  • Analysis is completed that contains the ===Workflow engagement metrics and ===Metric refinements listed above

i. The pinging feature became available on 17-June-2020: https://www.mediawiki.org/wiki/Talk_pages_project/replying#17_June_2020


This task/analysis, had previously been scoped as a follow up to a then-planned analysis of engagement with Version 1.0: T247138.

Related Objects

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes

@Mayakp.wiki and I will discuss the scope of this ticket before work on it begins.

SNowick_WMF moved this task from Triage to Backlog on the Product-Analytics board.

@Mayakp.wiki and I will discuss the scope of this ticket before work on it begins.

Next steps
Maya and I talked; next steps are:

Measurement scope changes
I've made the following updates to the task description:

  • REMOVE: "Are people taking a long time to publish their replies?"
  • REMOVE: "Are people needing to correct mistakes the tool produced?"
  • REMOVE: "Are people able to successfully edit their comments?"
    • Reason: for the time being, we are pausing development on the functionality for editing single comments. See: T242562#5977993.

Rationale

Revisions to engagement metrics
During our meeting on Wednesday, 1-April, @Mayakp.wiki and I confirmed the following changes to the metrics we will calculate to help evaluate how people are using the new Replying tool:

  • REMOVE: Distribution of elapsed times between when a person starts typing their comment and when they publish their comment
    • Rationale: we think this metric is likely to be noisy.
  • REMOVE: "Are people needing to correct mistakes the tool produced?"
    • Reasons: we think this metric is likely to be noisy as well (e.g. what if someone is just correcting a typo?). Also: at this stage, we can more reliably determine this by manually reviewing edits made using the DiscussionTools via change tags.

Change to task description

  • CHANGED question #5 from "5. What text input mode (source or visual) should be shown to people who do not already have a preference set?" to "5. What text input mode (source or visual) are people using to publish comments?"
    • Reason: as @Mayakp.wiki and I discussed, we will consider the question of which text input area should be down to people who do not already have a preference set in this task: T250523

Next steps
Documenting the next steps from the meeting @Mayakp.wiki and I had on 17-April:

  • @Mayakp.wiki to review the events below to determine what would be involved in tracking them. Specifically, would these events require a new schema, per T243363? Source: Talk pages/Replying/Instrumentation spec > "Events"
    • When someone clicks a "Reply" link on the talk page, what the text input mode – source or visual – do they see?
    • When someone starts typing a comment, what text input mode are they in?
    • When someone clicks "Cancel" after starting to type a comment, what text input mode are/were they in?
ppelberg renamed this task from Workflow engagement check: Replying v2.0 to Workflow engagement check: Replying v1.0 & v2.0.May 2 2020, 12:51 AM
ppelberg updated the task description. (Show Details)
ppelberg updated the task description. (Show Details)

Next steps
Documenting the next steps from the meeting @Mayakp.wiki and I had on 17-April:

  • @Mayakp.wiki to review the events below to determine what would be involved in tracking them. Specifically, would these events require a new schema, per T243363? Source: Talk pages/Replying/Instrumentation spec > "Events"
    • When someone clicks a "Reply" link on the talk page, what the text input mode – source or visual – do they see?
    • When someone starts typing a comment, what text input mode are they in?
    • When someone clicks "Cancel" after starting to type a comment, what text input mode are/were they in?

We are going answer the above with @DLynch in the meeting scheduled for this afternoon.

Next steps
Documenting the next steps from the meeting @Mayakp.wiki and I had on 17-April:

  • @Mayakp.wiki to review the events below to determine what would be involved in tracking them. Specifically, would these events require a new schema, per T243363? Source: Talk pages/Replying/Instrumentation spec > "Events"
    • When someone clicks a "Reply" link on the talk page, what the text input mode – source or visual – do they see?
    • When someone starts typing a comment, what text input mode are they in?
    • When someone clicks "Cancel" after starting to type a comment, what text input mode are/were they in?

We are going answer the above with @DLynch in the meeting scheduled for this afternoon.

Confirmed: for any event in EditAttemptStep we will be able to know what editing input mode (source or visual) the user was in when they took the action.

Caveat: the switching event (between source and visual) has not yet been implemented.

Change 594809 had a related patch set uploaded (by DLynch; owner: DLynch):
[mediawiki/extensions/DiscussionTools@master] Hook up VisualEditorFeatureUse logging

https://gerrit.wikimedia.org/r/594809

Mayakp.wiki moved this task from Backlog to Upcoming Quarter on the Product-Analytics board.

Assigning task to Megan and moving it to Upcoming Quarter of the Product Analytics board as confirmed in weekly 1:1 with Peter and Megan.

Task description update
I've added the following to the task description.

⚠️ Note: @MNeisler and I need to discuss this before we can consider it within the scope of this task.

6. Of the people who have made at least 1 edit with the Reply Tool after 17-June-2020 [i], after how many edits do people use > > the tool's Pinging Feature for the first time?
This question is relevant to us deciding whether the existence of the tool's pinging feature ought to be communicated more explicitly > via something like an onboarding experience per T253434#6334269.

Opt-out rates
As part of T249386, we calculated the number of people who published at least 1 comment with the Reply Tool and then turned off the DiscussionTools Beta Feature.

This count is/was a good leading indicator for whether people were having a particularly bad experience with the tool.

@MNeisler and I had talked about calculating the same metric as part of this analysis. Unfortunately, T260867 prevents us from doing this.

The above is the outcome of a conversation we had on 9-Sep-2020.

This analysis is currently predicted to begin in early October following the deployment of the ReplyTool as opt-out preference at the Arabic, Czech and Hungarian Wikipedias (T249394) and confirmation of the number of events being logged (T263050).

Task description update
I've added the below to the task description:

7. Are Junior Contributors becoming confused/distracted by the Advanced affordance?

ppelberg updated the task description. (Show Details)

@MNeisler and I just talked through the "REFINEMENTS" below.

Next step:

  • @MNeisler to review the "REFINEMENTS" below and ensure:
    • The metrics fit within the scope of this task.
    • The metrics are defined clearly enough.
  • @ppelberg to update task description once Megan reviews the below. ---

REFINEMENTS

  1. Are people publishing the replies they start?
  • CHANGE how we are measuring the comment completion rate so we can understand the full comment funnel. Said another way, we'd like to know:
    • Of the people who click a [ reply ] link, what % of people click the Reply button?
      • Of the people who see the interface is ready, what % of people the Reply button?
        • Of the people who start typing a comment, what % of people click the Reply button?
          • Of the people who click the Reply button, what % of people successfully publish the comment they were drafting?
  • ADD grouping by experience level for each of the above
  • ADD grouping by editing interface – source and visual – for each of the above
  • ADD a baseline of the above for full-page wikitext editing so we have a baseline to compare to.
  1. Are people abandoning the replies they start?
  • REMOVE
    • Reason: I think we can explore this question in the event that we notice a notable % of people not publishing comments with the Reply Tool which we'll learn from: "1. Are people publishing the replies they start?"
  1. Are people satisfied with the feature?

The thinking below is based on the thinking we did in T249386#6283682 .

  • ADD how many people posted a comment with the tool on only one day?
  • ADD how many people posted a comment with the tool on ___ (see grouping below) distinct days:
    • 1-5 days
    • 6-10 days
    • 11-15 days
    • 16-20 days
    • 21-25 days
    • 26-30 days
    • etc.
    • Note: these groupings depend on when this analysis is done and subsequently, for how many days the tool has been available.
  • ADD % of talk page edits people make with the Reply Tool, grouped by experience level and platform. Read: we'd like to know: 1) of the talk page edits people make across platforms, what percentage of those edits do people use the Reply Tool to make AND 2) of the talk page edits people make on desktop, what percentage of those edits do people use the Reply Tool to make.
    • 1-10 percent
    • 11-20 percent
    • 21-30 percent
    • 31-40 percent
    • 41-50 percent
    • 51-60 percent
    • 61-70 percent
    • 71-80 percent
    • 81-90 percent
    • 91-100 percent
  • ADD the number of people who are posting varying numbers of comments with the Reply Tool:
    • 1-5 comments
    • 6-10 comments
    • 11-15 comments
    • 16-20 comments
    • 21-25 comments
    • 26-30 comments
    • 31-35 comments
    • 50+ comments
  • ADD how many distinct people have explicitly turned off the Reply Tool preference (name: Enable quick replying) in Special:Preferences
  1. What text input mode (source or visual) are people using to publish comments?
  • REMOVE. This should be covered by: 1. Are people publishing the replies they start?
  1. Of the people who have made at least 1 edit with the Reply Tool after 17-June-2020 [i], after how many edits do people use the tool's Pinging Feature for the first time?
  • REMOVE: this is out of scope for this task.
ppelberg updated the task description. (Show Details)

@ppelberg - I reviewed the task scope and suggested refinements.

For ease of editing, I copied the workflow scope and refinements into a google doc. My revisions are noted in my comments and suggested changes in the doc. Let me know if you have any questions.

@ppelberg - I reviewed the task scope and suggested refinements.

For ease of editing, I copied the workflow scope and refinements into a google doc. My revisions are noted in my comments and suggested changes in the doc. Let me know if you have any questions.

Great idea – thank you for doing this, @MNeisler. I'll pose any comments/questions in the doc and then we can reflect the decisions we come to in a single comment here and then any necessary changes to the task description.

! In T247139#6568676, @ppelberg wrote:
...we can reflect the decisions we come to in a single comment here and then any necessary changes to the task description.

The decisions @MNeisler and I came to are summarized below; the task description has been updated to reflect the changes resulting from these decisions.

REFINEMENTS

  1. Are people publishing the replies they start?
  • CHANGE how we are measuring the comment completion rate so we can understand the full comment funnel. Said another way, we'd like to know:

✅ We're going to measure the edit completion rate such that we can see at what steps in comment funnel people might be dropping out.

  • ADD grouping by experience level for each of the above

✅ We're going to measure this.

  • ADD grouping by editing interface – source and visual – for each of the above
  • ADD a baseline of the above for full-page wikitext editing so we have a baseline to compare to.

✅ We're going to measure this.

  1. Are people abandoning the replies they start?
  • REMOVE

❗️We're not going to measure this.

  1. Are people satisfied with the feature?

The thinking below is based on the thinking we did in T249386#6283682 .

  • ADD how many people posted a comment with the tool on only one day?

✅ We're going to measure this.

  • ADD how many people posted a comment with the tool on ___ (see grouping below) distinct days:

✅ We're going to measure this.

  • ADD % of talk page edits people make with the Reply Tool, grouped by experience level and platform.

❗️We're not going to measure this.

  • ADD the number of people who are posting varying numbers of comments with the Reply Tool:

✅ We're going to measure this.

  • ADD how many distinct people have explicitly turned off the Reply Tool preference (name: Enable quick replying) in Special:Preferences

✅ We're going to measure this now that T260867 is resolved.

  1. What text input mode (source or visual) are people using to publish comments?
  • REMOVE. This should be covered by: 1. Are people publishing the replies they start?

❗️We're not going to measure this.

  1. Of the people who have made at least 1 edit with the Reply Tool after 17-June-2020 [i], after how many edits do people use the tool's Pinging Feature for the first time?
  • REMOVE: this is out of scope for this task.

❗️We're not going to measure this.

@ppelberg Update re timeline: I have finished calculating all the metrics identified above but going to use today to do a final QA and summarize key findings. I'll have the finished report by tomorrow (8 December) at the latest. Let me know if you have any concerns with this timeline or need any data earlier.

@ppelberg Here is the first draft for the workflow engagement report. I've provided a summary of key findings below and you can find more details about the code and copies of the charts in the repo. Sorry for the delay! Let me know if you have any questions or suggested modifications.

Summary of Key Findings
Data reflect reply tool usage by Arabic, Czech and Hungarian following the deployment of the tool as an opt-out feature (24 September 2020 through 30 November 2020).

  1. Are people publishing the replies they start?
    • For junior contributors (under 4 cumulative edits), there is a drop-off in the number of reply tool users from ready (editing interface is loaded) to firstChange(the user begins typing a comment). However, the majority of junior contributors that started typing a comment were able to successfully publish their comment across all three opt-out wikis. For example, 100% of first time contributors that started typing in the reply tool using VE were able to sucessfully publish their edit and 96% of contributors with 1-4 edits.
    • Reply tool completion rates range from a low of 45% (reply tool users with 1 to 4 edits using wikitext) to a high 85% (reply tool users with over 1000 edits using VE) across the various edit groups and editing interfaces. Reply comments made with VisualEditor had a higher completion rate across all contributor experience levels and the completion rate generally increases with experience level.
  1. Are people satisfied with the feature?
    • Over 80% of reply tool users on each of the three opt-out wikis made between 1 to 5 comments. Hungarian Wikipedia had the highest percentage of users (19%) that made over 5 comments using the reply tool. The number of comments posted by each user increases with experience level.
    • On the three opt-out wikis, the majority (58% to 67%) only used the reply tool on one day within the reviewed timeframe (24 September 2020 through 30 November 2020). Hungarian Wikipedia had the highest overall percentage (42%) of reply tool users that use the tool on 2 or more distinct days.
  2. Are people posting disruptive comments?
    • On Czech and Hungarian Wikipedia, under 1 percent of reply tool comments posted by users during the reviewed time period were reverted under 48 hours. Arabic Wikipedia had a significantly higher revert rate in comparison (6.8%). However, this revert rate is similar to the desktop revert rate on content namespaces on all Wikipedia projects, which was 6.7% during the same time period and reverts were mainly associated with editors with over 1000 edits.
    • Only 1.45% of all reply tool users were blocked after posting a comment across all three target wikis.

Also, please note that I made one change to the following metric identified in the task desciption.

Distribution of people who posted a comment with the tool on ___ (see grouping below) distinct days:
1-5 days
6-10 days
11-15 days
16-20 days
21-25 days
26-30 days
etc.

I changed the bin width to just 1 day instead of 5 since a large majority of reply tool users in each wiki only used the reply tool between 1 to 5 distinct days. With this change, we can see more detail about reply tool usage at this level.

@ppelberg Here is the first draft for the workflow engagement report. I've provided a summary of key findings below and you can find more details about the code and copies of the charts in the repo.

  1. This looks great, @MNeisler – thank you for putting this together.
  2. Changing the distinct tool day use bin width to 1 days instead of 5 days makes a lot of sense; I'm glad you did this.

Before posting reactions to this data, I'd like to clarify a few things through the questions below.

Note: because some of the questions below are ones I'd like to explore and others are questions I think you'd be best positioned to explore, I've been quite liberal with "@'ing" you (e.g. @MNeisler) so you can [hopefully] spend less time wondering, "Is Peter expecting me to think about this?"

Clarifying questions

  • @MNeisler: would it be accurate for me to understand the below as meaning: one user, across all wikis, in the ~67 days this analysis covered, turned off the Reply Tool after publishing at least one comment?
    • "There has only been one user on Hungarian Wikipedia has turned off the Reply Tool preference in Special:Preferences after making a comment using the reply tool since deployed as an opt-out feature"
  • @MNeisler: would it be accurate for me to understand the below as meaning: these are counts of the percentage of unique people who, at some point during the 67 days between 24-Sep and 30-Nov, reached a step (e.g. init, ready, firstchange, etc.) in the Reply Tool comment "funnel."
    • "The data below reflects the number of users not sessions..." in Reply Tool Funnel.
  • @MNeisler: would it be accurate for me to assume that people who show up in the Comments made with VisualEditor chart could also show up in the Comments made with Wikitext chart and vice versa?

Refinements

  • @MNeisler: where applicable, can you please add charts that aggregate Junior Contributors and Senior Contributors into two separate "bins"? Where "Junior Contributors" are people who have made <100 cumulative edits edits across namespaces ? Where "Senior Contributors" are people who have made >500 cumulative edits across namespaces?
  • @MNeisler: can we see companion charts to the already-existing "Reply Tool Wikitext vs Full Talk Page Wikitext Completion Rates" chart that shows "Reply Tool visual vs Full Talk Page Wikitext Completion Rates" and "Reply Tool (visual and source) vs. Full Talk page Wikitext"?
  • @MNeisler: can we extend the revert rate time window from 48 hours to 96 hours? Thinking about this again now: I wonder whether English Wikipedia is the best benchmark here considering I assume its "revert speed" will likely be faster than any other wiki.

Curiosities

  • @MNeisler: How do you interpret the Distribution of people who posted a comment with the tool on distinct days "By wiki" chart? My instinct: we should be encouraged by the non-trivial number of people who use the tool on 10+ days for it suggests there is the potential for people to become heavy users of it.
  • What could explain why relatively few people (28% at ar.wiki vs. 58% avg. across all wikis) who have made 1-4 total edits are completing the comments they start at ar.wiki using the Reply Tool's visual mode? See: "Reply Tool Workflow Funnel on Arabic Wikipedia using Visual Editor" chart in the Workflow funnel for each opt-out wikis by user's experience leve section.
  • Related: what could explain people with 0 edits publishing comments with the Reply Tool at a higher rate than people who have made 1-4 edits?
  • At ar.wiki, why might we be seeing almost the same number of people using the Reply Tool's visual and source modes in all user groups, save for the 0 edit group? See: "Reply Tool Workflow Funnel on Arabic Wikipedia using..." "Visual Editor" and "Wikitext" charts in the Workflow funnel for each opt-out wikis by user's experience leve section.

Future explorations

  • What is the distribution of load times and how do they vary between the Reply Tool's source and visual modes?
  • What could be contributing to comments made with the Reply Tool being reverted at higher than expected rates, especially among editors with 1,000+ edits? Have people at ar.wiki noticed this?
  • What could be contributing to the drop-off between the saveintent and savesuccess steps in the Reply Tool comment funnels? See the Comments made with Wikitext and the Comments made with VisualEditor charts.

Notes from the conversation @MNeisler and I had earlier today...

Clarifying questions

  • @MNeisler: would it be accurate for me to understand the below as meaning: one user, across all wikis, in the ~67 days this analysis covered, turned off the Reply Tool after publishing at least one comment?
    • "There has only been one user on Hungarian Wikipedia has turned off the Reply Tool preference in Special:Preferences after making a comment using the reply tool since deployed as an opt-out feature"

CONFIRMED: one person, across ar.wiki, cs.wiki and hu.wiki, turned the Reply Tool off after posting at least one comment.

  • @MNeisler: would it be accurate for me to understand the below as meaning: these are counts of the percentage of unique people who, at some point during the 67 days between 24-Sep and 30-Nov, reached a step (e.g. init, ready, firstchange, etc.) in the Reply Tool comment "funnel."
    • "The data below reflects the number of users not sessions..." in Reply Tool Funnel.

CONFIRMED: the above is accurate.

CONFIRMED: yes, if someone interacted with the tool in both source and visual mode they would be represented in both of the charts mentioned above.

Refinements

  • @MNeisler: can we extend the revert rate time window from 48 hours to 96 hours? Thinking about this again now: I wonder whether English Wikipedia is the best benchmark here considering I assume its "revert speed" will likely be faster than any other wiki.

DECIDED: we're not going to pursue this refinement right now. Instead, we'll continue to rely on the revert window Megan cited in the report: https://meta.wikimedia.org/wiki/Research:Revert.

Curiosities

DISCUSSED: Megan shares a similar optimism and added that it is worth noting the non-trivial percentage of people (11% at ar.wiki, 18% at cs.wiki and 15% at hu.wiki) who used the Reply Tool on 2 distinct days.

  • What could explain why relatively few people (28% at ar.wiki vs. 58% avg. across all wikis) who have made 1-4 total edits are completing the comments they start at ar.wiki using the Reply Tool's visual mode? See: "Reply Tool Workflow Funnel on Arabic Wikipedia using Visual Editor" chart in the Workflow funnel for each opt-out wikis by user's experience leve section.

Megan noted that we should hesitate to draw conclusions about the 0 edit count group considering how few people were in it (a maximum of 18 people).

  • Related: what could explain people with 0 edits publishing comments with the Reply Tool at a higher rate than people who have made 1-4 edits?

See above.

Task description update

Refinements...

The task description now contains the "refinements" Megan and I agreed to making to the report [i] shared in T247139#6682288. See the task description's ===Metric refinements section for more details.


i. https://nbviewer.jupyter.org/github/wikimedia-research/Discussion-tools-analysis-2020/blob/master/Engagement-Metrics/Reply-Tool-Workflow-Engagement-Metrics.ipynb

Refinement update
ADDED the refinement below to the task description's ===Metric refinements section after discussing with @MNeisler.

  • Filter out section=new edits from the "Reply Tool Wikitext/Visual/Both vs Full Talk Page Wikitext Completion Rates" charts.
    • Rationale: these charts are intended to help us approximate how much more or less likely people are to successfully publish replies with the Reply Tool vs. full page wikitext editing. While people may be using full page wikitext editing for a variety of purposes, beyond replying, we can be relatively sure they are not using the section=new form/workflow for replying and thus think we can safely exclude this from this particular analysis/dataset.

@ppelberg

I've updated the engagement metrics report with the following change:

1.Where applicable, can you please add charts that aggregate Junior Contributors and Senior Contributors into two separate "bins"? Where "Junior Contributors" are people who have made <100 cumulative edits edits across namespaces ? Where "Senior Contributors" are people who have made >500 cumulative edits across namespaces?

Here is a link to the updated report and repo.

The above change took a little more time than anticipated so I have not yet updated the report with the other two refinements. I'll prioritize the remaining two metric refinements as well as do a final clean-up of a report as soon as I'm back from break. Please let me know if you have any concerns or questions.

@ppelberg I've completed revising the engagement report to address all of the identified refinements. Here is the updated report for review. Please let me know if you have additional questions or revisions before finalizing.

@ppelberg I've completed revising the engagement report to address all of the identified refinements. Here is the updated report for review. Please let me know if you have additional questions or revisions before finalizing.

This report looks great and delivers answers to the questions we were seeking. Nicely done, @MNeisler.

I will publish the key findings from this report on the Reply Tool project page. This work will happen in T271391.