This ticket is meant to gather the work required to build a system that generates actionable edit suggestions that can be presented in an open-ended set of moments/places within the interface. Think: Newcomer homepage, read mode, visual editor, WikiProjects, events, etc.
Stories
- As a campaign, WikiProject, event, etc. organizer, I'd like to generate a manageable overview on-wiki of actionable edits from defined article set (e.g. manually, articles that belong to certain categories, etc.) so that I can easily identify "low hanging fruit" suggestion to present to participants and track our progress on completing them
- See en:Wikipedia:Cleanup Thank you to @Sj for making us aware of this.
- Talk to: @ifried, @SEgt-WMF , TBD
- Currently organizers are using tools like Petscan to generate lists and Google Sheets to generate task overviews. Also see: CEE Spring tables which Netha Hussain used for work on Women's Health articles, Integraality which @GFontenelle_WMF used to create the International Museum Day data challenge in 2023.
- T404467: Design exploration: Collaborative tracking against a worklist
- As an engaged reader who is in the midst of reading an article on Wikipedia, I'd value knowing if there are improvements I can consider making that are a) a fit for the limited (if any) experience I have editing Wikipedia and b) small/light, specific, actionable, and structured/constrained enough for me to feel like they won't "pull me out" of the reading/learning I'm doing, so that I can engage more deeply with the material I am curious about/interested in with the added benefit of feeling proud for contributing to the resource (Wikipedia) I value
- Talk to @KStoller-WMF ("Add a link" in read mode experiment, non-editing participation, etc.), @ovasileva, TBD...
- As a new account holder who is motivated to edit/contribute to Wikipedia and is viewing suggested edits on the Newcomer homepage, I'd value being able to browse a broad range of compelling edit suggestions [i], so that when I feel moved to edit Wikipedia I can be relatively certain that I can quickly and easily find something to do that will align with what I'm interested in and motivated by in that particular moment
- Think about the "Add reference" Suggested Edit and the extent to which the feedback some volunteers at es.wiki shared about the task has to do with the suggestion not pointing people to the specific part of an article that needs a reference.
- As someone who curiously clicked/tapped edit an article I'm reading, I'd value being able to easily see edits aligned with my interest and experience that I could consider making, so that I can engage more deeply with the material I am curious about/interested in with the added benefit of feeling proud for contributing to the resource (Wikipedia) I value
Components
A WIP list of the various components to the user experience...
- Showing / hiding suggestions
- Treatment: how "suggestions" are visually presented
- Rationale: why this suggestion
- Actions: what choices the suggestion presents to you
- Navigation: moving between suggestions
- State: various states a suggestion can take on
- Source: who/what is making the suggestion (a volunteer, a machine, etc.)
Data/Evidence
- T398250: Understanding newcomer onboarding experiences - mentorship, relatedness, and competence
- "Curious clicks" research from User:KKhan-WMF via Mobile web editing research
References
- User:Phlsph7/SourceVerificationAIAssistant: Source Verification AI Assistant is a user script to help editors verify whether a reliable source supports a claim. See: T399642.
- User:Polygnotus/Scripts/TypoFixer: this script, used in combination with {{Verify spelling}} can be used to allow users to fix typos or reject proposed fixes in one click. via @Polygnotus
- en:Wikipedia:Backlog: This backlog page lists tasks that should be done to improve Wikipedia (assuming the cleanup templates were placed correctly). via @Sj
- Writing articles with large language models/Archive 1
- Understanding the use of maintenance templates to develop models to support editors by @MGerlach via @Miriam
- en:Template:Refideas editnotice: template is automatically added to articles which have {{Refideas}} on their talk page, linking users to Template:Refideas on the article talk page, so that the user can be aware of the sources listed there. via @Samwalton9-WMF and @AAlhazwani-WMF:
- "I've been experimenting with using LLMs and web searching to do multi-step systematic review to find errors in articles, with very promising results..." - User:The Anome at en.wikivia @Samwalton9-WMF
- "Grokipedia not only copies WP articles, it also 'fact-checks' them. Now what really surprised me is that in the few articles I checked which were originally co-written by me, the corrections made by Grokipedia were actually on point! After diving into the sources, I even corrected the WP articles accordingly. I recommend everyone to check the Grokipedia versions of articles they have worked on and to click on the 'See Edits' button in the top right corner. It gives you a succinct description of the 'issue', the 'fix', and the 'supporting evidence' Grok seems to have used. You of course need to check everything in the sources, but as an error-detector for WP articles it works beautifully." | via User:Apaugasma at en.wiki
- GPT-5 powered article bias analysis WikiTool proposal to use AI model to suggest ways to improve an article
- Examples in person who proposed's sandbox here and here
- Thank you to @EBlackorby-WMF for spotting this.
- @derenrich prompts Gemini to find and fix an article within the Wikipedia article he is editing:
- WikiFix: uses LLMs to find inconsistencies in Wikipedia. //via @Pablo
- Noam Brown on X/Twitter showing how they prompt ChatGPT to identify potential errors within the Wikipedia articles they're reading:
- SuggestBot via @nettrom_WMF
- Seeks to help Wikipedia contributors find articles to edit.
- WikiProject X
- At one point, @nettrom_WMF + @Harej were thinking about extending SuggestBot to offer WikiProject-spcific suggestions
- https://bambots.brucemyers.com/cwb/bycat/Africa.html via @CMyrick-WMF
- T404467: Design exploration: Collaborative tracking against a worklist via @ifried
- The suggestions we (Editing et al) end up creating to populate suggestion mode with could, very well, be used to help volunteers/organizers create lists of immediately actionable edits
- Automate the discovery of structured improvement tasks for Vital Knowledge articles via @SEgt-WMF
- Wish: Apply AI to article improvement suggestions on the homepage: "A much better AI system than the previous one that recommends articles for improvement would help editors going through this, including new and veteran users."
- WikiVault (ko, en) by @Ykhwong: an integrated AI machine translation, researching, and writing tool called WikiVault. The tool currently has three main functions.
- Translation: Using AI to provide more accurate translations
- Writing: Using AI to quickly draft articles
- Quick running: Quickly access AI features on any screen using shortcut keys
- "AI that helps editors address problems with Wikipedia is welcome (if it works). We already use it, in various places, to varying degrees of success, and to relatively little pushback. ." | via @asilvering (source)
- "A next-level deployment would be for AI to read the sources of the article, summarize those, and then compare its summary to the written article, and post suggestions for changes on the talk page. A next-next-level deployment would be for AI to suggest new sources for the articles, or to read and summarize sources that aren't in the article, and then post edit suggestions to the talk page. AFAIK AI isn't ready for this level yet, but the first two suggestions above could probably be done today with reasonable accuracy using large and small language models. I hope the WMF keeps developing, experimenting, testing, and iterating on these approaches." | via @Levivich (source)
- "This is what we want. AI that supports editors. The human makes the decisions, the AI can only propose improvements but is incapable of doing anything." | via @Polygnotus (source)
- Ported from Anthropic Claude to free-as-in-beer Gemini Flash 2.5 by @Cramulator
- "A few months ago I obtained an AI generated list of typos on Wikipedia. I went through much of it manually, fixed a bunch of typos, made some suggestions for additional searches for AWB typo fixing, but ignored a whole bunch of AI errors that were either wrong or Americanisations." | WereSpielChequers describing how they're using AI to identify ways to improve existing Wikipedia content.
- For volunteers to be motivated to author Check suggestions, they'll need evidence that doing so results in people seeing said Checks and ultimately, acting upon them
- We ought to be open-minded about what can potentially cause a suggestion to be shown? E.g. do suggestions need to be gated behind someone explicitly asking for them to be revealed? Might suggestions be shown in an existing or yet-to-be-introduced moment (e.g. post-save)?
- What do we mean by "Suggestions" in this context? More broadly, we need to become clear about the vocabulary around this system.
- At least two internal meanings: 1) "Show me Checks applied to content I didn't write.", 2) "Show me low priority Checks for content I did write.", and 3) The moment in which the Check is shown (mid-Edit, Pre-Save, etc.)
- Need to avert confusion with Growth's existing work on Suggested Edits.
- At least two internal meanings: 1) "Show me Checks applied to content I didn't write.", 2) "Show me low priority Checks for content I did write.", and 3) The moment in which the Check is shown (mid-Edit, Pre-Save, etc.)
- To what extent does it need to be clear who authored the Check? //An AI, a volunteer, Wikipedia/the platform..."/
- Screenshot showing how @Polygnotus uses a userscript while viewing Wikipedia to identify ways to improve the Wikipedia articles they are viewing:
- WP:AutoWikiBrowser
- 11 April 2025 via @DLynch: “Suggestion mode” — existing edit checks applied to the entire article, not just your own changes.
i. "Compelling edit suggestions" = aligned with my experience level, availability (read: time), and access (read: device(s))







