Page MenuHomePhabricator

Proposal : Automatic editing suggestions and feedbacks for articles in Wiki Ed Dashboard (GSoC/Outreachy Summer 2017)
Closed, ResolvedPublic

Description

Profile

Name : Keerthana S
Email : keerukeerthana8@gmail.com
IRC nick : keerthana
Blog : https://developerbytes.wordpress.com/
Github : https://github.com/keer25
Location : India (UTC +5:30)
Typical working hours : Between 2 pm and 3 am UTC +5:30

Synopsis

The Wiki Ed Dashboard / Programs & Events Dashboard is a Ruby on Rails + Javascript application that helps people organize groups of newcomers to contribute to Wikipedia. This project aims to use ORES (Objective Revision Evaluation Service) to provide specific useful editing suggestions to newcomers about how they can improve existing Wikipedia articles or article drafts they are working on. This can give an idea of what kind of improvements are needed for the article and gives them a jumpstart in their contributions

Mentors : @Ragesoss, @Capt_Swing

Final Summary

  • After the project, an Automatic feedback feature is successfully included in the Wiki-Ed dashboard that is shown in the MyArticles component that shows the articles edited by a particular user and also shows them in the Articles Edited Section for the suggestion to be reviewed by instructors.
  • There is a feature for users to add their own suggestions on the edits in addition to the Automatic Feedback.
  • Adding an easy way for users to give feedback on this feature.
  • Even if the Automatic Suggestions couldn’t be made smarter as part of my project I hope the work done to document the ORES features would encourage such improvements.
  • Making a spreadsheet of the ORES suggestions pointed to some blatant errors in ORES predictions due to redirects and the issue is notified in this task.

Next Steps

  • I couldn’t contribute to making the Automatic Suggestions smarter as I lacked some experience with Wikipedia Editing and this is a major scope of improvement for the project.
  • Adding the feature for custom user suggestions on the edits in the Article-Viewer component where it could be more relevant.

Timeline

PeriodTask
May 4 to May 30Community bonding period. UI design for displaying the feedback messages with links to provide feedback on the suggestions. Designing UI mockups. Framing questions for the feedback form.
May 30 to June 12Creating buttons for pulling feedback in articles section under all subcategories (example : Available articles) and for pulling feedback in every row of contributions under an editor in editors tab (but only visible on logged in editors' contributions). This enables only the respective editors to receive feedback for their Sandbox drafts. Adding a toggle/filter UI to switch to only My Contributions in the Activity tab and in each row of feedback button will be present.
June 13 to June 19Design feedback form (link to which is present in modal opened by the feedback button) . UI for admin user to view the feedback responses collected. Announce feature in mailing lists or through other channels to the community
June 20 to June 26Improve the feedback with ORES features with the help of feedback received and find other features that can be helpful in giving feedback to find areas of improvements in ORES.
June 26 to June 30Phase I evaluation
June 30 to July 7Design Onboarding UX for users to get introduced to the feature which can improve feedback.
July 8 to July 14Write feedback based on the article grades(ORES predicted) using https://en.wikipedia.org/wiki/Template:Grading_scheme .
July 15 to July 31Designing method to measure how the feedback affects the following edit and collecting data(Needs more investigation) and improvements in feedback messages based on the feedback responses received.
July 28Phase II evaluation
August 1 to August 8Improve feedback based on the data collected on the effectiveness of the feedback messages.
August 9 to August 15Improvements based on the feedback responses received and find and document useful features that can be added to ORES.
August 16 to August 28Bug fixes, Writing documentation and Updating appropriate guides. Code cleanup for submission.
August 29 to September 5Mentors submit final student evaluations
September 6Final results of Google Summer of Code 2017 announced

Deliverables

  • Feedback buttons in articles/editors/activity section.
  • Collecting and viewing feedback responses
  • Improving the feedback messages with the help of the feedback responses with ORES features.
  • Identify areas of improvement for ORES and make feature requests to the community.

Phase I evaluation

  • Onboarding UX for the automatic feedback feature.
  • Feedback based on article grades (ORES predicted).
  • Data based on the feedback of the previous version and the subsequent edit.( To identify target users, articles &c.)
  • Improving the feedback messages with the help of the feedback respones with ORES features.
  • (Optional) Including in the feedback messages how the edit has improved from the previously given feedback

Phase II evaluation

  • Improve feedback based on the data collected on the effectiveness of the feedback messages.
  • Document useful features that can be added to ORES.
  • (Optional) UI of thumbs up/down to mark the feedback helpful.
  • Write documentation and update guides.

Final evaluation

Participation

  • Work on a separate branch on git and uploading code to the forked repo almost on a daily basis. Creating pull requests as and when a complete feature is done.
  • Online on IRC in my working hours ( 2pm to 3 am UTC +5:30)
  • Communication on tasks will be through commenting on subtasks to the project created on Phabricator.
  • Weekly reports will be published in my meta wiki user page
  • Publishing on my blog the summary of a task at the end of a task period as above in the timeline( https://developerbytes.wordpress.com/)

About me

Currently in the sophomore year of B.Tech in Engineering Physics in Indian Institute of Technology, Madras. Heard about this program in a campus wikimedia hack session. The odd semester starts on August second week but I will be able to commit enough time for the project as there are no exams during the period.

I am eligible for both Outreachy and GSoC. I am applying for both the programs with the same project. I am hoping that this will be a kickstart into open source development. This would be a great way to get exposure on real world applications, collaboration of code and interacting with the open source community.

What inspired me to work on this project is that I like the concept of letting students submit assignments as wikipedia content which the whole world can see and review rather than as a term paper to the professor. I would like to continue contributing to this(Wiki edu dashboard) project and it would be great to see it implemented in more universities worldwide.

Past Experience

I have gained experience with Ruby on Rails and VCS (specifically git) by working in college techops teams to design portals for several organizations in college.
I like to participate in Appathons (I develop Android apps) and have creates several Android apps.
I have developed a game Quantum Chess (hosted at http://qchess.bitballoon.com/ ) with Unity3D (UnityScript).

Microtasks completed

Event Timeline

Keer25 renamed this task from Automatic editing suggestions and feedbacks for articles in Wiki Ed Dashboard to Proposal : Automatic editing suggestions and feedbacks for articles in Wiki Ed Dashboard (GSoC/Outreachy Summer 2017).Mar 18 2017, 8:40 PM

This is an excellent start, @Keer25.

One of the most important things will be to get it into a state that we can do users tests with as soon as possible. Since the internship period is pretty short, and it occurs during the US summer period, we're probably not going to get to a point that we can do large-scale analysis of how work improves after feedback. We're going to want to focus on getting detailed feedback from a small number of users through user tests. To that end, I suggest a little bit of reordering: before diving into ORES and the article grading scheme, we should do a round of user tests as soon as we have a UI for delivering feedback (which we can use with the simple proof-of-concept feedback service that is already in place). Seeing it in action, and seeing the kinds of drafts that dashboard users have, will help guide what kinds of feedback we should build.

I think the item you marked as optional for providing feedback on sandbox work will actually be very important, although we can probably include that easily in the initial UI. For most student editors using the dashboard, that sandbox stage will be where automatic feedback will be most useful.

Another minor change is that we should plan to write tests as we go, rather than at the end. In general, we'll want to try to build out the feedback system in as small of steps as possible — lots of PRs that build upon one another — and we'll include tests with each new bit of functionality.

@Capt_Swing Please have a look at this when you get a chance.

@Keer25 I think this is a strong project plan. A lot of the details will inevitably change along the way, but it demonstrates a realistic picture of the overall project and shows you've thought it through well.

I think this will be a competitive application. Additional code contributions would be my top suggestion at this point to improve your chances. This may be helpful, if you haven't read it: https://www.mediawiki.org/wiki/Outreach_programs/Selection_process

@Ragesoss @Keer25 this looks like a good proposal overall. However, I honestly don't know enough about the project yet to make a judgement about whether the evaluation component ("Designing method to measure how the feedback affects the following edit") is realistic. Depending on what kind of evaluation is required and how much experience Keerthana has with that method, evalutaion could end up being a substantial project all on its own.

@Capt_Swing "Designing method to measure how the feedback affects the following edit" I need to investigate more into how to do it but the end deliverable as of now is that data on subsequent revision and the corresponding feedback on the previous revision is compared and collected.

@Ragesoss Thanks for the suggestions. I am looking to make more code contributions this weekend. I was held up with exams for a while.

+1 @Capt_Swing To an extent, evaluation will likely highlight currently hidden issues in the ORES articlequality-modeling prediction model too. I'll be on the lookout for those kind of insights to come through this project and I'll help out where I can.

Just a quick note that I spoke with @Halfak and I understand better what "evaluation" means in the context of this project, so I am no longer worried that it will be too complex/too time-consuming to perform a useful evaluation.

Is there a reason to keep this proposal task open, now that GSoC 2017 / Outreachy 14 are done, and as parent task T158679 exists?

No reason to keep to open.