Page MenuHomePhabricator

Wikimedia GSoD 2020 Proposal: Audience research and user experience
Closed, DeclinedPublic

Description

Personal Information
Name: Vrushti Mody
MediaWiki user: VrushtiMody
Location: India
Time Zone: UTC+5:30
Typical working hours: 17:00 - 1:00
Technical Blog: https://blog.vrushtimody.me/

Motivation
I was introduced to open source software recently and felt like it is the best way to learn, explore and contribute to the community. On digging deeper, I learnt about several open Source initiatives, Google Season of Docs being one of them.
Upon reviewing the organisations, the project ‘Audience research and user experience’ targeting understanding the wants and needs of the fellow community struck a chord with me. The Wikimedia community has been really supportive ever since I joined the chat and sent my first email.
I want to set foot to this amazing open source culture and hopefully learn and grow as a technical writer during the process.

A Pilot Survey to Define Utility of Technical Documentation for Wikimedia Users and Creating a Set of Recommendations for Better User Experience

Introduction
While we know that good technical documentation is essential to the community of volunteer developers, there is still a lot that we do not know about our audiences and their needs. Consumer behavior has changed over the period of time. Today users often google or watch youtube videos versus reading technical documentation. This project is meant to help us gain a better understanding of who is using technical documentation on MediaWiki and Wikitech, how they are responding to the current documentation, and what their wants and needs are for better technical support. The results of this project will help us build a technical content strategy that works for the users of Wikimedia to support their requirements.

Objective
To understand the utility of the documentation.
To identify enhancements needed to make it more useful and robust.

Review of Literature
Preliminary review of technical documentation suggests :
To study issues reported by the clients
Support team members reference technical documents all the time, and they are the first to receive user complaints, as well. So, a cross-team communication channel needs to be open and allow quick back-and-forth data exchange.
To conduct periodic surveys to obtain user feedback

Methodology
The methodology I follow will have 3 phases.
Phase 1:
Identify a survey tool for randomised trial
Defining sample size for the pilot study
Identify methods for data collection and meeting sample size
Identify inclusion exclusion criteria for the survey participants
Define valid and invalid survey responses
Phase 2:
To qualify the subjective data into broad categories to have better user experience for each category of user
Phase 3:

  • Analysis and observations

Standard statistical tools will be used for studying the outcomes

  • Discussion

Discussing the current findings to that of those available from previous literature review will help in better insights and outcomes from the study.
Limitations:
The sample population is one of the biggest limitations of their survey. Other limitations may include biased questions in the survey, lack of data, etc.

Deliverables

Review Literature available
Identify survey tool to best meet the study objective
Defining ways of reaching out to the target population to improve the survey validity.
Conducting a survey (following up with the users to get responses)
Observation and analysis of the data
Statistical evaluation of the data as applicable
Discussion and summary of the study

Mentors
Sarah R. Rodlund
Alex Paskulin

Zulip will be the primary mode of communication with my mentors. Wikimedia’s IRC channels and Email will be used for discussions with the community.
Discussions about specific tasks will happen in the comments section of the Phabricator tasks.

Discussion
This project is broadly divided into two phases:
Creating a survey for understanding the user preferences
Obtaining insights from the survey

  1. Creating a survey for understanding the user preferences

Why is it needed?
Technical Documentation survey is integral in obtaining customer feedback about the technical documents of a project. Not only does it provide an in-depth understanding of the product usage, but also helps point out things that might not be evident initially. As it happens, when making a product, we get so involved in it, we are not able to look at it from different perspectives. This is where a technical documentation survey makes the job easier. Insights from customers are essential to determine areas where improvement is needed. I plan on using one of the available questionnaires that best suits our description from sites like SurveyMonkey, QuestionPro, etc.

  1. Obtaining insights from the survey

Understanding Participant Demographics
Understanding and categorising the sample population based on demographics will help understand which category of people need what kind of help w.r.t the documentation
Preliminary Analysis
Preliminary analyses on any data set include checking the reliability of measures, evaluating the effectiveness of any manipulations, examining the distributions of individual variables, and identifying outliers.
This would be helpful to briefly understand the problems to work towards existing solutions.
Secondary Analysis
The secondary analysis will help get a broader perspective w.r.t results of the survey. Two of the many tests that would be used are Hypothesis analysis and correlation analysis
Hypothesis Testing: Hypothesis testing in statistics is a way to test the results of a survey or experiment to see if they have meaningful results. It is basically testing whether the results are valid by figuring out the odds that your results have happened by chance. If your results may have happened by chance, the experiment won’t be repeatable and so has little use. In the real world, it’s tough to get complete information about the population. Hence, we draw a sample out of that population, the results of the survey in this case and derive the same statistical measures. Thus it is important to find out if there is enough statistical evidence in favour of a certain belief, or hypothesis, about a parameter. Hypothesis analysis will be used to determine if the results of a survey are statistically significant to confidently say that those results are indicative of the user experience of the product in general.
Correlational Analysis: Correlation Analysis is a statistical method that is used to discover if there is a relationship between two variables/datasets, and how strong that relationship may be. It will be used to analyse quantitative data gathered from the survey, to identify whether there are any significant connections, patterns, or trends between the variables.

Tentative timeline

Community bonding period (7th - 25th August)
Analyze the project in detail with my mentors.
Discuss about:
How often the tasks should be reviewed.
Share schedules and decide on a weekly/daily workflow.
Tools and resources that can be used.
Bi-weekly and daily project reports.
Create the required tasks and subtasks

Week 1 (26th August - 1st September)
Identify a survey tool for randomised trial (Phase 1)

Week 2 (2nd - 8th September)
Defining sample size for the pilot study (Phase 1)
Identify methods for data collection and meeting sample size (Phase 1)

Week 3 (9th - 15th September)
Identify inclusion exclusion criteria for the survey participants (Phase 1)
Define valid and invalid survey responses (Phase 1)

Week 4 (16th - 22nd September)
To qualify the subjective data into broad categories to have better user experience for each category of user (Phase 2)

Week 5 (30th September - 6th October)
Identify portals like slack channels and google groups,etc to send the survey to relevant users. (Phase 2)

Week 6 (7th - 13th October)
Host the survey (Phase 2)

Week 7 (14th- 20th October)
Follow up with the users. Collect information from sources other than the survey (Phase 2)

Week 8 (21st - 27th October)
Follow up with the users. Collect information from sources other than the survey (Phase 2)

Week 9 (28th October - 3rd November)
Follow up with the users. Collect information from sources other than the survey (Phase 2)

Week 10 (4th - 10th November)
Use standard statistical tools to study the outcomes of the survey. Conduct Primary and secondary analysis (Phase 3)
Week 11 (11th - 17th November)
Use standard statistical tools to study the outcomes of the survey. Conduct Primary and secondary analysis (Phase 3)

Week 12 (18th - 24th November)
Discussing the current findings to that of those available from previous literature review will help in better insights and outcomes from the study (Phase 3)
Week 13 (25th - 29th November)
Work on the project report

Post GSoD
Work as a volunteer for Wikimedia

Past Contributions
T181946 (Update Screenshots for the "Huggle" software in the user manual on mediawiki.org)
T29650: Update Install/Upgrade documents on mw.org: Manual:Upgrading and Manual

About Me
I am a final year undergraduate student, pursuing B.Tech. in Computer Engineering from NMIMS University, Mumbai, India. I believe that, Open source projects are a great way of developing quality products by working together with other developers. I have always wanted to contribute to open source projects to give back to the Community. Wikimedia supports many projects like Wikipedia, Wikibooks, Wiktionary, Wikiquote and many more which I have been using for a long time. So while going through the GSoD organizations list, when I came across Wikimedia I knew it was the organization I wanted to apply to. Participating in GSoD will help me meet the community, exchange ideas, learn new skills and grow as a technical writer. I see GSoD as an opportunity to play a small role in the open source community and contribute to the organisation that I love.
I am a curious person who enjoys figuring out the building blocks of the world and rearranging them to build something even better. I have experience working with Markdown, Git, Github, etc
Contributing to this project, would help me improve my technical writing skills and create better and more usable documentation by the community at large.

Other Information
To know more about me, you can visit my personal site: www.vrushtimody.me
All my projects are hosted on https://github.com/vrushti-mody

Bibliography:
[1]https://medium.com/level-up-web/technical-writing-as-a-part-of-user-experience-2cfd97554d09
[2]https://comtechp7.hypotheses.org/files/2017/10/Compelling-documentation-a-way-to-improve-the-user-experience-1.pdf 01/06/2020
[3]https://comtechp7.hypotheses.org/files/2015/11/2013-OceaneFrancCeliaGoiset.pdf 09/06/2020
[4]https://clickhelp.com/clickhelp-technical-writing-blog/usability-of-technical-documentation/
[5]https://www.deque.com/blog/user-documentation-important/ 12/06/2020
[6]https://www.uxmatters.com/mt/archives/2016/08/creating-user-friendly-documentation.php 16/06/2020
[7]https://medium.com/@tonichang03/making-a-better-user-experience-for-using-technical-documentation-9e8d2b38d392#:~:text=It%20provides%20an%20insight%20into,of%20their%20length%20and%20scope. 22/06/2020
[8]https://infopros.com/software-documentation-and-training-case-study/ 24/06/2020
[9]https://infopros.com/public-utility-case-study/ 29/06/2020
[10]https://pressbooks.bccampus.ca/technicalwriting/chapter/casestudy-costpoorcommunication/ 30/06/2020
[11] https://www.surveymonkey.com/r/SL3PGY2 01/07/2020
[12] https://www.questionpro.com/survey-templates/technical-documentation/ 04/07/2020

Event Timeline

Nice work, @VrushtiMody! I think you’ve provided a good level of detail here, and I like that you’ve included information about your methodology. Here are a few suggestions:

The items listed under “Deliverables” seem to be a mix of goals, deliverables, and strategies. For example: “Understanding loopholes in existing documentation” seems like more of a goal, while “Use of wikitech-I for the survey” would fit better under strategy. Make sure the things listed as deliverables are concrete.

Additionally, I see that you’ve included a few quotes. Make sure to cite your sources.

Work on pop-ups such as rating the documentation on clicking the back or close button

We currently have a similar gadget enabled in the API namespace on mediawiki.org. See the bottom of https://www.mediawiki.org/wiki/API:Main_page for an example. T248892 discusses the data from that gadget and how we might improve participation. If you’re interested, I could see adding an element to your proposal about working on improvements to this gadget, but keep in mind that it’s important to keep your proposal focused and achievable.

Nice work, @VrushtiMody! I think you’ve provided a good level of detail here, and I like that you’ve included information about your methodology. Here are a few suggestions:

The items listed under “Deliverables” seem to be a mix of goals, deliverables, and strategies. For example: “Understanding loopholes in existing documentation” seems like more of a goal, while “Use of wikitech-I for the survey” would fit better under strategy. Make sure the things listed as deliverables are concrete.

Additionally, I see that you’ve included a few quotes. Make sure to cite your sources.

Work on pop-ups such as rating the documentation on clicking the back or close button

We currently have a similar gadget enabled in the API namespace on mediawiki.org. See the bottom of https://www.mediawiki.org/wiki/API:Main_page for an example. T248892 discusses the data from that gadget and how we might improve participation. If you’re interested, I could see adding an element to your proposal about working on improvements to this gadget, but keep in mind that it’s important to keep your proposal focused and achievable.

Thanks Alex! Will make the changes accordingly

Nice work, @VrushtiMody! I think you’ve provided a good level of detail here, and I like that you’ve included information about your methodology. Here are a few suggestions:

The items listed under “Deliverables” seem to be a mix of goals, deliverables, and strategies. For example: “Understanding loopholes in existing documentation” seems like more of a goal, while “Use of wikitech-I for the survey” would fit better under strategy. Make sure the things listed as deliverables are concrete.

Additionally, I see that you’ve included a few quotes. Make sure to cite your sources.

Work on pop-ups such as rating the documentation on clicking the back or close button

We currently have a similar gadget enabled in the API namespace on mediawiki.org. See the bottom of https://www.mediawiki.org/wiki/API:Main_page for an example. T248892 discusses the data from that gadget and how we might improve participation. If you’re interested, I could see adding an element to your proposal about working on improvements to this gadget, but keep in mind that it’s important to keep your proposal focused and achievable.

Thanks Alex! Will make the changes accordingly

Nice work, @VrushtiMody! I think you’ve provided a good level of detail here, and I like that you’ve included information about your methodology. Here are a few suggestions:

The items listed under “Deliverables” seem to be a mix of goals, deliverables, and strategies. For example: “Understanding loopholes in existing documentation” seems like more of a goal, while “Use of wikitech-I for the survey” would fit better under strategy. Make sure the things listed as deliverables are concrete.

Additionally, I see that you’ve included a few quotes. Make sure to cite your sources.

Work on pop-ups such as rating the documentation on clicking the back or close button

We currently have a similar gadget enabled in the API namespace on mediawiki.org. See the bottom of https://www.mediawiki.org/wiki/API:Main_page for an example. T248892 discusses the data from that gadget and how we might improve participation. If you’re interested, I could see adding an element to your proposal about working on improvements to this gadget, but keep in mind that it’s important to keep your proposal focused and achievable.

Nice work, @VrushtiMody! I think you’ve provided a good level of detail here, and I like that you’ve included information about your methodology. Here are a few suggestions:

The items listed under “Deliverables” seem to be a mix of goals, deliverables, and strategies. For example: “Understanding loopholes in existing documentation” seems like more of a goal, while “Use of wikitech-I for the survey” would fit better under strategy. Make sure the things listed as deliverables are concrete.

Additionally, I see that you’ve included a few quotes. Make sure to cite your sources.

Work on pop-ups such as rating the documentation on clicking the back or close button

We currently have a similar gadget enabled in the API namespace on mediawiki.org. See the bottom of https://www.mediawiki.org/wiki/API:Main_page for an example. T248892 discusses the data from that gadget and how we might improve participation. If you’re interested, I could see adding an element to your proposal about working on improvements to this gadget, but keep in mind that it’s important to keep your proposal focused and achievable.

I have made the necessary changes

VrushtiMody renamed this task from Wikimedia GSoD 2020 Proposal to Wikimedia GSoD 2020 Proposal: Audience research and user experience.Jul 7 2020, 1:55 PM
srodlund subscribed.

@VrushtiMody Thank you for applying for GSOD and sharing your proposal with us. This year we were only able to accept one intern, so I am setting this task as declined for now.