Page MenuHomePhabricator

Wikimedia Technical Conference 2019 Session: Standardizing QA best practices
Closed, ResolvedPublic

Tokens
"Meh!" token, awarded by zeljkofilipin."Like" token, awarded by MusikAnimal."Like" token, awarded by Jrbranaa."Love" token, awarded by hashar."Like" token, awarded by kostajh."Like" token, awarded by Aklapper.
Assigned To
Authored By
debt, Oct 4 2019

Description

Session

  • Track: Standardization Decisions
  • Topic: Standardizing QA best practices

Description

This session will be a short presentation of how QA processes are currently integrated into development workflows, and then, an open discussion.

  1. What makes software good - and why QA support is important
    • Three practical concepts of a good software (understanding the problem, simple and intuitive design, seeing users' perspective)
    • Context-specific testing approach
  1. Hands-on exploratory testing exercise (Special:RecentChanges page )
    • Evaluate what should be tested
    • Outline how the features may be tested
  1. Current practices
    • what are QA processes?
      • a team workflow support
      • a daily track of QA work
  2. how QA processes are integrated in development cycle (and why)
    • Planning/Design
    • Implementation
    • Deployment
    • Maintenance

Questions to answer and discuss

Question: How can we ensure QA processes are consistently and effectively integrated into development workflows?

Question: What are our current best practices?

Related Issues

  • What is missing in team processes regarding QA support?
  • Get feedback on the questions of what should've been discussed but wasn't.

Pre-reading for all Participants

  • None

Notes document(s)

Slides: Exploratory testing

https://etherpad.wikimedia.org/p/WMTC19-T234653

Notes and Facilitation guidance

https://www.mediawiki.org/wiki/Wikimedia_Technical_Conference/2019/NotesandFacilitation


Session Leader(s)

Session Scribes

  • Brennen

Session Facilitator

  • Aubrey

Session Style / Format

  • Presentation/demo of QA practices and hands-on exercise

Session Leaders please:

  • Add more details to this task description.
  • Coordinate any pre-event discussions (here on Phab, IRC, email, hangout, etc).
  • Outline the plan for discussing this topic at the event.
  • Optionally, include what this session will not try to solve.
  • Update this task with summaries of any pre-event discussions.
  • Include ways for people not attending to be involved in discussions before the event and afterwards.

Post-event summary:

  • Shared understanding on key issues related to current QA practices
  • Should UI automation testing be part of exploratory testing?

Post-event action items:

  • Continue discussion of QA practices in a broader context of Engineering productivity team
  • Start a discussion of a specialized types of testing - e.g. Accessibility - how to make it a part of QA testing?
  • Document what instrumentation/tools/skills/practices are critical for efficient exploratory testing

Event Timeline

debt created this task.Oct 4 2019, 3:42 PM
kaldari updated the task description. (Show Details)Oct 5 2019, 1:16 AM
kostajh added a subscriber: kostajh.

I am interested in us establishing some kind of guidelines, then eventually have a tool one can run against a code base to check for those guidelines and then receive recommendations and links to tutorial/docs etc as to how to match our expectations. For example on a MediaWiki extension a guideline would be to use JSDoc for JavaScript documentation, and one could detect whether it is in use on the repository, if not a tool could automagically add it to the dependencies, add some basic configuration and point to a document as to how to use JSDoc.

Or is the session more on the workflows and social aspect of development?

@Volans, @Etonkovidova - Would either (or both) of you be interested in leading this session?

kaldari updated the task description. (Show Details)Oct 9 2019, 3:40 AM
Volans added a comment.Oct 9 2019, 1:27 PM

@kaldari thanks for the offer, but I think this deserves some preparatory work that I cannot commit to at the moment, I'm sorry.

@Volans, @Etonkovidova - Would either (or both) of you be interested in leading this session?

Sure, it could be a short presentation of how QA processes are integrated into teams' workflow now, and then, an open discussion.

@Etonkovidova - That sounds great. I'll change the title and description to focus on the QA aspect. Please add yourself as the session leader above and feel free to add more details to the description (or change it completely).

@Etonkovidova - That sounds great. I'll change the title and description to focus on the QA aspect. Please add yourself as the session leader above and feel free to add more details to the description (or change it completely).

Thanks!

kaldari renamed this task from Wikimedia Technical Conference 2019 Session: Standardizing best practices to Wikimedia Technical Conference 2019 Session: Standardizing QA best practices.Oct 10 2019, 10:42 PM
kaldari reassigned this task from kaldari to Etonkovidova.
kaldari updated the task description. (Show Details)
kaldari updated the task description. (Show Details)Oct 11 2019, 2:36 PM

I am interested in us establishing some kind of guidelines, then eventually have a tool one can run against a code base to check for those guidelines and then receive recommendations and links to tutorial/docs etc as to how to match our expectations. For example on a MediaWiki extension a guideline would be to use JSDoc for JavaScript documentation, and one could detect whether it is in use on the repository, if not a tool could automagically add it to the dependencies, add some basic configuration and point to a document as to how to use JSDoc.
Or is the session more on the workflows and social aspect of development?

As far as I understand it, this session is more about high level QA practices, not about tooling. Some tooling could be a part of the recommended practices, of course.

debt triaged this task as Medium priority.Oct 22 2019, 7:03 PM
greg added a comment.Oct 23 2019, 9:39 PM

(Programming note)

This session was accepted and will be scheduled.

Notes to the session leader

  • Please continue to scope this session and post the session's goals and main questions into the task description.
    • If your topic is too big for one session, work with your Program Committee contact to break it down even further.
    • Session descriptions need to be completely finalized by November 1, 2019.
  • Please build your session collaboratively!
    • You should consider breakout groups with report-backs, using posters / post-its to visualize thoughts and themes, or any other collaborative meeting method you like.
    • If you need to have any large group discussions they must be planned out, specific, and focused.
    • A brief summary of your session format will need to go in the associated Phabricator task.
    • Some ideas from the old WMF Team Practices Group.
  • If you have any pre-session suggested reading or any specific ideas that you would like your attendees to think about in advance of your session, please state that explicitly in your session’s task.
    • Please put this at the top of your Phabricator task under the label “Pre-reading for all Participants.”

Notes to those interested in attending this session

(or those wanting to engage before the event because they are not attending)

  • If the session leader is asking for feedback, please engage!
  • Please do any pre-session reading that the leader would like you to do.
debt updated the task description. (Show Details)Oct 25 2019, 9:18 PM
Etonkovidova updated the task description. (Show Details)Nov 8 2019, 8:05 PM
kaldari updated the task description. (Show Details)Nov 11 2019, 5:08 PM
Etonkovidova updated the task description. (Show Details)Nov 13 2019, 2:10 PM

Notes from Etherpad:

Wikimedia Technical Conference
Atlanta, GA USA
November 12 - 15, 2019

Session Name / Topic
Standardizing QA best practices
Session Leader: Elena; Facilitator: Aubrey; Scribe: Brennen
https://phabricator.wikimedia.org/T234653

Session Attendees:
Niharika Kohli, Željko Filipin, Volker Eckl

Notes:

  • Elena: QA engineer at the Foundation, working at WMF for 5 years.  Every project and every team has different deadlines and different processes
  • Exploratory testing:  Opinions differ widely on exploratory testing - some think it's manual repition of checklists - boring and nobody wants to do it.  Others think that exploratory testing is somewhat randomly going from page to page and clicking around.  It's neither of those things.
  • [2] The mystery of exploratory testing
  • What needs to be tested?
    • Functionality
    • Langs
    • Accessibility
    • Performance
    • Browsers/devices...
  • [3] When QA support is needed
    • Planning
    • Design
    • Implementation
    • Deployment
    • Maintenance
  • Why is QA needed in planning?
    • Volker: For the MVP to define what the most important things are
    • E: Not enough to have UX designer involved?
    • V: There's still uncharted territory in the product that a QA person has an easier time identifying.  QA person comes more from a user perspective and less of a planner's and designer's view.
    • E: And that user perspective would be different from UX designer because QA from previous experience has dealt with many bugs from user feedback.  It's also beneficial for QA to evaluate the scope of future testing and risks involved in the product.
  • Design
    • Everything from above
    • Mockups
    • Ž: In a previous life I did a lot of exploratory testing - the value of a tester in the design phase is that devs and PMs are focused on making things work, and testers will have experience in the way that things tend to break.
    • E: How things in fit in the broader picture of features, etc.  Software (especially at WMF) tends to grow organically, historically, and naturally.  Things should fit.  You can lose track of where you were (example: Kaiser Permanente).
  • Implementation & Deployment
    • E: Involvement of QA is less controversial here.
  • Maintenance
  • [4] E: When we have a total absence of QA support [tree swing cartoon, "How the customer explained it...", etc.]
  • [5] What makes software good?
  • [6] 3 principles
    • Understand the underlying problem before attempting to solve it
    • Things should be simple and intuitive
    • Acknowledge that a user is not like you
    • E: Important to have a broader perspective
    • There will be an excercise in exploratory testing.
  • [7/8] "Happy families are all alike; every unhappy family is unhappy in its own way"
    • "Anna Karenina principle"
    • Above but s/familes/software/g
  • [9] How we communicate failures to users
  • [10] Cheerfully informing users about new features
  • [11] No matter what...
    • Internal error
  • [12] "Something went wrong"
  • ...more slides...
  • Discussion of maybe-googleable error messages and their relative utility
  • These slides cover a spectrum of quality in communicating errors to users
  • Ž: Do you have a collection of these?
  • E: Yes I do.
  • [Exploratory testing exercise]
    • Special:RecentChanges ( https://en.wikipedia.beta.wmflabs.org/wiki/Special:RecentChanges )
    • Any language wiki you're familiar with
    • No need to log in
    • Look at page from POV of exploratory testing.  Especially look at functionality of features.
    • Write 3 or 4 points of what should be tested.
    • Try to be in the mindset of exploratory testing.
    • Niharika: I found a bug.
      • Scrollbars under Safari.
    • E: What if something would cancel the whole implementation of the feature?
    • Ž: Can you give us more guidance?
    • E: I have 3 points I will present in 5 minutes.
    • E: This is a typical situation - you're presented with redesigned filters here - what do you think we should test?
    • Ž: I'm lacking context to be useful.
    • E: So that's the first question - would you need documentation, specs?  Very often we're lacking both.  This is why it's important to participate in planning and design so you have context.
    • [discussion of filter features]
    • [discussion of RecentChanges use cases in finding vandals, etc.]
    • Ž: [running brain dump on testing process]
    • N: Different clients...
    • E: Natural testing.
    • V: I just found an error in highlighting.
  • E: So we see here the enormous set of possibilities for testing.
  • [slide] What needs to be tested...
    • ORES filters - thresholds
    • Filter combos - do we communicate sufficiently that some filter combinations don't make sense?  Some filters shouldn't produce any results.
      • Combination of ORES filters and logged actions
      • [demo of this and resultant error message]
    • The old filter functionality - all previous functionality is preserved?
  • [slide] The mystery of exploratory testing
  • [slide] Phabricator screenshot - workboard including Code Review, QA, Design Review, ... columns
  • [slide] Task with specs - checkboxes for various components of design
  • [slide] Daily tracking spreadsheet with tasks and needs / actions / followups
  • [slide] Deployments calendar
  • E: How is exploratory testing done in regards to deployments?
    • [slide] Trains
    • [slide] Changelog - auto-generated for each release
    • [slide] Logstash / Kibana fatal monitor
    • [slide] Phabricator bug triage
      • Especially important to have regular bug triage meetings - highly structured feedback
  • [slide] QA engineer needs
    • provide helpful perspective
    • be satisfied as an information provider, not a gatekeeper
    • to adapt to an iterative methodology
    • to function with a minimum of formal specs
  • [slide] Context driven testing
    • start from project's needs
    • design to fit project
    • some practices might be irrelevant or counterproductive
    • test practices can't be independent of context
    • takes needs of the specific project / situation into account
    • context-driven-testing.com - "there're no best practices" - flexibility is key
  • E: Last, what didn't I talk about?
  • Ž: 2 comments
    • Context-driven testing is a school of software testing - there are 4 or 5 of these, you would have gotten different stuff from different schools of thought
      • Factory school, quality assurance school, analytical school - all have different ideas about software testing
    • Keep in mind that this is just part of what software testers think
    • Exploratory testing is just one of the tools -  a big practice of the context-driven school
  • N: Would also depend on the project
  • Ž: Yeah, or the industry - NASA for example with very rigorous needs or something blows up
  • V: Agile?
  • V: Would be interested in perspective on tester needs - we have very little insights about how QAs are going about testing - would love to see accessibility testing part of QA
  • [discussion of accessibility, utillity of human perspective in workflow]
  • Ž: Usability testing is a specialization.
  • E: We should think about it.
  • E: 2 things:
    • Should automation be part of exploratory testing?
    • Instrumentation / tools

Context-driven testing about testing best practices - the discussion topic

Why QA should be involved into Planning & Design

greg closed this task as Resolved.Dec 17 2019, 10:58 PM

Thanks for making this a good session at TechConf this year. Follow-up actions are recorded in a central planning spreadsheet (owned by me) and I'll begin farming them out to responsible parties in January 2020.