Page MenuHomePhabricator

[Epic] Define manual testing & QA strategy in Codex
Closed, ResolvedPublic

Description

Background

We should define our approach to conducting manual testing in Codex.

User stories

As a Codex designer and engineer, I want to rely on a set of manual testing principles and define concrete browser/OS testing combinations in order to ensure that we offer adequate testing coverage.

Draft Documentation link

Manual testing documented in this Notion

Acceptance criteria

Details

Other Assignee
Sarai-WMDE

Related Objects

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes
STH changed the task status from Open to In Progress.May 26 2022, 12:31 AM

The latest version of the Manual Testing Guidelines is available and up for review in Notion. @EUdoh-WMF already provided very useful answers to the questions shared in the doc: thank you! I'd like to open the write up for feedback to the rest of the team and specially @bmartinezcalvo and @Volker_E 🙏🏻

@Sarai-WMDE you've done a great work documenting all the steps and processes to test our components, all is well structured and well documented in detail. I've provided you some feedback by Notion comments but they are small things to update. Great documentation ✨

Very much agreed, a great rundown @Sarai-WMDE, that should live in a public location.

A big thanks to the team for meeting up to discuss some open questions regarding the open questions about the types of testing performed at different stages of design and development, who carries out these tests and the proactive ways of discussing quality from the initiation of an epic/task.

Now we have a strategy, it will be great to see it in action.

Hey all, great work on this! Just had a couple questions before we close/resolve:

NBaca-WMF subscribed.

Setting Owner to Desiree for Product Sign-off

Hi @ldelench_wmf ,

  • I see only @Sarai-WMDE's question about having a test environment for testing the languages identified by Nat and Volker. Is that what you are referring to?
  • It would be fantastic to put this up on Mediawiki, in my opinion. Any further changes/additions to this can be done on the Wiki. Would be nice to get other opinions about this.

Almost there. Has the whole team signed off on this yet? Can we all sign off before I close?

May need a spin-off task for smaller checklist of activities that designers always have to do when signing off tasks.

Can we get feedback from other teams here as well? This is something that we likely need to align on across teams

Reviewed & added one tiny comment to update a Phab link. Ticket representing feedback gathering from other teams coming presently (out of scope for this ticket).

@EUdoh-WMF sorry I was vague, was referring to these questions (sounds like we just want to use BrowserStack, but wanted to confirm there were no other open questions to be captured in followup tasks:

  • Coverage: In order to simplify, I reduced the original matrix (which combined several browsers - Opera, Edge – with several operative systems) to the minimum, leaving 1 webkit, chromium and quantum browser representative and not specifying : would this be enough? Opera and Edge were left out, although they are explicitly supported by Wikimedia project (see compatibility matrix)
    • If we can use BrowserStack to check this, it would be great.
  • Which tool should we use to test in different browser versions? (can only think of BrowserStack)
    • BrowserStack is fine.

Reviewed Manual Testing and added some small things to fix in the Notion page. But, in general, all the info makes sense to me so +1 with moving forward.

(Added Notion page link in the task description to find it easily in the task)

+1 on moving forward with the manual testing process as outlined on Notion

I'd like to propose breaking our manual testing guidelines into two parts:

  • testing Codex components in isolation, using the Codex docs site (via Netlify deploy previews)
  • testing Codex components within MediaWiki, using the VueTest extension (via MW-Docker and @Mhurd's build script)

I think most of the steps outlined in the Notion doc would apply to both types of tests. But there would also be some differences:

  • Testing in isolation could happen as part of the initial review for a component patch, before that patch gets merged. This is hopefully where most issues would be caught.
  • After a component is merged but before it is included in the next release, we could do a second round of testing inside a MediaWiki environment. The tester would need to spin up a local MW environment that pulled from the main branch of Codex as opposed to the current release; Monte's script can automate this.

This second round of testing would hopefully be quicker than the first one, because at this point most functionality has already been tested. During the second stage, testing should focus on topics such as:

  • How does the component fit in to existing skin styles (desktop, mobile, etc)?
  • How does the component function with internationalized messages (these can be easily provided inside a MW environment using the existing internationalization tools).
  • Does the component conflict in any way with other UI elements present on a typical MW page (sidebar or navigation elements, etc)

Once a component passes these second round of tests (the "embedded in MW tests"), it could safely be included in the next release and used inside of MW freely.

This is an interesting idea @egardner, thanks for sharing.

Considering the integration of a component in a MW environment and simply because of the “unknown” variables that are present, I would imagine the tests performed at this second stage may be more than the first stage! Just thinking about it, I would want to carry out pretty much the same tests as I did in isolation (to verify the component still plays nice in its new environment) as well as the other tests you have rightly mentioned that cannot be tested on the Demo site.

Notwithstanding, this will indeed give us some extra confidence for the component to make the next release and I am really looking forward to it.

Thank for for this plan @egardner! At a high level it looks good to me, but I would add two things:

First, we now have support for Codex in PatchDemo, which allows you to create a test MediaWiki instance with the latest main branch of Codex, or with an unmerged patch in Codex, or even with an unmerged Codex patch and an unmerged VueTest patch applied at the same time (e.g. a Codex patch adding a new component, and a VueTest patch adding a demo for that component). This means that we could also do testing in the MediaWiki environment before a patch is merged, if we want to do that (or we could decide that we only want to do this second round of testing once the component is merged and the dust has settled on follow-up changes, which PatchDemo also supports). Setting up a test wiki on PatchDemo is easier than setting up a local MW-Docker environment, since it can be done through the browser, but it's a bit less flexible when it comes to testing different MW configuration settings, since there's no built-in way to change config.

Second, I think we should do a general round of pre-release testing in an MW environment right before we do a release, to ensure that the new release doesn't introduce regressions. This would involve testing existing uses of Codex in MediaWiki extensions/skins and (re-)testing existing components in VueTest, rather than newly created ones.

A quick guide to using PatchDemo:

Go to https://patchdemo.wmflabs.org/ . If you're not already logged in, click the "Sign in with OAuth" link, then click "Allow" (if you're also not logged in to the wikis, you'll be prompted to log in there first). Once you do that, you should see this screen:

Screenshot from 2022-07-21 14-10-53.png (926×1 px, 131 KB)

Expand the "Choose included repos" dropdown, and make sure the boxes for "design/codex" and "VueTest" are checked:
Screenshot from 2022-07-21 14-14-56.png (1×1 px, 210 KB)

If all you wanted to do was create a test wiki for the main branch of Codex (with all merged but unreleased changes, but without any unmerged changes), you can now click "Create demo" and skip the next step.
If you want to test an unmerged patch in Codex, paste the Gerrit link or the 6-digit Gerrit change number into the "Then, apply patches" text box. The text box should populate with the description of the Gerrit change. The tool will also detect which Phabricator tasks are associated with the patch, and will automatically post a link to your test wiki on all of those tasks unless you uncheck the checkbox for that task next to "Announce wiki on Phabricator".
Screenshot from 2022-07-21 14-18-49.png (362×1 px, 54 KB)

If you also need to test an unmerged VueTest change at the same time (e.g. because the Codex change you're testing creates a new component, and you also need to pull in the VueTest change that adds that component to the demo page), you can paste the Gerrit link for that change as well. PatchDemo lets you apply multiple unmerged changes, as long as each change is in a different repository (so you can test a Codex change together with a VueTest change, but you can't combine two Codex changes).
Once you're done, click the "Create demo" button near the bottom. This will take you to a progress screen that looks like this:
Screenshot from 2022-07-21 14-25-51.png (883×1 px, 192 KB)

Wait patiently for your wiki to be created, it takes about 4-5 minutes. You'll see this screen when it's done:
Screenshot from 2022-07-21 14-30-32.png (356×1 px, 37 KB)

Click the "Open wiki" button, which will take you to the main page of your test wiki:
Screenshot from 2022-07-21 14-31-44.png (563×914 px, 85 KB)

Finally, navigate to Special:VueTest, either by editing the URL (replacing Main_Page with Special:VueTest), or by typing Special:VueTest into the search bar in the top right and pressing enter.
Screenshot from 2022-07-21 14-32-51.png (548×1 px, 77 KB)

You can share the URL in the address bar with others, and a bot may already have posted the URL to the test wiki on the related Phabricator task (if there is an associated task, and if you didn't uncheck the box). The test wiki I created in this example is here: https://patchdemo.wmflabs.org/wikis/bc7eb58fc8/wiki/Special:VueTest

Second, I think we should do a general round of pre-release testing in an MW environment right before we do a release, to ensure that the new release doesn't introduce regressions. This would involve testing existing uses of Codex in MediaWiki extensions/skins and (re-)testing existing components in VueTest, rather than newly created ones.

Maybe we can batch up components for testing inside of MW before they get included in a release. Then only a single update would need to be made to VueTest to showcase all the new components being tested, and that setup could either be deployed to the cloud or spun up locally (and I'm all for anything that makes life easier for folks who are testing, so cloud deployment of the environment sounds great).

Ideally this workflow could be reflected in our Phabricator process; a set of "ready to release" tickets would accumulate, these could periodically be batched for testing inside of MW, and once all tests pass we cut a new Codex release and close out the tasks.

I'd like to propose that we add a new "acceptance criteria" item: whatever guidelines we settle on should live in a public and permanent location – either on one of the team's MediaWiki pages or in the Codex docs site.

I think in general we should try to consolidate our documentation in these two places.

I'd like to propose that we add a new "acceptance criteria" item: whatever guidelines we settle on should live in a public and permanent location – either on one of the team's MediaWiki pages or in the Codex docs site.

I think in general we should try to consolidate our documentation in these two places.

In general, I would suggest that anything that is specific to Codex live on the Codex docs site (which means it also lives in the Codex code, for folks who prefer to read documentation there), and anything that's general for the Design System Team's work lives on mediawiki.org. If we need to mention, say, something about Codex on mediawiki.org, we should mention it and link to the Codex docs site, rather than duplicating information.

There will probably be some circumstances where we need to duplicate information (e.g. the task tracking info), but perhaps we could follow the above rule in most cases.

egardner renamed this task from Define manual testing strategy for visual/design QA of Codex to Draft documentation for manual testing strategy in Codex .Aug 1 2022, 8:56 PM
egardner renamed this task from Draft documentation for manual testing strategy in Codex to [Epic] Define manual testing & QA strategy in Codex .

Moving out from under https://phabricator.wikimedia.org/T314082 , which is meant to house spike tickets for future capabilities. Will be taking a closer look at the subtasks of this and reconfiguring.

Once T318842: Enable embedding of Codex component demos inside of VueTest MediaWiki extension is complete, I will update some documentation here with the updated workflow for testing a new Codex component inside a MW environment (using PatchDemo to avoid having to maintain a local wiki instance). Then I think we can consider this task to be complete.

I will make another pass over the current guidelines now that T318842: Enable embedding of Codex component demos inside of VueTest MediaWiki extension has been resolved. This means that anyone who wants to test out a new component inside of MediaWiki will be able to spin up a PatchDemo instance where the component demos from Codex show up on the Special:VueTest wiki page. DST engineers will have to do a small amount of work to add new component demos as they are introduced upstream, but the code from Codex can be re-used here without modification.

I've added a new section to the testing guide: Testing in MediaWiki. @EUdoh-WMF can you take a look and let me know if anything here is unclear? If that looks good to you, then I think we can try to pick up T312113: QTE team review & publishing: manual testing guidelines (unless we no longer think that step is necessary). Then we can publish these guidelines somewhere and finally close this task!

@egardner This is super clear. I wouldn't change a thing! Great work.

Moving this to blocked until it's clear that all necessary cross-team review / approval is complete.

ldelench_wmf claimed this task.
ldelench_wmf added a subscriber: Jrbranaa.

V1 of the manual testing guidelines can be found here: https://www.mediawiki.org/wiki/Design_Systems_Team/Codex_Manual_Testing_Guidelines.

We will create followup tasks to address 1) the low-lift cleanup items such as fixing errors; 2) the more fundamental concerns surfaced in T312113#8478193 in in partnership with @Jrbranaa + QTE team.

Hello there! The manual testing guidelines were originally drafted to support all system contributors. In order to keep these accessible for everyone, we might want to provide them (or a concise version, even perhaps just a link to them) from Codex's demo site. What do you all think?

My apologies if this has already been discussed/decided internally. Thanks, DST!

Hello there! The manual testing guidelines were originally drafted to support all system contributors. In order to keep these accessible for everyone, we might want to provide them (or a concise version, even perhaps just a link to them) from Codex's demo site. What do you all think?

My apologies if this has already been discussed/decided internally. Thanks, DST!

Totally agree, I think the Manual testing guidelines should be added in the Codex demo too as we have the Contributing guidelines. Maybe we should add a "Testing components" section there.