Breakpoint at BrowserStack conferences (2020-2021) - Part 1 Focus on Automation

“Improving daily work is even more important than doing daily work.”
― Gene Kim, The Phoenix Project: A Novel About IT, DevOps, and Helping Your Business Win

I went to two Breakpoint (BrowserStack) conferences - the first one was in July 2020 and the second was in March 2021. The table below lists 6 out of 18 presentation that I watched at the Breakpoint conference in July. The choice which presentations to attend was mostly based on my time availability, so there could be some really interesting presentations that I missed.

Looking at the presentations' titles it's easy to figure out that the conference main theme was Automation Testing. With the motto "Delivering quality software at speed" the presenters shared their understanding of why testing automation is important and gave some insights into valuable lessons they learned from implementing automation testing in their organizations.

"Transforming Test methodologies: from manual to robust strategies"Todd Eaton, Head of Dev-Ops. The Weather Company
"WebdriverIO:The next gen Automation Test framework you should know about"Kevin Lamping, Senior Front End engineer, WebdriverIO
"Automating to augment testing"Alan Richardson (Evil Tester), Coach & Trainer, Agile Testing and Development
"Firefox and the push for simpler cross browser testing"Maja Frydrychowicz, Senior Software Engineer, Mozilla
"How we reconstructed our releases at Trivago"Benjamin Bischoff, Test Automation Engineer, Trivago
"Modern automation approaches"Priyanka Halder, Sr. Manager, Quality Engineering. GoodRx

Although each presentation by itself probably could be a topic for a separate blog post, I'll focus here on the ideas that resonated with my thoughts on QA processes and on the role automation testing plays in delivering quality in software products.

Todd Eaton ("Transforming Test methodologies: from manual to robust strategies"), strongly proposed QA-Devops integration, stating that a QA engineer should be " a release boss", and "the one who pushes the deployment button". Such integration would help creating an optimal framework to increase test coverage which, in turn, would give an immediate feedback on what state the software is. Alan Richardson ("Automating to augment testing") fully supported that point of view. For him automation testing is a subset of more general automation workflows - basically, it's a part of dev tools for the build processes, pipelines and other scripting activities.

Re-thinking the place of QA automation Alan Richardson showed a classic Agile test pyramid but with "Manual testing" completely removed, replaced with a small cloud of "Exploratory testing" which crowns the pyramid that has all layers dedicated to automation testing.

Screen Shot 2020-08-02 at 8.50.58 PM.png (601×1 px, 224 KB)

What about a real-life success story of automation testing? Priyanka Halder( "Modern automation approaches") gave a detailed overview on how automation testing at GoodRx gradually evolved to play a vital role in ensuring the quality of software products. She mentioned 2,000 tests run for regression and about 600 tests that are run as someone pushes the commit to a pipeline. That variety of tests allowed QA engineers there to run A/B testing and campaign-based testing by quickly create test suites based on features or devices, or even on specific users audiences.

Given massive evidence of many successes for automation testing, the two questions remain

  • what to automate (and how)?
  • what communication processes should a QA team implement to be proactive in automation testing?

Surprisingly, the two questions above are closely related. Benjamin Bischoff ("How we reconstructed our releases at Trivago") described a disconnected triangle or, in his words, "a triangle of doom", when QA team functions separately from Developers and from Test Automation team. He advocated "not automation per se - it's about QA process and people involved into it".
Bischoff's practical advice - "Make your test automation user friendly, that would break the "triangle of doom"" finds a support in Todd Eaton presentation. Todd Eaton described an Automation architect who would define the priorities for developing automation testing tools.
As for the effective communication, Todd Eaton also emphasized that the Test Management with an efficient and user-friendly reporting system must be in place to give immediate and easily-digestible feedback to everyone - managers. devops, developers and, of course, QA engineers. He demonstrated a QA dashboard with a sweeping overview of all tests status and the ability to drill into group of tests or into just a single test to see the detailed info.

Alan Richardson added to the above a checklist with important questions that should be asked before starting automation testing:

  • What we can automate?
  • What processes do I do and would it help to automate some of that?
  • What I am not doing?
  • How to balance the time spent on automation with the risk you’re mitigating?

Each of the questions requires a good analysis to grasp what's needed most at the moment and what's needed in the long term.

The lessons learned? As I watched the presentations and was reflecting on what the most important to learn there, the two questions I wrote earlier are still valid but they may be re-prioritized (and rephrased) by putting the communication processes (including test management reporting system) first and adding why? (to automate) as the first question to ask:

  • what communication processes should a QA team implement to be proactive in automation testing?
  • why/what/how to automate

And I completely agree with what Alan Richardson said concluding his presentation: "Automating is driven by ideas. Don't let tools dictate what you automate.”

Written by Etonkovidova on Mar 31 2021, 4:15 PM.

Event Timeline