Page MenuHomePhabricator

[EPIC] FY 24/25 SDS 2.1.1 POC Integration tests of 3rd Party experimentation engine solutions
Open, HighPublic

Description

Description

This epic encompasses the work required to perform Proof of Concept (PoC) integration tests for various third-party experimentation engine solutions. The goal is to evaluate the feasibility, performance, and compatibility of these solutions within our existing infrastructure. The integration tests will ensure we can confidently make a recommendation in our ongoing exploration for a build, buy, or install T366191: [Sprint 14 GOAL] SDS 2.5.6 Make build, install or buy recommendation for experimentation engine license for review by Tajh T335482: Investigate feature flagging/experimentation platforms experimentation solution that meets our technical and business requirements, and provide the desired experimentation capabilities.

This work support the goals outlined in the FY 24/25 annual plan:

SDS Objective 2: Product managers can quickly, easily, and confidently evaluate the impacts of product features.

Key Result 2.1: By the end of Q2, we can support 1 product team to evaluate a feature or product via basic split A/B testing that reduces their time to logged-in user interaction data by 50%.

Based on our preliminary research, we have selected two third-party options to advance to integration testing:

KR/Hypothesis(Initiative)

This work began in Q3 of 23/24 with the SDS 2.5.6 Hypothesis:

If we establish the user and technical requirements,

we can analyze available 3rd party experimentation solutions and custom build requirements
in order to determine the best use of resources to implement an experiment flagging engine.

and will continue into FY 24/25 under SDS 2.1.1 hypothesis:

If we create an integration test environment for the proposed 3rd party experimentation solution, we can collaborate practically with Data SRE, SRE, and QTE to evaluate the solution’s viability within WMF infrastructure

Goals

  • Evaluate at least two third-party experimentation engines to identify the most suitable solution.
  • Demonstrate that a seamless integration of the selected engines with our existing infrastructure is possible.
  • Verify the functionality and performance of the integrated solutions.
  • Document the integration process, challenges, and solutions for future reference.
  • Provide recommendations based on the test results for the final selection of the experimentation engine.

User Stories

Setting Up Testing Environments

As a <DevOps? SRE?Staff?> Engineer
I want to set up isolated testing environments for each selected experimentation engine
So that we can perform integration tests without affecting the production environment.
(open questions: is this possible to do without affecting the production environment?)

Acceptance Criteria:
  • Separate testing environments configured and operational.
  • Documentation of the setup process for each environment.
Integration of Experimentation Engine A (Growthbook)

As a Software Engineer
I want to integrate Growthbook with our existing infrastructure
So that we can evaluate its compatibility and functionality.

Acceptance Criteria:
  • Successful integration
  • Basic functionality tests passed.
  • Initial performance benchmarks recorded.
Integration of Experimentation Engine B (Statsig)

As a Software Engineer
I want to integrate Statsig with our existing infrastructure
So that we can evaluate its compatibility and functionality.

Acceptance Criteria:
  • Successful integration
  • Basic functionality tests passed.
  • Initial performance benchmarks recorded.
Functional Testing of Integrated Solutions

As a QA Engineer
I want to conduct comprehensive functional tests on each integrated experimentation engine
So that we can ensure they meet our requirements.

Acceptance Criteria:
  • Detailed functional test cases developed and executed.
  • Test results documented and analyzed.
Performance and Scalability Testing

As a QA Engineer
I want to perform performance and scalability tests on each integrated experimentation engine
So that we can evaluate their capabilities under load.

Acceptance Criteria:
  • Performance and scalability test plans developed and executed.
  • Test results documented and analyzed.
User Acceptance Testing

As a UX Designer
I want to perform user acceptance tests on each integrated experimentation engine's UI with key stakeholders from Product Analytics and Product Leadership
So that we can evaluate the end user satisfaction with the two options.

Acceptance Criteria:
  • user acceptance test plans developed and executed.
  • Test results documented and analyzed.
Documentation of Integration Processes

As a (Technical Writer?) Tech Lead
I want to document the integration processes, challenges, and solutions for each experimentation engine
So that we have a reference for future integrations and troubleshooting.

Acceptance Criteria:
  • Comprehensive documentation created for each integration process.
  • Documentation reviewed and validated by the engineering team.
Comparative Analysis and Recommendations

As a Product Manager
I want to create a comparative analysis report of the tested experimentation engines
So that we can make an informed decision on the best solution.

Acceptance Criteria:
  • Comparative analysis report completed.
  • Recommendations provided based on test results and analysis in the form of a Decision Brief

Overall Acceptance Criteria for Epic

  • Successful integration of at least two different third-party experimentation engines.
  • Comprehensive testing of each integrated solution, including functionality, performance, and scalability.
  • Detailed documentation of the integration process for each engine.
  • Comparative analysis report highlighting the strengths and weaknesses of each solution.
  • maintenance plans drafted for each solution so that cost can be included in the analysis
  • supplier risk assessment completed so it can be included in the analysis
  • Decision brief that makes a recommendation for the best-suited experimentation engine based on the PoC results.

Success metrics

  • All work is completed by August 20th, 2024
  • Two successful integrations
  • 90% functional test pass
  • Response time within acceptable limits and throughput meeting or exceeding current infrastructure capabilities.
  • Each solution should handle at least 1.5 times the expected peak load.
  • two week timeline for integration per solution
  • Decision brief has a high satisfaction score from stakeholders (e.g., at least 4 out of 5) and includes actionable recommendations.
  • 80% positive feedback from stakeholders about the integration process and outcomes
  • Clear identification of the most cost-effective solution

In scope

  • isolated testing environment setup for each solution
  • configuration of necessary infrastructure and tools to support the integration testing
  • integration for each solution within existing infrastructure
  • functional tests
  • performance and scalability tests
  • documentation of integration process and learnings
  • detailed reports on test cases, results, and analysis of learnings
  • comparative analysis report that highlights strengths and weaknesses of both options.
  • provide confident recommendation
  • gather stakeholder feedback throughout the process at a regular cadence
  • key stakeholder user testing of of integrated solutions
  • remove integrations from infrastructure

Out of Scope

  • production deployment (I think?)
  • user training
  • long term maintenance beyond the proof of concept phase
  • developing custom features or significant modifications
  • full-scale user testing
  • cost negotiations with vendors
  • addressing dependencies or integrations with systems beyond the PoC
  • comprehensive security audit

Artifacts & Resources

FY24-25 SDS 2.1 KR Centralized Experimentation Capabilities
Link to diagrams and architecture and design docs (@phuedx please add links here)

Event Timeline

VirginiaPoundstone renamed this task from [Epic] PoC Integration tests of 3rd Party experimentation engine solutions to [EPIC] PoC Integration tests of 3rd Party experimentation engine solutions .Fri, Jun 21, 5:03 PM
VirginiaPoundstone renamed this task from [EPIC] PoC Integration tests of 3rd Party experimentation engine solutions to [EPIC] POC Integration tests of 3rd Party experimentation engine solutions .Thu, Jun 27, 12:18 PM
VirginiaPoundstone renamed this task from [EPIC] POC Integration tests of 3rd Party experimentation engine solutions to [EPIC] FY 24/25 SDS 2.1.1 POC Integration tests of 3rd Party experimentation engine solutions .Thu, Jun 27, 12:44 PM