In a recent thread, @jdlrobson suggested we do a survey to further explore the waning maintenance of browser-test suites among teams.
> would it make sense to do a survey as you did with Vagrant to understand how our developers think of these? Such as who owns them... who is responsible for a test failing... who writes them... who doesn't understand them.. why they don't understand them etc...?
Some other questions I can think of:
- How valuable are unit tests to the health/quality of a software project?
- How valuable are browser tests to the health/quality of a software project?
- How much experience do you have with TDD?
- Would you like more time to learn or practice TDD?
- How often do you write tests when developing a new feature?
- What kinds of test? (% of unit test vs. browser test)
- How often do you write tests to verify a bugfix?
- What kinds of test? (% of unit test vs. browser test)
- When would you typically write a unit test?
- Before implementation
- After implementation
- When stuff breaks
- When would you typically write a browser test?
- During conception
- Before implementation
- After implementation
- When stuff breaks
- What are the largest barriers to writing/running unit tests?
- Test framework
- Documentation/examples
- Execution time
- CI
- Structure of my code
- Structure of code I depend on
- What are the largest barriers to writing/running browser tests?
- Test framework
- Documentation/examples
- Execution time
- CI
- What are the largest barriers to debugging test failure?
- Test framework
- Confusing errors/stack traces
- Documentation/examples
- Debugging tools