Writing Tests that Work
Your team realizes that testing is valuable. Your team creates tests for already completed features. The result was encouraging and your team even found a few bugs hiding in the application. Your team returns to feature development and attempts to add new tests. Features took more time to complete because of the need to create corresponding tests, but your team remains committed to testing. However, some tests keep breaking and some aspects seem too difficult to test. The test suite begins to get neglected and trust in the tests fades. What went wrong? (Be sure to also catch part one of this two-part automated testing series – Why Automated Testing Matters)
Just like poorly written code, poorly created tests become an impediment for progress. Poorly written tests become more of an impediment than missing tests because bad tests make it more difficult to make changes. Tests should facilitate faster development, so let’s look at some keys to writing effective and maintainable tests.
- Tests Should Pass
- Tests Should Always Pass
- Tests Should Be Fast
- Add New Tests When Adding New Code
- Tests Should Be Independent of Implementation Details
- Tests Should Be Independent of Each Other
- Track Your Testing Code Coverage
It is important to remember that working software is the goal of software development. Testing is one aspect to help achieve that. Teams should regularly consider how well their testing suite is helping the team deliver a quality product to users. When testing gets done effectively, the benefits should far outweigh the costs in the long term.
To read more how to test well, check out my full post on the SitPen blog.
Responses
See how to respond...
Respond via email
If you'd prefer to message me directly, send an email. If you'd also like your message to be visible on the site I can add it as a comment.
Reply via Email
Respond from another site
Responses are collected from posts on other sites. Have you posted somewhere that links to this page? If so, share the link!