Microservices automated testing principals:
- Automation test pyramid (have tests with different granularity and the more high level we go the fewer tests we should have) can be used as a model for test design of microservices: https://www.mountaingoatsoftware.com/blog/the-forgotten-layer-of-the-test-automation-pyramid
- Majority of automation tests should be at the unit test level
- Integration tests are executed at the API or service level
- A minimum set of automated E2E tests to complete automation test suite.
- Smoke tests to be added after deployment to dev environment
- Regression test for QA environment
Unit tests guidelines (solitary and/or social):
- SpringBoot microservices use Mockito, JMockit, Wiremock and/or Junit
- Unit tests should be small, specific and performing fast
- All external service calls and calls to the database should be mocked and stubbed
- Unit tests should test business logic and cover at least 75% of code (jacoco could be used for test coverage: https://www.mkyong.com/maven/maven-jacoco-code-coverage-example/)
- Model and controller classes could be excluded from coverage
- Unit tests should be automated and included in the build pipeline (test should all pass otherwise the build should fail)
- New feature should not be delivered without the unit tests
UI specific unit test guidelines
- As UI is usually SPA, it could have automated unit tests with the stubbed backend (behavior, layout, usability and adherence to the corporate design etc.)
- Automated and included in the pipeline
- Katalon studio, Selenium could be used as a tool
Integration tests guidelines (service level):
- Integration tests should test one or a multiple components (microservices) – api level
- The tool could be postman + newman (SoapUI could be an option, too), or the java code within the microservices
- Include all external api calls and database operations – slower performance then the unit test
- Include tests against boundary conditions, edge cases, and error conditions
- Test full business logic without UI
- Could be included in the deployment pipeline and triggered when the new change is introduced.
E2E Tests guidelines (API and UI)
- Selenium, Katalon for UI and Postman, SoapUI for API
- Should be minimal as they are the slowest (they should not last more than 3 hours)
- Automated and included in the pipeline
Good read on test automation common mistakes: https://content.microfocus.com/software-test-automation-tb/top-7-test-automation-mistakes?lx=wYdl7c&utm_source=techbeacon&utm_medium=techbeacon&utm_campaign=00134846
Good read on test automation for microservices: https://martinfowler.com/articles/practical-test-pyramid.html#TheTestPyramid
Tests, Bitbucket Repositories and the CI/CD pipeline guidelines
- Unit tests should be in the same repository as a microservice (component) that they test
- Integration (cross component) tests could be in a different repository than the microservices.
- E2E tests could be in the separate repository from the UI code (they use different microservices for executions)
- CI/CD build of the micorservice should run all unit tests – if some of them fail, build should fail
- If the build and unit tests are successful, they should trigger automated integration tests. If some of them fail, deployment should fail.
- If the integration tests are successful, E2E tests could be triggered in the pipeline. If they pass, the code could be deployed to development environment.
- Additional smoke tests could be run on the dev environment to make sure that the development environment is stable.
- If the smoke tests are successful, code should be promoted to QA. The testing environment should be different for developers and QAs. The development environment should not be used by the QA for testing.
- Run regression tests in the QA environment to test all other business scenarios which are not covered in the lower part of the test pyramid.
- Run performance tests and cross-platform tests (9.x, 10.x) to ensure the software is performant and is working as expected in other platforms or versions.
Automation Testing good practice
- Avoiding duplication of test scenarios over multiple test suites as it leads to slowness of the regression/automation test pipeline
- Automate as many tests as we can
- Both, developers and QAs should own and share test pipelines
- Addressing the tests that are failing by failing build and/or deployment