These are the definitions of done for ITE. There is the possibility that they may be altered as time passes to conform to new methods, standards or best practices or as the development team evolves and refines.
For a User Story (a feature):
- Is the defect/bug is fixed? is alle stories bugs?
- Were meaningful comments added?
- Are there Unit Tests implemented that covers the fix’s code functionality?
- Are there functional tests implemented to cover the bug (performed by a person other than the person who fixed the bug)
- Are there automated test created?
For a Sprint:
- Are all the User Stories included in the Sprint closed and do they meet the ITE DoD?
- Have all the automated tests (Unit, Acceptance, E2E, Integration) passed successfully?
- Has regression testing been performed on the product after any changes were made?
For a Release:
- Are all acceptance criteria met?
- Does the code adhere to coding standards?
- Is the code documented and were meaningful comments added?
- Does all the automated tests pass on the staging environment?
- Did all the unit tests pass?
- Did all the integration tests pass?
- Did all the Acceptance, End-to-End and Edge-to-Edge tests pass?
- Has test coverage been run and has all our TDD code been covered?
- Was the code peer reviewed (Documented, Commented)?
- Was the code built into a final deployable format?
- Was the code merged and committed into the production branch?
- Was the documentation/readme updated/created?
- Did the PO/Client accept it as done?