Type | Description |
---|---|
Unit Testing | A programmatic test that tests the internal working of a unit of code, such as a method or a function. |
Integration Testing | Ensures that multiple components of systems work as expected when they are combined to produce a result. |
Regression Testing | Ensures that existing features/functionality that used to work are not broken due to new code changes. |
System Testing | Complete end-to-end testing is done on the complete software to make sure the whole system works as expected. |
Smoke Testing | A quick test performed to ensure that the software works at the most basic level and doesn’t crash when it’s started. Its name originates from the hardware testing where you just plug the device and see if smoke comes out. |
Performance Testing | Ensures that the software performs according to the user’s expectations by checking the response time and throughput under specific load and environment. |
User-Acceptance Testing | Ensures the software meets the requirements of the clients or users. This is typically the last step before the software is live, i.e. it goes to production. |
Stress Testing | Ensures that the performance of the software doesn’t degrade when the load increases. In stress testing, the tester subjects the software under heavy loads, such as a high number of requests or stringent memory conditions to verify if it works well. |
Usability Testing | Measures how usable the software is. This is typically performed with a sample set of end-users, who use the software and provide feedback on how easy or complicated it is to use the software. |
Security Testing | Now more important than ever. Security testing tries to break a software’s security checks, to gain access to confidential data. Security testing is crucial for web-based applications or any applications that involve money. |
Black box testing | Gray box testing | White box testing |
---|---|---|
Black box testing does not need the implementation knowledge of a program. | Gray box testing knows the limited knowledge of an internal program. | In white box testing, implementation details of a program are fully required. |
It has a low granularity. | It has a medium granularity. | It has a high granularity. |
It is also known as opaque box testing, closed box testing, input-output testing, data-driven testing, behavioral testing and functional testing. | It is also known as translucent testing. | It is also known as glass box testing, clear box testing. |
It is a user acceptance testing, i.e., it is done by end users. | It is also a user acceptance testing. | Testers and programmers mainly do it. |
Test cases are made by the functional specifications as internal details are not known. | Test cases are made by the internal details of a program. | Test cases are made by the internal details of a program. |
# | Verification | Validation |
---|---|---|
1. | Verification is the process of evaluating the different artifacts as well as the process of software development. This is done in order to ensure that the product being developed will comply with the standards. |
Validation is the process of validating that the developed software product conforms to the specified business requirements. |
2. | It is a static process of analyzing the documents and not the actual end product. | It involves dynamic testing of a software product by running it. |
3. | Verification is a process-oriented approach. | Validation is a product-oriented approach. |
4. | Answers the question – “Are we building the product right?” | Answers the question – “Are we building the right product?” |
5. | Errors found during verification require lesser cost/resources to get fixed as compared to be found during the validation phase. | Errors found during validation require more cost/resources. Later the error is discovered higher is the cost to fix it. |
Positive Testing | Negative Testing |
---|---|
Positive testing ensures that your software performs as expected. The test fails if an error occurs during positive testing. | Negative testing guarantees that your app can gracefully deal with unexpected user behaviour or incorrect input. |
In this testing, the tester always looks for a single set of valid data. | Testers use as much ingenuity as possible when validating the app against erroneous data. |
Phases | Explanation |
---|---|
Requirement Analysis | QA team understands the requirement in terms of what we will testing & figure out the testable requirements. |
Test Planning | In this phase, the test strategy is defined. Objective & the scope of the project is determined. |
Test Case Development | Here, detailed test cases are defined and developed. The testing team also prepares the test data for testing. |
Test Environment Setup | It is a setup of software and hardware for the testing teams to execute test cases. |
Test Execution | It is the process of executing the code and comparing the expected and actual results. |
Test Cycle Closure | It involves calling out the testing team member meeting & evaluating cycle completion criteria based on test coverage, quality, cost, time, critical business objectives, and software. |