Lecture overview
Lecture 1: Course introduction
Lecture 2: Introduction to testing and testing principles
Lecture 3: Specification-based and Boundary Testing
Lecture 4: Model- and State-based testing
Lecture 5: Structural testing
Lecture 6: Pragmatic exploratory testing
Lecture 7: Testability and Mock Objects
Lecture 8: Security testing (Part I)
Lecture 9: Security testing (Part II)
Lecture 10: Test-driven development (TDD)
Lecture 11: Web testing
Lecture 12: Design by Contract and Property-based testing
Lecture 13: Search-based software testing
,Lecture 2: Introduction to testing and testing principles
▪ Material: chapter 1, 2 and 3
▪ Faults and failures
▪ Principles of testing
▪ Testing in the software development life cycle
▪ Validation and verification
▪ V-Model
▪ Static analysis
Failure: a component or system behaves in a way that is not expected.
Defect/Bug/Fault: a flaw in a component that can cause the system to behave incorrectly. A
defect, if encountered during execution, may cause a failure.
Error/Mistake: a human action that produces an incorrect result.
Testing: attempt to trigger failures, debugging: attempt to find faults given a failure.
Principles of testing:
1. Testing shows the presence of defects, not the absence of them.
2. Exhaustive testing is impossible.
3. Testing needs to start early
a. To let tests guide design.
b. To get feedback as early as possible.
c. To find bugs when they are cheapest to fix.
d. To find bugs when have caused least damage. Faults can be introduced at any
moment in the software development process. Early discovery of faults pays off.
4. Defects are likely to be clustered
5. Pesticide paradox yields test methods ineffective: re-running the same test suite again and
again on a changing program gives a false sense of security, we need variation in testing.
6. Testing is context-dependent
7. Absence-of-errors fallacy: there is more to success/quality than absence of errors.
Software life cycle: period of time that begins when a software system is conceived and ends
when the system is no longer available for use.
Phases: concept development, requirements, design, implementation, test, installation, retirement.
Phases may overlap and be performed iteratively.
Waterfall model (left) vs. V-model (right)
Validation: are we building the right system? Is this what the user wants?
Verification: are we building the system right (“no” bugs)? Does the software system meet the
requirements specifications?
, Test levels:
• Component (unit) testing: units in isolation.
• Integration testing: interaction between units.
• System testing: system-level properties.
• Acceptance testing: focus on user needs.
Test types: group of test activities aimed at testing a component or system focused on a specific
test objective. Different test types target specific test objectives.
• Testing of function: functional testing, black box testing.
• Testing of software product characteristics: non-functional testing (performance).
• Testing of software structure/architecture: structural testing, white box testing.
• Testing related to changes: confirmation vs. regression testing.
Confirmation testing: a test failure indicates a defect, developer fixed the defect and exact same
test is ran again to confirm that the defect has indeed been fixed.
Regression testing: testing of a previously tested program to detect ‘unexpected side-effects’ of
changes. Done after modification to ensure that defects have not been introduced in unchanged
areas of the software as a result of the changes made. Performed when the software or its
environment is changed. Automated.
Maintenance testing: testing after a system is stable and deployed. Test impact of changed
environment to an operational system (e.g. security updates of libraries used).
Impact analysis: determine which parts are affected by change and conduct regression testing for
those.
Static testing: testing of a component or system at specification or implementation level without
execution of software (e.g., reviews or static analysis).
Dynamic testing: testing that involves the execution of the software of a component or system.
Formal review phases: planning, kick-off, preparation, review meeting, rework, follow-up.
Formal review roles: moderator, author, scribe, reviewers, managers.
Types of review:
• Walkthrough: author in the lead.
• Technical review: technical meeting to achieve consensus.
• Inspection: peer review of documents, relies on visual inspection to detect defects.
Static analysis tools: checkstyle (formatting), PMD (design faults), Spotbugs (design faults).
Many static analysis tools are based on heuristics
• Correct positive: warning, and a problem (let’s fix it!)
• Correct negative: no warning, no problem (no need to act)
• False positive: warning, but not a problem (annoying)
• False negative: problem, but no warning (possibly dangerous)