Testing

Objectives of testing

  • to find defects
  • to bring the tested software to an acceptable level of quality
  • to perform the required tests efficiently and effectively
  • to compile a record of software errors

Testing Axioms

Its impossible to test a program completely

  • software testing is risk-based
    • need to balance cost and risk of missing defects
  • testing can't prove the asbsence of defects
  • the more defects you find, the more defects there are likely
  • the pesticide paradox
    • a system tends to build resistance to a particular testing technique
  • not all defects found are fixed
  • defects are not always obvious
  • product specifications are never final
  • software testers aren't the most popular members of a project team
  • software testing is a disciplined technical profession that requires training

Also:

  • large input space
  • large output space
  • large state space
  • large number of possible execution paths
  • subjectivity of specifications

Testing process

  • Planning
    • includes completion criteria (coverage goal)
  • Design
    • approaches for test case selection to achieve coverage goal
  • Implementation
    • scripting of test cases
      • input/output data
      • state before/after
      • test procedure
  • Execution
    • run tests
      • check results - pass or fail?
      • coverage?
  • Test Management
    • defect tracking
    • maintain relationships

Levels of Testing

  • Unit Testing
    • individual program units, such as procedure, methods in isolation
  • integration testing
    • modules are assembled to construct larger subsystem and tested
  • system testing
    • includes wide spectrum of testing such as functionality, and load
  • acceptance testing
    • customer's expectations from the system

Test Design Approaches

  • Black box testing
    • tests design based on specification (input/output)
  • White box testing
    • tests design based on code structure
  • Grey Box testing
    • tests design based on design model

Agile Testing

Image

Customer tests (functional tests, acceptance tests, end-user tests)

  • verify the behavior of the application from the point of view of Customers
  • developed based on requirements (e.g. user stories) by specifying acceptance criteria
  • developed in collaboration by Customers, Users, Testers, Developers
  • written in a language understandable by Customers

Unit tests

  • verify the behavior of a program unit (e.g. single class, method, function) that is a consequence of a design decision
  • written by Developers
  • summarize the behavior of the unit in the form of tests

Component tests (integration tests)

  • verify components consisting of groups of units that collectively provide some service
  • written by developers (derived from customer tests)

Property tests (nonfunctional, cross-functional)

  • verify nonfunctional requirements (eg. response time, capacity, stress, security)
  • automation is essential for most of these tests
  • tests must start as soon as possible (basic architecture, skeleton of functionality and executed continuously.)

Usability tests

  • verify fitness for purpose
  • can real users use the software application to achieve the stated goals?

Exploratory tests

  • to determine whether the product is self-consistent
  • testers use the product, observe behavior, form hypotheses, design tests to verify hypotheses, and exercise product with them.

Test Driven Development (TDD)

  • software development approach based on writing tests first

Steps of TDD

  1. Quickly add a test
  2. Run all the tests
  3. Updates the functional code
  4. Repeat from step 2 until all tests pass
  5. Refactor the code as needed
  6. Repeat from step 1 if the coding is not done

Acceptance TDD

Development starts with writing of an acceptance test (customer test)

  • Expected system behavior from users' point of view
  • Free of technical details
  • written in collaboration with experts, customers, users, developers
  • Expressed in simple understandable language

Testing Automation

Use of automated tools is essential

  • provides speed, efficiency, accuracy, precision, etc
  • allows repeatability (regression testing)

Types of tools:

  • viewers and monitors (e.g. code coverage tool, debugger) driver and stubs
  • stress and load tools
  • analysis tools (e.g. file comparison, screen capture and comparison) random testing tools (monkeys)
  • defect tracking

Testing and Planning Management

  • Create/maintain test plans
    • integrate with project plan
  • Maintain links to Requirements/Specification
    • generate Requirements Test Matrix Reports and Metrics on test case execution
  • Tracking of history/status of test cases
    • defect tracking

Test Execution

  • Test Drivers and Execution Frameworks
    • Run test scripts and report
    • result
    • e.g. JUnit
  • Runtime test execution assistance
    • comparators

Test performance assessment

  • Analysis of the effectiveness of test cases for extent of system covered
    • coverage analyzers
    • report on various levels of coverage
  • Analysis of the effectiveness of test cases for defects detection
    • mutation testing

Specialized testing

  • Security testing tools
    • password crackers
    • vulnerability scanners
    • packet crafters
  • Performance / Load testing tools
    • performance monitors
    • load generators

Capture and Replay

  • For testing from user interface (GUI, Web)
  • Records a manual test session in a script
    • user inputs and “capture” of system responses
  • Then, “plays back” the recorded user input and checks if the same responses are detected as are stored in the captured script.
  • Benefits: relatively simple approach, easy to use, little/no scripting involved

Test case

  • for a state-less system (outcome depends solely on the current input)
    • a pair of <input, expected outcome>
  • for a state-oriented system (outcome depends both on the current state of the system and current input)
    • a sequence of <input, expected outcome>

Expected outcome

An outcome of program execution may include:

  • value produced by the program

  • state change

  • sequence of values which must be interpreted together for the outcome to be valid

  • determination of the expected outcome is not always straightforward

  • Test oracle

    • a mechanism that verifies the correctness of program output
    • generates expected results from test inputs
    • compares the expected results with the actual results of execution

Point of Control Observations

  • Direct method call (e.g. JUnit)
  • User input / output
  • Data file input / output
  • Network ports / interfaces
  • Windows registry / configuration files
  • Log files
  • Pipes / shared memory

3rd party component interfaces:

  • Lookup facilities:
    • network: Domain Name Service (DNS), Lightweight Directory
    • Access Protocol (LDAP), etc.
    • local / server: database lookup, Java Naming and Directory Interface (JNDI), etc.
  • Calls to:
    • remote methods (e.g. RPC, Services)
    • Operating System