LCG-Application Area
SPI: Software Testing 

POOL project Test Plan

June 5, 2003




Table of Contents:

  1. - Title 
  2. - Overview 
  3. - Test Methodologies and Phases 
    • Testing Approach and Phases 
    • Test documents 
    • Work products: test deliverables 
    • Testing procedure 
    • Requirements Validation 
  4. - TestScope 
    • Features to be Tested 
    • Features Not to be Tested 
  5. - Test Environment 
    • Hardware 
    • Software 
  6. - Schedule and Responsibilities 
  7. - Sw Testing Policies 
  8. - Glossary 



1.- Title

This document was generated by the POOL project team and will be developed for the LCG AppArea computing project.


Document Name:  POOL 
Project Test Plan
Short Name:  POOL-testplan.html
Version:  0.1
Publication Date:  July 4, 2003
Author(s):  Giacomo Govi
e-mail(s) contacts:  giacomo.govi@cern.ch
Status:  approved

Document Status Sheet

The following information is being used to control and track modifications made to this document. It is the reader's responsibility to ensure they have the latest version of this document placed at http://pool.cern.ch/infrastructure/index.html and stored at "project/doc/developer/" CVS directory.


Version  Date  Author  section(s)  reason 
15/05/03  SPI (M. Gallas)  All  Inital template 
         



2.- Overview

This document describes the "Test Plan" for the project mentioned above and addresses the specification of the:

Is intended that all LCG AppArea projects share as much as possible the "Test Plan" described here. Of course, there are items that are project specific and that should be discussed internally into the project. The elaboration of this Test Plan is one of the Sw-Testing Policies already agreed for the LCG AppArea projects and it is done in conformance with them.

The Sw-Testing policies reflected in this document are labeled with the word "Mandatory" through the all document.

Only those "italic" sections/items with a (*) can be and should be modify by the projects by adding information. If a item does not apply to a specific project write "does not apply", do not delete it.



3.- Test Methodologies and Phases

  3.1- Testing Approach and Phases

A multi-level sw-testing will be performed in the software product with the purpose to find faults and to be sure that faults already corrected will not appear again. In a bottom-up approach sw-testing starts at the level classes and small groups of collaborating classes and when we cannot find any more defects in them, we put them together and test the next level, and so on.

The goal is to automate as much as possible the testing process. Because when detected faults are corrected there is a possibility of introduce new faults the system should be retested (regression testing) to verify that the old functionality remains. For this reason all level of tests should be automated and run by the nightly (and pre-release) building system(Mandatory). Test cases based on bugs found during the pre-release stage and user detected problems must be also include in a complete regression test suite (Mandatory).

A test log should be kept during the entire test work in special during the nightly and pre-release builds (Mandatory). The test log must be connected the version of the system.

This test plan addresses the requirements, schedule and responsibilities for the four typical sw-testing phases:

Unit testing:

Unit tests should be written with the code (Mandatory). Developers are responsible for the creation of unit-tests that at least check the expected functionality (black-box testing) of classes and small group of collaborating classes. The tests should be also provided with the test case specification and integrated with the software configuration tool and test-frameworks selected by SPI. The tests should be produced according to the SPI sw-testing policies in which is related with test directory structure, test naming, testing tool integration, test documentation and test output messages.

Integration testing:

Integration tests are needed to check whether different units that have been developed and checked are working together properly. At the end of the integration tests all the uses-cases of the interface of each subsystem have to be matched with the tests cases and the subsystem has implemented all its components. The integration testing will start as soon as all the needed components for a use case are implemented and it will evolve until all use-cases will be implemented (Mandatory). Integration checks should include checks for memory leaks at 'predictable' or in places where problems were found in the past.

System testing:

A set of black box tests should be done in order to validate the entire sw-product against its requirements. This job is under the responsibility of the Sw-infrastructure & testing team into the project. The tests must systematically validate each requirement. If the requirements are organized by use cases the case testing is simpler compared with the test of the individual requirements. The list of most common system tests is written in the following lines. To check which ones are already implemented go to the section 4.1.
Types of system tests:

Acceptance testing:

Intended for comparison of the end sw-product with to the current needs of the end users. Should include at first step (internal acceptance tests) all the examples offered to the users (Mandatory). The key persons in the experiments that will try to use the sw-product should be identified (in the schedule and responsibilities sections) and they own examples should be included in the acceptance testing test suite. Standard inputs and comparisons to reference the output must be provided also with these validation tests. These tests examples considered as validation examples should perform progressive-testing (to check the new features of the new sw-product) and regressive-testing (to check what was there in the past and backward compatibility).

  3.2- Test Documents

The sw-testing activity should be documented through the following documents:

  3.3- Work Products: Test deliverables

Apart from the present test plan document a test deliverable consists in: The deliverable must be working within the sw-configuration management tool, sw-testing frameworks and nightly building systems proposed by SPI (Mandatory).

  3.4- Testing procedure: suspension/resumption criteria

Test should be run from bottom to top in the test phase hierarchy. As soon as a test fails in one phase the next ones are invalidated until the fault will be solved. Is up to the sw infrastructure and testing team decide if the test procedure will continue. Within a test phase the most critical tests should be run at first. The testing tools proposed by SPI offer the mailing alert if is desired.

  3.4- Requirements Validation

A document describing the requirements of the POOL framework is in preparation. Nevertheless, a full set of system requirements can be extracted from the description documents of the component whose interface is exposed to the framework clients: DataSvc, FileCatalog and Collection. The requirements collected are mapped to implemented test case.

A Traceability Matrix can be use to perform the mapping between software design requirements and software test cases.



4.- Test Scope

  4.1- Features to be Tested

In order to achieve a comprehensive and cumulative regression test suite all the tests cases should be documented as is explained in the test deliverables item. A relation of all the test cases implemented by testing phases and/or by component is needed. This information can be stored in an external document or in this section

Unit tests
    AttributeList:
    Collection:
    DataSvc:
       Cache functionality
       DataSvc functionality - Stub for PersistencySvc
       DataSvc storage operation - Stub for PersistencySvc
       Ref functionality - Stub for PersistencySvc
   EDGCatalog:
   FileCatalog:
   ImplicitCollection:
   MySQLCatalog:
   MySQLCollection:
   POOLCore:
   PersistencySvc:
   RootCollection:
   RootStorageSvc:
   StorageSvc:
   XMLCatalog:
 
Funtionality tests
    Ref and Cache:
       Functionality of a generic Ref object
       Functionality of a Ref object qualified as a const
       Functionality of a static Ref object
       Functionality of a polymorphic Ref object
       Storage operation: Write/Read
       Storage operation: Write/Update
       Storage operation: Write/Delete
       Object navigation: homogeneous tree
       Append new items (no association) on an existing database
       Append new items referencing existing items
       Append new items (associated to other new items) on existing databases
       Storage of classes related through inheritance
       Storage of classes referencing external object through pointers
       Storage of classes using some STL containers
       Handling of related and unrelated classes through Refs
       Pointers ownership and cache clean up policy
    Persistency service:
       General functionality
    File catalogue:
    Collection:
       Collection Write
       Collection Read
       Collection Update
       Collection multi file Write
       Collection multi file Update
       Collection file info retrieve

Performance tests
    Ref and Cache:
    Persistency service:
    File catalogue:
    Collection:
 

  4.2- Features Not to be Tested

 


5.- Test Environment 

  5.1- Hardware

As LCG AppArea sw-products will be delivered for different type of platforms, sw-testing should run at the same type of platforms the sw-product is delivered. Check the SPI Web pages to know which are the present supported platforms.

The tests are executed in all the platforms supported.

  5.2- Software

Sw-testing code should be produced in accordance with the rules and using the test frameworks proposed by SPI (Mandatory). The rules are written as Sw-testing policies for the code phase of the sw-testing procedure. These rules include test directory structure, test naming conventions, conventions for messaging output ... and can be found at the http://spi.cern.ch/software_development.html. The test frameworks proposed are: CppUnit, PyUnit, .... (X-Unit Family) and Oval. The updated list of these tools (together with HowTos an examples) can be found at SPI Sw-Testing pages and are provided by the SPI external software service.

The tests will be completely integrated with the proposed SPI nightly building system.



6.- Schedule and Responsibilities

6.1 - Unit or Component test

For all projects within LCG AppArea Unit-tests should be written as the code is written (Mandatory). The owner of the package is responsible for:

6.2 - Integration test

The owner of the TOP package in the architecture hierarchy has to provide tests proving the functionality of the set of software layers integrated.
The tasks at this phase are: Integration tests in POOL are provided for the following component:

PersistencySvc - depends on StorageSvc, FileCatalog
DataSvc - depends on PersistencySvc, FileCatalog
Collection - depends on DataSvc and/or PersistencySvc

6.3 - System test

The infrastructure WP is responsible for: System tests in POOL are provided for the following component:

FileCatalog - depends on external packages
DataSvc - depends on PersistencySvc, FileCatalog
Collection - depends on DataSvc and/or PersistencySv

6.3 - Acceptance test

The infrastructure WP is responsible for: The integration of POOL in the experiment frameworks has been recently started. The developers involved in the integration act as contact person for the requirements and for the validation of the released version.

ATLAS: Valeri Fine
CMS: William Tanenbaum
LHCb: Markus Frank

6.4 - Test plan document and test cases documents.

The infrastructure WP is responsible for: As a part of the test deliverable the authors of the test should provide a test case specification (Mandatory).

6.5 - Run the tests

The infrastructure WP is responsible for:

6.5 - Bug tracking:

Is under the responsibility of developers and Sw infrastructure and testing team (depending on the type of bug) to perform the bug tracking and convert into tests all those bugs susceptible to be use in the regression testing scheme (Mandatory). 


7.- Software Testing Polices

Apart from the mandatory testing actions and schedule already defined in this document, the sw-testing procedure must be performed in total accordance with the Sw-Testing Policies defined within LCG AppArea. These Sw-testing policies can be found at SPI Sw-development Web pages.



8.- Glossary




Acceptance testing  is the process of comparing the end product to the current needs of its end users. It is normally performed by the organization ordering the sw-product or end user. The system is now tested with real data and in the real environment. This is often the validation of the system. Syn: alpha and beta testing. 
Integration testing  involves tests with the purpose of verifying that the units are working together correctly. It is an orderly progression of testing in which tested software components are combined and tested to evaluate their interactions.The unit/component under test uses all the other components which is depending on. The uses-cases of the have to be matched with the test cases.
Test case documentation specifying inputs, predicted results, and a set of execution conditions for a test item. Syn: test case specification. 
Test deliverable a test deliverable consists in: an executable (binary or script), a set of input data (optional), a reference output for the test validation, a clear description (using the proposed template) of the aim of the test together with the specific environment needs to run it. The deliverable must be working within the sw-configuration management tool, sw-testing frameworks and nightly building systems proposed by SPI
System testing is the process of testing an integrated hardware and software system to verify that the system meets its specified requirements. 
Unit testing  means that one and only one unit is tested as such. This test requires that the unit is independent of the other units. The unit can be a class but it can be also a small group of collaborating classes. Syn: component testing. 

 


Template version: TEST_testplan-template-00.01 (Draft)
Thu May 15 17:19:36 CEST 2003