ARC1/Testing

From NorduGrid
Jump to: navigation, search

Warn.png

NOTE: This page is out of date and is kept for purely historical reasons.
See Testing

This page is dedicated to test planning and detailed specifications. For now it contains just an outline of main components and functionality of ARC to be tested. The more detailed input will be added continuously. Any input is welcome.

An outline of testing process

The testing can be divided into two parts: unit and system/integration testing. The former concentrate on the lowest level components of the software while the latter probes larger subsystems or system as whole. Schematic outline of the testing procedure is on the figure below.

Test flow.png

Unit tests

If developers will adhere to this approach of component testing (Unit Testing), the low level tests could be performed automatically after successful builds. In any case some form of transparent lowest level components could be very helpful in building confidence to the software.

System/integration tests

I have divided system/integration tests into three subgroups: accordingly three test suits will be developed. The test suits should run automatically after successful installation of ARC.

Black box testing

The first and the most important one is "alien" layer which comprise all functional requirements end user could demand from ARC.

Functional tests:

  • Site registration & authorization.
  • Monitoring and logging.
  • Single/bulk job submission.
  • Job management (control, kill, clean, restart).
  • Storage elements (SE). Copy/move files to/from/between SE.

All of standard commands: apclean, apecho, apget, apinfo, apkill, approxy, apsstat, apstat, apsub, arccp, arcls, arcrm and arcstat will be tested during this step.

Glass (White) box testing

Finer grained testing of distinct components of ARC. Based on specifications in design document (D1.1).

  • Hosting environment.
  • Information system.
  • Execution management capability.
  • Data management.
  • Resource management.
  • Security.
  • Self Management.

Performance testing

Tests from deliverable D5.4

  • CPU and MEM usage under different conditions.
  • Job submission component (jobs submitted per minute, job ratio ...).
  • Information system response time.
  • Resource discovery
  • Data staging.
  • Data copy.

... We will track performance changes between subsequent builds of ARC.

Test specifications

The following text specifies tests that are run upon subsequent revisions of ARC1 trunk repository automatically. The test reusults can be viewed at dedicated vls.grid.upjs.sk/testing webpage

AREX

A-rex is tested by submitting, controling, dowloading, and cleaning with following jobs:

  • get hostname. If the job finishes successfully a hostname of ARC1 server shall be present in out.txt file.
  • shell http. This job stages in an input file which is then processed.
  • http stage in. This job stages in an input file which is then processed.
  • http stage out. This job produces a file which is then staged out to a remote http server .
  • ftp stage in. This job stages in an input file from a remote ftp server which is then processed.
  • ftp stage out . This job produces a file which is then staged out to a remote ftp server .
  • gsi stage in. This job stages in a python script from a client and an input file from a remote GSI server. The python script uses the input file for creating output which is then staged out to the client.
  • gsi stage out. This job stages in a python script from a client. This script generates an output file which is then staged out to a remote GSI server.


AREX SECURITY

The AREX SECURITY feature is tested by submitting the same bunch of jobs to AREX. AREX is running with several different policies one after another.

ECHO

There are currently three forms of ECHO service provided by ARC1. ECHO is available from C++, Python and Java.

Tests for ECHO C++

  1. Using apecho client tool. Expected output is <message> enclosed in wrapper specified in configuration file. Usage:
     $ arcecho service_url <message> 
    .
  1. Using curl gnu utility. Usage:
     $ curl -d <message> 
    , where <message> is XML formatted text. For example:
'<?xml version="1.0"?><soap-env:Envelope xmlns:soap-enc="http://schemas.xmlsoap.org/soap/encoding/"
xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" 
xmlns:xsd="http://www.w3.org/2001/XMLSchema" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:star="urn:echo">
<soap-env:Body>
<star:echo>
<star:say>'''HELLO'''</star:say>
</star:echo>
</soap-env:Body>
</soap-env:Envelope>'
.

The expected output is XML formatted text with HELLO string enclosed in wrapper specified in configuration file.

Tests for ECHO python

  1. Using curl gnu utility. Usage:
     $ curl -d <message> 
    , where <message> is XML formatted text. For example:
'<?xml version="1.0"?><soap-env:Envelope xmlns:soap-enc="http://schemas.xmlsoap.org/soap/encoding/"
xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" 
xmlns:xsd="http://www.w3.org/2001/XMLSchema" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:star="urn:echo">
<soap-env:Body>
<star:echo>
<star:say>'''HELLO'''</star:say>
</star:echo>
</soap-env:Body>
</soap-env:Envelope>'
.

The expected output is XML formatted text with HELLO string enclosed in wrapper specified in configuration file.

Tests for ECHO java (not available now)

  1. Using curl gnu utility. Usage:
     $ curl -d <message> 
    , where <message> is XML formatted text. For example:
'<?xml version="1.0"?><soap-env:Envelope xmlns:soap-enc="http://schemas.xmlsoap.org/soap/encoding/"
xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" 
xmlns:xsd="http://www.w3.org/2001/XMLSchema" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:star="urn:echo">
<soap-env:Body>
<star:echo>
<star:say>'''HELLO'''</star:say>
</star:echo>
</soap-env:Body>
</soap-env:Envelope>'
.

The expected output is XML formatted text with HELLO string enclosed in wrapper specified in configuration file.

HOPI (previous name HTTPD)

HOPI service is tested by listing of content of published directory and by wget GNU utility:

Listing

  1. published HTML page is checked for whether every file or directory within published directory is present and accessible via hyperlink.

wget

  1. The published directory is recursively downloaded and the MD5SUMs of downloaded files are compared to checksums of local files in published directory.
  2. Structure of downloaded directory is compared to the structure of published directory.

STORAGE

The STORAGE service is tested through its client arc_storage_cli. All its methods (make, stat, list, put, get, del, move, unmake) are being tested.

MAKE

  1. root
  2. some collection in root
  3. recursive collections up to 100
  4. some collection within recursive tree
  5. without target

STAT

  1. empty root
  2. file
  3. non empty collection
  4. empty collection
  5. bunch of files within collection tree
  6. non existing path
  7. root
  8. without target

LIST

  1. root
  2. file
  3. non empty collection
  4. empty collection
  5. non existing path
  6. without target

PUT

  1. to root
  2. to existing collection
  3. to non existing collection
  4. to existing file
  5. bunch of files somwhere in the collection tree
  6. without target

GET

  1. bunch of files from the collection tree
  2. to non existing local path
  3. to new name
  4. collection
  5. without source

DEL

  1. non existing path
  2. collection
  3. file
  4. bunch of files from the collection tree
  5. without target

MOVE

  1. to new name
  2. non existing file
  3. to existing file
  4. to existing collection
  5. to non existing path
  6. without target

UNMAKE

  1. collection in root
  2. collection within recursive tree
  3. recursive collection up to 100
  4. file
  5. non empty collection
  6. non existing path
  7. root
  8. without target

CHARON (previous name PDP - Policy Decision Point)

For now it only makes sure that service runs with example configuration file (after trivial modifications) in the trunk, and that it returns correct answer based on predefined policy and preformated request sent through its client (arcdecision).

Operating systems

Based on discussion during WP5 meeting the following platforms were selected for integration testing:

  • Fedora 8
  • OpenSuSE 10.3
  • Ubuntu 8.04
  • CentOS 5.1

Snapshots and relevant .rpm packages for these operating systems, are downloaded, compiled (in the case of installation from sources), installed and tested daily. Results of tests, actual tests scripts, test jobs and scripts, can be found on dedicated site

Bugtracking system

Results of all tests are archived and accessible on the testing site and discovered bugs will be filled into bugzilla and/or the responsible developer will be contacted directly if one can be identified easily.