Running tests

There are two basic categories of tests unit tests in TARDIS 1) the unit tests 2) integration tests. Unit tests check the outputs of individual functions while the integration tests check entire runs for different setups of TARDIS.

The Unit tests run very quickly and thus are executed after every suggested change to TARDIS. The Integration tests are much more costly and thus are only executed every few days on a dedicated server.

All of them are based on the excellent astropy-setup-helpers package and pytest.

Running the unit tests

This is very straight forward to run on your own machine. For very simple unit tests you can run this with:

> python test

Running the more advanced unit tests it requires atomic data that can be downloaded (atom_data).

> python test --args="--atomic-dataset=kurucz_cd23_chianti_H_He.h5"

Running the integration tests

These tests require reference files against which the results of the various tardis runs are tested. So you first need to either download the current reference files (here) or generate new ones.

Both of of these require a configuration file for the integration tests:

atom_data_path: "~/projects/tardis/integration/"

# This section holds information about mechanism of saving the HTML
# report of the tests.
# "save_mode" is mandatory: It can be either "local" or "remote".
  save_mode: "local"

  # This section contains credentials for dokuwiki instance.
  # It is mandatory if "save_mode" is "remote", else can be removed.
    url: ""
    username: "private"
    password: "private"

  # If "save_mode" is "local", a sub directory will be made in this
  # directory according to commit hash (shortened), and it will contain
  # the complete report content.
  reportpath: "~/projects/tardis/integration"

# Path to directory containing reference HDF files.
reference: "~/projects/tardis/integration/"

# Path to directory where reference HDF files will be generated and
# saved during the test run. Use "--generate-reference" flag in command
# line args for the purpose, for other cases this will be simply ignored.
generate_reference: "~/projects/tardis/integration/"

# Speeds up test execution by reducing amount of packets per iteration,
# useful for debugging problems in testing infrastructure itself.
# Use "--less-packets" in command line args, for other cases this will be
# simply ignored. This section is not mandatory.
  no_of_packets: 400
  last_no_of_packets: 1000

Inside the atomic data directory there needs to be atomic data for each of the setups that are provided in the test_integration folder. If no references are given the first step is to generate them. The --less-packets option is useful for debugging purposes and will just use very few packets to generate the references and thus make the process much faster - THIS IS ONLY FOR DEBUGGING PURPOSES. The -s option ensures that TARDIS prints out the progress:

To run the test after having run the --generate-references all that is needed is:

Setting up the Dokuwiki report

A normal dokuwiki installation is performed on the required server. Before the connection works one is requires to set the option remote access in the settings. If this is not done the dokuwiki python plugin will not connect with the warning DokuWikiError: syntax error: line 1, column 0. One also has to enable this for users (remoteuser option) otherwise the error: ProtocolError for xmlrpc.php?p=xxxxxx&u=tardistester: 403 Forbidden will appear.

Another important configuration option is to enable embedded html htmlok otherwise it won’t show nice html page reports.

Finally, one has to call the python test with the --remote-data option to allow posting to an external dokuwiki server.