Fix testing documentation

refs #4639
This commit is contained in:
Johannes Meyer 2014-04-10 15:54:50 +02:00
parent 1db9247d0d
commit 7903d44af9
7 changed files with 10 additions and 519 deletions

View File

@ -1,78 +0,0 @@
# Frontend component tests
Frontend tests test your code from the users perspective: By opening a specific url, executing a few clicks, strokes, etc.
and expecting something to happen. We use [CasperJS](http://casperjs.org/) for frontend testing, which is basically a
headless Webkit browser.
**NOTE**: The 1.1.0DEV version does *NOT* work at this time as the api changed. Use the stable 1.0.3 branch instead.
In order to be able to run the frontend tests, you need a running instance of icingaweb. You should make sure that you
don't need this instance after running the tests, as they could change preferences or configuration
## Writing tests
### Test bootstrap - icingawebtest.js module
The icingawebtest.js module is required for proper testing, as this module eases casperjs usage. After importing the
module with:
var icingawebtest = require('./icingawebtest');
You only need two methods for testing:
* *getTestEnv()*: This method returns a modified casperjs test environment. The difference to then normal casperjs object
is that all methods which take a URL are overloaded so you can add a relative URL if you want to (and
normally you don't want to hardcode your test URLs)
Example:
var casper = icingawebtest.getTestEnv();
* performLogin(): This calls the login page of your icingaweb instance and tries to login with the supplied credentials
icinga.performLogin();
Login is performed with the credentials from the CASPERJS_USER/CASPERJS_PASS environment (this can be set with the
./runtest --user %user% --pass %pass% arguments). The host, server and port are also represented as
CASPERJS_HOST, CASPERJS_PORT and CASPERJS_PATH environment settings. The default in runtest resembles the version that
works best in the vagrant development environment:
* The default user is 'jdoe'
* The default password is 'password'
* The host and port are localhost:80
* The default path is icinga2-web
### Writing the test code
Most tests will require you to login with the supplied credentials, this can be performed with a simple call
icinga.performLogin();
You can then start the test by calling casper.thenOpen with the page you want to work
casper.thenOpen("/mysite", function() {
// perform tests
});
### Testing
Afterwards, everything is like a normal CasperJS test, so you can wrap your assertions in a casper.then method:
// assert a specific title
casper.then(function() {
this.test.assertTitle("Just an empty page");
});
Note that asynchronous calls reuqire you to wait or provide a callback until the resource is loaded:
// waitForSelector calls callbackFn as soon as the selector returns a non-empty set
casper.waitForSelector("div#icinga-main a", callbackFn);
At the end of your test, you have to provide
casper.run(function() {
this.test.done();
});
Otherwise the tests won't be executed.

View File

@ -1,88 +0,0 @@
# Writing JavaScipt tests
JavaScript tests are executed using [mocha](http://visionmedia.github.io/mocha/) as a test framework and
[should.js](https://github.com/visionmedia/should.js/) as the assertion framework.
## Mocking require.js
As we use require.js for asynchronous dependency resolution in JavaScript, this can lead to problems in our node.js
environment. In order to avoid requirejs calls to cause issues, it has been mocked in the testlib/asyncmock.js class
and should be required in every testcase:
var rjsmock = require("requiremock.js");
rjsmock now makes dependency management comfortable and provides the following most important methods:
// remove all registered dependencies from the rjsmock cache
rjsmock.purgeDependencies();
// register the following objects underneath the following requirejs paths:
rjsmock.registerDependencies({
'icinga/container' : {
updateContainer : function() {},
createPopupContainer: function() {}
}
});
// in your js code a require(['icinga/container], function(container) {}) would now have the above mock
// object in the container variable
requireNew("icinga/util/async.js"); // requires icinga/util/async.js file - and ignores the requirejs cache
var async = rjsmock.getDefine(); // returns the last define, this way you can retrieve a specific javacript file
## Faking async responses
As we currently use the icinga/util/async.js class for all asynchronous requests, it's easy to fake responses. The asyncmock.js
class provides functions for this. To use it in your test, you first have to require it:
var asyncMock = require("asyncmock.js");
You now can use asyncMock.setNextAsyncResult((async) asyncManager, (string) resultString, (bool) fails, (object) headers) to
let the next request of the passed asyncManager object return resultString as the response, with the headers provided as the
last parameter. If fails = true, the error callback of the request will be called.
## Example test
The following example describes a complete test, (which tests whether the registerHeaderListener method in the async class works) :
var should = require("should"); // require should.js for assertions
var rjsmock = require("requiremock.js"); // use the requiremock described above
var asyncMock = require("asyncmock.js"); // Helper class to fake async requests
GLOBAL.document = $('body'); // needed when our test accesses window.document
describe('The async module', function() { // start the test scenario
it("Allows to react on specific headers", function(done) { // Start a test case - when done is called it is finished
rjsmock.purgeDependencies(); // Remove any dependency previously declared
rjsmock.registerDependencies({ // Mock icinga/container, as this is a dependency for the following include
'icinga/container' : {
updateContainer : function() {},
createPopupContainer: function() {}
}
});
requireNew("icinga/util/async.js"); // This is the file we want to test, load it and all of it's dependencies
var async = rjsmock.getDefine(); // Retrieve a reference to the loaded file
// Use asyncMock.setNextAsyncResult to let the next async request return 'result' without failing and set
// the response headers 'X-Dont-Care' and 'X-Test-Header'
asyncMock.setNextAsyncResult(async, "result", false, {
'X-Dont-Care' : 'Ignore-me',
'X-Test-Header' : 'Testme123'
});
// register a listener for results with the X-Test-Header response
async.registerHeaderListener("X-Test-Header", function(value, header) {
// test for the correct header
should.equal("Testme123", value);
// call done to mark this test as succeeded
done();
},this);
// run the faked request
var test = async.createRequest();
});
});

View File

@ -4,7 +4,7 @@
The path where you should put your PHPUnit tests should reflect the path in the sourcetree, with test/php/ prepended. So
if you're testing a file library/Icinga/My/File.php the test file should be at test/php/library/Icinga/My/File.php. This
also applies for modules, where the test folder is underneath modules/myModule/test/php
also applies for modules, where the tests are underneath modules/myModule/test/php
## Example test skeleton
@ -12,13 +12,8 @@ Let's assume you're testing a class MyClass underneath the MyModule module and t
modules/mymodule/library/MyModule/Helper/MyClass.php.
<?php
// The namespace is the same namespace as the file to test has, but with 'Test' prepended
namespace Test\Modules\MyModule\Helper;
// Require the file and maybe others. The start point is always the applications
// testss/php/ folder where the runtest executable can be found
require_once '../../mymodule/library/MyModule/Helper/MyClass.php';
// The namespace is the same namespace as the file to test has, but with 'Tests' prepended
namespace Tests\Module\MyModule\Helper;
class MyClassTest extends \PHPUnit_Framework_TestCase
{
@ -58,80 +53,3 @@ should **document** that the test interacts with static attributes:
// ...
}
}
## Requirements and the dependency mess
### spl_autoload_register vs. require
When looking at our test classes, you'll notice that we don't use PHPs autoloader to automatically load dependency, but
write 'require_once' by ourselfs. This has the following reasons:
- When writing tests, you to be aware of every dependency your testclass includes. With autoloading, it's not directly
obvious which classes are included during runtime.
- When mocking classes, you don't need to tell your autoloader to use this class instead of the one used in production
- Tests can't be run isolated without an boostrap class initializing the autoloader
### How to avoid require_once massacres: LibraryLoader
The downside of this approach is obvious: Especially when writing compoment tests you end up writing a lot of 'require'
classes. In the worst case, the PHP require_once method doesn't recognize a path to be already included and ends up
with an 'Cannot redeclare class XY' issue.
To avoid this, you should implement a LibraryLoader class for your component that handles the require_once calls.
For example, the status.dat component tests has a TestLoader class that includes all dependencies of the component:
namespace Tests\Icinga\Protocol\Statusdat;
use Test\Icinga\LibraryLoader;
require_once('library/Icinga/LibraryLoader.php');
/**
* Load all required libraries to use the statusdat
* component in integration tests
*
**/
class StatusdatTestLoader extends LibraryLoader
{
/**
* @see LibraryLoader::requireLibrary
*
**/
public static function requireLibrary()
{
// include Zend requirements
require_once 'Zend/Config.php';
require_once 'Zend/Cache.php';
require_once 'Zend/Log.php';
// retrieve the path to the icinga library
$libPath = self::getLibraryPath();
// require library dependencies
require_once($libPath."/Data/AbstractQuery.php");
require_once($libPath."/Application/Logger.php");
require_once($libPath."/Data/DatasourceInterface.php");
// shorthand for the folder where the statusdat component can be found
$statusdat = realpath($libPath."/Protocol/Statusdat/");
require_once($statusdat."/View/AccessorStrategy.php");
// ... a few more requires ...
require_once($statusdat."/Query/Group.php");
}
}
Now an component test (like tests/php/library/Icinga/Protocol/Statusdat/ReaderTest.php) can avoid the require calls and
just use the requireLibrary method:
use Icinga\Protocol\Statusdat\Reader as Reader;
// Load library at once
require_once("StatusdatTestLoader.php");
StatusdatTestLoader::requireLibrary();
**Note**: This should be used for component tests, where you want to test the combination of your classes. When testing
a single execution unit, like a method, it's often better to explicitly write your dependencies.
If you compare the first approach with the last one you will notice that, even if we produced more code in the end, our
test is more verbose in what it is doing. When someone is updating your test, he should easily see what tests are existing
and what scenarios are missing.

View File

@ -6,11 +6,10 @@ Tests for the application can be found underneath the test folder:
test/
php/ PHPUnit tests for backend code
js/ mocha tests for JavaScript frontend code unittests
frontend/ Integration tests for the frontend using casperjs
regression/ PHPUnit regression tests
The same structure applies for modules, which also contain a toplevel test folder and suitable subtests. When you fix
a bug and write a regression test for it, put the test in the 'regression' and name it %DESCRIPTION%%TicketNumber% (myBug1234.js)
a bug and write a regression test for it, put it in 'regression' and name it %DESCRIPTION%%TicketNumber% (myBug1234.php)
## Running tests

View File

@ -18,11 +18,8 @@ This list summarizes what will be described in the next few chapters:
- Your assertions should reflect one test scenario, i.e. don't write one test method that tests if something works **and**
if it correctly detects errors after it works. Write one test to determine the behaviour with correct input and one that
tests the behaviour with invalid input.
- When writing unit-tests (like function level tests), try to keep your dependencies as low as possible (best indicator herefor
is the number of require calls in your test). Mock external components and inject them into the class you want to test. If
your testsubject is not able to use mocked dependencies, it's often a design flaw and should be considered as a bug
(and be fixed)
- When writing component tests with a lot of dependencies, wrap the require calls in a LibraryLoader subclass
- Mock external components and inject them into the class you want to test. If your testsubject is not able to use mocked
dependencies, it's often a design flaw and should be considered as a bug (and be fixed)
## What should be tested
@ -163,12 +160,6 @@ need much dependency handling. An example for a unittest would be to test the fo
A unit test for this user could, but should not look like this (we'll explain why):
require_once "../../library/Icinga/MyLibrary/UserManager.php";
// needed by UserManager
require_once "../../library/Icinga/Authentication/Manager.php"
require_once "../../library/Icinga/Authentication/User.php"
// .. imagine a few more require_once
use Icinga/MyLibrary/UserManager
class UserManagerTest extends \PHPUnit_Framework_TestCase
@ -196,7 +187,7 @@ A unit test for this user could, but should not look like this (we'll explain wh
$this->assertTrue($mgr->isCorrectPassword("hans", "validpasswor"));
}
This test has obviously a few issues:
This test has a few issues:
- First, it assert a precondition to apply : A database must exist with the users jdoe and jsmith and the credentials
must match the ones provided in the test
@ -238,7 +229,6 @@ It would of course be best to create an Interface like UserSource which the Auth
we trust our Programmer to provide a suitable object. We now can eliminate all the AuthManager dependencies by mocking the
AuthManager (lets dumb it down to just providing an array of users):
require_once "../../library/Icinga/MyLibrary/UserManager.php";
use Icinga/MyLibrary/UserManager
class AuthManagerMock
@ -286,7 +276,6 @@ AuthManager (lets dumb it down to just providing an array of users):
Ok, we might have more code here than before, but our test is now less like prone to fail:
- Our test doesn't assume any preconditions to apply, like having a db server with correct users
- The require call to the AuthManager is gone, so if there's a bug in the AuthManager implementation our test is not affected
@ -368,84 +357,3 @@ Also, the assertions should get an error message that will be printed on failure
Now if something fails, we now see what has been tested via the testmethod and what caused the test to fail in the
assertion error message. You could also leave the comments and everybody knows what you are doing.
## Testing PHP
## Requirements and the dependency mess
### spl_autoload_register vs. require
When looking at our test classes, you'll notice that we don't use PHPs autoloader to automatically load dependency, but
write 'require_once' by ourselfs. This has the following reasons:
- When writing tests, you to be aware of every dependency your testclass includes. With autoloading, it's not directly
obvious which classes are included during runtime.
- When mocking classes, you don't need to tell your autoloader to use this class instead of the one used in production
- Tests can't be run isolated without an boostrap class initializing the autoloader
### How to avoid require_once massacres: LibraryLoader
The downside of this approach is obvious: Especially when writing compoment tests you end up writing a lot of 'require'
classes. In the worst case, the PHP require_once method doesn't recognize a path to be already included and ends up
with an 'Cannot redeclare class XY' issue.
To avoid this, you should implement a LibraryLoader class for your component that handles the require_once calls.
For example, the status.dat component tests has a TestLoader class that includes all dependencies of the component:
namespace Tests\Icinga\Protocol\Statusdat;
use Test\Icinga\LibraryLoader;
require_once('library/Icinga/LibraryLoader.php');
/**
* Load all required libraries to use the statusdat
* component in integration tests
*
**/
class StatusdatTestLoader extends LibraryLoader
{
/**
* @see LibraryLoader::requireLibrary
*
**/
public static function requireLibrary()
{
// include Zend requirements
require_once 'Zend/Config.php';
require_once 'Zend/Cache.php';
require_once 'Zend/Log.php';
// retrieve the path to the icinga library
$libPath = self::getLibraryPath();
// require library dependencies
require_once($libPath."/Data/AbstractQuery.php");
require_once($libPath."/Application/Logger.php");
require_once($libPath."/Data/DatasourceInterface.php");
// shorthand for the folder where the statusdat component can be found
$statusdat = realpath($libPath."/Protocol/Statusdat/");
require_once($statusdat."/View/AccessorStrategy.php");
// ... a few more requires ...
require_once($statusdat."/Query/Group.php");
}
}
Now an component test (like tests/php/library/Icinga/Protocol/Statusdat/ReaderTest.php) can avoid the require calls and
just use the requireLibrary method:
use Icinga\Protocol\Statusdat\Reader as Reader;
// Load library at once
require_once("StatusdatTestLoader.php");
StatusdatTestLoader::requireLibrary();
**Note**: This should be used for component tests, where you want to test the combination of your classes. When testing
a single execution unit, like a method, it's often better to explicitly write your dependencies.
If you compare the first approach with the last one you will notice that, even if we produced more code in the end, our
test is more verbose in what it is doing. When someone is updating your test, he should easily see what tests are existing
and what scenarios are missing.

View File

@ -50,29 +50,14 @@ password is queried when connecting from the local machine:
Icinga has it's own base test which lets you easily require libraries, testing database and form functionality. The class resides in
library/Icinga/Test. If you write a test, just subclass BaseTestCase.
### Default test header
Before writing a test you should include the base test first
// @codingStandardsIgnoreStart
require_once realpath(__DIR__ . '/../../../../../library/Icinga/Test/BaseTestCase.php');
// @codingStandardsIgnoreEnd
Now you can simply include dependencies with predefined properties:
require_once BaseTestCase::$libDir . '/Web/Form.php';
require_once BaseTestCase::$appDir . '/forms/Config/AuthenticationForm.php';
BaseTestCase provides static variables for every directory in the project.
### Writing database tests
The base test uses the PHPUnit dataProvider annotation system to create Zend Database Adapters. Typically a
The base test uses the PHPUnit dataProvider annotation system to create database connections. Typically a
database test looks like this:
/**
* @dataProvider mysqlDb
* @param Zend_Db_Adapter_PDO_Abstract $mysqlDb
* @param Icinga\Data\Db\Connection $mysqlDb
*/
public function testSomethingWithMySql($mysqlDb)
{
@ -106,58 +91,3 @@ BaseTestCase holds method to require form libraries and create form classes base
The second parameter of createForm() can be omitted. You can set initial post request data as
an array if needed.
## Writing tests for controllers
When writing tests for controllers, you can subclass the MonitoringControllerTest class underneath monitoring/test/php/testlib:
class MyTestclass extends MonitoringControllerTest
{
// test stuff
}
This class handles a lot of depenendency resolving and controller mocking. In order to test your action correctly and
without side effects, the TestFixture class allows your to define and set up your faked monitoring results in the backend
you want to test:
use Test\Monitoring\Testlib\Datasource\TestFixture;
class MyTestclass extends MonitoringControllerTest
{
public function testSomething()
{
$fixture = new TestFixture();
// adding a new critical, but acknowledged host
$fixture->addHost("hostname", 1, ObjectFlags::ACKNOWLEDGED())
// add a comment to the host (this has to be done before adding services)
->addComment("author", "comment text")
// assign to hostgroup
->addToHostgroup("myHosts")
// and add three services to this host
->addService("svc1", 0) // Service is ok
->addService("svc2", 1, ObjectFlags::PASSIVE) // service is warning and passive
->addService("svc3", 2, null, array("notes_url" => "test.html")) // critical with notes url
->addComment("author", "what a nice service comment") // add a comment to the service
->addToServicegroup("alwaysdown"); // add svc3 to servicegroup
// Create the datasource from this fixture, here in MySQL
$this->setupFixture($fixture, "mysql");
// ... do the actual testing (discussed now)
}
}
After the call to setupFixture() your backend should be ready to be tested. Setting up the controller manually would
force you to go through the whole bootstrap. To avoid this the MonitoringControllerTest class provides a 'requireController'
method which returns the Controller for you with an already set up backend using your previously defined testdata:
$controller = $this->requireController('MyController', 'mysql');
// controller is now the Zend controller instance, perform an action
$controller->myAction();
$result = $controller->view->hosts->fetchAll();
This example assumes that the controller populates the 'host' variable in the view, so now you can assert the state of
the result according to your test plan.

View File

@ -1,98 +0,0 @@
# Testing controllers with different backends
Icingaweb's monitoring controllers support a variety of different backends (IDO, Statusdat, Livestatus) and make it
therefore hard to test for every backend. In order to make life a little bit easier, Test-Fixtures allow you to setup
a backend with specific monitoring data and test it afterwards by running the controller's action.
## Example
It's best to subclass MonitoringControllerTest (found in modules/monitoring/test/testlib), as this handles depency resoultion
and setup for you:
// assume our test is underneath the test/application/controllers folder in the monitoring module
require_once(dirname(__FILE__).'/../../testlib/MonitoringControllerTest.php');
use Test\Monitoring\Testlib\MonitoringControllerTest;
use Test\Monitoring\Testlib\Datasource\TestFixture;
use Test\Monitoring\Testlib\Datasource\ObjectFlags;
class MyControllerTest extends MonitoringControllerTest
{
public function testSomething()
{
// Create a test fixture
$fixture = new TestFixture()
$fixture->addHost('host', 0) // Add a host with state OK
->addToHostgroup('myHosts') // Add host to hostgroup
->addService('svc1', 1) // Add a warning service 'svc1' underneath host
->addToServiceGroup('svc_warning') // add this service to a servicegroup svc_warning
->addService('svc2', 2, ObjectFlags::ACKNOWLEDGED()) // Add a critical, but acknowledged service to this host
->addService(
'svc3',
1,
new ObjectFlags(),
array("customvariables" =>
array("customer" => "myCustomer")
)
); // add a warning service with a customvariable
$this->setupFixture($fixture, "mysql"); // setup the fixture for MySQL, so the backend is populated with the data set above
// backends can be mysql, pgsql, statusdat (and in the future livestatus)
$controller = $this->requireController('MyController', 'mysql'); // request a controller with the given backend injected
$controller->myAction(); // controller is now the Zend controller instance, perform an action
$result = $controller->view->hosts->fetchAll(); // get the result of the query
// and assert stuff
$this->assertEquals(1, count($result), "Asserting one host being retrieved in the controller");
}
}
## The Test-fixture API
In order to populate your backend with specific monitoring objects, you have to create a TestFixture class. This class
allows you to setup the monitoring objects in a backend independent way using a few methods:
### TestFixture::addHost($name, $state, [ObjectFlags $flags], [array $properties])
The addHost method adds a host with the name $name and the status $state (0-2) to your testfixture. When no ObjectFlags
object is provided, the default flags are used (not flapping, notifications enabled, active and passive enabled, not
acknowledged, not in downtime and not pending). The $properties array can contain additional settings like 'address',
an 'customvariables' array, 'notes_url', 'action_url' or 'icon_image'.
Subsequent addToHostgroup and addService calls will affect this host (so the service will be added to this host)
### TestFixture::addService($name, $state, [ObjectFlags $flags], [array $properties])
The addHost method adds a service with the name $name and the status $state (0-3) to your testfixture. When no ObjectFlags
object is provided, the default flags are used (not flapping, notifications enabled, active and passive enabled, not
acknowledged, not in downtime and not pending). The $properties array can contain additional settings like an
'customvariables' array, 'notes_url', 'action_url' or 'icon_image'.
Subsequent addToServicegroup calls will affect this service.
### ObjectFlags
The Objectflags object encapsulates the following monitoring states with the following default values:
public $flapping = 0; // Whether this host is flapping
public $notifications = 1; // Whether notifications are enabled
public $active_checks = 1; // Whether actice checks are enabled
public $passive_checks = 1; // Whether passive checks are enabled
public $acknowledged = 0; // Whether this object has been acknowledged
public $in_downtime = 0; // Whether this object is in a scheduled downtime
public $is_pending = 0; // Whether this object is currently pending
public $time = time(); // The last check and statechange time
ObjectFlags can either be created using new ObjectFlags([$ageInSeconds]) and directly modify the attributes or by
calling one of the following factory methods:
ObjectFlags::FLAPPING()
ObjectFlags::PENDING()
ObjectFlags::DISABLE_NOTIFICATIONS()
ObjectFlags::PASSIVE_ONLY()
ObjectFlags::ACTIVE_ONLY()
ObjectFlags::DISABLED() {
ObjectFlags::ACKNOWLEDGED()
ObjectFlags::IN_DOWNTIME()