Merge branch 'feature/testdoc-4263'

This commit is contained in:
Eric Lippmann 2013-08-02 13:14:43 +02:00
commit e24d3ca031
6 changed files with 892 additions and 0 deletions

View File

@ -0,0 +1,60 @@
# Frontend component tests
Frontend tests test your code from the users perspective: By opening a specific url, executing a few clicks, strokes, etc.
and expecting something to happen. We use [CasperJS](http://casperjs.org/) for frontend testing, which is basically a
headless Webkit browser.
## The current state of frontend testing
Currently frontend tests are not very advanced: We spawn a small - non php - server on port 12999 to test static files
and javascript behaviour. This will change in the future where we are going to test an installation (or use PHP 5.4
standalone server).
## Writing tests
### Test bootstrap
In order to make testing more comfortable, the i2w config provides a few helpers to make testing more straightforward.
In general you start your test by including i2w-config:
var i2w = require('./i2w-config');
and afterward creating a testenvironment with the getTestEnv() method:
var casper = i2w.getTestEnv();
You can then start the test by calling casper.start with the startpage (the servers root is always frontend/static, where
public is a symlink to the icingaweb public folder).
casper.start("http://localhost:12999/generic.html");
As we use requirejs, this has to be set up for our testcases. i2w provides a setupRequireJs function that does everything for you.
You just have to run this method on your testpage (note that the tested JavaScript is isolated from your test case's JavaScript, if
you want to execute JavaScript you must use the casper.page.evaluate method).
casper.then(function() {
// Setup requirejs
casper.page.evaluate(i2w.setupRequireJs, {icinga: true});
});
### Testing
Afterwards, everything is like a normal CasperJS test, so you can wrap your assertions in a casper.then method:
// assert a specific title
casper.then(function() {
this.test.assertTitle("Just an empty page");
});
Note that asynchronous calls reuqire you to wait or provide a callback until the resource is loaded:
// waitForSelector calls callbackFn as soon as the selector returns a non-empty set
casper.waitForSelector("div#icinga-main a", callbackFn);
At the end of your test, you have to provide
casper.run(function() {
this.test.done();
});
Otherwise the tests won't be executed.

88
doc/test/js_tests.md Normal file
View File

@ -0,0 +1,88 @@
# Writing JavaScipt tests
JavaScript tests are executed using [mocha](http://visionmedia.github.io/mocha/) as a test framework and
[should.js](https://github.com/visionmedia/should.js/) as the assertion framework.
## Mocking require.js
As we use require.js for asynchronous dependency resolution in JavaScript, this can lead to problems in our node.js
environment. In order to avoid requirejs calls to cause issues, it has been mocked in the testlib/asyncmock.js class
and should be required in every testcase:
var rjsmock = require("requiremock.js");
rjsmock now makes dependency management comfortable and provides the following most important methods:
// remove all registered dependencies from the rjsmock cache
rjsmock.purgeDependencies();
// register the following objects underneath the following requirejs paths:
rjsmock.registerDependencies({
'icinga/container' : {
updateContainer : function() {},
createPopupContainer: function() {}
}
});
// in your js code a require(['icinga/container], function(container) {}) would now have the above mock
// object in the container variable
requireNew("icinga/util/async.js"); // requires icinga/util/async.js file - and ignores the requirejs cache
var async = rjsmock.getDefine(); // returns the last define, this way you can retrieve a specific javacript file
## Faking async responses
As we currently use the icinga/util/async.js class for all asynchronous requests, it's easy to fake responses. The asyncmock.js
class provides functions for this. To use it in your test, you first have to require it:
var asyncMock = require("asyncmock.js");
You now can use asyncMock.setNextAsyncResult((async) asyncManager, (string) resultString, (bool) fails, (object) headers) to
let the next request of the passed asyncManager object return resultString as the response, with the headers provided as the
last parameter. If fails = true, the error callback of the request will be called.
## Example test
The following example describes a complete test, (which tests whether the registerHeaderListener method in the async class works) :
var should = require("should"); // require should.js for assertions
var rjsmock = require("requiremock.js"); // use the requiremock described above
var asyncMock = require("asyncmock.js"); // Helper class to fake async requests
GLOBAL.document = $('body'); // needed when our test accesses window.document
describe('The async module', function() { // start the test scenario
it("Allows to react on specific headers", function(done) { // Start a test case - when done is called it is finished
rjsmock.purgeDependencies(); // Remove any dependency previously declared
rjsmock.registerDependencies({ // Mock icinga/container, as this is a dependency for the following include
'icinga/container' : {
updateContainer : function() {},
createPopupContainer: function() {}
}
});
requireNew("icinga/util/async.js"); // This is the file we want to test, load it and all of it's dependencies
var async = rjsmock.getDefine(); // Retrieve a reference to the loaded file
// Use asyncMock.setNextAsyncResult to let the next async request return 'result' without failing and set
// the response headers 'X-Dont-Care' and 'X-Test-Header'
asyncMock.setNextAsyncResult(async, "result", false, {
'X-Dont-Care' : 'Ignore-me',
'X-Test-Header' : 'Testme123'
});
// register a listener for results with the X-Test-Header response
async.registerHeaderListener("X-Test-Header", function(value, header) {
// test for the correct header
should.equal("Testme123", value);
// call done to mark this test as succeeded
done();
},this);
// run the faked request
var test = async.createRequest();
});
});

106
doc/test/php_tests.md Normal file
View File

@ -0,0 +1,106 @@
# Writing PHPUnit tests
## Test path and filename
The path where you should put your PHPUnit tests should reflect the path in the sourcetree, with test/php/ prepended. So
if you're testing a file library/Icinga/My/File.php the test file should be at test/php/library/Icinga/My/File.php. This
also applies for modules, where the test folder is underneath modules/myModule/test/php
## Example test skeleton
Let's assume you're testing a class MyClass underneath the MyModule module and the file can be found at
modules/mymodule/library/MyModule/Helper/MyClass.php.
<?php
// The namespace is the same namespace as the file to test has, but with 'Test' prepended
namespace Test\Modules\MyModule\Helper;
// Require the file and maybe others. The start point is always the applications
// testss/php/ folder where the runtest executable can be found
require_once '../../mymodule/library/MyModule/Helper/MyClass.php';
class MyClassTest extends \PHPUnit_Framework_TestCase
{
public function testSomething()
{
$this->assertTrue(true, "Asserting that the world didn't end yet");
}
}
## Requirements and the dependency mess
### spl_autoload_register vs. require
When looking at our test classes, you'll notice that we don't use PHPs autoloader to automatically load dependency, but
write 'require_once' by ourselfs. This has the following reasons:
- When writing tests, you to be aware of every dependency your testclass includes. With autoloading, it's not directly
obvious which classes are included during runtime.
- When mocking classes, you don't need to tell your autoloader to use this class instead of the one used in production
- Tests can't be run isolated without an boostrap class initializing the autoloader
### How to avoid require_once massacres: LibraryLoader
The downside of this approach is obvious: Especially when writing compoment tests you end up writing a lot of 'require'
classes. In the worst case, the PHP require_once method doesn't recognize a path to be already included and ends up
with an 'Cannot redeclare class XY' issue.
To avoid this, you should implement a LibraryLoader class for your component that handles the require_once calls.
For example, the status.dat component tests has a TestLoader class that includes all dependencies of the component:
namespace Tests\Icinga\Protocol\Statusdat;
use Test\Icinga\LibraryLoader;
require_once('library/Icinga/LibraryLoader.php');
/**
* Load all required libraries to use the statusdat
* component in integration tests
*
**/
class StatusdatTestLoader extends LibraryLoader
{
/**
* @see LibraryLoader::requireLibrary
*
**/
public static function requireLibrary()
{
// include Zend requirements
require_once 'Zend/Config.php';
require_once 'Zend/Cache.php';
require_once 'Zend/Log.php';
// retrieve the path to the icinga library
$libPath = self::getLibraryPath();
// require library dependencies
require_once($libPath."/Data/AbstractQuery.php");
require_once($libPath."/Application/Logger.php");
require_once($libPath."/Data/DatasourceInterface.php");
// shorthand for the folder where the statusdat component can be found
$statusdat = realpath($libPath."/Protocol/Statusdat/");
require_once($statusdat."/View/AccessorStrategy.php");
// ... a few more requires ...
require_once($statusdat."/Query/Group.php");
}
}
Now an component test (like tests/php/library/Icinga/Protocol/Statusdat/ReaderTest.php) can avoid the require calls and
just use the requireLibrary method:
use Icinga\Protocol\Statusdat\Reader as Reader;
// Load library at once
require_once("StatusdatTestLoader.php");
StatusdatTestLoader::requireLibrary();
**Note**: This should be used for component tests, where you want to test the combination of your classes. When testing
a single execution unit, like a method, it's often better to explicitly write your dependencies.
If you compare the first approach with the last one you will notice that, even if we produced more code in the end, our
test is more verbose in what it is doing. When someone is updating your test, he should easily see what tests are existing
and what scenarios are missing.

89
doc/test/running_tests.md Normal file
View File

@ -0,0 +1,89 @@
# Test organization
## Testfolders
Tests for the application can be found underneath the test folder:
test/
php/ PHPUnit tests for backend code
js/ mocha tests for JavaScript frontend code unittests
frontend/ Integration tests for the frontend using casperjs
The same structure applies for modules, which also contain a toplevel test folder and suitable subtests. When you fix
a bug and write a regression test for it, put the test in the 'regression' and name it %DESCRIPTION%%TicketNumber% (myBug1234.js)
## Running tests
The tests can be run in the specific folder using the runtests script.
Running PHP tests example:
cd test/php
./runtests
In this case, all application and all module tests will be executed. The testrunners also support additional flags, which
affect they way the test is executed:
Options:
-h, --help show this help message and exit
-b, --build Enable reporting.
-v, --verbose Be more verbose.
-i PATTERN, --include=PATTERN
Include only specific files/test cases.
-V, --vagrant Run in vagrant VM
Some tests also support the --exclude method, it's best to use the --help option to see which flags are supported.
## Setting up databases
Despite running most of the tests should work out of the box, a few specific cases require some setup.
At this moment, only database tests require additional setup and expect an icinga_unittest user with an icinga_unittest
database to exist and have rights in your database.
### The database test procedure
When testing PostgreSQL and MySQL databases, the test library (normally) executes the following test procedure for every
test case:
- Log in to the rdbms as the user icinga_unittest with password icinga_unittest
- Use the icinga_unittest database (which must be existing)
- **Drop all tables** in the icinga_unittest database
- Create a new, clean database schema
If anything goes wrong during this procedure, the test will be skipped (because maybe you don't have a pgsql database, but
want to test mysql, for example).
### Setting up a test user and database in MySQL
In MySQL, it's best to create a user icinga_unittest@localhost, a database icinga_unittest and grant all privileges on
this database:
mysql -u root -p
mysql> CREATE USER `icinga_unittest`@`localhost` IDENTIFIED BY 'icinga_unittest';
mysql> CREATE DATABASE `icinga_unittest`;
mysql> GRANT ALL PRIVILEGES ON `icinga_unittest`.* TO `icinga_unittest`@`localhost`;
mysql> FLUSH PRIVILEGES;
mysql> quit
### Setting up a test user and database in PostgreSQL
In PostgreSQL, you have to modify the pg_hba database if you don't have password authentication set up (which often is
the case). In this setup the icinga_unittest user is set to trust authentication on localhost, which means that no
password is queried when connecting from the local machine:
sudo su postgres
psql
postgres=# CREATE USER icinga_unittest WITH PASSWORD 'icinga_unittest';
postgres=# CREATE DATABASE icinga_unittest;
postgres=# \q
bash$ createlang plpgsql icinga;
Add the following lines to your pg_hba.conf (etc/postgresql/X.x/main/pg_hba.conf under debian, /var/lib/pgsql/data/pg_hba.conf for Redhat/Fedora)
to enable trust authentication for the icingaweb user when connecting from the localhost.
local icinga_unittest icinga_unittest trust
host icinga_unittest icinga_unittest 127.0.0.1/32 trust
host icinga_unittest icinga_unittest ::1/128 trust

451
doc/test/styleguide.md Normal file
View File

@ -0,0 +1,451 @@
# General testing guidelines
## The short summary
This list summarizes what will be described in the next few chapters:
- You really should write tests for your code
- Think about your what you want to test and how you can assert the behaviour correctly.
- Isolate your tests and start without any assumptions about your test environment (like a specific user existing). The
only excuse here is to assume a correct login if you need a database for testing (but not the tables/content of the database!)
- Don't just test correct behaviour - test border cases and invalid input or assumptions to make sure the application handles
cases that might not occur in a normal test environment
- Use description strings in your assertions, this makes it easy to detect what's going wrong when they fail and it's easier
to follow what *exactly* you are asserting
- Test methods should be one scenario and the method name should describe this scenario.
*testLogin* is bad for example (do you test if the login fails? Or if it is correct?
*testLoginWithCorrectCredentials*, *testLoginWithWrongPassword* is a far better name here
- Your assertions should reflect one test scenario, i.e. don't write one test method that tests if something works **and**
if it correctly detects errors after it works. Write one test to determine the behaviour with correct input and one that
tests the behaviour with invalid input.
- When writing unit-tests (like function level tests), try to keep your dependencies as low as possible (best indicator herefor
is the number of require calls in your test). Mock external components and inject them into the class you want to test. If
your testsubject is not able to use mocked dependencies, it's often a design flaw and should be considered as a bug
(and be fixed)
- When writing component tests with a lot of dependencies, wrap the require calls in a LibraryLoader subclass
## What should be tested
### Writing meaningful tests
Writing tests doesn't ensure that your code is free of errors, but it can test if, in a specific scenario, the behaviour of
your code is as expected. This means that you have to think about your test scenario before writing a test. This involves
three steps:
- Determine what you want to test: Which errors can occur, what is the normal input and what are the border cases.
- Define a test scenario: What datasets make sense for the test?
- How do I write my assertions so they are really meaningful about the correctness of the testcase
Especially the border cases are important, often they lay inside of try/catch blocks or error detection routines. When
looking at your code coverages, these blocks should be covered.
### Example
Let's say you have the following function (the examples in this section are to be considered as php-like pseudocode) :
function isValidName($name)
{
if (hasCorrectFormat($name)) {
return false;
}
if (nameExistsInDatabase($name)) {
return false;
}
return true;
}
#### Determine what to test:
At first glance there can be 3 scenarios:
1. The username is unique and valid
2. The username has the wrong format
3. The username has the correct format, but exists in the database
But - what happens if the database is down? Should an exception be thrown or should it return false? This case has to be added
to your tests, so you have (at least!) the following scenarios:
1. The username is unique and valid
2. The username has the wrong format
3. The username has the correct format, but exists in the database
4. The username has the correct format, but access to the database fails.
#### Determine meaningful testscenarios
When it comes to creating your testscenario, we start with the easiest one: Test the wrongly formatted username. This
should be pretty straightforward, as we never reach the code where we need the database:
function testWrongFormat()
{
assertFalse(isValidName("$$% 1_', "Assert a name with special characters to be considered an invalid name");
}
The first and third scenario are more sophisticated (like always, when a database comes into the game). There are two ways
to test this:
- Either you create an empty table on each test, fill them with the users and run the test
- or you Mock the database call with a class that behaves like querying the users and returns true or false in each case
You **shouldn't** create a static database for all tests and assume this one to be existing - these are decoupled from the
actual test and soon get outdated or difficult to reflect all scenarios. Also it's not considered good practice to create
a precondition your tests have to rely on. In this case the second approach makes sense, as the mock class should be rather simple:
function testCorrectUserName()
{
// set no users in the database mock
$db = new UserDatabaseMock(array());
setupUserDatabaseMock($db);
assertTrue(isValidName("hans"), "Assert a correctly formatted and unique name to be considered valid");
}
function testNonUniqueUserName()
{
// set no users in the database mock
$db = new UserDatabaseMock(array("pete"));
setupUserDatabaseMock($db);
assertFalse(isValidName("pete"), "Assert a correctly formatted, but existing name to be considered invalid");
}
The exception can be tested by providing invalid db credentials when using a real databse or by extending the mock (which
we will do here):
function testDatabaseError()
{
// set no users in the database mock
$db = new UserDatabaseMock(array());
$db->setThrowsException(true);
setupUserDatabaseMock($db);
assertFalse(isValidName("hans"), "Assert a correct, unique user to be considered invalid when the database doesn't work");
}
This, of course, depends on how you want the behaviour to be when the db is down: Do you want to log the error and proceed
as usual or do you want the error to bubble up via an exception.
#### Writing sensible assertions
It's crucial to write sensible assertions in your test-classes: You can write a perfect test that covers the right scenario,
but don't catch errors because you aren't asking the correct questions.
- Write assertions that cover your scenario - if you test a correct behaviour don't test what happens if something goes wrong
(this is a seperate scenario)
- While you should try to write redundant assertions it's better to assert more than to have a missing assertion
- A failed assertion means that the implementation is incorrect, other assertions are to be avoided (like testing if a
precondition applies)
- When testing one function, you have to be naive and assume that everything else is bug free, testing whether other parts
worked correctly before testing should be made in a different test.
## How to test
Unit tests should only test an isolated, atomic of your code - in theory just one single function - and shouldn't
need much dependency handling. An example for a unittest would be to test the following (hypothetical) class method:
class UserManager
{
/**
* returns true when a user with this name exists and the
* password for this user is correct
**/
public function isCorrectPassword($name, $password)
{
// You needn't to know the implementation.
}
}
### The wrong way
A unit test for this user could, but should not look like this (we'll explain why):
require_once "../../library/Icinga/MyLibrary/UserManager.php";
// needed by UserManager
require_once "../../library/Icinga/Authentication/Manager.php"
require_once "../../library/Icinga/Authentication/User.php"
// .. imagine a few more require_once
use Icinga/MyLibrary/UserManager
class UserManagerTest extends \PHPUnit_Framework_TestCase
{
/**
* Test whether an user is correctly recognized by the UserManager
*
**/
public function testUserManager()
{
// Connect to the test database that contains jdoe and jsmith
$mgrConfg = new \Zend_Config(array(
"backend" => "db",
"user" => "dbuser.."
"pass" =>
// all the other db credentials
));
$mgr = new UserManager($mgrConfig);
$this->assertTrue($mgr->isCorrectPassword("jdoe", "validpassword"));
$this->assertTrue($mgr->isCorrectPassword("jsmith", "validpassword"));
$this->assertTrue($mgr->isCorrectPassword("jdoe", "nonvalidpassword"));
$this->assertTrue($mgr->isCorrectPassword("jsmith", "nonvalidpassword"));
$this->assertTrue($mgr->isCorrectPassword("hans", "validpasswor"));
}
This test has obviously a few issues:
- First, it assert a precondition to apply : A database must exist with the users jdoe and jsmith and the credentials
must match the ones provided in the test
- There are a lot of dependencies in this code, almost the complete Authentication code must exists. Our test
will fail if this code changes or contains bugs, even the isCorrectPassword method is correct.
### Reducing dependencies
To avoid these issues, you need to code your classes with testing in mind. Maybe now you're screaming *"Tests shouldn't
affect my code, just test if it is correct!"*, but it's not that easy. Testability should be considered as a quality aspect
of your code, just like commenting, keeping functions coherent, etc. Non-testable code should be considered as a bug, or
at least as a design-flaw.
One big buzzword in development is now Inversion of Control and Dependency Injection. You can google the details, but in
our case it basically means: Instead of your class (in this case UserManager) setting up it's dependencies (creating an Authentication Manager and
then fetching users from it), the dependencies are given to the class. This has the advantage that you can mock the
dependencies in your testclasses which heavily reduces test-complexity. On the downside this can lead to more complicate
Api's, as you have to know the dependencies of your Object when creating it. Therefore we often allow to provide an
dependency from the outside (when testing), but normally create the dependencies when nothing is provided (normal use).
In our case we could say that we allow our UserManager to use a provided set of Users instead of fetching it from the
Authmanger:
class UserManager
{
public function __construct($config, $authMgr = null)
{
if ($authMgr == null) {
// no Authmanager provided, resolve dependency by yourself
$this->authMgr = new AuthManager($config);
} else {
$this->authMgr = $authMgr;
}
}
}
It would of course be best to create an Interface like UserSource which the AuthManger implements, but in this example
we trust our Programmer to provide a suitable object. We now can eliminate all the AuthManager dependencies by mocking the
AuthManager (lets dumb it down to just providing an array of users):
require_once "../../library/Icinga/MyLibrary/UserManager.php";
use Icinga/MyLibrary/UserManager
class AuthManagerMock
{
public $users;
/**
* Create a new mock classw with the provided users and their credentials
*
* @param array $userPasswordCombinations The users and password combinations to use
**/
public function __construct(array $userPasswordCombinations)
{
$this->users = $userPasswordCombinations;
}
public function getUsers()
{
return $this->users;
}
}
class UserManagerTest extends \PHPUnit_Framework_TestCase
{
/**
* Test whether an user is correctly recognized by the UserManager
*
**/
public function testUserManager()
{
$authMock = new AuthManagerMock(array(
"jdoe" => "validpassword",
"jsmith" => "validpassword"
));
$mgrConfg = new \Zend_Config(array(), $authMock);
$mgr = new UserManager($mgrConfig);
$this->assertTrue($mgr->isCorrectPassword("jdoe", "validpassword"));
$this->assertTrue($mgr->isCorrectPassword("jsmith", "validpassword"));
$this->assertFalse($mgr->isCorrectPassword("jdoe", "nonvalidpassword"));
$this->assertFalse($mgr->isCorrectPassword("jsmith", "nonvalidpassword"));
$this->assertFalse($mgr->isCorrectPassword("hans", "validpassword"));
}
Ok, we might have more code here than before, but our test is now less like prone to fail:
- Our test doesn't assume any preconditions to apply, like having a db server with correct users
- The require call to the AuthManager is gone, so if there's a bug in the AuthManager implementation our test is not affected
### Splitting up assertions
The test is now not that bad, but still has a few issues:
- If an assert fails, we don't know which one, as the message will be rather generic ("failed asserting that False is True")
- In this case it might be obvious what we test, but if someone sees the class and the assertions, he doesn't know what
assumptions are made
To fix those issues, we have to split up our big test method in several smaller one and give the testmethod **talking names**.
Also, the assertions should get an error message that will be printed on failure.
/**
* Testcases for the UserManager class
*
**/
class UserManagerTest extends \PHPUnit_Framework_TestCase
{
/**
* Creates a new UserManager with a mocked AuthManage
*
* @returns UserManager
**/
public function getUserManager()
{
$authMock = new AuthManagerMock(array(
"jdoe" => "validpassword",
"jsmith" => "validpassword"
));
$mgrConfg = new \Zend_Config(array(), $authMock);
return new UserManager($mgrConfig);
}
/**
* Tests whether correct user/name combinations are considered valid
*
**/
public function testCorrectUserPasswordCombinations()
{
$mgr = $this->getUserManager();
$this->assertTrue(
$mgr->isCorrectPassword("jdoe", "validpassword"),
"Asserted that a correct user/password combination is considered valid for jdoe"
);
$this->assertTrue(
$mgr->isCorrectPassword("jsmith", "validpassword"),
"Asserted that a correct user/password combination is considered valid for jsmith"
);
}
/**
* Tests whether invalid names are rejected
*
**/
public function testInvalidUsernameRecognition()
{
$mgr = $this->getUserManager();
$this->assertFalse(
$mgr->isCorrectPassword("hans", "validpassword"),
"Asserted a non-existing user to be be considered invalid"
);
}
/**
* Tests whether invalid passwords for existing users are rejected
*
**/
public function testInvalidPasswordRecognition()
{
$mgr = $this->getUserManager();
$this->assertFalse(
$mgr->isCorrectPassword("jsmith", "nonvalidpassword"),
"Asserted that an invalid password for an existing user is considered invalid"
);
}
}
Now if something fails, we now see what has been tested via the testmethod and what caused the test to fail in the
assertion error message. You could also leave the comments and everybody knows what you are doing.
## Testing PHP
## Requirements and the dependency mess
### spl_autoload_register vs. require
When looking at our test classes, you'll notice that we don't use PHPs autoloader to automatically load dependency, but
write 'require_once' by ourselfs. This has the following reasons:
- When writing tests, you to be aware of every dependency your testclass includes. With autoloading, it's not directly
obvious which classes are included during runtime.
- When mocking classes, you don't need to tell your autoloader to use this class instead of the one used in production
- Tests can't be run isolated without an boostrap class initializing the autoloader
### How to avoid require_once massacres: LibraryLoader
The downside of this approach is obvious: Especially when writing compoment tests you end up writing a lot of 'require'
classes. In the worst case, the PHP require_once method doesn't recognize a path to be already included and ends up
with an 'Cannot redeclare class XY' issue.
To avoid this, you should implement a LibraryLoader class for your component that handles the require_once calls.
For example, the status.dat component tests has a TestLoader class that includes all dependencies of the component:
namespace Tests\Icinga\Protocol\Statusdat;
use Test\Icinga\LibraryLoader;
require_once('library/Icinga/LibraryLoader.php');
/**
* Load all required libraries to use the statusdat
* component in integration tests
*
**/
class StatusdatTestLoader extends LibraryLoader
{
/**
* @see LibraryLoader::requireLibrary
*
**/
public static function requireLibrary()
{
// include Zend requirements
require_once 'Zend/Config.php';
require_once 'Zend/Cache.php';
require_once 'Zend/Log.php';
// retrieve the path to the icinga library
$libPath = self::getLibraryPath();
// require library dependencies
require_once($libPath."/Data/AbstractQuery.php");
require_once($libPath."/Application/Logger.php");
require_once($libPath."/Data/DatasourceInterface.php");
// shorthand for the folder where the statusdat component can be found
$statusdat = realpath($libPath."/Protocol/Statusdat/");
require_once($statusdat."/View/AccessorStrategy.php");
// ... a few more requires ...
require_once($statusdat."/Query/Group.php");
}
}
Now an component test (like tests/php/library/Icinga/Protocol/Statusdat/ReaderTest.php) can avoid the require calls and
just use the requireLibrary method:
use Icinga\Protocol\Statusdat\Reader as Reader;
// Load library at once
require_once("StatusdatTestLoader.php");
StatusdatTestLoader::requireLibrary();
**Note**: This should be used for component tests, where you want to test the combination of your classes. When testing
a single execution unit, like a method, it's often better to explicitly write your dependencies.
If you compare the first approach with the last one you will notice that, even if we produced more code in the end, our
test is more verbose in what it is doing. When someone is updating your test, he should easily see what tests are existing
and what scenarios are missing.

View File

@ -0,0 +1,98 @@
# Testing controllers with different backends
Icingaweb's monitoring controllers support a variety of different backends (IDO, Statusdat, Livestatus) and make it
therefore hard to test for every backend. In order to make life a little bit easier, Test-Fixtures allow you to setup
a backend with specific monitoring data and test it afterwards by running the controller's action.
## Example
It's best to subclass MonitoringControllerTest (found in modules/monitoring/test/testlib), as this handles depency resoultion
and setup for you:
// assume our test is underneath the test/application/controllers folder in the monitoring module
require_once(dirname(__FILE__).'/../../testlib/MonitoringControllerTest.php');
use Test\Monitoring\Testlib\MonitoringControllerTest;
use Test\Monitoring\Testlib\Datasource\TestFixture;
use Test\Monitoring\Testlib\Datasource\ObjectFlags;
class MyControllerTest extends MonitoringControllerTest
{
public function testSomething()
{
// Create a test fixture
$fixture = new TestFixture()
$fixture->addHost('host', 0) // Add a host with state OK
->addToHostgroup('myHosts') // Add host to hostgroup
->addService('svc1', 1) // Add a warning service 'svc1' underneath host
->addToServiceGroup('svc_warning') // add this service to a servicegroup svc_warning
->addService('svc2', 2, ObjectFlags::ACKNOWLEDGED()) // Add a critical, but acknowledged service to this host
->addService(
'svc3',
1,
new ObjectFlags(),
array("customvariables" =>
array("customer" => "myCustomer")
)
); // add a warning service with a customvariable
$this->setupFixture($fixture, "mysql"); // setup the fixture for MySQL, so the backend is populated with the data set above
// backends can be mysql, pgsql, statusdat (and in the future livestatus)
$controller = $this->requireController('MyController', 'mysql'); // request a controller with the given backend injected
$controller->myAction(); // controller is now the Zend controller instance, perform an action
$result = $controller->view->hosts->fetchAll(); // get the result of the query
// and assert stuff
$this->assertEquals(1, count($result), "Asserting one host being retrieved in the controller");
}
}
## The Test-fixture API
In order to populate your backend with specific monitoring objects, you have to create a TestFixture class. This class
allows you to setup the monitoring objects in a backend independent way using a few methods:
### TestFixture::addHost($name, $state, [ObjectFlags $flags], [array $properties])
The addHost method adds a host with the name $name and the status $state (0-2) to your testfixture. When no ObjectFlags
object is provided, the default flags are used (not flapping, notifications enabled, active and passive enabled, not
acknowledged, not in downtime and not pending). The $properties array can contain additional settings like 'address',
an 'customvariables' array, 'notes_url', 'action_url' or 'icon_image'.
Subsequent addToHostgroup and addService calls will affect this host (so the service will be added to this host)
### TestFixture::addService($name, $state, [ObjectFlags $flags], [array $properties])
The addHost method adds a service with the name $name and the status $state (0-3) to your testfixture. When no ObjectFlags
object is provided, the default flags are used (not flapping, notifications enabled, active and passive enabled, not
acknowledged, not in downtime and not pending). The $properties array can contain additional settings like an
'customvariables' array, 'notes_url', 'action_url' or 'icon_image'.
Subsequent addToServicegroup calls will affect this service.
### ObjectFlags
The Objectflags object encapsulates the following monitoring states with the following default values:
public $flapping = 0; // Whether this host is flapping
public $notifications = 1; // Whether notifications are enabled
public $active_checks = 1; // Whether actice checks are enabled
public $passive_checks = 1; // Whether passive checks are enabled
public $acknowledged = 0; // Whether this object has been acknowledged
public $in_downtime = 0; // Whether this object is in a scheduled downtime
public $is_pending = 0; // Whether this object is currently pending
public $time = time(); // The last check and statechange time
ObjectFlags can either be created using new ObjectFlags([$ageInSeconds]) and directly modify the attributes or by
calling one of the following factory methods:
ObjectFlags::FLAPPING()
ObjectFlags::PENDING()
ObjectFlags::DISABLE_NOTIFICATIONS()
ObjectFlags::PASSIVE_ONLY()
ObjectFlags::ACTIVE_ONLY()
ObjectFlags::DISABLED() {
ObjectFlags::ACKNOWLEDGED()
ObjectFlags::IN_DOWNTIME()