Ruling Database Testing with DBUnit Rules

In this post I am going to talk about DBUnit Rules (renamed to Database Rider), a small opensource project I maintain which aims to simplify database testing[1].

For a better reading experience, access the asciidoc based version of this post in HTML and PDF.

Introduction

DBUnit Rules integrates JUnit and DBUnit through JUnit rules and, in case of CDI based tests, a CDI interceptor. This powerful combination lets you easily prepare the database state for testing through xml, json, xls or yaml files.

Most inspiration of DBUnit Rules was taken from Arquillian extension persistence a library for database in-container integration tests.

Source code for the upcoming examples can be found at github here: https://github.com/rmpestano/dbunit-rules-sample

Setup DBUnit Rules

First thing to do is to add DBunit Rules core module to your test classpath:

        <dependency>
            <groupId>com.github.dbunit-rules</groupId>
            <artifactId>core</artifactId>
            <version>0.9.0</version>
            <scope>test</scope>
        </dependency>

Secondly we need a database, for testing I recommend HSQLDB which is a very fast in-memory database, here is its maven dependency:

        <dependency>
            <groupId>org.hsqldb</groupId>
            <artifactId>hsqldb</artifactId>
            <version>2.3.3</version>
            <scope>test</scope>
        </dependency>

Later A JPA provider will be needed, in this case Hibernate will be used:

        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-core</artifactId>
            <version>4.2.8.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-entitymanager</artifactId>
            <version>4.2.8.Final</version>
            <scope>test</scope>
        </dependency>

And the entity manager persistence.xml(src/test/resources/META-INF/persistence.xml):

<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
    <persistence-unit name="rulesDB" transaction-type="RESOURCE_LOCAL">

            <provider>org.hibernate.ejb.HibernatePersistence</provider>
            <class>com.github.dbunit.rules.sample.User</class>

            <properties>
                <property name="hibernate.dialect" value="org.hibernate.dialect.HSQLDialect" />
                <property name="javax.persistence.jdbc.driver" value="org.hsqldb.jdbcDriver" />
                <property name="javax.persistence.jdbc.url" value="jdbc:hsqldb:mem:test;DB_CLOSE_DELAY=-1" />
                <property name="javax.persistence.jdbc.user" value="sa" />
                <property name="javax.persistence.jdbc.password" value="" />
                <property name="hibernate.hbm2ddl.auto" value="create-drop" /> <1>
                <property name="hibernate.show_sql" value="true" />
            </properties>

    </persistence-unit>

</persistence>
  1. We’re creating the database from our JPA entities, but we could use a database migration tool like flyway to do this work, see example here.

and finally the JPA entity which our tests will work on:

@Entity
public class User {

    @Id
    @GeneratedValue
    private long id;

    private String name;

Now we are ready to rule our database tests!

Example

Create a yaml file which will be used to prepare database, with two users, before the test:

src/test/resources/dataset/users.yml

 
user: 
  - id: 1
    name: "@realpestano" 
  - id: 2 
    name: "@dbunit"

And the JUnit test:

@RunWith(JUnit4.class)
public class DBUnitRulesCoreTest {

    @Rule
    public EntityManagerProvider emProvider = EntityManagerProvider.instance("rulesDB"); <1> 

    @Rule
    public DBUnitRule dbUnitRule = DBUnitRule.instance(emProvider.connection()); <2>


    @Test
    @DataSet("users.yml") <3>
    public void shouldListUsers() {
        List<User> users = em(). <4>
                createQuery("select u from User u").
                getResultList();
        assertThat(users).
                isNotNull().
                isNotEmpty().
                hasSize(2);
    }
}
  1. EntityManagerProvider is a JUnit rule that initializes a JPA entity manager before each test class. rulesDB is the name of persistence unit;
  2. DBUnit rule reads @DataSet annotations and initializes database before each test method. This rule only needs a JDBCconnection to be created.
  3. The dataSet configuration itself, see here for all available configuration options.
  4. em() is a shortcut (import static com.github.dbunit.rules.util.EntityManagerProvider.em;) for the EntityManager that was initialized by EntityManagerProvider rule.

Transactions

EntityManagerProvider rule provides entity manager transactions so you can insert/delete entities in your tests:

    @Test
    @DataSet("users.yml")
    public void shouldUpdateUser() {
        User user = (User) em().
                createQuery("select u from User u  where u.id = 1").
                getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getName()).isEqualTo("@realpestano");
        tx().begin(); <1>
        user.setName("@rmpestano");
        em().merge(user);
        tx().commit();
        assertThat(user.getName()).isEqualTo("@rmpestano");
    }

    @Test
    @DataSet("users.yml")
    public void shouldDeleteUser() {
        User user = (User) em().
                createQuery("select u from User u  where u.id = 1").
                getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getName()).isEqualTo("@realpestano");
        tx().begin();
        em().remove(user);
        tx().commit();
        List<User> users = em().
                createQuery("select u from User u ").
                getResultList();
        assertThat(users).
                hasSize(1);
    }
  1. tx() is a shortcut for the entity manager transaction provided by EntityManagerProvider.

Database assertion with ExpectedDataSet

Consider the following datasets:

src/test/resources/dataset/users.yml

user:
  - id: 1
    name: "@realpestano"
  - id: 2
    name: "@dbunit"

and expected dataset:

src/test/resources/dataset/expectedUser.yml

user:
  - id: 2
    name: "@dbunit"

And the following test:

    @Test
    @DataSet("users.yml")
    @ExpectedDataSet(value = "expectedUser.yml",ignoreCols = "id") <1>
    public void shouldAssertDatabaseUsingExpectedDataSet() {
        User user = (User) em().
                createQuery("select u from User u  where u.id = 1").
                getSingleResult();
        assertThat(user).isNotNull();
        tx().begin();
        em().remove(user);
        tx().commit();
    }
  1.  Database state after test will be compared with dataset provided by @ExpectedDataSet.

NOTE that If database state is not equal then an assertion error is thrown, example imagine in test above we’ve deleted user with id=2, error would be:

junit.framework.ComparisonFailure: value (table=USER, row=0, col=name)
Expected :@dbunit
Actual   :@realpestano
 <Click to see difference>
	at org.dbunit.assertion.JUnitFailureFactory.createFailure(JUnitFailureFactory.java:39)
	at org.dbunit.assertion.DefaultFailureHandler.createFailure(DefaultFailureHandler.java:97)
	at org.dbunit.assertion.DefaultFailureHandler.handle(DefaultFailureHandler.java:223)
	at com.github.dbunit.rules.assertion.DataSetAssert.compareData(DataSetAssert.java:94)

OBS: Since version 0.9.0 (To be released at the time of writing) transactions can be managed automatically at test level (useful for expected datasets), see example here.

Regular Expressions

Expected datasets also alow regexp in datasets:

src/test/resources/dataset/expectedUsersRegex.yml

user:
  - id: "regex:\\d+"
    name: regex:^expected user.* #expected user1
  - id: "regex:\\d+"
    name: regex:.*user2$ #expected user2
    @Test
    @DataSet(cleanBefore = true) <1>
    @ExpectedDataSet("expectedUsersRegex.yml")
    public void shouldAssertDatabaseUsingRegex() {
        User u = new User();
        u.setName("expected user1");
        User u2 = new User();
        u2.setName("expected user2");
        tx().begin();
        em().persist(u);
        em().persist(u2);
        tx().commit();
    }
  1. You don’t need to initialize a dataset but can use cleanBefore to clear database before testing.

IMPORTANT: When you use a dataset like users.yml in @DataSet dbunit will use CLEAN_INSERT seeding strategy (by default) for all declared tables in dataset. This is why we didn’t needed cleanBefore in any other example tests.

Scriptable datasets

DBUnit Rules enables scripting in dataset for languages that implement JSR 233 – Scripting for the Java Platform, see this article for more information.

For this example we will introduce another JPA entity:

@Entity
public class Tweet {

    @Id
    @GeneratedValue
    private long id;

    @Size(min = 1, max = 140)
    private String content;

    private Integer likes;

    @Temporal(TemporalType.DATE)
    private Date date;

    @ManyToOne(fetch = FetchType.LAZY)
    User user;

Javascript Scriptable dataset

Following is a dataset which uses Javascript:

src/test/resources/datasets/dataset-with-javascript.yml

tweet:
  - id: 1
    content: "dbunit rules!"
    likes: "js:(5+5)*10/2" <1>
    user_id: 1
  1. js: prefix enables javascript in datasets.

and the junit test:

    @Test
    @DataSet(value = "dataset-with-javascript.yml",
            cleanBefore = true, <1> 
            disableConstraints = true)  <2>
    public void shouldSeedDatabaseUsingJavaScriptInDataset() {
        Tweet tweet = (Tweet) emProvider.em().createQuery("select t from Tweet t where t.id = 1").getSingleResult();
        assertThat(tweet).isNotNull();
        assertThat(tweet.getLikes()).isEqualTo(50);
    }
    
}
  1. As we don’t declared User table in dataset it will not be cleared by CLEAN_INSERT seeding strategy so we need cleanBefore to avoid conflict with other tests that insert users.
  2. Disabling constraints is necessary because Tweet table depends on User.

If we do not disable constraints we will receive the error below on dataset creation:

Caused by: org.dbunit.DatabaseUnitException: Exception processing table name='TWEET'
	at org.dbunit.operation.AbstractBatchOperation.execute(AbstractBatchOperation.java:232)
	at org.dbunit.operation.CompositeOperation.execute(CompositeOperation.java:79)
	at com.github.dbunit.rules.dataset.DataSetExecutorImpl.createDataSet(DataSetExecutorImpl.java:127)
	... 21 more
Caused by: java.sql.SQLIntegrityConstraintViolationException: integrity constraint violation: foreign key no parent; FK_OH8MF7R69JSK6IISPTIAOCC6L table: TWEET
	at org.hsqldb.jdbc.JDBCUtil.sqlException(Unknown Source)

TIP: If we declare User table in dataset-with-javascript.yml dataset we can remove cleanBefore and disableConstraints attributes.

Groovy scriptable dataset

Javascript comes by default in JDK but you can use other script languages like Groovy, to do so you need to add it to test classpath:

        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-all</artifactId>
            <version>2.4.6</version>
            <scope>test</scope>
        </dependency>

If Groovy is not present in classpath we’ll receive a warn message (maybe we should fail, what do you think?):

WARNING: Could not find script engine with name groovy in classpath

Here’s our Groovy based dataset:

tweet:
  - id: "1"
    content: "dbunit rules!"
    date: "groovy:new Date()" <1>
    user_id: 1
  1. groovy: prefix enables Groovy in datasets.

And here is the test:

    @Test
    @DataSet(value = "dataset-with-groovy.yml",
            cleanBefore = true,
            disableConstraints = true)
    public void shouldSeedDatabaseUsingGroovyInDataset() throws ParseException {
        Tweet tweet = (Tweet) emProvider.em().createQuery("select t from Tweet t where t.id = '1'").getSingleResult();
        assertThat(tweet).isNotNull();
        SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd");//remove time
        Date now = sdf.parse(sdf.format(new Date()));
        assertThat(tweet.getDate()).isEqualTo(now);
    }

Multiple databases

Multiple databases can be tested by using multiple DBUnit rule and Entity manager providers:

package com.github.dbunit.rules.sample;

import com.github.dbunit.rules.DBUnitRule;
import com.github.dbunit.rules.api.dataset.DataSet;
import com.github.dbunit.rules.api.dataset.DataSetExecutor;
import com.github.dbunit.rules.api.dataset.DataSetModel;
import com.github.dbunit.rules.connection.ConnectionHolderImpl;
import com.github.dbunit.rules.dataset.DataSetExecutorImpl;
import com.github.dbunit.rules.util.EntityManagerProvider;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;

import static org.assertj.core.api.Assertions.assertThat;

/**
 * Created by pestano on 23/07/15.
 */

@RunWith(JUnit4.class)
public class MultipleDataBasesTest {

    @Rule
    public EntityManagerProvider emProvider = EntityManagerProvider.instance("pu1");

    @Rule
    public EntityManagerProvider emProvider2 = EntityManagerProvider.instance("pu2");

    @Rule
    public DBUnitRule rule1 = DBUnitRule.instance("rule1",emProvider.connection()); <1>

    @Rule
    public DBUnitRule rule2 = DBUnitRule.instance("rule2",emProvider2.connection());


    @Test
    @DataSet(value = "users.yml", executorId = "rule1") <1>
    public void shouldSeedDatabaseUsingPu1() {
        User user = (User) emProvider.em().
                createQuery("select u from User u where u.id = 1").getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getId()).isEqualTo(1);
    }

    @Test
    @DataSet(value = "users.yml", executorId = "rule2")
    public void shouldSeedDatabaseUsingPu2() {
        User user = (User) emProvider2.em().
                createQuery("select u from User u where u.id = 1").getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getId()).isEqualTo(1);
    }

    @Test <3>
    public void shouldSeedDatabaseUsingMultiplePus() {
        DataSetExecutor exec1 = DataSetExecutorImpl.
                instance("exec1", new ConnectionHolderImpl(emProvider.connection()));
        DataSetExecutor exec2 = DataSetExecutorImpl.
                instance("exec2", new ConnectionHolderImpl(emProvider2.connection()));

        //programmatic seed db1
        exec1.createDataSet(new DataSetModel("users.yml"));

        exec2.createDataSet(new DataSetModel("dataset-with-javascript.yml"));//seed db2

        //user comes from database represented by pu1
        User user = (User) emProvider.em().
                createQuery("select u from User u where u.id = 1").getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getId()).isEqualTo(1);

        //tweets comes from pu2
        Tweet tweet = (Tweet) emProvider.em().createQuery("select t from Tweet t where t.id = 1").getSingleResult();
        assertThat(tweet).isNotNull();
        assertThat(tweet.getLikes()).isEqualTo(50);
    }

}
  1. rule1 is the id of DataSetExecutor, the component responsible for database initialization in DBUnit Rules.
  2. here we match dataset executor id in @DataSet annotation so in this test we are going to use database from pu1.
  3. For multiple databases in same test we need to initialize database state programmatically.

Ruling database in CDI tests

For CDI based tests we are going to use DeltaSpike test control module and DBUnit rules CDI.

The first enables CDI in JUnit tests and the second enables DBUnit though a CDI interceptor.

Classpath dependencies

First we need DBUnit CDI:

       <dependency>
            <groupId>com.github.dbunit-rules</groupId>
            <artifactId>cdi</artifactId>
            <version>0.9.0</version>
            <scope>test</scope>
        </dependency>

And also DeltaSpike control module:

        <dependency> <1>
            <groupId>org.apache.deltaspike.core</groupId>
            <artifactId>deltaspike-core-impl</artifactId>
            <version>1.6.1</version>
            <scope>test</scope>
        </dependency>

        <dependency> <2>
            <groupId>org.apache.deltaspike.modules</groupId>
            <artifactId>deltaspike-test-control-module-api</artifactId>
            <version>1.6.1</version>
            <scope>test</scope>
        </dependency>

        <dependency> <2>
            <groupId>org.apache.deltaspike.modules</groupId>
            <artifactId>deltaspike-test-control-module-impl</artifactId>
            <version>1.6.1</version>
            <scope>test</scope>
        </dependency>

        <dependency> <3>
            <groupId>org.apache.deltaspike.cdictrl</groupId>
            <artifactId>deltaspike-cdictrl-owb</artifactId>
            <version>1.6.1</version>
            <scope>test</scope>
        </dependency>

        <dependency> <4>
            <groupId>org.apache.openwebbeans</groupId>
            <artifactId>openwebbeans-impl</artifactId>
            <version>1.6.2</version>
            <scope>test</scope>
        </dependency>
  1. DeltaSpike core module is base of all DeltaSpike modules
  2. Test control module api and impl
  3. CDI control OWB dependency, it is responsible for bootstraping CDI container
  4. OpenWebBeans as CDI implementation

Configuration

For configuration we will need a beans.xml which enables DBUnit CDI interceptor:

/src/test/resources/META-INF/beans.xml

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/beans_1_0.xsd">

       <interceptors>
              <class>com.github.dbunit.rules.cdi.DBUnitInterceptor</class>
       </interceptors>
</beans>

And apache-deltaspike.properties to set our tests as CDI beans:

/src/test/resources/META-INF/apache-deltaspike.properties

deltaspike.testcontrol.use_test_class_as_cdi_bean=true 

The test itself must be a CDI bean so DBUnit Rules can intercept it.

The last configuration needed is to produce a EntityManager for tests:

package com.github.dbunit.rules.sample.cdi;


import com.github.dbunit.rules.util.EntityManagerProvider;

import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Produces;
import javax.persistence.EntityManager;

/**
 * Created by pestano on 09/10/15.
 */
@ApplicationScoped
public class EntityManagerProducer {

    private EntityManager em;


    @Produces
    public EntityManager produce() {
        return EntityManagerProvider.instance("rulesDB").em();
    }

}

This entityManager will be used as a bridge to JDBC connection needed by DBUnit Rules.

Example

@RunWith(CdiTestRunner.class) <1>
public class DBUnitRulesCDITest {

    @Inject
    EntityManager em; <2>


    @Test
    @UsingDataSet("users.yml") <3>
    public void shouldListUsers() {
        List<User> users = em.
                createQuery("select u from User u").
                getResultList();
        assertThat(users).
                isNotNull().
                isNotEmpty().
                hasSize(2);
    }
}
  1. DeltaSpike JUnit runner that enables CDI in tests;
  2. The EntityManager we produced in previous step;
  3. This annotation enables DBUnit CDI interceptor which will prepare database state before the test execution.

All other features presented earlier, except multiple databases, are supported by DBUnit CDI.

Here is ExpectedDataSet example:

src/test/resources/datasets/expectedUsers.yml

user:
  - id: 1
    name: "expected user1"
  - id: 2
    name: "expected user2"

And the test:

    @Test
    @UsingDataSet(cleanBefore = true) //needed to activate interceptor (can be at class level)
    @ExpectedDataSet(value = "expectedUsers.yml",ignoreCols = "id")
    public void shouldMatchExpectedDataSet() {
        User u = new User();
        u.setName("expected user1");
        User u2 = new User();
        u2.setName("expected user2");
        em.getTransaction().begin();
        em.persist(u);
        em.persist(u2);
        em.getTransaction().commit();
    }

Ruling database in BDD tests

BDD and DBUnit are integrated by DBUnit Rules Cucumber. It’s a Cucumber runner which is CDI aware.

Configuration

Just add following dependency to your classpath:

       <dependency>
            <groupId>com.github.dbunit-rules</groupId>
            <artifactId>cucumber</artifactId>
            <version>0.9.0</version>
            <scope>test</scope>
        </dependency>

Now you just need to use CdiCucumberTestRunner to have Cucumber, CDI and DBUnit on your BDD tests.

Example

First we need a feature file:

src/test/resources/features/search-users.feature

Feature: Search users
In order to find users quickly
As a recruiter
I want to be able to query users by its tweets.

Scenario Outline: Search users by tweet content

Given We have two users that have tweets in our database

When I search them by tweet content <value>

Then I should find <number> users
Examples:
| value    | number |
| "dbunit" | 1      |
| "rules"  | 2      |

Then a dataset to prepare our database:

src/test/resources/datasets/usersWithTweet.json

{
  "USER": [
    {
      "id": 1,
      "name": "@realpestano"
    },
    {
      "id": 2,
      "name": "@dbunit"
    }
  ],
  "TWEET": [
    {
      "id": 1,
      "content": "dbunit rules json example",
      "date": "2013-01-20",
      "user_id": 1
    },
    {
      "id": 2,
      "content": "CDI rules",
      "date": "2016-06-20",
      "user_id": 2
    }
  ]
}

Now a Cucumber runner test entry point:

package com.github.dbunit.rules.sample.bdd;

import com.github.dbunit.rules.cucumber.CdiCucumberTestRunner;
import cucumber.api.CucumberOptions;
import org.junit.runner.RunWith;

/**
 * Created by rmpestano on 4/17/16.
 */
@RunWith(CdiCucumberTestRunner.class)
@CucumberOptions(features ="src/test/resources/features/search-users.feature")
public class DBUnitRulesBddTest {
}

And finally our cucumber step definitions:

package com.github.dbunit.rules.sample.bdd;

import com.github.dbunit.rules.cdi.api.UsingDataSet;
import com.github.dbunit.rules.sample.User;
import cucumber.api.PendingException;
import cucumber.api.java.en.Given;
import cucumber.api.java.en.Then;
import cucumber.api.java.en.When;
import org.hibernate.Criteria;
import org.hibernate.Session;
import org.hibernate.criterion.DetachedCriteria;
import org.hibernate.criterion.MatchMode;
import org.hibernate.criterion.Restrictions;
import org.hibernate.sql.JoinType;

import javax.inject.Inject;
import javax.persistence.EntityManager;
import java.util.List;

import static org.assertj.core.api.Assertions.assertThat;

/**
 * Created by pestano on 20/06/16.
 */
public class SearchUsersSteps {

    @Inject
    EntityManager entityManager;

    List<User> usersFound;

    @Given("^We have two users that have tweets in our database$")
    @UsingDataSet("usersWithTweet.json")
    public void We_have_two_users_in_our_database() throws Throwable {
    }

    @When("^I search them by tweet content \"([^\"]*)\"$")
    public void I_search_them_by_tweet_content_value(String tweetContent) throws Throwable {
        Session session = entityManager.unwrap(Session.class);
        usersFound = session.createCriteria(User.class).
        createAlias("tweets","tweets", JoinType.LEFT_OUTER_JOIN).
        add(Restrictions.ilike("tweets.content",tweetContent, MatchMode.ANYWHERE)).list();
    }

    @Then("^I should find (\\d+) users$")
    public void I_should_find_number_users(int numberOfUsersFound) throws Throwable {
        assertThat(usersFound).
                isNotNull().
                hasSize(numberOfUsersFound).
                contains(new User(1));//examples contains user with id=1
    }


}

Living documentation of DBUnit Rules is based on its BDD tests, you can access it here: http://rmpestano.github.io/dbunit-rules/documentation.html

 

Ruling Database in JUnit 5 tests

JUnit 5 is the new version of JUnit and comes with a new extension model, so instead of rules you will use extensions in your tests. DBUnit Rules comes with a JUnit 5 extension which enables DBUnit.

Configuration

Just add following dependency to your classpath:

<dependency>
   <groupId>com.github.dbunit-rules</groupId>
   <artifactId>junit5</artifactId>
   <version>0.12.1-SNAPSHOT</version>
   <scope>test</scope>
</dependency>

Example

@ExtendWith(DBUnitExtension.class) <1>
@RunWith(JUnitPlatform.class) <2>
public class DBUnitJUnit5Test {

    private ConnectionHolder connectionHolder = () -> <3>
            instance("junit5-pu").connection(); <4>

    @Test
    @DataSet("users.yml")
    public void shouldListUsers() {
        List<User> users = em().createQuery("select u from User u").getResultList();
        assertThat(users).isNotNull().isNotEmpty().hasSize(2);
    }
  1. Enables DBUnit;
  2. JUnit 5 runner;
  3. As JUnit5 requires Java8 you can use lambdas in your tests;
  4. DBUnitExtension will get connection by reflection so just declare a field or a method with ConnectionHolder as return type.
[1]. In the context of this article, database testing stands for integration tests which depend on a relational database so application business logic that depend on a database can be tested without mocking.

Some Words on JavaEE, REST and Swagger

Introduction

In this post i will cover the following topics:

  1. Create a simple JavaEE REST application using JBoss Forge
  2. Add some Rest authentication and authorization
  3. Create some Arquillian tests for the created Rest endpoint
  4. Use Swagger to generate Rest API documentation

I will enhance the CDI Crud application presented on previous posts: CDI Generic Dao, CDI Crud Multi “Tenancy” and Arquillian + DBUnit + Cucumber. All source code is avaiable here: https://github.com/rmpestano/cdi-crud

 Creating the REST Endpoint

I used JBoss Forge to execute this task. As Forge can be used to evolve an application i have just executed Rest setup command:

img01

I have chosen to use JaxRS 1.1 because i want to run the app in JBoss AS and Wildfly:

img02

As we already have our JPA entities, creating the endpoint is done with generate endpoints from entities:

img03

And the CarEndpoint is created and ready to CRUD cars via REST:

@Stateless
@Path("/cars")
public class CarEndpoint {
    @Inject
    CarService carService;

    @POST
    @Consumes("application/json")
    public Response create(Car entity) {
        carService.insert(entity);
        return Response.created(UriBuilder.fromResource(CarEndpoint.class).path(String.valueOf(entity.getId())).build()).build();
    }

    @DELETE
    @Path("/{id:[0-9][0-9]*}")
    public Response deleteById(@PathParam("id") Integer id) {
        Car entity = carService.findById(id);
        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        carService.remove(entity);
        return Response.noContent().build();
    }

    @GET
    @Path("/{id:[0-9][0-9]*}")
    @Produces("application/json")
    public Response findById(@PathParam("id") Integer id) {
        Car entity;
        try {
            entity = carService.findById(id);
        } catch (NoResultException nre) {
            entity = null;
        }

        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        return Response.ok(entity).build();
    }

    @GET
    @Produces("application/json")
    @Path("list")
    public List<Car> listAll(@QueryParam("start") @DefaultValue("0") Integer startPosition, @QueryParam("max") @DefaultValue("10") Integer maxResult) {
        Filter<Car> filter = new Filter<>();
        filter.setFirst(startPosition).setPageSize(maxResult);
        final List<Car> results = carService.paginate(filter);
        return results;
    }

    @PUT
    @Path("/{id:[0-9][0-9]*}")
    @Consumes("application/json")
    public Response update(@PathParam("id") Integer id,  Car entity) {
        if (entity == null) {
            return Response.status(Status.BAD_REQUEST).build();
        }
        if (!id.equals(entity.getId())) {
            return Response.status(Status.CONFLICT).entity(entity).build();
        }
        if (carService.crud().eq("id",id).count() == 0) {
            return Response.status(Status.NOT_FOUND).build();
        }
        try {
            carService.update(entity);
        } catch (OptimisticLockException e) {
            return Response.status(Response.Status.CONFLICT).entity(e.getEntity()).build();
        }

        return Response.noContent().build();
    }
}

I have only replaced EntityManager used by Forge with CarService which was created on previous posts, the rest of the code was generated by Forge. This step was really straightforward, thanks to Forge.

REST Authentication

To authenticate client before calling the REST endpoint i’ve created a CDI Interceptor:

@RestSecured
@Interceptor
public class RestSecuredImpl implements Serializable{

    @Inject
    CustomAuthorizer authorizer;

    @Inject
    Instance<HttpServletRequest> request;

    @AroundInvoke
    public Object invoke(InvocationContext context) throws Exception {
        String currentUser = request.get().getHeader("user");
         if( currentUser != null){
             authorizer.login(currentUser);
         } else{
             throw new CustomException("Access forbidden");
         }
        return context.proceed();
    }

}

 

So for this app we are getting current user from HttpHeader of name user. If the interceptor doesn’t find the header it will throw a CustomException, it will be explained in next section. Note that only endpoints annotated with @RestSecured will be intercepted. Authorization is done by CustomAuthorizer

Verifying Authorization

Authorization is performed by CustomAuthorizer which is based on DeltaSpike security module. A very simple authorizer was created, it is based on username and stores logged user in a hashmap:

@ApplicationScoped
public class CustomAuthorizer implements Serializable {

    Map<String, String> currentUser = new HashMap<>();

    @Secures
    @Admin
    public boolean doAdminCheck(InvocationContext invocationContext, BeanManager manager) throws Exception {
        boolean allowed = currentUser.containsKey("user") && currentUser.get("user").equals("admin");
        if(!allowed){
            throw new CustomException("Access denied");
        }
        return allowed;
    }

   
    public void login(String username) {
        currentUser.put("user", username);
    }
}

When authorization fails (check method returns false) we are throwing another CustomException.

I have created a Rest Provider to map CustomException into Response types:

@Provider
public class CustomExceptionMapper implements ExceptionMapper<CustomException> {

    @Override
    public Response toResponse(CustomException e) {
        Map map = new HashMap();
        map.put("message", e.getMessage());

        if (e.getMessage().equals("Access forbidden")) {//TODO create specific exception and its mapper
            return Response.status(Response.Status.FORBIDDEN).type(MediaType.APPLICATION_JSON).entity(map).build();
        }
        if (e.getMessage().equals("Access denied")) {//TODO create specific exception and its mapper
            return Response.status(Response.Status.UNAUTHORIZED).type(MediaType.APPLICATION_JSON).entity(map).build();
        }
        return Response.status(Response.Status.BAD_REQUEST).type(MediaType.APPLICATION_JSON).entity(map).build();
    }
}

I have only added authentication to delete endpoint:

    @DELETE
    @Path("/{id:[0-9][0-9]*}")
    @RestSecured
    public Response deleteById(@PathParam("id") Integer id) {
        Car entity = carService.findById(id);
        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        carService.remove(entity);
        return Response.noContent().build();
    }

Basically added @RestSecured annotation. It means that if a client fires a request to this endpoint without providing a user on http header, the method will not be called and response will be 403. If client provides a user but it is not allowed then http response must be 401.

For authorization we use @Admin in the service method:

@Stateless
public class CarService extends CrudService<Car> {

    @Override
    @Admin
    public void remove(Car car) {
        super.remove(car);
    }
}

@Admin activates our CustomAuthorizer which verifies if current user has authorization to execute annotated method.

Testing the REST Endpoint

To test CarEndpoint i have used Arquillian, RestAssured and DBUnit. Before tests our database is populated with 4 cars described in car.yml dataset:

car:
  - id: 1
    model: "Ferrari"
    price: 2450.8
  - id: 2
    model: "Mustang"
    price: 12999.0
  - id: 3
    model: "Porche"
    price: 1390.3
  - id: 4
    model: "Porche274"
    price: 18990.23

I’ve implemented tests for all CRUD and HTTP operations. I will show only List and DELETE tests, other tests can be found in CrudRest.java. Here is how List all cars test look like:


    @Test
    public void shouldListCars() {
        given().
                queryParam("start",0).queryParam("max", 10).
        when().
                get(basePath + "rest/cars/list").
        then().
                statusCode(Response.Status.OK.getStatusCode()).
                body("", hasSize(4)).//dataset has 4 cars
                body("model", hasItem("Ferrari")).
                body("price", hasItem(2450.8f)).
                body(containsString("Porche274"));
    }

For DELETE methods i have one that fails with authentication, another fails with authorization and one which can delete a car:


    @Test
    public void shouldFailToDeleteCarWithoutAuthentication() {
        given().
                contentType(ContentType.JSON).
                when().
                delete(basePath + "rest/cars/1").  //dataset has car with id =1
                then().
                statusCode(Response.Status.FORBIDDEN.getStatusCode());
    }

    @Test
    public void shouldFailToDeleteCarWithoutAuthorization() {
        given().
                contentType(ContentType.JSON).
                header("user", "guest"). //only admin can delete
                when().
                delete(basePath + "rest/cars/1").  //dataset has car with id =1
                then().
                statusCode(Response.Status.UNAUTHORIZED.getStatusCode());
    }

    @Test
    public void shouldDeleteCar() {
        given().
                contentType(ContentType.JSON).
                header("user","admin").
        when().
                delete(basePath + "rest/cars/1").  //dataset has car with id =1
        then().
                statusCode(Response.Status.NO_CONTENT.getStatusCode());

        //ferrari should not be in db anymore
        given().
        when().
                get(basePath + "rest/cars/list").
         then().
                statusCode(Response.Status.OK.getStatusCode()).
                body("", hasSize(3)).
                body("model", not(hasItem("Ferrari")));
    }

Generating the REST API Documentation

To generate the API documentation i will use Swagger which is a specification for REST apis. Swagger is composed by various components, the main ones are:

  • swagger-spec: describes the format of REST APIs
  • swagger-codgen: generates REST clients based on swagger spec
  • swagger-ui: generates web pages describing the API based on swagger spec
  • swagger-editor: designing swagger specifications from scratch, using a simple YAML structure

Instead of using “pure” swagger, which requires its own annotations, i will use swagger-jaxrs-doclet that is based on javadoc and leverages JAXRS annotations.

The first thing to do is to copy the swagger-ui distribution (the swagger-ui i’ve used can be found here, i’ve made minor changes to index.html)to your application in webapp/apidocs as in image below:

img04

Now we just have to generate the swagger spec files based on our REST endpoints. This is done by the doclet maven plugin:


     <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-javadoc-plugin</artifactId>
                <version>2.9.1</version>
                <executions>
                    <execution>
                        <id>generate-service-docs</id>
                        <phase>generate-resources</phase>
                        <configuration>
                            <doclet>com.carma.swagger.doclet.ServiceDoclet</doclet>
                            <docletArtifact>
                                <groupId>com.carma</groupId>
                                <artifactId>swagger-doclet</artifactId>
                                <version>1.0.2</version>
                            </docletArtifact>
                            <reportOutputDirectory>src/main/webapp</reportOutputDirectory>
                            <useStandardDocletOptions>false</useStandardDocletOptions>
                            <additionalparam>-apiVersion 1 -docBasePath /cdi-crud/apidocs
                                -apiBasePath /cdi-crud/rest
                                -swaggerUiPath ${project.build.directory}/
                            </additionalparam>
                        </configuration>
                        <goals>
                            <goal>javadoc</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

The plugin has 3 main configurations:

  • docBasePath: where swagger spec files(.json) will be generated
  • apiBasePath: path used in the calls made from the API documentation(swagger-ui generates executable documentation)
  • swaggerUiPath: the plugin can generate the swagger-ui ditribution. As i am copying the dist manually i do not use this option and point it to target folder (in fact i could not get it working well so i need to play more with this option).

With this configuration the swagger spec files will be generated on each build which makes the documentation and API synchronized, see .json spec files (in red):

img05

Now you can access your REST API documentation in /apisdocs url:

img06

you can also fire REST requests through the API Docs:

img12

We can also enhance the documentation via Javadoc so for example we can add response types to describe the response codes, see modified delete method:


   /**
     * @description deletes a car based on its ID
     * @param user name of the user to log in
     * @param id car ID
     * @status 401 only authorized users can access this resource
     * @status 403 only authenticated users can access this resource
     * @status 404 car not found
     * @status 204 Car deleted successfully
     */
    @DELETE
    @Path("/{id:[0-9][0-9]*}")
    @RestSecured
    public Response deleteById(@HeaderParam("user") String user, @PathParam("id") Integer id) {
        Car entity = carService.findById(id);
        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        carService.remove(entity);
        return Response.noContent().build();
    }

and here is the generated doc(note that i’ve added @HeaderParam so we can authenticate through documentation page):

img11

All supported doclet annotations can be found here.

This app and REST documentation is available at openshift here: http://cdicrud-rpestano.rhcloud.com/cdi-crud/apidocs. There is also a simple car crud app.   To see a “bit more elaborated” documentation generated by swagger and doclet see Carma Rest API.

Arquillian + Cucumber + DBUnit

Arquillian and Cucumber integration is represented by Cukespace project, a very active with nice examples project.

One issue I’ve found with cukespace(also with arquillian jbehave) is the fact that I can’t use arquillian-persistence extension to initialize my scenarios datasets, the main cause is that cucumber test life cycle is different from junit/testng so arquillian isnt aware when a test or cucumber event is occurring, e.g. arquillian org.jboss.arquillian.test.spi.event.suite.* events aren’t triggered during cucumber tests execution.

There is an issue at cukespace project.

As arquillian persistence uses DBUnit behind the scenes i solved that limitation(while the issue isnt solved) using pure DBUnit api, here is some sample code, it can be found here.

CrudBdd.java

@RunWith(ArquillianCucumber.class)
@Features("features/search-cars.feature")
@Tags("@whitebox")
public class CrudBdd {

    @Inject
    CarService carService;

    Car carFound;

    int numCarsFound;

    @Deployment(name = "cdi-crud.war")
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment();
        war.addAsResource("datasets/car.yml", "car.yml").//needed by DBUnitUtils
                addClass(DBUnitUtils.class);
        System.out.println(war.toString(true));
        return war;
    }

    @Before
    public void initDataset() {
        DBUnitUtils.createDataset("car.yml");
    }

    @After
    public void clear() {
        DBUnitUtils.deleteDataset("car.yml");
    }

    @Given("^search car with model \"([^\"]*)\"$")
    public void searchCarWithModel(String model) {
        Car carExample = new Car();
        carExample.setModel(model);
        carFound = carService.findByExample(carExample);
        assertNotNull(carFound);
    }

    @When("^update model to \"([^\"]*)\"$")
    public void updateModel(String model) {
        carFound.setModel(model);
        carService.update(carFound);
    }

    @Then("^searching car by model \"([^\"]*)\" must return (\\d+) of records$")
    public void searchingCarByModel(final String model, final int result) {
        Car carExample = new Car();
        carExample.setModel(model);
        assertEquals(result, carService.crud().example(carExample).count());
    }

    @When("^search car with price less than (.+)$")
    public void searchCarWithPrice(final double price) {
        numCarsFound = carService.crud().initCriteria().le("price", price).count();
    }

    @Then("^must return (\\d+) cars")
    public void mustReturnCars(final int result) {
        assertEquals(result, numCarsFound);
    }

}

DBUnitUtils is a simple class that deals with DBUnit api, the idea is to initialize database
on Before and After events which are triggered on each scenario execution, also they are triggered for each ‘example‘ which is very important so we can have a clean database on each execution.

Here is my feature file:

Feature: Search cars

@whitebox
Scenario Outline: simple search and update
Given search car with model "Ferrari"
When update model to "Audi"
Then searching car by model "<model>" must return <number> of records
Examples:
| model | number |
| Audi  | 1      |
| outro | 0      |

@whitebox
Scenario Outline: search car by price
When search car with price less than <price>
Then must return <number> cars
Examples:
| price | number |
| 1390.2 | 0 |
| 1390.3 | 1 |
| 10000.0 | 2 |
| 13000.0 | 3 |

@blackbox
Scenario Outline: search car by id
When search car by id <id>
Then must find car with model "<model>" and price <price>
Examples:
| id | model | price |
| 1 | Ferrari | 2450.8 |
| 2 | Mustang | 12999.0 |
| 3 | Porche | 1390.3 |

DBUnitUtils.java

public class DBUnitUtils {

    private static DataSource ds;
    private static DatabaseConnection databaseConnection;

    public static void createDataset(String dataset) {

        if (!dataset.startsWith("/")) {
            dataset = "/" + dataset;
        }
        try {
            initConn();
            DatabaseOperation.CLEAN_INSERT.execute(databaseConnection,
new YamlDataSet(DBUnitUtils.class.getResourceAsStream(dataset)));
        } catch (Exception e) {
            e.printStackTrace();
            throw new RuntimeException("could not initialize dataset:" + dataset +
" \nmessage: " + e.getMessage());
        } finally {
            closeConn();
        }
    }

    public static void deleteDataset(String dataset) {
        if (!dataset.startsWith("/")) {
            dataset = "/" + dataset;
        }
        try {
            initConn();
            DatabaseOperation.DELETE_ALL.execute(databaseConnection,
new YamlDataSet(DBUnitUtils.class.getResourceAsStream(dataset)));
        } catch (Exception e) {
            e.printStackTrace();
            throw new RuntimeException("could not delete dataset dataset:" + dataset +
" \nmessage: " + e.getMessage());
        } finally {
            closeConn();
        }
    }

    private static void closeConn() {
        try {
            if (databaseConnection != null && !databaseConnection.getConnection().isClosed()) {
                databaseConnection.getConnection().close();
            }
        } catch (SQLException e) {
            e.printStackTrace();
            throw new RuntimeException("could not close conection \nmessage: " + e.getMessage());
        }

    }

    private static void initConn() throws SQLException, NamingException, DatabaseUnitException {
        if (ds == null) {
            ds = (DataSource) new InitialContext()
.lookup("java:jboss/datasources/ExampleDS");
        }
        databaseConnection = new DatabaseConnection(ds.getConnection());
    }

and car.yml

car:
  - id: 1
    model: "Ferrari"
    price: 2450.8
  - id: 2
    model: "Mustang"
    price: 12999.0
  - id: 3
    model: "Porche"
    price: 1390.3
  - id: 4
    model: "Porche274"
    price: 18990.23

 

DBUnit  Rest endpoint

Another limitation of persistence-extension is its integration with functional tests(blackbox/RunAsClient/testable=false), see this issue. Basically arquillian persistence needs server side resources like datasource to work but blackbox tests run outside the container, in a separated JVM.

To overcome that limitation I’ve created a DBUnit rest endpoint and deployed within my test so i can make rest calls to the server and create dataset there where i have all needed resources, here is the DBUnitUtils with rest calls:

 


public class DBUnitUtils {

    private static DataSource ds;
    private static DatabaseConnection databaseConnection;

  public static void createRemoteDataset(URL context, String dataset) {
        HttpURLConnection con = null;
        try {
            URL obj = new URL(context + "rest/dbunit/create/" + dataset);
            con = (HttpURLConnection) obj.openConnection();
            con.setRequestMethod("GET");
            con.setDoOutput(true);
            int responseCode = con.getResponseCode();
            if (responseCode != 200) {
                throw new RuntimeException("Could not create remote dataset\nstatus:" + responseCode + "\nerror:" + con.getResponseMessage());
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (con != null) {
                con.disconnect();
            }
        }
    }

    public static void deleteRemoteDataset(URL context, String dataset) {
        HttpURLConnection con = null;
        try {
            URL obj = new URL(context + "rest/dbunit/delete/" + dataset);
            con = (HttpURLConnection) obj.openConnection();
            con.setRequestMethod("GET");
            con.setDoOutput(true);

            int responseCode = con.getResponseCode();
            if (responseCode != 200) {
                throw new RuntimeException("Could not create remote dataset\nstatus:" + responseCode + "\nerror:" + con.getResponseMessage());
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (con != null) {
                con.disconnect();
            }
        }
    }

}

and DBUnitRest.java

@Path("/dbunit")
public class DBUnitRest {

    @GET
    @Path("create/{dataset}")
    public Response createDataset(@PathParam("dataset") String dataset) {
        try {
            DBUnitUtils.createDataset(dataset);//i feel like going in circles
        } catch (Exception e) {
            return Response.status(Status.BAD_REQUEST).entity("Could not create dataset.\nmessage:" + e.getMessage() + "\ncause:" + e.getCause()).build();
        }
        return Response.ok("dataset created sucessfully").build();
    }

    @GET
    @Path("delete/{dataset}")
    public Response deleteDataset(@PathParam("dataset") String dataset) {
        try {
            DBUnitUtils.deleteDataset(dataset);//i feel like going in circles
        } catch (Exception e) {
            return Response.status(Status.BAD_REQUEST).entity("Could not delete dataset.\nmessage:" + e.getMessage() + "\ncause:" + e.getCause()).build();
        }
        return Response.ok("dataset deleted sucessfully").build();
    }

}

 

and here is my functional test which uses Drone and Graphene:


@RunWith(ArquillianCucumber.class)
@Features("features/search-cars.feature")
@Tags("@blackbox")
public class CrudAt {

  @Deployment(name = "cdi-crud.war", testable=false)
  public static Archive<?> createDeployment() {
    WebArchive war = Deployments.getBaseDeployment();
        war.addAsResource("datasets/car.yml","car.yml").//needed by DBUnitUtils
                addPackage(DBUnitUtils.class.getPackage()).addClass(CrudBean.class).addClass(YamlDataSet.class).
                addClass(YamlDataSetProducer.class).
                addClass(Row.class).addClass(Table.class).addClass(DBUnitRest.class);

        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory("src/main/webapp").as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png|gif)$"));
        MavenResolverSystem resolver = Maven.resolver();
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.dbunit:dbunit:2.5.0").withoutTransitivity().asSingleFile());
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.yaml:snakeyaml:1.10").withoutTransitivity().asSingleFile());
    System.out.println(war.toString(true));
    return war;
  }

  @ArquillianResource
  URL url;

  @Drone
  WebDriver webDriver;

  @Page
  IndexPage index;

  @Before
  public void initDataset() {
      DBUnitUtils.createRemoteDataset(url,"car.yml");
  }

  @After
  public void clear(){
      DBUnitUtils.deleteRemoteDataset(url,"car.yml");
   }

  @When("^search car by id (\\d+)$")
  public void searchCarById(int id){
      Graphene.goTo(IndexPage.class);
      index.findById(""+id);
  }

  @Then("^must find car with model \"([^\"]*)\" and price (.+)$")
  public void returnCarsWithModel(String model, final double price){
    assertEquals(model,index.getInputModel().getAttribute("value"));
    assertEquals(price,Double.parseDouble(index.getInputPrice().getAttribute("value")),0);
  }

}

here is cucumber reports:

whitebox

blackbox

That’s all folks.

 

Arquillian and Mocks

Although it’s a bit contradictory to talk about Arquillian and Mocks sometimes you may need to skip some operations which are not relevant to your tests, for example:

  • Replicate data in legacy databases, here is somewhat a limitation of arquillian persistence extension where you can’t have multiple datasources but in general you don’t want to test data replication although if you have the legacy datasource configured in the test server you can simple fire a native query to see if data was replicated there.
  • If you use in memory databases such as H2 (which is somewhat a form of fake but we use it a lot ;)) you may not have some operations, functions or procedures which are available in the real application database so you may want to skip those calls in tests.
  • Skip CMS, such as Alfresco, calls in tests.
  • avoid start a mail service during tests
  • Skip In memory datagrids, such as Hazelcast, initialization during tests

In my case most of the issues are due to the fact that i can’t replicate all my infrastructure in tests, maybe with arquillian docker integration they can be solved without mocks but another interesting fact is that the components i’m faking slow down my tests and sometimes their relevance to my business logic is low.

So if you have faced any situation where you need to mock components in arquillian integration tests here is how i am dealing with that, the source code is available here: https://github.com/rmpestano/arquillian-mocks, the examples are quite simple, if you know alternatives don’t hesitate to comment here or send a pull request .

Here is some code:

MyBeanImpl.java


@RequestScoped
public class MyBeanImpl implements MyBean {

    private boolean alive = false;

    @PostConstruct
    public void init(){
       alive = true;
    }

    @Override
    public boolean someSlowOperation() {
        try {
            Thread.currentThread().sleep(10000);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        return true;
    }

    @Override
    public boolean isAlive() {
        return alive;
    }
}

and the integration test:


@RunWith(Arquillian.class)
public class MyBeanTest {

  @Inject
  MyBeanImpl realBean;

  @Deployment
  public static WebArchive createDeployment() {
        WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
                .addClasses(MyBeanImpl.class, MyBean.class)
                .addAsWebInfResource("test-beans.xml", "beans.xml");
        return war;
    }

  @Test
  public void shouldRunSlowWithRealBean() {
        long start = System.currentTimeMillis();
        assertTrue(realBean.someSlowOperation());
        long executionTime = (System.currentTimeMillis() - start);
        assertTrue(executionTime >= 10000);//should take at least 10000ms
    }

 } 

Now here are some ways to mock the bean slow call.

1 – Using a mock framework:
You can use a mock framework like Mockito to fake method calls, to do that you need to add mockito to test deployment and then you can mock the slow method call:


@RunWith(Arquillian.class)
public class MyBeanTest {

    //@Mock //to use @Mock see auto discover extension: https://github.com/arquillian/arquillian-showcase/blob/master/extensions/autodiscover
    @Inject
    MyBeanImpl realMockedBean;

    @Before
    public void setupMock() {
        realMockedBean = Mockito.mock(MyBeanImpl.class);//real bean replaced by mocked one
        Mockito.when(realMockedBean.someSlowOperation()).thenReturn(true);
    }

    @Deployment
    public static WebArchive createDeployment() {
        WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
                .addClasses(MyBeanImpl.class, MyBean.class)
                .addAsWebInfResource("test-beans.xml", "beans.xml");
         MavenResolverSystem resolver = Maven.resolver();
         war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.mockito:mockito-all:1.10.8").withTransitivity().asSingleFile());
        return war;
    }
 
    @Test
    public void shouldRunFastWithMockedBean() {
        long start = System.currentTimeMillis();
        assertTrue(realMockedBean.someSlowOperation());
        assertTrue((System.currentTimeMillis() - start) < 10000);
    }
}

Note that this approach have some limitations:

  1. If the mocked bean is injected into another (real)bean you will have to set the mocked bean into the real one.
  2. When you use Mock(SomeBean.class) CDI doesn’t manage the bean anymore meaning that you will have to fake all behaviour you need in tests.

2 – CDI alternatives:

Just create an alternative implementation of the real bean and enable it in test-beans.xml:

MyBeanAlternative.java


@Alternative
public class MyBeanAlternative implements MyBean {

    @Override
    public boolean someSlowOperation() {
         return true;
    }
}

 

add the alternative bean to the test deployment and inject MyBean interface:


@RunWith(Arquillian.class)
public class MyBeanTest {

    @Inject
    MyBean myAlternativebean;

    @Deployment
    public static WebArchive createDeployment() {
        WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
                .addClasses(MyBeanImpl.class, MyBean.class, MyBeanAlternative.class)
                .addAsWebInfResource("test-beans.xml", "beans.xml");

             return war;      }

    @Test
    public void shouldRunFastWithAlternativeBean() {
        long start = System.currentTimeMillis();
        assertTrue(myAlternativebean.someSlowOperation());
        assertTrue((System.currentTimeMillis() - start) < 5000);//should run faster than 10000
    }
}

This is a good approach that solves the first limitation of previous approach but still doesn’t solve second limitation.

3 – CDI Bean Specialization

For bean specialization you dont need to implement an interface but just extend the real bean and provide new implementation to the method you want to fake:


@Specializes
public class MyBeanSpecialization extends MyBeanImpl {

    @Override
    public boolean someSlowOperation()  {
        return true;
    }
}

 

and now in the test you can inject the real bean but dont forget to add the bean specializition on test deployment:

@RunWith(Arquillian.class)
public class MyBeanTest {

@Inject
MyBeanImpl realBean;

 @Deployment
 public static WebArchive createDeployment() {
     WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
             .addClasses(MyBeanImpl.class, MyBean.class, MyBeanSpecialization.class)
             .addAsWebInfResource("test-beans.xml", "beans.xml");
    return war;
    }

 @Test
 public void shouldRunFastWithSpecializedBean() {
        long start = System.currentTimeMillis();
        assertTrue(realBean.someSlowOperation());
        long executionTime = (System.currentTimeMillis() - start);
        assertTrue(executionTime < 5000);//should run faster than 10000
    }
}

This approach solve both mentioned limitations.

4 – Bytecode manipulation

For the bytecode manipulation i will use arquillian-byteman extension where you can override bean methods through byte code weaving:

@RunWith(Arquillian.class)
public class MyBeanTest {

   @Inject
   MyBeanImpl realBean;

   @Deployment
   public static WebArchive createDeployment() {
       WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
              .addClasses(MyBeanImpl.class, MyBean.class)
              .addAsWebInfResource("test-beans.xml", "beans.xml");
       return war;
    }

    @Test
    @BMRule(
      name = "fake method call", targetClass = "MyBeanImpl",
      targetMethod = "someSlowOperation",
      action = "return true;")
    public void shouldRunFastWithBytecodeManipulatedBean() {
        long start = System.currentTimeMillis();
        assertTrue(realMockedBean.someSlowOperation());
        assertTrue(realMockedBean.isAlive());
        assertTrue((System.currentTimeMillis() - start) < 5000);
    }

 

I make it work only in managed container, you need to add this vmArg to arquillian.xml:

<container qualifier="wildfly-managed">
        <configuration>
            <property name="outputToConsole">true</property>
            <property name="javaVmArguments">-Xmx512m -XX:MaxPermSize=256m
                -Djboss.modules.system.pkgs=com.sun.tools.attach,org.jboss.byteman -Xbootclasspath/a:${java.home}/../lib/tools.jar
            </property>
            <property name="allowConnectingToRunningServer">true</property>
    </configuration>
   <extension qualifier="byteman">
        <property name="autoInstallAgent">true</property>
        <property name="agentProperties">org.jboss.byteman.verbose=false</property>
    </extension>

I have tested only with wildfly application server.

Conclusion

If somehow you need mocks in your integration tests i recommend option 3 – Bean specialization cause option 1 and 2 you’ll need to reimplement all bean methods/behaviour and option 4 doesn’t work on all containers yet.

Arquillian – The aliens are invading

I dedicate this post to my forefather, most of it was written while i was taking care of my father at hospital, rest in peace my best friend!

The ideia behind this post is to share my experience with Arquillian[1] which is becoming – the de facto standard – framework for writing (real) tests in the JavaEE environment(hence the blog title).

We will use arquillian to test a JavaEE6(compatible with EE7) application, all sources are available at github here, its a simple User, Group and Role management application. Also as usual there is a video showing what we will see on this entry:http://youtu.be/iGkCcK1EwAQ

Introduction

Arquillian is a testing platform which brings the power of real tests in Java enterprise applications by enabling the easy creation of integration, functional, behaviour tests among others.

One of the main characteristic of tests written with arquillian is that they run inside a container like servlet container, CDI, JavaEE(6 or +) server, mobile and so on, the so called in-container testing[2]. With tests inside a container the developer or test engineer don’t need to be concerned with server environment such as EJBs, JPA, CDI and JMS infrastructures and can focus on test itself.

To be able to run tests inside a container its necessary to provide which classes and resources will be part of the tests, this is called a “micro-deployment” cause its usually a subset of all application resources.

Micro deployment

Creating micro-deployments is ease by ShrinkWrap library, here is an example:

     @Deployment//tells arquillian which method generates the deployment
     public static Archive<?> createDeployment() {
	  WebArchive war = ShrinkWrap.create(WebArchive.class);

          //adding classes
	  war.addClass(MyClass.class).addClass(AnotherClass.class)
          //adding all classes in com.mypackage and in its subpackages
          .addPackages(true,com.mypackage);
          //adding libs
          MavenResolverSystem resolver = Maven.resolver();
	  //adds primefaces4.0.jar in web-inf/libs
          war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.primefaces:primefaces:4.0").withoutTransitivity().asFile());

	  //web-inf resources

	  //copies src/main/webapp/WEB-INF/beans.xml to micro-deployment WEB-INF folder
          war.addAsWebInfResource("src/main/webapp/WEB-INF/beans.xml", "beans.xml");

	 //resources

	 //copies src/test/resources/test-persistence.xml to src/main/resources/META-INF/persistence.xml micro deployment folder
         war.addAsResource("test-persistence.xml", "META-INF/persistence.xml");

	return war;
     }

Execution modes

To perform in-container tests arquillian can be configured to operate in three ways:

  • Embedded: An embedded container version is downloaded via maven during execution time and tests run on top of it
  • Remote: Arquillian connects to a container running on a local or remote machine.
  • Managed: in this mode arquillian manages container startup/shutdown by using a container installation. Note that if the container is already running locally arquillian will connect to it like in remote mode.

Test run modes

There are two ways tests can run with arquillian, in-container and as client.

Running tests in container means you have full control of artifacts deployed by micro-deployment like injecting beans, access persistenceContext, EJBs and so on. This is how we do integration/white box tests.

Running as client is the opposite, we cannot access any deployed resource. This mode simulates a client accessing our application from outside and denotes black box testing.

Also there is the mixed mode where some tests run as client and others inside container.

Here is a detailed explanation of arquillian test run mode.

Test lifecycle

Every arquillian test execution follows the following steps:

  1.  Micro deployment creation, it can be and jar, war or ear. If a dependency cannot be colected(resource not found), the process is aborted.
  2.  Startup of the container(embedded and managed) or arquillian connects to a container(remote mode) where tests will run. Can abort process if server doesn’t start within 60 sec or isn’t found.
  3. Deploy of micro-deployment generated in step 1. If a dependency is missing, e.g.: CDI bean dependent beans not deployed, arquillian will not run tests related to this micro-deploy.
  4. Tests related to step 3 micro-deployment are executed
  5. Undeploy of step 3 micro-deploy
  6. Container shutdown(in managed or embedded) or container disconnect(remote)

OBS: steps 1, 3, 4 and 5 will repeat for each @Deployment – @RunWith(Arquillian.class) present in test classaph.

Extensions:

The framework is composed by various extensions, here are the ones we will use:

  • Core: base of all other extensions, it’s responsible of deploy/undeploy,container startup and shutdown and also manage tests life cycle.
  • Drone: Integration with selenium like webdriver management/injection.
  • Graphene: provide a layer of abstraction on top of webdriver extending its functionalities to ease functional tests creation
  • Persistence: brings database management to tests via DBUnit
  • JBehave: enables BDD tests in arquillian through JBehave.
  • Warp: enables gray box testing creation
  • Recorder: allows test recording through video and images.
  • Rest: enables restful endpoints testing
  • JaCoCo: test coverage metrics.

for more extensions refer to arquillian github organization and reference guide

Configuring a project to use Arquillian

Following are the main steps to configure arquillian.

Dependencies

Below is a basic arquillian pom.xml

 <dependencies>
      <!-- Test dependencies -->
       <dependency>
            <groupId>junit</groupId> <!--testNG is also supported-->
            <artifactId>junit</artifactId>
            <version>4.8.2</version>
            <scope>test</scope>
        </dependency>
        <!-- arquillian test framework set to junit, could be testNG -->
        <dependency>
            <groupId>org.jboss.arquillian.junit</groupId>
            <artifactId>arquillian-junit-container</artifactId>
            <scope>test</scope>
        </dependency>
       <!-- shrinkWrap resolvers -->
        <dependency>
            <groupId>org.jboss.shrinkwrap.resolver</groupId>
            <artifactId>shrinkwrap-resolver-depchain</artifactId>
            <scope>test</scope>
            <type>pom</type>
        </dependency>
	 <dependency>
            <groupId>org.jboss.spec.javax.annotation</groupId>
            <artifactId>jboss-annotations-api_1.1_spec</artifactId>
            <version>1.0.1.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.spec.javax.ejb</groupId>
            <artifactId>jboss-ejb-api_3.1_spec</artifactId>
            <version>1.0.2.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.protocol</groupId>
            <artifactId>arquillian-protocol-servlet</artifactId>
            <scope>test</scope>
        </dependency>
		<!-- container adapter -->
	 <dependency><!-- server and mode(managed)-->
            <groupId>org.jboss.as</groupId>
            <artifactId>jboss-as-arquillian-container-managed</artifactId>
            <scope>test</scope>
        </dependency>
	  <!-- end test dependencies -->
</dependencies>

Arquillian uses the concept of maven bom(bill of materials) where a dependency of type pom dictates the recommended versions(can be overriden) of declared dependencies so because of this we didnt declared the version of most of above dependencies. Here follows the arquillian core bom which must be declared at dependencyManagement section:

   <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.jboss.arquillian</groupId>
                <artifactId>arquillian-bom</artifactId>
                <version>1.1.4.Final</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

Configuration file

Arquillian centralizes its configuration in arquillian.xml which contains container adapters and extensions configuration, it must be located at src/test/resources, here is an example:

<arquillian xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xmlns="http://jboss.org/schema/arquillian"
	xsi:schemaLocation="
        http://jboss.org/schema/arquillian
        http://jboss.org/schema/arquillian/arquillian_1_0.xsd">

	<!-- Force the use of the Servlet 3.0 protocol with all containers, as it is the most mature -->
	<defaultProtocol type="Servlet 3.0" />

	<container qualifier="jboss-remote" >
		<configuration>
			<property name="managementAddress">127.0.0.1</property>
			<property name="managementPort">9999</property>

		</configuration>
	</container>
	<container qualifier="jboss-managed"  default="true" >
		<configuration>
		   <!-- jbossHome can be replaced by JBOSS_HOME maven environmentVariable-->
	    	   <property name="jbossHome">#{arquillian.serverHome}</property>
	           <property name="outputToConsole">true</property>
                   <property name="javaVmArguments">-Xmx512m -XX:MaxPermSize=256m -Djboss.bind.address=localhost</property>
//makes it behave like remote adapter if container is already started
		   <property name="allowConnectingToRunningServer">true</property>
                </configuration>
	</container>
</arquillian>
   

Basically what we have are adapters config, in this example jboss managed is active by default. Note that container activated in arquillian.xml must have its dependency present in test classpath when running the tests, in our case jboss-as-arquillian-container-managed.

To switch beetween adapters you can either hardcode default property(as above) or use arquillian.launch file which may contain the name of the qualifier to be used by tests.

To make arquillian.launch dinamic you can use a maven property instead of using a constant adapter qualifier, but a better approach is to use maven system property in maven surefire plugin so for example jboss-managed maven profile sets arquillian.launch to jboss-manaded as below:

	<profile>
            <id>jboss-managed</id>
             <dependencies>
                <dependency>
                    <groupId>org.jboss.as</groupId>
                    <artifactId>jboss-as-arquillian-container-managed</artifactId>
                    <scope>test</scope>
                    <version>${jboss.version}</version>
                </dependency>
            </dependencies>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>2.16</version>
                        <configuration>
                            <systemPropertyVariables>
                                <arquillian.launch>jboss-managed</arquillian.launch>
                            </systemPropertyVariables>
                            <environmentVariables>
                                <JBOSS_HOME>${arquillian.serverHome}</JBOSS_HOME>
                            </environmentVariables>
                        </configuration>
                    </plugin>
           </plugins>
         </build>
     </profile>

Using this approach we garantee that adapter dependency will be present in classpath. Also note JBOSS_HOME enviromnment variable is here to specify container location(needed by managed adapter), we are using ${arquillian.serverHome} so it can be overriden via maven command when executing by CI.

Hello Arquillian

With maven dependencies and arquillian.xml configured to use jboss as test container in managed mode we already can create an arquillian (integration)test.

@RunWith(Arquillian.class)
public class HelloArquillianIt {//'It' sulfix is used to identify different types of test in pom.xml, we'll detail in tips section  

    @Deployment
    public static WebArchive createDeployment() {
        WebArchive war = ShrinkWrap.create(WebArchive.class);
        war.addPackages(true, "org.conventions.archetype.model");//entities
        war.addClasses(RoleService.class, RoleServiceImpl.class) //we will test RoleService
        .addClasses(UserService.class, UserServiceImpl.class)//used by SecurityInterceptorImpl.java
        .addPackages(true, "org.conventions.archetype.qualifier")//@ListToUpdate, @see roleServiceImpl
        .addClass(TestService.class);
        war.addPackages(true, "org.conventions.archetype.security");//security interceptor @see beans.xml
        war.addPackages(true, "org.conventions.archetype.event");//UpdateListEvent @see RoleServiceImpl#afterStore
        war.addPackages(true, "org.conventions.archetype.util");
        //LIBS
        MavenResolverSystem resolver = Maven.resolver();
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.conventionsframework:conventions-core:1.1.2").withTransitivity().asFile());//convention is a experimental framework to enable some JavaEE utilities
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.primefaces:primefaces:4.0").withoutTransitivity().asSingleFile());

        //WEB-INF
        war.addAsWebInfResource(new File(WEB_INF,"beans.xml"), "beans.xml");//same app beans.xml
        war.addAsWebInfResource(new File(WEB_INF,"web.xml"), "web.xml");same app web.xml
        war.addAsWebInfResource(new File(WEB_INF,"faces-config.xml"), "faces-config.xml");

        war.addAsWebInfResource("jbossas-ds.xml", "jbossas-ds.xml");//datasource

        //resources
        war.addAsResource("test-persistence.xml", "META-INF/persistence.xml");//customized persistence.xml to use a different database 

        return war;
    }

    @Inject
    RoleService roleService;// service real instance

    @Test
    public void shouldListRolesWithSuccess(){
        assertEquals(roleService.crud().countAll(),???????????);// how many roles?
    }

}

As you can see the most difficult part is creating the test deployment, you need to discover each component dependency and you usually face some
‘noClassDefFound’ error or ‘unsatisfiedDependency’ Exception but after that you can move most deployment entries to a utility class and reuse between your tests, see Deployments.java.

Also note that we could not made the assertion in test method cause we dont know how many roles there is in database, by the way which database the test is using? it’s using the one declared in test-persistence.xml(added in micro-deployment) which uses a maven property ${datasource} that is
defined in pom.xml. For tests using jboss or wildfly we are going to use ‘exampleDS’ as datasource.

The test datasource uses exampleDS2 cause its good practice that tests has its own database, as exampleDS2 is not configured in the server we add it on demand in our micro deployment via jbossas-ds.xml.

Ok we answered which database but we still doesn’t know how to populate test database so we can make assertions on top of it.

One way is to initialize database before each test using an utility class, here is TestService:

@Named
@Stateless
public class TestService implements Serializable {

    @PersistenceContext(unitName = "archetypeTestPU")
    EntityManager em;	

    @Inject
    RoleService roleService;

    public void createRoleDataset(){
        clearDatabase();

        Role roleArch = new Role("architect");
        Role roleAdmin = new Role("administrator");

        em.persist(roleAdmin);
        em.persist(roleArch);
        em.flush();

    }

    public void clearDatabase() {

        em.createNativeQuery("delete from group__role_").executeUpdate();//intermediate tables
        em.flush();
        em.createNativeQuery("delete from user__group_").executeUpdate(); //intermediate tables
        em.flush();
        em.createNativeQuery("delete from role_").executeUpdate();//role
        em.flush();
        em.createNativeQuery("delete from group_").executeUpdate();//group
        em.flush();
        em.createNativeQuery("delete from user_").executeUpdate();//user
        em.flush();
    }

}

here is our integration test with test service to initialize database:

    //testService needs to be added to deployment with .addClass(TestService.class); so it can be injected into test.

    @Inject
    TestService testService;

    @Test
    public void shouldListRolesWithSuccess(){
        testService.createRoleDataset();
        int numRoles = roleService.crud().countAll();
        assertEquals(numRoles, 2);
    }

Running the test

To run the test from IDE just right click in HelloArquillianIt.java and choice run/debug as junit test(dont forget to activate adapter maven profile in IDE).

Also be careful with JBOSS_HOME environment variable, in eclipse it also must be set (in Intellij it is set automatically for you)

To run via maven, surefire plugin must be configured:


	<plugin>
		<groupId>org.apache.maven.plugins</groupId>
		<artifactId>maven-surefire-plugin</artifactId>
		<version>2.16</version>
		<configuration>
			<skipTests>false</skipTests>
			<includes>
				<include>HelloArquillianIt.java</include>
			</includes>
		</configuration>
	</plugin>

 
in case surefire is defined in main build tag in pom.xml then tests will be executed on each build(mvn package, install or test). We will see in
TIPS section how to separate tests in specific maven profiles.

Another way to initialize database is to use DBUnit throught arquillian persistence extension.

Managing test database with arquillian persistence

Persistence extension helps the writing of tests where persistence layer is involved like preparing a test dataset, make a dataset assertion and so on, its based on dbunit framework.

Dependencies

<!--arquillian persistence(dbunit) -->
		<dependency>
			<groupId>org.jboss.arquillian.extension</groupId>
			<artifactId>arquillian-persistence-api</artifactId>
			<version>1.0.0.Alpha7</version>
			<scope>test</scope>
		</dependency>
		<dependency>
			<groupId>org.jboss.arquillian.extension</groupId>
			<artifactId>arquillian-persistence-dbunit</artifactId>
			<version>1.0.0.Alpha7</version>
			<scope>test</scope>
		</dependency>

Configuration file

    <extension qualifier="persistence">
        <property name="defaultDataSource">${datasource}</property>
        <!--<property name="defaultDataSeedStrategy">CLEAN_INSERT</property>-->
    </extension>

${datasource} is set via maven property in pom.xml so we can have dinamic datasource to run tests in different servers, for example in JBossAS tests must use datasource java:jboss/datasources/ExampleDS

Example

    @Test
    @UsingDataSet("role.yml")
    @Cleanup(phase = TestExecutionPhase.BEFORE)
    public void shouldListRolesUsingDataset(){
        int numRoles = roleService.crud().countAll();
        log.info("COUNT:"+numRoles);
        assertEquals(numRoles, 3);
    }

role.yml is a file located at src/test/resources/datasets folder with the following content describing 3 roles:


role_:
  - id: 1
    name: "role"
    version: 1
    createDate: 2014-01-01
    updateDate: 2014-01-01
  - id: 2
    name: "role2"
    version: 1
    createDate: 2014-01-01
    updateDate: 2014-01-01
  - id: 3
    name: "role3"
    version: 1
    createDate: 2014-01-01
    updateDate: 2014-01-01

Other formats like json e xml are supported.

Its also possible to execute scripts before/after each test with @ApplyScriptBefore/After and make dataset assertions with @ShouldMatchDataSet.

Some limitations:

  • hard to maintain large datsets
  • dataset values are static but in next version there will be a way to feed datasets with expressions
  • can degradate test performance(looks like it was fixed for next version(alpha7))
  • doesn’t work with client tests(black box)

Behaviour driven tests(BDD)

BDD[4] with arquillian is implemented in arquillian-jbehave extension throught jbehave[5] framework where an story file(.story) dictates the behaviour of a functionalite using natural language with ‘give when then’ tuple which links the story with tests that will execute based on the described behaviour.

Dependencies


	<dependency>
		<groupId>org.jboss.arquillian.jbehave</groupId>
		<artifactId>arquillian-jbehave-core</artifactId>
		<version>1.0.2</version>
		<scope>test</scope>
	</dependency>

	<!-- although jbehave extension works great and integrates well with most other extensions it doesn't <a href="https://community.jboss.org/message/865003#865003" target="_blank">have an official release</a> -->
        <repository>
            <id>arquillian jbehave UNofficial maven repo</id>
            <url>http://arquillian-jbehave-repo.googlecode.com/git/</url>
            <layout>default</layout>
        </repository>

Example

here follows BDD example, all bdd related tests can be found here.

@RunWith(Arquillian.class)
@Steps(UserSteps.class)
public class UserBdd extends BaseBdd {

    @Deployment
    public static WebArchive createDeployment()
    {
        WebArchive archive = Deployments.getBaseDeployment() //same deployment we saw before
                .addPackage(BaseBdd.class.getPackage())
                .addClass(UserSteps.class)
        return archive;
    }

}

Superclass BaseBdd is used to configure jbehave like for example customize test output report and initialize Steps – classes that execute the test behaviour(given when then).

The mapping between step class and story file works in the following way:

  • arquillian looks for story file in the same package of the class which extends JUnitStory which in case is UserBdd cause it extends BaseBdd that extends JUnitStory.
  • story file name by convention must have the same name of the class which extends JUnitStory but with ‘_'(underline) instead of camelcase, so for example UserBdd story file is user_bdd.story. This convention can be configured in BaseBdd or you can override configuration() method.
  • to execute the behaviour jbehave uses one or more step classes provided in the annotation @Steps(MyStepClass.class)
Story: manage users

Scenario: listing users by role

When i search users with role [name]

Then users found is equal to [total]

Examples:
|name|total|
|developer|2|
|administrator|1|
|secret|0|

//other scenarios related to user

and the step class to execute the behaviour UserSteps.java:

public class UserSteps extends BaseStep implements Serializable {

    private Integer totalUsersFound;

    private String message;

    @BeforeStory
    public void setUpStory(){
	//initialize story
        testService.initDatabaseWithUserAndGroups();
    }

    @BeforeScenario
    public void setUpScenario(){
        //initialize scenario
    }

    @When("i search users with role $name")
    public void searchUserByRole(@Named("name") String roleName){
           totalUsersFound = userService.findUserByRole(new Role(roleName)).size();
    }

    @Then("users found is equal to $total")
    public void usersFoundEqualTo(@Named("total") Integer total){
        assertEquals(totalUsersFound,total);
    }

The matching between story and test(Step) is done by exact string comparison but you can use @Alias annotation to match multiple strings for a step.

For more information about JBehave see its (great) documentation at jbehave.org.

Jbehave enables acceptance tests which can be white box, sometimes called system acceptance tests and usually are defined by development team cause they cover internal system logic or can be black box and named as user acceptance tests which is usualy written by or with client(final user), we will cover black boxed bdd tests later.

To execute bdd tests in our app use:

mvn clean test -Pwildfly-managed -Pbdd-tests

Functional Tests

Tests we saw until now were white boxed or in other words executed in the same proccess(JVM) where container executes tests so we have direct access(through injection) to the objects that were deployed by the micro deployment. Functional tests are black boxed and run “from outside” the container in a different JVM simulating a client accessing the system through user interface.

Arquillian enables functional tests through Drone and Graphene extensions. Both work on top of selenium, the first manages webdriver lifecycle and injection and the second extends selenium funcionalities as PageObjects[6], jquery selectors, page fragments, simplified waits and so on.

Dependencies

		<dependency>
			<groupId>org.jboss.arquillian.graphene</groupId>
			<artifactId>graphene-webdriver</artifactId>
			<type>pom</type>
			<scope>test</scope>
			<version>${version.graphene}</version>
		</dependency>

<dependencyManagement>
		<dependencies>
			<dependency>
				<groupId>org.jboss.arquillian.selenium</groupId>
				<artifactId>selenium-bom</artifactId>
				<version>${version.selenium}</version>
				<type>pom</type>
				<scope>import</scope>
			</dependency>
			<dependency>
				<groupId>org.jboss.arquillian.extension</groupId>
				<artifactId>arquillian-drone-bom</artifactId>
				<version>${version.drone}</version>
				<type>pom</type>
				<scope>import</scope>
			</dependency>
		</dependencies>
</dependencyManagement>

note that the extension uses two boms, selenium bom updates webdriver version regardless webdriver version that comes in drone bom but it needs to be declared before cause in case of identical dependencies(from each bom) the first has precedence on the others in dependencyManagement section.

Configuration file

<extension qualifier="graphene">
	   <property name="waitGuiInterval">3</property>
	   <property name="waitAjaxInterval">4</property>
	   <property name="waitModelInterval">5</property>

	</extension>
	 <extension qualifier="webdriver">
<!--         <property name="browser">firefox</property> -->
        <property name="browser">${arquillian.browser}</property>
        <property name="dimensions">1280x1024</property>
        <property name="remoteReusable">${arquillian.remoteReusable}</property>
 		<property name="remote">${arquillian.remote}</property>
        <property name="remoteAddress">${arquillian.seleniumGrid}</property>
        <property name="chromeDriverBinary">${arquillian.chromeDriver}</property>
        <!-- propriedade para rodar com o chrome localmente, baixar em: http://code.google.com/p/chromedriver/downloads/list -->
    </extension>

to more configuration details see drone and graphene documentation.

Example

here follows functional testing example, all functional related source can be found here

first step is micro-deployment:

    @Deployment(testable = false)
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment()
        .addPackages(true, UserMBean.class.getPackage()) //managed beans
        .addPackages(true,"org.conventions.archetype.converter");//faces converters

        //web resources (pages, js, css etc...)
        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory(WEBAPP_SRC).as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png)$"));
        war.addAsWebResource(new File(TEST_RESOURCES, "/pages/test-logon.xhtml"), "/templates/logon.xhtml");//test logon initialize database for black box tests on each logon, we will detail it later in TIPS section
        System.out.println(war.toString(true));
        return war;
    }

the main difference is the property testable = false which indicates that the test will run as client(separated JVM) this is necessary for drone getting in action cause it executes as a separated proccess.

We also need to add managed beans(bean package), converters and web related resources
such as pages and js files to the test deploy, that is done via ShrinkWrap Filters which accepts regular expressions letting this task simple and generic but becareful cause you are adding all files of a type(eg .js) and if the application is big it can onerate deployment process.

Here is logon example:

    @Drone
    WebDriver browser;

    @Test
    @InSequence(1)
    public void shouldLogonWithSuccess(@InitialPage HomePage home){
        assertTrue(home.getLogonDialog().isPresent());
        home.getLogonDialog().doLogon("admin", "admin");
        home.verifyMessage(resourceBundle.getString("logon.info.successful"));/;/asserts primefaces growl message
    }

Observations

– WebDriver is injected by drone and it know the browser through the arquillian.xml property ${arquillian.browser}. ${arquillian.browser} is defined in pom.xml.
– @InitialPage denote the page that webdriver must navigate before executing the test.
– HomePage is a PageObject[6], here is its code:

@Location("home.faces")
public class HomePage extends BasePage{

    @FindByJQuery("div[id$=logonDialog]")
    private LogonDialog logonDialog;

    public LogonDialog getLogonDialog() {
        return logonDialog;
    }

Also note:
– @Location annotation helps Drone to navigate via @InitialPage
– BasePage has some utilities for page objects
– @FindByJquery is an extension of FindBy selenium selector and is based on JQuery selector
– LogonDialog is a pageFragment and here is its code:

public class LogonDialog {

    @Root
    private GrapheneElement dialog;

    @FindByJQuery("input[id$=inptUser]")
    private GrapheneElement username;

    @FindByJQuery("input[id$=inptPass]")
    private GrapheneElement password;

    @FindByJQuery("button[id$=btLogon]")
    private GrapheneElement btLogon;

    public void doLogon(String username, String password){
        this.username.clear();
        this.username.sendKeys(username);
        this.password.clear();
        this.password.sendKeys(password);
        guardHttp(btLogon).click();
    }

    public boolean isPresent(){
        return this.username.isPresent();
    }

About LogonDialog fragment:

– GrapheneElement is an extension of seleniun webelement
– guardHttp is a Graphene implicity wait that will block test until the http request is done, there is also ajax wait which is very useful
– there is also explicity wait like Graphene.waitModel() and its timeout is configured in arquillian.xml

To execute functional tests in our app use:

mvn clean test -Pwildfly-managed -Pft-tests

User Acceptance tests

In this article i will call user acceptance tests[7] a combination of functional tests with Jbehave or in other words, black box behaviour driven tests. Its important to separate different types of tests cause you probably will want to execute them in different moments, eg: faster tests(white box) on each commit and slower and resource consumer ones at the end of the day.

As this kind of test is a combination of other kind of tests we wont have any specific configuration in arquillian.xml nor dependency at pom.

Example

@Steps({ RoleStep.class, LogonStep.class })
@RunWith(Arquillian.class)
public class RoleAt extends BaseAt {

    @Deployment(testable = false)
    public static WebArchive createDeployment()
    {
        WebArchive archive = createBaseDeployment();

        System.out.println(archive.toString(true));
        return archive;
    }

}

BaseAt combines elements of functional tests(testable=false) and bdd tests that we saw earlier.

role_at.story(remember name convention)


Story: manage user roles

GivenStories: org/conventions/archetype/test/at/logon/logon_at.story

Scenario: insert new role

Given user go to role home

When user clicks in new button

Then should insert role with name new role

Scenario: search roles

Given user go to role home

When user filter role by name [name]

Then should list only roles with name [name]

And return [total] rows

Examples:
|name|total|
|developer|1|
|admin|1|
|a|2|

public class RoleStep implements Serializable {

    @Drone
    private WebDriver browser;

    @ArquillianResource
    private URL baseUrl;

    @Page
    private RoleHome roleHome;

    @FindByJQuery("div[id$=menuBar]")
    private Menu menu;

    @Given("user go to role home")
    public void goToRoleHome() {
        menu.gotoRoleHome();
    }

    @When("user clicks in new button")
    public void userClickInNewButton() {
        WebElement footer = roleHome.getFooter();
        assertTrue(footer.isDisplayed());
        WebElement newButton = footer.findElement(By.xpath("//button"));
        assertTrue(newButton.isDisplayed());
        guardHttp(newButton).click();
    }

    @Then("should insert role with name $name")
    public void shouldInsertRole(String name) {
        Assert.assertTrue(roleHome.isFormPage());
        roleHome.insertRole(name);

    }

    @When("user filter role by name $name")
    public void userFilterRolesBy(@Named("name") String name) {
        roleHome.filterByName(name);
    }

    @Then("should list only roles with name $name")
    public void shouldListRolesWith(@Named("name") String name) {
        for (WebElement row : roleHome.getTableRows("table")) {
            assertTrue(row.findElement(By.xpath("//td[@role='gridcell']//span[contains(@id,'name')]")).getText().contains(name));
        }
    }

    @Then("return $total rows")
    public void shouldReturn(@Named("total") Integer total){
         assertEquals(roleHome.getTableRows("table").size(), total.intValue());
    }

}

RoleHome is a pageObject that controls role form(roleHome.xhtml), its source can be found here.

logon_at.story

 
Story: logon as user

Scenario: user should logon successfully

Given i am at logon screen

When i logon providing credentials admin, admin

Then i should be logged in

public class LogonStep extends BaseAtStep implements Serializable {

  @Drone
  private WebDriver       browser;

  @ArquillianResource
  private URL             baseUrl;

  @FindByJQuery("div[id$=menuBar]")
  private Menu menu;

  @Page
  private HomePage home;

  @Given("i am at logon screen")
  public void imAtLogon() {
    if (!home.getLogonDialog().isPresent()) {
      //if is already logged in, do logout
        super.goToPage(home);
      }
  }

  @When("i logon providing credentials $username, $password")
  public void loginWithCredentials(String username, String password) {
    home.getLogonDialog().doLogon(username,password);
  }

  @Then("i should be logged in")
  public void shouldBeAt() {
    home.verifyMessage(resourceBundle.getString("logon.info.successful"));
  }

}

To execute user acceptance tests in our example project use this maven command:

mvn clean test -Pwildfly-managed -Pat-tests

Gray box tests with Warp

With gray box testing[3] we can fire a request as client(eg:via web driver) and inspect internal objects(like in white box tests) like http session, faces context etc.

Warp tests of our example app can be found here.

Dependencies

	<dependency>
		<groupId>org.jboss.arquillian.extension</groupId>
		<artifactId>arquillian-warp</artifactId>
		<type>pom</type>
		<scope>test</scope>
		<version>1.0.0.Alpha7</version>
	</dependency>
	<dependency>
		<groupId>org.jboss.arquillian.extension</groupId>
		<artifactId>arquillian-warp-jsf</artifactId>
		<version>1.0.0.Alpha7</version>
	</dependency>

Example


@RunWith(Arquillian.class)
@WarpTest
@RunAsClient
public class LogonWarp {

    protected static final String WEBAPP_SRC = "src/main/webapp";

    protected static final String TEST_RESOURCES = "src/test/resources";

    @Drone
    protected WebDriver browser;

    @ArquillianResource
    protected URL baseUrl;

    @Deployment(testable = true)
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment()
                .addPackages(true, UserMBean.class.getPackage()) //managed beans
                .addPackages(true,"org.conventions.archetype.converter");

        //web resources
        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory(WEBAPP_SRC).as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png)$"));
        war.addAsWebResource(new File(TEST_RESOURCES, "/pages/test-logon.xhtml"), "/templates/logon.xhtml");//test logon clears the database on each logon
        System.out.println(war.toString(true));
        return war;
    }

        @Test
        @InSequence(1)
        public void shouldLogonWithSuccess(@InitialPage final HomePage home){
            assertTrue(home.getLogonDialog().isPresent());
            Warp.initiate(new Activity() {

                @Override
                public void perform() {
                    home.getLogonDialog().doLogon("admin", "admin");

                }
            })
                    .observe(request().method().equal(HttpMethod.POST).uri().contains("home.faces"))
                    .inspect(new Inspection() {
                        private static final long serialVersionUID = 1L;

                        @Inject
                        SecurityContext securityContext;

                        @Inject
                        ResourceBundle resourceBundle;

                        @BeforePhase(Phase.INVOKE_APPLICATION)
                        public void shouldNotBeLoggedIn() {
                            System.out.println("shouldNotBeLoggedIn:"+securityContext.loggedIn());
                            assertFalse(securityContext.loggedIn());
                        }

                        @AfterPhase(Phase.INVOKE_APPLICATION)
                        public void shouldBeLoggedIn(@ArquillianResource FacesContext context) {
                            System.out.println("shouldBeLoggedIn:"+securityContext.loggedIn());
                            assertTrue(securityContext.loggedIn());
                            boolean loggedInMessage = false;
                            for (FacesMessage facesMessage : context.getMessageList()) {
                                  if(facesMessage.getSummary().equals(resourceBundle.getString("logon.info.successful"))){
                                      loggedInMessage = true;
                                  }
                            }
                            assertTrue(loggedInMessage);
                        }

                    });
        }

Gray box tests mix elements from black and white box tests, this is achieved with arquillian mixed mode where we have a Deployment(testable=true) which denotes white testing and at the same time we RunAsClient for the black box part.

The black box part is represented by the Activity interface which has a perform method that is responsible for firing client requests.

The white(server side execution) part is represented by Inspection interface, whithin inspection you can do anything you do with white box testing such as accessing CDI beans through injection for example.

Also note that you can observe specific requests(Activity may start multiple requests) with observe method.

In our example we first logon in the application via user interface with Drone/Graphene and latter we access internal system object, in case SecurityContext to see if user is in fact logged.

Tips and recommendations

Notation

I’m using the following sulfix convention to differentiate tests:

it: Integration (white box)tests (eg: UserIt.java)
ft: Functional (black box)tests (eg: UserFt.java)
bdd: white box (system)acceptance tests (eg: UserBdd.java)
at: black box (user)acceptance tests (ex: UserAt.java)
warp: gray box tests (ex: LogonWarp.java)

this way we easy test profiles management.

Test profiles

Its used to separate different types of test so you can run them in different moments(eg: run lightweight tests more frequent or earlier)

With the sulfix notation we can separate tests in surefire plugin as follows:

		<profile>
			<!-- all tests -->
			<id>all-tests</id>
			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<skipTests>false</skipTests>
							<runOrder>reversealphabetical</runOrder>
							<includes>
								<include>**/*Ft.java</include><!--functional -->
								<include>**/*It.java</include><!--integration -->
								<include>**/*UnitTests.java</include><!--unit -->
								<include>**/*Bdd.java</include><!--white acceptance -->
								<include>**/*At.java</include><!--black acceptance -->
								<include>**/*Warp.java</include><!--gray box -->
							</includes>
							<excludes>
<!-- avoid execution of test superclasses -->
								<exclude>**/*BaseFt.java</exclude>
								<exclude>**/*BaseBdd.java</exclude>
								<exclude>**/*BaseIt.java</exclude>
								<exclude>**/*BaseAt.java</exclude>
								<exclude>**/*BaseWarp.java</exclude>
							</excludes>

						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

		<profile>
			<!-- only integration tests and bdd(white box) -->
			<id>it-tests</id>
			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<skipTests>false</skipTests>
							<runOrder>reversealphabetical</runOrder>
							<includes>
								<include>**/*It.java</include>
								<include>**/*Bdd.java</include>
							</includes>
							<excludes>
								<exclude>**/*BaseBdd.java</exclude>
								<exclude>**/*BaseIt.java</exclude>
							</excludes>

						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

		<profile>
			<!-- only functional and user acceptance tests (black box) -->
			<id>ft-tests</id>
			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<skipTests>false</skipTests>
							<runOrder>reversealphabetical</runOrder>
							<includes>
								<include>**/*Ft.java</include>
								<include>**/*At.java</include>
							</includes>
							<excludes>
								<exclude>**/*BaseFt.java</exclude>
								<exclude>**/*BaseAt.java</exclude>
							</excludes>
						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

Container profiles(adapters)

We talked about three execution modes to run arquillian tests inside a container. To switch between adapters easily we can define maven profiles:

	<profile>
			<id>jboss-remote</id>
			<dependencies>
				<dependency>
					<groupId>org.jboss.as</groupId>
					<artifactId>jboss-as-arquillian-container-remote</artifactId>
					<scope>test</scope>
					<version>7.2.0.Final</version>
				</dependency>
			</dependencies>
                        <build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<systemPropertyVariables>
								<arquillian.launch>jboss-remote</arquillian.launch>
							</systemPropertyVariables>

						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

		<profile>
			<id>jboss-managed</id>
			<activation>
				<activeByDefault>true</activeByDefault>
			</activation>
			<properties>
				<arquillian.serverHome>/home/jboss-eap-6.2</arquillian.serverHome>
                        </properties>
			<dependencies>
				<dependency>
					<groupId>org.jboss.as</groupId>
					<artifactId>jboss-as-arquillian-container-managed</artifactId>
					<scope>test</scope>
					<version>7.2.0.Final</version>
				</dependency>
			</dependencies>

			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<systemPropertyVariables>
								<arquillian.launch>jboss-managed</arquillian.launch>
							</systemPropertyVariables>
							<environmentVariables>
								<JBOSS_HOME>${arquillian.jbossHome}</JBOSS_HOME>
							</environmentVariables>
						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

Note that in this case we are using surefire only to define properties like:

– JBOSS_HOME provide container path(in managed case). We hold container path in
arquillian.serverHome maven property so it can be overriden by CI(Jenkins) later.
– arquillian.launch informs which adapter must be activated in arquillian.xml

You can also use container maven dependency to download container automaticly during tests as follows:

  	<profile>
            <id>wildfly-managed</id>
            <properties>
                <arquillian.serverHome>${project.build.directory}/wildfly-${wildfly.version}</arquillian.serverHome>
            </properties>
            <dependencies>
                <dependency>
                    <groupId>org.wildfly</groupId>
                    <artifactId>wildfly-arquillian-container-managed</artifactId>
                    <version>${wildfly.version}</version>
                    <scope>test</scope>
                </dependency>
            </dependencies>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-dependency-plugin</artifactId>
                        <executions>
                            <execution>
                                <id>unpack</id>
                                <phase>process-test-classes</phase>
                                <goals>
                                    <goal>unpack</goal>
                                </goals>
                                <configuration>
                                    <artifactItems>
                                        <artifactItem>
                                            <groupId>org.wildfly</groupId>
                                            <artifactId>wildfly-dist</artifactId>
                                            <version>${wildfly.version}</version>
                                            <type>zip</type>
                                            <overWrite>false</overWrite>
                                            <outputDirectory>${project.build.directory}</outputDirectory>
                                        </artifactItem>
                                    </artifactItems>
                                </configuration>
                            </execution>
                        </executions>
                    </plugin>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>2.16</version>
                        <configuration>
                            <systemPropertyVariables>
                                <arquillian.launch>jboss-managed</arquillian.launch>
                            </systemPropertyVariables>
                            <environmentVariables>
                                <JBOSS_HOME>${arquillian.serverHome}</JBOSS_HOME>
                            </environmentVariables>
                        </configuration>
                    </plugin>
                </plugins>
	</profile>

maven dependency plugin downloads the container(if its not in local maven repo) and unpack it inside target directory. Surefire plugin uses
arquillian.serverHome property which points to the server inside target dir.

Faster test execution

When we are developing tests we tend to execute them a lot which can be very time consuming, imagine execute them inside a container which needs to be started and stopped. One way to minimize that is to use the remote container adapter so the developer starts the container once and then execute tests on top of this started container so the only time consuming task(aside from the test itself) is the micro-deployment deploy/undeploy. here is a a remote adapter maven profile:

	<profile>
			<id>jboss-remote</id>

			<dependencies>
				<dependency>
					<groupId>org.jboss.as</groupId>
					<artifactId>jboss-as-arquillian-container-remote</artifactId>
					<scope>test</scope>
					<version>7.2.0.Final</version>
				</dependency>
			</dependencies>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>2.16</version>
                        <configuration>
                            <systemPropertyVariables>
                                <arquillian.launch>jboss-remote</arquillian.launch>
                            </systemPropertyVariables>
                        </configuration>
                    </plugin>
                </plugins>
            </build>
		</profile>

the system property arquillian.launch selects the container to be activated in arquillian.xml:

<container qualifier="jboss-remote" >
		<configuration>
			<property name="managementAddress">127.0.0.1</property>
			<property name="managementPort">9999</property>
		</configuration>
	</container>

OBS: As you usually will deploy a datasource within your test deploy becareful with the infamous DuplicateService exception.

Avoiding multiple deployments

In white box tests it is possible to avoid multiple deployments(@Deployment) through test dependency injection, to achieve that you just need to separate your test classes and inject them in a common class which has the arquillian deployment, see ArchetypeIt.java where we inject UserIt and RoleIt to leverage ArchetypeIt deployment.

Manipulating test database in functional tests

As we don’t have access to deployed classes during black box tests we can’t use our TestService(neither arquillian persistence) to initialize/prepare database for tests so what we do is to deploy a customized logon page so each time a login is done we init database, this is done in BaseFt as follows:

war.addAsWebResource(new File(TEST_RESOURCES, "/pages/test-logon.xhtml"), "/templates/logon.xhtml");

//test logon clears and initializes test database on each logon

the customized logon fires the initialization method via primefaces remote command each time logon dialog is opened, its the same application logon dialog with the additional code showed below:

 <h:form id="loginForm">
            <p:remoteCommand name="init" rendered="#{not loggedIn}" autoRun="true" immediate="true" process="@this" update="@none" partialSubmit="true" actionListener="#{testService.initDatabaseWithUserAndGroups}"/>
</h:form>

this way we garantee a specific dataset for tests.

Rest testing

For testing restful web services we are running arquillian as client(black box) and deploying our Rest endpoint to microdeployent.

When running as client we can inject the application url(which is dinamic generated by arquillian) in tests via @ArquillianResource, that’s all we need to fire rest requests as client.

Rest test in our example app can be found here.

Also note that we are deploying a RestDataset class which is responsible for inializing our database before the tests.

For more advanced restful tests like testing endpoints from the server side refer to rest extension.

Running tests in CI server(Jenkins)

Running tests in Jenkin is same as locally, the only problem you may face is with functional tests cause CI servers usually dont have graphical interface so it can’t open a web browser to run tests. A way to solve that is to run functional tests remotely in a machine with
graphical card and with a selenium grid up and running, if you have that then you just need to pass some maven parammeters to your test command so you enable remote functional tests:

mvn clean test -Pjboss-managed -Darquillian.remote=true -Darquillian.seleniumGrid=http://remote-machine-ip:1044/wd/hub -Darquillian.localAddress=jenkins-ip

Note that:

arquillian.remote, arquillian.seleniumGrid and Darquillian.localAddress are properties defined in arquillian.xml used by webdriver extension that enable remote functional testing.

http://remote-machine-ip:1044/wd/hub is a selenium grid waiting for conections at port 1044

OBS: you can encapsulate above command in a maven profile:

<profile>
	<id>jenkins</id>
	<properties>//profile default properties, can be overrided with -Dproperty-name=value
		<arquillian.remote>true</arquillian.remote>
		<arquillian.serverHome>/opt/jboss-eap</arquillian.serverHome>
		<arquillian.seleniumGrid>http://remote-machine-ip:1044/wd/hub</arquillian.seleniumGrid>
		<arquillian.localAddress>jenkins-ip</arquillian.localAddress>
	</properties>
</profile>

Another important aspect when running tests in continuous integration is where is the container, for that you can either have a container instalation on CI machine and use jboss managed pointing to that instalation or use the container as maven dependency as we explained here.

Yet another addendum is when you have multiple projects using arquillian tests and these tests run concurrent on CI, in this case you may have conflicts cause arquillian will try to start multiple containers on the same machine, one way to solve that on JBossAS is using port offset in arquillian.xml as show below:

	<container qualifier="jboss-managed"  default="true" >
	   <configuration>
	     <property name="jbossHome">${arquillian.jbossHome}</property>
	     <property name="outputToConsole">true</property>
            <property name="javaVmArguments">-Xmx512m -XX:MaxPermSize=512m  -Djboss.bind.address=localhost
                -Djboss.socket.binding.port-offset=100
                -Djboss.management.native.port=9054            
            </property>
	    <property name="allowConnectingToRunningServer">false</property>
            <property name="managementPort">9154</property>
        </configuration>
	</container>

The parameters -Djboss.socket.binding.port-offset=100 and -Djboss.management.native.port=9054 work in conjunction with property:
9154

You just need to configure different offsets in your application so they can run concurrent on CI server. For more details see[11]

DuplicateServiceException

A problem you may face from time to time is the raise of DuplicateService Exception during test deploy, it usualy happens due to the dynamic datasource inclusion inside test deployment. Arquillian generates an entry in standalone.xml(in case of jbossas/wildfly)for each test deployment and undeploy it after test execution but somethimes it can’t undeploy it so next time you execute the test the datasource(added by micro deployment) will be registered again resulting in the exception below:

Caused by: org.jboss.msc.service.DuplicateServiceException: Service jboss.data-source.java:jboss/datasources/ArchetypeTestDS is already registered

When that happens you must remove the deployment entry in standalone.xml in order to run the tests again.

Note that this exception won’t happen when using the adapter as maven dependency as we saw in container profiles section.

Publishing test metrics

The generation of test metrics like % of test success/fails, tests executed and coverage is done by arquillian-jacoco extension:

JaCoCo works through byte code instrumentation and is activated by a maven plugin:

		<profile>
			<id>jacoco</id>
			<properties>
				<jacoco.version>0.7.0.201403182114</jacoco.version>
			</properties>
			<dependencies>
				<dependency>
					<groupId>org.jboss.arquillian.extension</groupId>
					<artifactId>arquillian-jacoco</artifactId>
					<scope>test</scope>
					<version>1.0.0.Alpha6</version>
				</dependency>
				<dependency>
					<groupId>org.jacoco</groupId>
					<artifactId>org.jacoco.core</artifactId>
					<scope>test</scope>
					<version>${jacoco.version}</version>
				</dependency>
			</dependencies>
			<build>
				<plugins>
					<plugin>
						<groupId>org.jacoco</groupId>
						<artifactId>jacoco-maven-plugin</artifactId>
						<version>${jacoco.version}</version>
						<executions>
							<execution>
								<goals>
									<goal>prepare-agent</goal>
								</goals>
							</execution>
							<execution>
								<id>report</id>
								<phase>prepare-package</phase>
								<goals>
									<goal>report</goal>
								</goals>
							</execution>
						</executions>
					</plugin>
				</plugins>
			</build>
		</profile>

its also necessary to pass the following properties in pom.xml(or use -D option in maven command) so sonar can read jaCoCo reports

   <properties>
	        <sonar.core.codeCoveragePlugin>jacoco </sonar.core.codeCoveragePlugin>
		<sonar.dynamicAnalysis>reuseReports</sonar.dynamicAnalysis>
		<sonar.core.codeCoveragePlugin>jacoco</sonar.core.codeCoveragePlugin>
   </properties>

then you can use the command:

mvn clean install -Pjboss-managed -Pall-tests -Pjacoco

note that you need to use ‘install’ so report is generated in target folder and also note that coverage report will only take into account white box tests.

Arquillian all dependencies

Arquillian All is an all-in-one maven dependency for arquillian, its main objective is to facilitate beginners to setup arquillian dependencies, if you have good knowlegde of arquillian platform then preferably declare each dependency you need so you can leveragy arquillian modularity. Here is a before-after arquillian-all pom.xml:

Before:

<dependencies
        <!-- Test dependencies -->
       <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.11</version>
            <scope>test</scope>
        </dependency>
 
        <!-- arquillian -->
        <dependency>
            <groupId>org.jboss.arquillian.junit</groupId>
            <artifactId>arquillian-junit-container</artifactId>
            <scope>test</scope>
        </dependency>
 
        <!--arquillian persistence(dbunit) -->
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-persistence-api</artifactId>
            <version>1.0.0.Alpha7</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-persistence-dbunit</artifactId>
            <version>1.0.0.Alpha7</version>
            <scope>test</scope>
        </dependency>
 
        <!-- warp -->
 
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-warp</artifactId>
            <type>pom</type>
            <scope>test</scope>
            <version>${warp.version}</version>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-warp-jsf</artifactId>
            <version>${warp.version}</version>
 
        </dependency>
 
        <!-- shrinkWrap resolvers -->
        <dependency>
            <groupId>org.jboss.shrinkwrap.resolver</groupId>
            <artifactId>shrinkwrap-resolver-depchain</artifactId>
            <scope>test</scope>
            <type>pom</type>
        </dependency>
 
        <dependency>
            <groupId>org.jboss.arquillian.graphene</groupId>
            <artifactId>graphene-webdriver</artifactId>
            <type>pom</type>
            <scope>test</scope>
            <version>${version.graphene}</version>
        </dependency>
 
          <dependency>
             <groupId>org.jboss.arquillian.graphene</groupId>
             <artifactId>arquillian-browser-screenshooter</artifactId>
             <version>2.1.0.Alpha1</version>
             <scope>test</scope>
          </dependency>
 
         <!-- REST -->
 
         <dependency>
             <groupId>org.jboss.arquillian.extension</groupId>
             <artifactId>arquillian-rest-client-api</artifactId>
             <version>1.0.0.Alpha3</version>
         </dependency>
         <dependency>
             <groupId>org.jboss.arquillian.extension</groupId>
             <artifactId>arquillian-rest-client-impl-2x</artifactId>
             <version>1.0.0.Alpha3</version>
         </dependency>
 
         <dependency>
             <groupId>org.jboss.arquillian.extension</groupId>
             <artifactId>arquillian-rest-warp-impl-resteasy</artifactId>
             <version>1.0.0.Alpha3</version>
         </dependency>
 
        <!-- arquillian bdd -->
 
         <!-- jbehave -->
        <dependency>
            <groupId>org.jboss.arquillian.jbehave</groupId>
            <artifactId>arquillian-jbehave-core</artifactId>
            <version>1.0.2</version>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.jboss.spec.javax.annotation</groupId>
            <artifactId>jboss-annotations-api_1.1_spec</artifactId>
            <version>1.0.1.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.spec.javax.ejb</groupId>
            <artifactId>jboss-ejb-api_3.1_spec</artifactId>
            <version>1.0.2.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.protocol</groupId>
            <artifactId>arquillian-protocol-servlet</artifactId>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.apache.httpcomponents</groupId>
            <artifactId>httpcore</artifactId>
            <version>4.2.5</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>commons-collections</groupId>
            <artifactId>commons-collections</artifactId>
            <version>3.2.1</version>
        </dependency>
        <dependency>
            <groupId>xml-apis</groupId>
            <artifactId>xml-apis</artifactId>
            <version>1.4.01</version>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>1.7.5</version>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.codehaus.jackson</groupId>
            <artifactId>jackson-core-lgpl</artifactId>
            <version>1.9.13</version>
            <scope>test</scope>
        </dependency>
 
    </dependencies>
 
    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.jboss.arquillian</groupId>
                <artifactId>arquillian-bom</artifactId>
                <version>${version.arquillian}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
            <dependency>
                <groupId>org.jboss.arquillian.selenium</groupId>
                <artifactId>selenium-bom</artifactId>
                <version>${version.selenium}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
            <dependency>
                <groupId>org.jboss.arquillian.extension</groupId>
                <artifactId>arquillian-drone-bom</artifactId>
                <version>${version.drone}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

After:

<dependencies
        <!-- Test dependencies -->
       <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.11</version>
            <scope>test</scope>
        </dependency>
 
        <!-- arquillian -->
         <dependency>
            <groupId>org.jboss.arquillian</groupId>
            <artifactId>arquillian-all</artifactId>
            <version>1.0.1</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>xml-apis</groupId>
            <artifactId>xml-apis</artifactId>
            <version>1.4.01</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.shrinkwrap</groupId>
            <artifactId>shrinkwrap-api</artifactId>
            <version>1.2.2</version>
          <scope>test</scope>
    </dependency>
 
</dependencies>

Functional tests on each commit?

As ftests tends to be slower than white box tests its unbearable to execute them on each commit in continuous integration, a way to workaround it is to create a dedicated jenkins job to run black box tests(ft-tests profile we showed before) and schedule it to run at the end of the day for example.

Another possibility is to execute functional tests is a headless browsers[8] so test time execution is faster and viable on each commit(or every 10 minutes at least).

Phantomjs is the most recommended for this kind of test, it simulates a web browser through the V8 javascript engine(same used by Chrome).

To execute tests using phantom in our sample application just activate ‘webdriver-phantomjs’ profile or -Darquillian.browser=phantomjs in maven command

Another option of headless webdriver is HtmlUnit, also supported by Drone.

Prime arquillian

Prime-Arquillian is an opensource project which aims testing Primefaces showcase, the output of this project must be Arquillian graphene page fragments representing primefaces components to easy functional testing of primefaces based applications.

The project is in initial state and has a draft of primefaces datatable fragment that can be found in components module.

The alien build pipeline

A good testing suite is the basis of a deployment pipeline where we can promote a build across multiple stages, this process is well explained by Martin Fowler here.

For our application i’ve created a simple pipeline with only automated steps that uses the arquillian tests of this post on a build pipeline orchestrated by Jenkins with metrics published in Sonar ending with a deployed version in Wildfly application server.

I have not detailed the pipeline here to not extend this post even more, instead i’ve published a video on youtube, any question you can post a comment here.

Conclusion

In this article we saw a powerfull testing platform that facilitate the development of tests in the JavaEE ecosystem and also saw it working in practice on a JavaEE6 application.

We could notice that arquillian provided support to most used kinds of tests and can be easily used in continous integration but its initial setup/configuration, mainly in existing projects, is not trivial but once done can bring inumerous advantages.

References