Ruling Database Testing with DBUnit Rules

In this post I am going to talk about DBUnit Rules (renamed to Database Rider), a small opensource project I maintain which aims to simplify database testing[1].

For a better reading experience, access the asciidoc based version of this post in HTML and PDF.

Introduction

DBUnit Rules integrates JUnit and DBUnit through JUnit rules and, in case of CDI based tests, a CDI interceptor. This powerful combination lets you easily prepare the database state for testing through xml, json, xls or yaml files.

Most inspiration of DBUnit Rules was taken from Arquillian extension persistence a library for database in-container integration tests.

Source code for the upcoming examples can be found at github here: https://github.com/rmpestano/dbunit-rules-sample

Setup DBUnit Rules

First thing to do is to add DBunit Rules core module to your test classpath:

        <dependency>
            <groupId>com.github.dbunit-rules</groupId>
            <artifactId>core</artifactId>
            <version>0.9.0</version>
            <scope>test</scope>
        </dependency>

Secondly we need a database, for testing I recommend HSQLDB which is a very fast in-memory database, here is its maven dependency:

        <dependency>
            <groupId>org.hsqldb</groupId>
            <artifactId>hsqldb</artifactId>
            <version>2.3.3</version>
            <scope>test</scope>
        </dependency>

Later A JPA provider will be needed, in this case Hibernate will be used:

        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-core</artifactId>
            <version>4.2.8.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate-entitymanager</artifactId>
            <version>4.2.8.Final</version>
            <scope>test</scope>
        </dependency>

And the entity manager persistence.xml(src/test/resources/META-INF/persistence.xml):

<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
    <persistence-unit name="rulesDB" transaction-type="RESOURCE_LOCAL">

            <provider>org.hibernate.ejb.HibernatePersistence</provider>
            <class>com.github.dbunit.rules.sample.User</class>

            <properties>
                <property name="hibernate.dialect" value="org.hibernate.dialect.HSQLDialect" />
                <property name="javax.persistence.jdbc.driver" value="org.hsqldb.jdbcDriver" />
                <property name="javax.persistence.jdbc.url" value="jdbc:hsqldb:mem:test;DB_CLOSE_DELAY=-1" />
                <property name="javax.persistence.jdbc.user" value="sa" />
                <property name="javax.persistence.jdbc.password" value="" />
                <property name="hibernate.hbm2ddl.auto" value="create-drop" /> <1>
                <property name="hibernate.show_sql" value="true" />
            </properties>

    </persistence-unit>

</persistence>
  1. We’re creating the database from our JPA entities, but we could use a database migration tool like flyway to do this work, see example here.

and finally the JPA entity which our tests will work on:

@Entity
public class User {

    @Id
    @GeneratedValue
    private long id;

    private String name;

Now we are ready to rule our database tests!

Example

Create a yaml file which will be used to prepare database, with two users, before the test:

src/test/resources/dataset/users.yml

 
user: 
  - id: 1
    name: "@realpestano" 
  - id: 2 
    name: "@dbunit"

And the JUnit test:

@RunWith(JUnit4.class)
public class DBUnitRulesCoreTest {

    @Rule
    public EntityManagerProvider emProvider = EntityManagerProvider.instance("rulesDB"); <1> 

    @Rule
    public DBUnitRule dbUnitRule = DBUnitRule.instance(emProvider.connection()); <2>


    @Test
    @DataSet("users.yml") <3>
    public void shouldListUsers() {
        List<User> users = em(). <4>
                createQuery("select u from User u").
                getResultList();
        assertThat(users).
                isNotNull().
                isNotEmpty().
                hasSize(2);
    }
}
  1. EntityManagerProvider is a JUnit rule that initializes a JPA entity manager before each test class. rulesDB is the name of persistence unit;
  2. DBUnit rule reads @DataSet annotations and initializes database before each test method. This rule only needs a JDBCconnection to be created.
  3. The dataSet configuration itself, see here for all available configuration options.
  4. em() is a shortcut (import static com.github.dbunit.rules.util.EntityManagerProvider.em;) for the EntityManager that was initialized by EntityManagerProvider rule.

Transactions

EntityManagerProvider rule provides entity manager transactions so you can insert/delete entities in your tests:

    @Test
    @DataSet("users.yml")
    public void shouldUpdateUser() {
        User user = (User) em().
                createQuery("select u from User u  where u.id = 1").
                getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getName()).isEqualTo("@realpestano");
        tx().begin(); <1>
        user.setName("@rmpestano");
        em().merge(user);
        tx().commit();
        assertThat(user.getName()).isEqualTo("@rmpestano");
    }

    @Test
    @DataSet("users.yml")
    public void shouldDeleteUser() {
        User user = (User) em().
                createQuery("select u from User u  where u.id = 1").
                getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getName()).isEqualTo("@realpestano");
        tx().begin();
        em().remove(user);
        tx().commit();
        List<User> users = em().
                createQuery("select u from User u ").
                getResultList();
        assertThat(users).
                hasSize(1);
    }
  1. tx() is a shortcut for the entity manager transaction provided by EntityManagerProvider.

Database assertion with ExpectedDataSet

Consider the following datasets:

src/test/resources/dataset/users.yml

user:
  - id: 1
    name: "@realpestano"
  - id: 2
    name: "@dbunit"

and expected dataset:

src/test/resources/dataset/expectedUser.yml

user:
  - id: 2
    name: "@dbunit"

And the following test:

    @Test
    @DataSet("users.yml")
    @ExpectedDataSet(value = "expectedUser.yml",ignoreCols = "id") <1>
    public void shouldAssertDatabaseUsingExpectedDataSet() {
        User user = (User) em().
                createQuery("select u from User u  where u.id = 1").
                getSingleResult();
        assertThat(user).isNotNull();
        tx().begin();
        em().remove(user);
        tx().commit();
    }
  1.  Database state after test will be compared with dataset provided by @ExpectedDataSet.

NOTE that If database state is not equal then an assertion error is thrown, example imagine in test above we’ve deleted user with id=2, error would be:

junit.framework.ComparisonFailure: value (table=USER, row=0, col=name)
Expected :@dbunit
Actual   :@realpestano
 <Click to see difference>
	at org.dbunit.assertion.JUnitFailureFactory.createFailure(JUnitFailureFactory.java:39)
	at org.dbunit.assertion.DefaultFailureHandler.createFailure(DefaultFailureHandler.java:97)
	at org.dbunit.assertion.DefaultFailureHandler.handle(DefaultFailureHandler.java:223)
	at com.github.dbunit.rules.assertion.DataSetAssert.compareData(DataSetAssert.java:94)

OBS: Since version 0.9.0 (To be released at the time of writing) transactions can be managed automatically at test level (useful for expected datasets), see example here.

Regular Expressions

Expected datasets also alow regexp in datasets:

src/test/resources/dataset/expectedUsersRegex.yml

user:
  - id: "regex:\\d+"
    name: regex:^expected user.* #expected user1
  - id: "regex:\\d+"
    name: regex:.*user2$ #expected user2
    @Test
    @DataSet(cleanBefore = true) <1>
    @ExpectedDataSet("expectedUsersRegex.yml")
    public void shouldAssertDatabaseUsingRegex() {
        User u = new User();
        u.setName("expected user1");
        User u2 = new User();
        u2.setName("expected user2");
        tx().begin();
        em().persist(u);
        em().persist(u2);
        tx().commit();
    }
  1. You don’t need to initialize a dataset but can use cleanBefore to clear database before testing.

IMPORTANT: When you use a dataset like users.yml in @DataSet dbunit will use CLEAN_INSERT seeding strategy (by default) for all declared tables in dataset. This is why we didn’t needed cleanBefore in any other example tests.

Scriptable datasets

DBUnit Rules enables scripting in dataset for languages that implement JSR 233 – Scripting for the Java Platform, see this article for more information.

For this example we will introduce another JPA entity:

@Entity
public class Tweet {

    @Id
    @GeneratedValue
    private long id;

    @Size(min = 1, max = 140)
    private String content;

    private Integer likes;

    @Temporal(TemporalType.DATE)
    private Date date;

    @ManyToOne(fetch = FetchType.LAZY)
    User user;

Javascript Scriptable dataset

Following is a dataset which uses Javascript:

src/test/resources/datasets/dataset-with-javascript.yml

tweet:
  - id: 1
    content: "dbunit rules!"
    likes: "js:(5+5)*10/2" <1>
    user_id: 1
  1. js: prefix enables javascript in datasets.

and the junit test:

    @Test
    @DataSet(value = "dataset-with-javascript.yml",
            cleanBefore = true, <1> 
            disableConstraints = true)  <2>
    public void shouldSeedDatabaseUsingJavaScriptInDataset() {
        Tweet tweet = (Tweet) emProvider.em().createQuery("select t from Tweet t where t.id = 1").getSingleResult();
        assertThat(tweet).isNotNull();
        assertThat(tweet.getLikes()).isEqualTo(50);
    }
    
}
  1. As we don’t declared User table in dataset it will not be cleared by CLEAN_INSERT seeding strategy so we need cleanBefore to avoid conflict with other tests that insert users.
  2. Disabling constraints is necessary because Tweet table depends on User.

If we do not disable constraints we will receive the error below on dataset creation:

Caused by: org.dbunit.DatabaseUnitException: Exception processing table name='TWEET'
	at org.dbunit.operation.AbstractBatchOperation.execute(AbstractBatchOperation.java:232)
	at org.dbunit.operation.CompositeOperation.execute(CompositeOperation.java:79)
	at com.github.dbunit.rules.dataset.DataSetExecutorImpl.createDataSet(DataSetExecutorImpl.java:127)
	... 21 more
Caused by: java.sql.SQLIntegrityConstraintViolationException: integrity constraint violation: foreign key no parent; FK_OH8MF7R69JSK6IISPTIAOCC6L table: TWEET
	at org.hsqldb.jdbc.JDBCUtil.sqlException(Unknown Source)

TIP: If we declare User table in dataset-with-javascript.yml dataset we can remove cleanBefore and disableConstraints attributes.

Groovy scriptable dataset

Javascript comes by default in JDK but you can use other script languages like Groovy, to do so you need to add it to test classpath:

        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy-all</artifactId>
            <version>2.4.6</version>
            <scope>test</scope>
        </dependency>

If Groovy is not present in classpath we’ll receive a warn message (maybe we should fail, what do you think?):

WARNING: Could not find script engine with name groovy in classpath

Here’s our Groovy based dataset:

tweet:
  - id: "1"
    content: "dbunit rules!"
    date: "groovy:new Date()" <1>
    user_id: 1
  1. groovy: prefix enables Groovy in datasets.

And here is the test:

    @Test
    @DataSet(value = "dataset-with-groovy.yml",
            cleanBefore = true,
            disableConstraints = true)
    public void shouldSeedDatabaseUsingGroovyInDataset() throws ParseException {
        Tweet tweet = (Tweet) emProvider.em().createQuery("select t from Tweet t where t.id = '1'").getSingleResult();
        assertThat(tweet).isNotNull();
        SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd");//remove time
        Date now = sdf.parse(sdf.format(new Date()));
        assertThat(tweet.getDate()).isEqualTo(now);
    }

Multiple databases

Multiple databases can be tested by using multiple DBUnit rule and Entity manager providers:

package com.github.dbunit.rules.sample;

import com.github.dbunit.rules.DBUnitRule;
import com.github.dbunit.rules.api.dataset.DataSet;
import com.github.dbunit.rules.api.dataset.DataSetExecutor;
import com.github.dbunit.rules.api.dataset.DataSetModel;
import com.github.dbunit.rules.connection.ConnectionHolderImpl;
import com.github.dbunit.rules.dataset.DataSetExecutorImpl;
import com.github.dbunit.rules.util.EntityManagerProvider;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;

import static org.assertj.core.api.Assertions.assertThat;

/**
 * Created by pestano on 23/07/15.
 */

@RunWith(JUnit4.class)
public class MultipleDataBasesTest {

    @Rule
    public EntityManagerProvider emProvider = EntityManagerProvider.instance("pu1");

    @Rule
    public EntityManagerProvider emProvider2 = EntityManagerProvider.instance("pu2");

    @Rule
    public DBUnitRule rule1 = DBUnitRule.instance("rule1",emProvider.connection()); <1>

    @Rule
    public DBUnitRule rule2 = DBUnitRule.instance("rule2",emProvider2.connection());


    @Test
    @DataSet(value = "users.yml", executorId = "rule1") <1>
    public void shouldSeedDatabaseUsingPu1() {
        User user = (User) emProvider.em().
                createQuery("select u from User u where u.id = 1").getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getId()).isEqualTo(1);
    }

    @Test
    @DataSet(value = "users.yml", executorId = "rule2")
    public void shouldSeedDatabaseUsingPu2() {
        User user = (User) emProvider2.em().
                createQuery("select u from User u where u.id = 1").getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getId()).isEqualTo(1);
    }

    @Test <3>
    public void shouldSeedDatabaseUsingMultiplePus() {
        DataSetExecutor exec1 = DataSetExecutorImpl.
                instance("exec1", new ConnectionHolderImpl(emProvider.connection()));
        DataSetExecutor exec2 = DataSetExecutorImpl.
                instance("exec2", new ConnectionHolderImpl(emProvider2.connection()));

        //programmatic seed db1
        exec1.createDataSet(new DataSetModel("users.yml"));

        exec2.createDataSet(new DataSetModel("dataset-with-javascript.yml"));//seed db2

        //user comes from database represented by pu1
        User user = (User) emProvider.em().
                createQuery("select u from User u where u.id = 1").getSingleResult();
        assertThat(user).isNotNull();
        assertThat(user.getId()).isEqualTo(1);

        //tweets comes from pu2
        Tweet tweet = (Tweet) emProvider.em().createQuery("select t from Tweet t where t.id = 1").getSingleResult();
        assertThat(tweet).isNotNull();
        assertThat(tweet.getLikes()).isEqualTo(50);
    }

}
  1. rule1 is the id of DataSetExecutor, the component responsible for database initialization in DBUnit Rules.
  2. here we match dataset executor id in @DataSet annotation so in this test we are going to use database from pu1.
  3. For multiple databases in same test we need to initialize database state programmatically.

Ruling database in CDI tests

For CDI based tests we are going to use DeltaSpike test control module and DBUnit rules CDI.

The first enables CDI in JUnit tests and the second enables DBUnit though a CDI interceptor.

Classpath dependencies

First we need DBUnit CDI:

       <dependency>
            <groupId>com.github.dbunit-rules</groupId>
            <artifactId>cdi</artifactId>
            <version>0.9.0</version>
            <scope>test</scope>
        </dependency>

And also DeltaSpike control module:

        <dependency> <1>
            <groupId>org.apache.deltaspike.core</groupId>
            <artifactId>deltaspike-core-impl</artifactId>
            <version>1.6.1</version>
            <scope>test</scope>
        </dependency>

        <dependency> <2>
            <groupId>org.apache.deltaspike.modules</groupId>
            <artifactId>deltaspike-test-control-module-api</artifactId>
            <version>1.6.1</version>
            <scope>test</scope>
        </dependency>

        <dependency> <2>
            <groupId>org.apache.deltaspike.modules</groupId>
            <artifactId>deltaspike-test-control-module-impl</artifactId>
            <version>1.6.1</version>
            <scope>test</scope>
        </dependency>

        <dependency> <3>
            <groupId>org.apache.deltaspike.cdictrl</groupId>
            <artifactId>deltaspike-cdictrl-owb</artifactId>
            <version>1.6.1</version>
            <scope>test</scope>
        </dependency>

        <dependency> <4>
            <groupId>org.apache.openwebbeans</groupId>
            <artifactId>openwebbeans-impl</artifactId>
            <version>1.6.2</version>
            <scope>test</scope>
        </dependency>
  1. DeltaSpike core module is base of all DeltaSpike modules
  2. Test control module api and impl
  3. CDI control OWB dependency, it is responsible for bootstraping CDI container
  4. OpenWebBeans as CDI implementation

Configuration

For configuration we will need a beans.xml which enables DBUnit CDI interceptor:

/src/test/resources/META-INF/beans.xml

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/beans_1_0.xsd">

       <interceptors>
              <class>com.github.dbunit.rules.cdi.DBUnitInterceptor</class>
       </interceptors>
</beans>

And apache-deltaspike.properties to set our tests as CDI beans:

/src/test/resources/META-INF/apache-deltaspike.properties

deltaspike.testcontrol.use_test_class_as_cdi_bean=true 

The test itself must be a CDI bean so DBUnit Rules can intercept it.

The last configuration needed is to produce a EntityManager for tests:

package com.github.dbunit.rules.sample.cdi;


import com.github.dbunit.rules.util.EntityManagerProvider;

import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Produces;
import javax.persistence.EntityManager;

/**
 * Created by pestano on 09/10/15.
 */
@ApplicationScoped
public class EntityManagerProducer {

    private EntityManager em;


    @Produces
    public EntityManager produce() {
        return EntityManagerProvider.instance("rulesDB").em();
    }

}

This entityManager will be used as a bridge to JDBC connection needed by DBUnit Rules.

Example

@RunWith(CdiTestRunner.class) <1>
public class DBUnitRulesCDITest {

    @Inject
    EntityManager em; <2>


    @Test
    @UsingDataSet("users.yml") <3>
    public void shouldListUsers() {
        List<User> users = em.
                createQuery("select u from User u").
                getResultList();
        assertThat(users).
                isNotNull().
                isNotEmpty().
                hasSize(2);
    }
}
  1. DeltaSpike JUnit runner that enables CDI in tests;
  2. The EntityManager we produced in previous step;
  3. This annotation enables DBUnit CDI interceptor which will prepare database state before the test execution.

All other features presented earlier, except multiple databases, are supported by DBUnit CDI.

Here is ExpectedDataSet example:

src/test/resources/datasets/expectedUsers.yml

user:
  - id: 1
    name: "expected user1"
  - id: 2
    name: "expected user2"

And the test:

    @Test
    @UsingDataSet(cleanBefore = true) //needed to activate interceptor (can be at class level)
    @ExpectedDataSet(value = "expectedUsers.yml",ignoreCols = "id")
    public void shouldMatchExpectedDataSet() {
        User u = new User();
        u.setName("expected user1");
        User u2 = new User();
        u2.setName("expected user2");
        em.getTransaction().begin();
        em.persist(u);
        em.persist(u2);
        em.getTransaction().commit();
    }

Ruling database in BDD tests

BDD and DBUnit are integrated by DBUnit Rules Cucumber. It’s a Cucumber runner which is CDI aware.

Configuration

Just add following dependency to your classpath:

       <dependency>
            <groupId>com.github.dbunit-rules</groupId>
            <artifactId>cucumber</artifactId>
            <version>0.9.0</version>
            <scope>test</scope>
        </dependency>

Now you just need to use CdiCucumberTestRunner to have Cucumber, CDI and DBUnit on your BDD tests.

Example

First we need a feature file:

src/test/resources/features/search-users.feature

Feature: Search users
In order to find users quickly
As a recruiter
I want to be able to query users by its tweets.

Scenario Outline: Search users by tweet content

Given We have two users that have tweets in our database

When I search them by tweet content <value>

Then I should find <number> users
Examples:
| value    | number |
| "dbunit" | 1      |
| "rules"  | 2      |

Then a dataset to prepare our database:

src/test/resources/datasets/usersWithTweet.json

{
  "USER": [
    {
      "id": 1,
      "name": "@realpestano"
    },
    {
      "id": 2,
      "name": "@dbunit"
    }
  ],
  "TWEET": [
    {
      "id": 1,
      "content": "dbunit rules json example",
      "date": "2013-01-20",
      "user_id": 1
    },
    {
      "id": 2,
      "content": "CDI rules",
      "date": "2016-06-20",
      "user_id": 2
    }
  ]
}

Now a Cucumber runner test entry point:

package com.github.dbunit.rules.sample.bdd;

import com.github.dbunit.rules.cucumber.CdiCucumberTestRunner;
import cucumber.api.CucumberOptions;
import org.junit.runner.RunWith;

/**
 * Created by rmpestano on 4/17/16.
 */
@RunWith(CdiCucumberTestRunner.class)
@CucumberOptions(features ="src/test/resources/features/search-users.feature")
public class DBUnitRulesBddTest {
}

And finally our cucumber step definitions:

package com.github.dbunit.rules.sample.bdd;

import com.github.dbunit.rules.cdi.api.UsingDataSet;
import com.github.dbunit.rules.sample.User;
import cucumber.api.PendingException;
import cucumber.api.java.en.Given;
import cucumber.api.java.en.Then;
import cucumber.api.java.en.When;
import org.hibernate.Criteria;
import org.hibernate.Session;
import org.hibernate.criterion.DetachedCriteria;
import org.hibernate.criterion.MatchMode;
import org.hibernate.criterion.Restrictions;
import org.hibernate.sql.JoinType;

import javax.inject.Inject;
import javax.persistence.EntityManager;
import java.util.List;

import static org.assertj.core.api.Assertions.assertThat;

/**
 * Created by pestano on 20/06/16.
 */
public class SearchUsersSteps {

    @Inject
    EntityManager entityManager;

    List<User> usersFound;

    @Given("^We have two users that have tweets in our database$")
    @UsingDataSet("usersWithTweet.json")
    public void We_have_two_users_in_our_database() throws Throwable {
    }

    @When("^I search them by tweet content \"([^\"]*)\"$")
    public void I_search_them_by_tweet_content_value(String tweetContent) throws Throwable {
        Session session = entityManager.unwrap(Session.class);
        usersFound = session.createCriteria(User.class).
        createAlias("tweets","tweets", JoinType.LEFT_OUTER_JOIN).
        add(Restrictions.ilike("tweets.content",tweetContent, MatchMode.ANYWHERE)).list();
    }

    @Then("^I should find (\\d+) users$")
    public void I_should_find_number_users(int numberOfUsersFound) throws Throwable {
        assertThat(usersFound).
                isNotNull().
                hasSize(numberOfUsersFound).
                contains(new User(1));//examples contains user with id=1
    }


}

Living documentation of DBUnit Rules is based on its BDD tests, you can access it here: http://rmpestano.github.io/dbunit-rules/documentation.html

 

Ruling Database in JUnit 5 tests

JUnit 5 is the new version of JUnit and comes with a new extension model, so instead of rules you will use extensions in your tests. DBUnit Rules comes with a JUnit 5 extension which enables DBUnit.

Configuration

Just add following dependency to your classpath:

<dependency>
   <groupId>com.github.dbunit-rules</groupId>
   <artifactId>junit5</artifactId>
   <version>0.12.1-SNAPSHOT</version>
   <scope>test</scope>
</dependency>

Example

@ExtendWith(DBUnitExtension.class) <1>
@RunWith(JUnitPlatform.class) <2>
public class DBUnitJUnit5Test {

    private ConnectionHolder connectionHolder = () -> <3>
            instance("junit5-pu").connection(); <4>

    @Test
    @DataSet("users.yml")
    public void shouldListUsers() {
        List<User> users = em().createQuery("select u from User u").getResultList();
        assertThat(users).isNotNull().isNotEmpty().hasSize(2);
    }
  1. Enables DBUnit;
  2. JUnit 5 runner;
  3. As JUnit5 requires Java8 you can use lambdas in your tests;
  4. DBUnitExtension will get connection by reflection so just declare a field or a method with ConnectionHolder as return type.
[1]. In the context of this article, database testing stands for integration tests which depend on a relational database so application business logic that depend on a database can be tested without mocking.

The simplest “micro” deployment (ArqTip #2)

read the asciidoc based version of this post here.

The second Arquillian tip is the simplest “micro” deployment. Its a Arquillian deployment that uses the hole project as deployment with no need for adding individual classes, packages or libraries:

simplest-deployment

@RunWith(Arquillian.class)
public class SimplestDeployment {

    @Deployment
    public static Archive<?> createDeployment() {
        WebArchive war = ShrinkWrap.create(ZipImporter.class, "cdi-crud.war").
                importFrom(new File("target/cdi-crud.war")).as(WebArchive.class);
        war.addAsResource("persistence.xml", "META-INF/persistence.xml");//replace with test persistence
        return war;
    }

    @Inject
    CarService carService;


    @Test
    @UsingDataSet("car.yml")
    public void shouldCountCars() {
        assertNotNull(carService);
        assertEquals(carService.crud().count(), 4);
    }
}

This basically uses a previously builded project as deployment and just replaces its persistence.xml to use a test database.

Compare it with a traditional deployment here.

Of course that this simplicity comes with a price:

1 – Its not a true micro deployment because it uses the hole application. If your application is big the deployment can take considerable time(seconds);

2 – You need to build the application before running the test. Here you lose a big advantage of Arquillian which is to not build the application if a test (even functional tests) has failed.

To overcome problem #2 you can execute the tests in surefire integration-tests phase:

<profile>
    <id>simple-deployment</id>
    <build>
        <plugins>
            <plugin>
                <artifactid>maven-surefire-plugin</artifactid>
                <version>2.16</version>
                <executions>
                    <execution>
                        <id>after-package</id>
                        <phase>integration-test</phase>
                        <goals>
                            <goal>test</goal>
                        </goals>
                        <configuration>
                            <skiptests>false</skiptests>
                            <includes>
                                <include>**/*SimplestDeployment.java</include>
                            </includes>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</profile>

There is an issue from 2012 in Arquillian issue tracker which address this feature of a “simplest deployment”  using a single annotation , see the issue here: https://issues.jboss.org/browse/ARQ-74.

Source code of this post can be found here: https://github.com/rmpestano/cdi-crud/blob/cdi-crud-universe/src/test/java/com/cdi/crud/it/SimplestDeployment.java

 

Test your REST endpoints inside the container (ArqTip #1)

 

read the asciidoc based version of this post here.

 

Since Arquillian 1.1.9.Final it is possible to get deployment URL even in in-container tests. This enables REST endpoints testing inside the container.

The main advantage of running this kind of test inside the container (same JVM) is that you can call any service/method of your application before making the (test)rest call.

Even better, you can prepare your database or whatever configuration you need before running the test. Here is an example using arquillian persistence (which doesn’t work outside the container – see Arq1077):

 

 

@RunWith(Arquillian.class)
public class CrudRestIt {

    @Deployment(name = "cdi-rest.war")
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment();
        MavenResolverSystem resolver = Maven.resolver();
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("com.jayway.restassured:rest-assured").withTransitivity().asFile());
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("com.google.code.gson:gson:2.4").withoutTransitivity().asSingleFile());
        System.out.println(war.toString(true));
        return war;
    }

    @ArquillianResource
    URL basePath;


    @Test
    @UsingDataSet("car.yml")
    public void shouldListCars() {
        given().
                queryParam("start", 0).queryParam("max", 10).
        when().
                get(basePath + "rest/cars").
        then().
                statusCode(Status.OK.getStatusCode()).
                body("", hasSize(4)).//dataset has 4 cars
                body("model", hasItem("Ferrari")).
                body("price", hasItem(2450.8f)).
                body(containsString("Porche274"));
    }

}

Note that I have included two libs into the deployment, RestAssured and Gson because both are used inside the test.

As a bonus you can get code coverage of your REST endpoints, something you don’t have when running as client (testable=false a.k.a blackbox):

 

cov

cov2

 

Test source code can be found here: https://github.com/rmpestano/cdi-crud/blob/cdi-crud-universe/src/test/java/com/cdi/crud/it/CrudRestIt.java

A Simple Java EE Docker Example

read the asciidoc based version of this post here.

 

In this post we will play a bit with docker in the context of Java EE. Here is what we will do:

  • Create, build and run a docker image;
  • the image will start a wildfly server within a JavaEE sample application deployed;
  • show some docker commands;
  • start multiple containers to see the same app running on different ports.

Introduction

I will not introduce docker as there are already many good references on the subject. To create this post i’ve read the following tutorials:

  1.  Docker userguide;
  2. Working with docker images;
  3.  this great post: Docker begginers tutorial;
  4. Arun Gupta’s tech tips: #39, #57#61 and #65.

Pre requisites

To run this tutorial you will need:

  • A docker daemon running on your host machine
    •  after installing docker add this lines in ‘etc\default\docker‘ file: DOCKER_OPTS=”-H tcp://127.0.0.1:2375 -H unix:///var/run/docker.sock”
    •  after that, restart your machine and try to run the command:  docker -H tcp://127.0.0.1:2375 –version
      the output must be something like: Docker version 1.4.1, build 5bc2ff8
  • A wildfly 8.2.0 installation(unziped);
  • jdk-8u25-linux-x64.tar.gz file;
  • car-service.war available here;
  • Dockerfile available here.

Creating the Docker image

Docker images represent/describe the container itself. As i got limited internet access(3g from my cellphone) i have created an image using resources from my local machine. So the image will only work if it is build in a directory containing the following files:

  • wildfly-8.2.0.Final: the application server
  • car-service.war: the app we will deploy
  • Dockerfile: the file describing this container
  • jdk-8u25-linux-x64.tar.gz: the java version we will install in the container

NOTE: It is not good practice to use fixed resources in a docker container as it will only work if the files are present during the image build. The best approach is to install everything from scratch and download necessary files. Here is an example of docker file that download/install/deploy an app into wildfly 10 without using local files.

Here is the Dockerfile content:

FROM ubuntu
MAINTAINER Rafael Pestano &amp;lt;rmpestano@gmail.com&amp;gt;

# setup WildFly
COPY wildfly-8.2.0.Final /opt/wildfly

# install example app on wildfy
COPY car-service.war /opt/wildfly/standalone/deployments/

# setup Java

RUN mkdir /opt/java

COPY jdk-8u25-linux-x64.tar.gz /opt/java/

# change dir to Java installation dir

WORKDIR /opt/java/

RUN tar -zxf jdk-8u25-linux-x64.tar.gz

# setup environment variables

RUN update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.8.0_25/bin/javac 100

RUN update-alternatives --install /usr/bin/java java /opt/java/jdk1.8.0_25/bin/java 100

RUN update-alternatives --display java

RUN java -version

# Expose the ports we're interested in
EXPOSE 8080 9990

# Set the default command to run on boot
# This will boot WildFly in the standalone mode and bind to all interface
CMD [&amp;quot;/opt/wildfly/bin/standalone.sh&amp;quot;, &amp;quot;-c&amp;quot;, &amp;quot;standalone-full.xml&amp;quot;, &amp;quot;-b&amp;quot;, &amp;quot;0.0.0.0&amp;quot;]

The image inherits from ubuntu, an image which installs Ubuntu OS. Ubuntu image is installed when you follow the docker instalation tutorial.

Next we copy the server to the folder /opt/wildfly inside the container we are creating. COPY is a command available in Dockerfile DSL. All commands can be found here.

Next we copy our app war inside the server with: COPY car-service.war /opt/wildfly/standalone/deployments/.

After, we setup Java by unziping it to /opt/java inside the container and setting up environment variables. A best approach would be apt-get but it requires (good)internet access which i didn’t had at the time of the writing. I used RUN command to execute java -version which will print the version during the image build (if java is correctly installed).

Later I use EXPOSE 8080 9990 to tell docker the ports that can be exposed by the container. A container is the instantiation of a Docker image. When we run an image (docker run) we can specify which ports are accessible to the host machine.

Finally we specify the default command using CMD [“/opt/wildfly/bin/standalone.sh”, “-c”, “standalone-full.xml”, “-b”, “0.0.0.0”]. This command will be fired everytime our container is stated.

Building the image

After describing our image we have to build it. Run the following command from the parent folder containing the Dockerfile:

docker -H tcp://127.0.0.1:2375 build -t javaee_sample java_ee/

  • -H flag specify docker daemon address(we are using tcp to communicate with daemon);
  • build is the command itself;
  • -t specify the name of the tag to identify the image (javaee_sample in this case);
  • java_ee/ is the folder containing the Dockerfile describing our image.

More docker commands can be found here. Here is the output of the command:

After that we can see the created image by listing images installed: docker -H tcp://127.0.0.1:2375 images :

 

Starting the container

The container can be started with the command: docker -H tcp://127.0.0.1:2375 run -p 8180:8080 javaee_sample

  • -p specifies the container port(8080) to be exposed on the host machine. Port 8180 in this case (EXPOSE Dockerfile command);
  • run is the command itself;
  • javaee_sample is the name of the image.

The output of command is wildfly starting because we set it as initial command (CMD Dockerfile command):

img4

Running multiple containers

We can instantiate many container as we want since their ports don’t conflict in the host machine. I will start two more containers exposing port 8080 to 8280 and 8380 respectively:

docker -H tcp://127.0.0.1:2375 run -p 8280:8080 javaee_sample
docker -H tcp://127.0.0.1:2375 run -p 8380:8080 javaee_sample

To list started containers we can use the command docker -H tcp://127.0.0.1:2375 ps, here is the output:

rmpestano@rmpestano-ubuntu:~/docker /images$ docker -H tcp://127.0.0.1:2375 ps
CONTAINER ID        IMAGE                  COMMAND                CREATED             STATUS              PORTS                              NAMES
7b9079806e69        javaee_sample:latest   &amp;quot;/opt/wildfly/bin/st   27 seconds ago      Up 27 seconds       9990/tcp, 0.0.0.0:8280-&amp;gt;8080/tcp   suspicious_lovelace
d4975e825751        javaee_sample:latest   &amp;quot;/opt/wildfly/bin/st   28 seconds ago      Up 28 seconds       9990/tcp, 0.0.0.0:8380-&amp;gt;8080/tcp   loving_hopper
96e58eb65126        javaee_sample:latest   &amp;quot;/opt/wildfly/bin/st   42 seconds ago      Up 42 seconds       9990/tcp, 0.0.0.0:8180-&amp;gt;8080/tcp   clever_cori

And now we can access the three apps in the browser at the same time:

img5

You can stop the container by its ID or by name with docker -H tcp://127.0.0.1:2375 stop suspicious_lovelace

Remember that all data will be lost when the container is stopped. Use Docker volumes for persistent data.

RESTFul Java with JAX-RS 2.0 Book Review

During my vacation i’ve read RESTFul Java with JAX-RS 2.0 second edition by Bill Burk.

This review is just some notes i made while i was visiting each chapter. It is mainly listing the topics i found important and interesting in the book. I’ve also made some comments.

Positive aspects:

  • Very practical with plenty of nice examples using latest JAX-RS version;
  • really easy to ready and understand book;
  • interesting topics were covered.

Negative Aspects:

  • Missing a REST API documentation chapter;
  • not enough attention on testing, maybe a dedicate chapter with best practices;
  • use more json in favor of xml;
  • Jersey was not mentioned.

Overall it is a great book and fully recommended, even for those already working with REST.

Here are my notes on each chapter. A narrative on most important topics covered so you can have an idea on the content of the book:

Chapter 1 – Introduction to REST

Although it is very objective and succinct, this chapter goes directly to the heart of REST. Compares with CORBA, SOAP and WS-* standards. How REST and HTTP are related. A bit of SOA. Refers to Roy Fielding’s PhD thesis[LINK], a must read article. Finally describes the five RESTful architectural principles: Addressability, Constrained Interface, Representation-Oriented, Communicate Statelessly and HATEOAS.
An excellent overview.

Chapter 2 – Designing RESTful Services

This chapter presents a RESTFul Order entry system interface(a.k.a endpoint) of an hypothetical ecommerce. It shows the concepts described on the first chapter explaining it in an HTTP oriented way, no Java code yet. There is an interesting discussion about “State vs Operation” and best practice to model REST resources. The data format chosen(XML) for model was not the best option i think, in my opinion json would be a best approach both for exemplifying as for best practices (i don’t buy that xml is for Java and json is for web related technologies such as Ajax). This chapter accomplishes well it’s objectives which is to illustrate RESTful concepts in practice.

Chapter 3 – Your First JAX-RS Service

It starts talking a bit of servlets then jumps to JAX-RS, it summarizes well the framework for writing RESTFul services in Java.
Next the Order system designed in chapter 2 is implemented in Java using JAX-RS. For the ones already working with REST in Java it does not adds much to the table but it is a necessary step so the application can evolve during the book. Maybe its a personal taste but i have to say again that “application/xml” was not a good choice. In my opinion the examples would be simpler with json, for example


return new StreamingOutput() {
public void write(OutputStream outputStream)
throws IOException, WebApplicationException {
outputCustomer(outputStream, customer);
}
};

Maybe introducing JAX-B in this chapter could be an option to avoid “streams”, inner classes and could simplify the client example.

Chapter 4 – HTTP Method and URI Matching

Details @Path annotation and its matching rules, a bit of sub resources and dispatching, matrix x query params and finally some gotchas in request matching.

Chapter 5 – JAX-RS Injection

Interesting hint on field injection versus singleton resources. @PathParam injection is revisited and more examples are presented. PathSegment and UroInfo are introduced. MatrixParam, QueryParam, FormParam, HeaderParam, CookieParam and BeanParam are detailed with nice examples. BeanParam is new and added to JAX-RS 2.0, a very useful feature. Next it talks about automatic type conversion. How JAX-RS can map request Strings to primitives, enums, lists and objects. Later it goes into details about ParamConverter so JAX-RS can convert http request String into Java objects. Finally the chapter ends explaining @Encoded annotation.

Chapter 6 – JAX-RS Content Handlers

The chapter starts with content marshalling and build in providers (maybe here is the motivation of streams in previous chapters). Some byte and File related examples are presented. Next there is an example of posting a form with Multivalued Map<String, String>. Next the chapter focus JAXB. There is a small intro and some examples. There is an interesting section about JAXB and JSON and how they integrate. Later the chapter details JSON objects. Finally it talks about custom marshalling and exemplifies message body reader and writer.

Chapter 7 – Server Responses and Exception Handling

It starts talking about successful and error responses. Next topic is how to create response with ResponseBuilder. A bit on cookies and later status codes. Next GenericEntity is presented to deal with generics. Finally exception handling is detailed by showing WebApplicationException, exception mappers. It ends explaining error codes and build in JAX-RS exceptions.

Chapter 8 – JAX-RS Client API

A very good introduction to the Client API that comes in JAX-RS 2.0. It really do its job in a very practical way with nice examples.

Chapter 9 – HTTP Content Negotiation

A nice overview of how JAX-RS supports the Conneg protocol to easy the integration with heterogeneous clients and evolution of the system. It starts explaining the negotiation protocol with media type examples, language negotiation and encoding. Next, examples with JAX-RS are presented.The chapter ends with Variants (multiple types of response for the same uri), URI negotiation, new media types (for versioning) and flexible schemas using content negotiation.

Chapter 10 – HATEOAS

A little introduction to the concept. How it can be applied to Web Services. Atom links is presented. Next, the advantages of HATEOAS are explained. Later JAX-RS and HATEOAS plus URI builder and URIInfo are presented with examples. Finally building links and link header is presented.

Chapter 11 – Scaling JAX-RS Applications

The chapter begins talking about the web and mechanisms that help it scale. It talks about caching (browser, proxy and CDN). After introducing caching it explores the HTTP caching with JAX-RS examples. Cache revalidation is visited, again with nice examples. Next topic is concurrency with conditional PUT and POST followed by JAX-RS examples.

Chapter 12 – Filters and Interceptors

Server side filter are presented first with Request and Response filter examples like cache control and authorization. Next, reader and writer interceptors with GZIP example. Client Filters are presented using JAX-RS client API. A cache control filter example is explained, it basically caches some requests and manipulates “If-None-Match” and “If-Modified-Since” headers. Deploying (@Provider) and ordering (@Priority) of filters and interceptors are visited. Method filters and interceptors are exemplified with DynamicFeature and NameBinding. Finally there is a note on exception on filters or interceptors.

Chapter 13 – Asynchronous JAX-RS

It first starts with Client API and AsyncInvoker using futures. Next, Callbacks are presented with nice examples. Server side asynchronous response is introduced. The Internet HTTP request/response thread model and its challenges is explained. Next, the AsyncResponse API is presented with JAX-RS examples.
The chapter made clear that asynchronous responses is for specific applications and most of the times “normal” request/response paradigm is sufficient. Later exception handling and response with resume and cancel is explained. Timeouts and response callbacks are explained. Use cases like server push and publish subscribe (chat) are presented and exemplified. A note on WebSockets and Server Sent Events versus pure HTTP server push apps. Finally scheduling using executors is presented.

Chapter 14 – Deployment and Integration

It starts by registering REST resources by extending application class, initializing Singletons and Classes. Difference between Servlet container and JavaEE JAX-RS deployment is explained. Web.xml configuration is presented. Next topics are EJB 3.1 ad Spring integration. Pretty simple but useful chapter.

Chapter 15 – Securing JAX-RS

A small introduction to security in the web and JavaEE like authentication, authorization, and encryption. It dive in servlet authentication and authorization mechanism followed by encryption. Next authorization annotations like @RolesAllowed and @PermitAll is presented. Next topic is programmatic security with SecurityContext. A JAX-RS RequestFilter for authorization is exemplified. Next is Client side security using JAX-RS client API. OAuth is the next topic. The CNN case is presented as an example of OAuth. Signing and encrypting message bodies is next security topic. It is basically concerned with security in intermediary REST services (a.k.a integrations), twitter is used as example. Later Digital signatures is introduced. DKIM and JOSE JSON Signature (JWS) are exemplified. The last topic is the encryption of representations (the message body). JOSE JSON Encryption (JWE) is used as example. The chapter is more conceptual, the majority of security examples are in Chapter 29.

In chapter 29 there is an interesting OTPAuthenticated Request Filter(one time password). @AllowedPerDay is introduced in the security chain, a nice example of limiting number of access to a resource by user. It is also a ContainerRequestFilter by with lower priority meaning that it runs after OTPAuthenticated filter.

Chapter 16 – Alternative Java Clients

Besides JAX-RS client API other clients are presented. It starts with “pure Java” and HttpURLConnection examples. There is a small note on caching and authentication using java.net classes. Standard Java certificate auth using “keytool” command is introduced. Next topic is HttpClient examples. Client authentication with HttpClient is presented. RESTEasy client proxy is introduced.

CHAPTER 17 – Workbook Introduction

This chapter is a step by step tutorial on how to setup your environment with RESTEasy 3.x(an implementation of JAX-RS 2.0 spec). It uses JDK 6, Maven 3, Jetty 8.1 for servlet container and Wildfly 8.0 for examples that require JavaEE 7. It creates the project and illustrates its directory structure.

From chapter 18 to Chapter 29 is reserved to more elaborate and complete examples for each chapter. This is the great thing on this book, it is example oriented and has dedicate chapters for examples. I will not comment the examples but recommend you to try them when you read, it is great for learning.

Some Words on JavaEE, REST and Swagger

Introduction

In this post i will cover the following topics:

  1. Create a simple JavaEE REST application using JBoss Forge
  2. Add some Rest authentication and authorization
  3. Create some Arquillian tests for the created Rest endpoint
  4. Use Swagger to generate Rest API documentation

I will enhance the CDI Crud application presented on previous posts: CDI Generic Dao, CDI Crud Multi “Tenancy” and Arquillian + DBUnit + Cucumber. All source code is avaiable here: https://github.com/rmpestano/cdi-crud

 Creating the REST Endpoint

I used JBoss Forge to execute this task. As Forge can be used to evolve an application i have just executed Rest setup command:

img01

I have chosen to use JaxRS 1.1 because i want to run the app in JBoss AS and Wildfly:

img02

As we already have our JPA entities, creating the endpoint is done with generate endpoints from entities:

img03

And the CarEndpoint is created and ready to CRUD cars via REST:

@Stateless
@Path("/cars")
public class CarEndpoint {
    @Inject
    CarService carService;

    @POST
    @Consumes("application/json")
    public Response create(Car entity) {
        carService.insert(entity);
        return Response.created(UriBuilder.fromResource(CarEndpoint.class).path(String.valueOf(entity.getId())).build()).build();
    }

    @DELETE
    @Path("/{id:[0-9][0-9]*}")
    public Response deleteById(@PathParam("id") Integer id) {
        Car entity = carService.findById(id);
        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        carService.remove(entity);
        return Response.noContent().build();
    }

    @GET
    @Path("/{id:[0-9][0-9]*}")
    @Produces("application/json")
    public Response findById(@PathParam("id") Integer id) {
        Car entity;
        try {
            entity = carService.findById(id);
        } catch (NoResultException nre) {
            entity = null;
        }

        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        return Response.ok(entity).build();
    }

    @GET
    @Produces("application/json")
    @Path("list")
    public List<Car> listAll(@QueryParam("start") @DefaultValue("0") Integer startPosition, @QueryParam("max") @DefaultValue("10") Integer maxResult) {
        Filter<Car> filter = new Filter<>();
        filter.setFirst(startPosition).setPageSize(maxResult);
        final List<Car> results = carService.paginate(filter);
        return results;
    }

    @PUT
    @Path("/{id:[0-9][0-9]*}")
    @Consumes("application/json")
    public Response update(@PathParam("id") Integer id,  Car entity) {
        if (entity == null) {
            return Response.status(Status.BAD_REQUEST).build();
        }
        if (!id.equals(entity.getId())) {
            return Response.status(Status.CONFLICT).entity(entity).build();
        }
        if (carService.crud().eq("id",id).count() == 0) {
            return Response.status(Status.NOT_FOUND).build();
        }
        try {
            carService.update(entity);
        } catch (OptimisticLockException e) {
            return Response.status(Response.Status.CONFLICT).entity(e.getEntity()).build();
        }

        return Response.noContent().build();
    }
}

I have only replaced EntityManager used by Forge with CarService which was created on previous posts, the rest of the code was generated by Forge. This step was really straightforward, thanks to Forge.

REST Authentication

To authenticate client before calling the REST endpoint i’ve created a CDI Interceptor:

@RestSecured
@Interceptor
public class RestSecuredImpl implements Serializable{

    @Inject
    CustomAuthorizer authorizer;

    @Inject
    Instance<HttpServletRequest> request;

    @AroundInvoke
    public Object invoke(InvocationContext context) throws Exception {
        String currentUser = request.get().getHeader("user");
         if( currentUser != null){
             authorizer.login(currentUser);
         } else{
             throw new CustomException("Access forbidden");
         }
        return context.proceed();
    }

}

 

So for this app we are getting current user from HttpHeader of name user. If the interceptor doesn’t find the header it will throw a CustomException, it will be explained in next section. Note that only endpoints annotated with @RestSecured will be intercepted. Authorization is done by CustomAuthorizer

Verifying Authorization

Authorization is performed by CustomAuthorizer which is based on DeltaSpike security module. A very simple authorizer was created, it is based on username and stores logged user in a hashmap:

@ApplicationScoped
public class CustomAuthorizer implements Serializable {

    Map<String, String> currentUser = new HashMap<>();

    @Secures
    @Admin
    public boolean doAdminCheck(InvocationContext invocationContext, BeanManager manager) throws Exception {
        boolean allowed = currentUser.containsKey("user") && currentUser.get("user").equals("admin");
        if(!allowed){
            throw new CustomException("Access denied");
        }
        return allowed;
    }

   
    public void login(String username) {
        currentUser.put("user", username);
    }
}

When authorization fails (check method returns false) we are throwing another CustomException.

I have created a Rest Provider to map CustomException into Response types:

@Provider
public class CustomExceptionMapper implements ExceptionMapper<CustomException> {

    @Override
    public Response toResponse(CustomException e) {
        Map map = new HashMap();
        map.put("message", e.getMessage());

        if (e.getMessage().equals("Access forbidden")) {//TODO create specific exception and its mapper
            return Response.status(Response.Status.FORBIDDEN).type(MediaType.APPLICATION_JSON).entity(map).build();
        }
        if (e.getMessage().equals("Access denied")) {//TODO create specific exception and its mapper
            return Response.status(Response.Status.UNAUTHORIZED).type(MediaType.APPLICATION_JSON).entity(map).build();
        }
        return Response.status(Response.Status.BAD_REQUEST).type(MediaType.APPLICATION_JSON).entity(map).build();
    }
}

I have only added authentication to delete endpoint:

    @DELETE
    @Path("/{id:[0-9][0-9]*}")
    @RestSecured
    public Response deleteById(@PathParam("id") Integer id) {
        Car entity = carService.findById(id);
        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        carService.remove(entity);
        return Response.noContent().build();
    }

Basically added @RestSecured annotation. It means that if a client fires a request to this endpoint without providing a user on http header, the method will not be called and response will be 403. If client provides a user but it is not allowed then http response must be 401.

For authorization we use @Admin in the service method:

@Stateless
public class CarService extends CrudService<Car> {

    @Override
    @Admin
    public void remove(Car car) {
        super.remove(car);
    }
}

@Admin activates our CustomAuthorizer which verifies if current user has authorization to execute annotated method.

Testing the REST Endpoint

To test CarEndpoint i have used Arquillian, RestAssured and DBUnit. Before tests our database is populated with 4 cars described in car.yml dataset:

car:
  - id: 1
    model: "Ferrari"
    price: 2450.8
  - id: 2
    model: "Mustang"
    price: 12999.0
  - id: 3
    model: "Porche"
    price: 1390.3
  - id: 4
    model: "Porche274"
    price: 18990.23

I’ve implemented tests for all CRUD and HTTP operations. I will show only List and DELETE tests, other tests can be found in CrudRest.java. Here is how List all cars test look like:


    @Test
    public void shouldListCars() {
        given().
                queryParam("start",0).queryParam("max", 10).
        when().
                get(basePath + "rest/cars/list").
        then().
                statusCode(Response.Status.OK.getStatusCode()).
                body("", hasSize(4)).//dataset has 4 cars
                body("model", hasItem("Ferrari")).
                body("price", hasItem(2450.8f)).
                body(containsString("Porche274"));
    }

For DELETE methods i have one that fails with authentication, another fails with authorization and one which can delete a car:


    @Test
    public void shouldFailToDeleteCarWithoutAuthentication() {
        given().
                contentType(ContentType.JSON).
                when().
                delete(basePath + "rest/cars/1").  //dataset has car with id =1
                then().
                statusCode(Response.Status.FORBIDDEN.getStatusCode());
    }

    @Test
    public void shouldFailToDeleteCarWithoutAuthorization() {
        given().
                contentType(ContentType.JSON).
                header("user", "guest"). //only admin can delete
                when().
                delete(basePath + "rest/cars/1").  //dataset has car with id =1
                then().
                statusCode(Response.Status.UNAUTHORIZED.getStatusCode());
    }

    @Test
    public void shouldDeleteCar() {
        given().
                contentType(ContentType.JSON).
                header("user","admin").
        when().
                delete(basePath + "rest/cars/1").  //dataset has car with id =1
        then().
                statusCode(Response.Status.NO_CONTENT.getStatusCode());

        //ferrari should not be in db anymore
        given().
        when().
                get(basePath + "rest/cars/list").
         then().
                statusCode(Response.Status.OK.getStatusCode()).
                body("", hasSize(3)).
                body("model", not(hasItem("Ferrari")));
    }

Generating the REST API Documentation

To generate the API documentation i will use Swagger which is a specification for REST apis. Swagger is composed by various components, the main ones are:

  • swagger-spec: describes the format of REST APIs
  • swagger-codgen: generates REST clients based on swagger spec
  • swagger-ui: generates web pages describing the API based on swagger spec
  • swagger-editor: designing swagger specifications from scratch, using a simple YAML structure

Instead of using “pure” swagger, which requires its own annotations, i will use swagger-jaxrs-doclet that is based on javadoc and leverages JAXRS annotations.

The first thing to do is to copy the swagger-ui distribution (the swagger-ui i’ve used can be found here, i’ve made minor changes to index.html)to your application in webapp/apidocs as in image below:

img04

Now we just have to generate the swagger spec files based on our REST endpoints. This is done by the doclet maven plugin:


     <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-javadoc-plugin</artifactId>
                <version>2.9.1</version>
                <executions>
                    <execution>
                        <id>generate-service-docs</id>
                        <phase>generate-resources</phase>
                        <configuration>
                            <doclet>com.carma.swagger.doclet.ServiceDoclet</doclet>
                            <docletArtifact>
                                <groupId>com.carma</groupId>
                                <artifactId>swagger-doclet</artifactId>
                                <version>1.0.2</version>
                            </docletArtifact>
                            <reportOutputDirectory>src/main/webapp</reportOutputDirectory>
                            <useStandardDocletOptions>false</useStandardDocletOptions>
                            <additionalparam>-apiVersion 1 -docBasePath /cdi-crud/apidocs
                                -apiBasePath /cdi-crud/rest
                                -swaggerUiPath ${project.build.directory}/
                            </additionalparam>
                        </configuration>
                        <goals>
                            <goal>javadoc</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

The plugin has 3 main configurations:

  • docBasePath: where swagger spec files(.json) will be generated
  • apiBasePath: path used in the calls made from the API documentation(swagger-ui generates executable documentation)
  • swaggerUiPath: the plugin can generate the swagger-ui ditribution. As i am copying the dist manually i do not use this option and point it to target folder (in fact i could not get it working well so i need to play more with this option).

With this configuration the swagger spec files will be generated on each build which makes the documentation and API synchronized, see .json spec files (in red):

img05

Now you can access your REST API documentation in /apisdocs url:

img06

you can also fire REST requests through the API Docs:

img12

We can also enhance the documentation via Javadoc so for example we can add response types to describe the response codes, see modified delete method:


   /**
     * @description deletes a car based on its ID
     * @param user name of the user to log in
     * @param id car ID
     * @status 401 only authorized users can access this resource
     * @status 403 only authenticated users can access this resource
     * @status 404 car not found
     * @status 204 Car deleted successfully
     */
    @DELETE
    @Path("/{id:[0-9][0-9]*}")
    @RestSecured
    public Response deleteById(@HeaderParam("user") String user, @PathParam("id") Integer id) {
        Car entity = carService.findById(id);
        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        carService.remove(entity);
        return Response.noContent().build();
    }

and here is the generated doc(note that i’ve added @HeaderParam so we can authenticate through documentation page):

img11

All supported doclet annotations can be found here.

This app and REST documentation is available at openshift here: http://cdicrud-rpestano.rhcloud.com/cdi-crud/apidocs. There is also a simple car crud app.   To see a “bit more elaborated” documentation generated by swagger and doclet see Carma Rest API.

Arquillian + Cucumber + DBUnit

Arquillian and Cucumber integration is represented by Cukespace project, a very active with nice examples project.

One issue I’ve found with cukespace(also with arquillian jbehave) is the fact that I can’t use arquillian-persistence extension to initialize my scenarios datasets, the main cause is that cucumber test life cycle is different from junit/testng so arquillian isnt aware when a test or cucumber event is occurring, e.g. arquillian org.jboss.arquillian.test.spi.event.suite.* events aren’t triggered during cucumber tests execution.

There is an issue at cukespace project.

As arquillian persistence uses DBUnit behind the scenes i solved that limitation(while the issue isnt solved) using pure DBUnit api, here is some sample code, it can be found here.

CrudBdd.java

@RunWith(ArquillianCucumber.class)
@Features("features/search-cars.feature")
@Tags("@whitebox")
public class CrudBdd {

    @Inject
    CarService carService;

    Car carFound;

    int numCarsFound;

    @Deployment(name = "cdi-crud.war")
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment();
        war.addAsResource("datasets/car.yml", "car.yml").//needed by DBUnitUtils
                addClass(DBUnitUtils.class);
        System.out.println(war.toString(true));
        return war;
    }

    @Before
    public void initDataset() {
        DBUnitUtils.createDataset("car.yml");
    }

    @After
    public void clear() {
        DBUnitUtils.deleteDataset("car.yml");
    }

    @Given("^search car with model \"([^\"]*)\"$")
    public void searchCarWithModel(String model) {
        Car carExample = new Car();
        carExample.setModel(model);
        carFound = carService.findByExample(carExample);
        assertNotNull(carFound);
    }

    @When("^update model to \"([^\"]*)\"$")
    public void updateModel(String model) {
        carFound.setModel(model);
        carService.update(carFound);
    }

    @Then("^searching car by model \"([^\"]*)\" must return (\\d+) of records$")
    public void searchingCarByModel(final String model, final int result) {
        Car carExample = new Car();
        carExample.setModel(model);
        assertEquals(result, carService.crud().example(carExample).count());
    }

    @When("^search car with price less than (.+)$")
    public void searchCarWithPrice(final double price) {
        numCarsFound = carService.crud().initCriteria().le("price", price).count();
    }

    @Then("^must return (\\d+) cars")
    public void mustReturnCars(final int result) {
        assertEquals(result, numCarsFound);
    }

}

DBUnitUtils is a simple class that deals with DBUnit api, the idea is to initialize database
on Before and After events which are triggered on each scenario execution, also they are triggered for each ‘example‘ which is very important so we can have a clean database on each execution.

Here is my feature file:

Feature: Search cars

@whitebox
Scenario Outline: simple search and update
Given search car with model "Ferrari"
When update model to "Audi"
Then searching car by model "<model>" must return <number> of records
Examples:
| model | number |
| Audi  | 1      |
| outro | 0      |

@whitebox
Scenario Outline: search car by price
When search car with price less than <price>
Then must return <number> cars
Examples:
| price | number |
| 1390.2 | 0 |
| 1390.3 | 1 |
| 10000.0 | 2 |
| 13000.0 | 3 |

@blackbox
Scenario Outline: search car by id
When search car by id <id>
Then must find car with model "<model>" and price <price>
Examples:
| id | model | price |
| 1 | Ferrari | 2450.8 |
| 2 | Mustang | 12999.0 |
| 3 | Porche | 1390.3 |

DBUnitUtils.java

public class DBUnitUtils {

    private static DataSource ds;
    private static DatabaseConnection databaseConnection;

    public static void createDataset(String dataset) {

        if (!dataset.startsWith("/")) {
            dataset = "/" + dataset;
        }
        try {
            initConn();
            DatabaseOperation.CLEAN_INSERT.execute(databaseConnection,
new YamlDataSet(DBUnitUtils.class.getResourceAsStream(dataset)));
        } catch (Exception e) {
            e.printStackTrace();
            throw new RuntimeException("could not initialize dataset:" + dataset +
" \nmessage: " + e.getMessage());
        } finally {
            closeConn();
        }
    }

    public static void deleteDataset(String dataset) {
        if (!dataset.startsWith("/")) {
            dataset = "/" + dataset;
        }
        try {
            initConn();
            DatabaseOperation.DELETE_ALL.execute(databaseConnection,
new YamlDataSet(DBUnitUtils.class.getResourceAsStream(dataset)));
        } catch (Exception e) {
            e.printStackTrace();
            throw new RuntimeException("could not delete dataset dataset:" + dataset +
" \nmessage: " + e.getMessage());
        } finally {
            closeConn();
        }
    }

    private static void closeConn() {
        try {
            if (databaseConnection != null && !databaseConnection.getConnection().isClosed()) {
                databaseConnection.getConnection().close();
            }
        } catch (SQLException e) {
            e.printStackTrace();
            throw new RuntimeException("could not close conection \nmessage: " + e.getMessage());
        }

    }

    private static void initConn() throws SQLException, NamingException, DatabaseUnitException {
        if (ds == null) {
            ds = (DataSource) new InitialContext()
.lookup("java:jboss/datasources/ExampleDS");
        }
        databaseConnection = new DatabaseConnection(ds.getConnection());
    }

and car.yml

car:
  - id: 1
    model: "Ferrari"
    price: 2450.8
  - id: 2
    model: "Mustang"
    price: 12999.0
  - id: 3
    model: "Porche"
    price: 1390.3
  - id: 4
    model: "Porche274"
    price: 18990.23

 

DBUnit  Rest endpoint

Another limitation of persistence-extension is its integration with functional tests(blackbox/RunAsClient/testable=false), see this issue. Basically arquillian persistence needs server side resources like datasource to work but blackbox tests run outside the container, in a separated JVM.

To overcome that limitation I’ve created a DBUnit rest endpoint and deployed within my test so i can make rest calls to the server and create dataset there where i have all needed resources, here is the DBUnitUtils with rest calls:

 


public class DBUnitUtils {

    private static DataSource ds;
    private static DatabaseConnection databaseConnection;

  public static void createRemoteDataset(URL context, String dataset) {
        HttpURLConnection con = null;
        try {
            URL obj = new URL(context + "rest/dbunit/create/" + dataset);
            con = (HttpURLConnection) obj.openConnection();
            con.setRequestMethod("GET");
            con.setDoOutput(true);
            int responseCode = con.getResponseCode();
            if (responseCode != 200) {
                throw new RuntimeException("Could not create remote dataset\nstatus:" + responseCode + "\nerror:" + con.getResponseMessage());
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (con != null) {
                con.disconnect();
            }
        }
    }

    public static void deleteRemoteDataset(URL context, String dataset) {
        HttpURLConnection con = null;
        try {
            URL obj = new URL(context + "rest/dbunit/delete/" + dataset);
            con = (HttpURLConnection) obj.openConnection();
            con.setRequestMethod("GET");
            con.setDoOutput(true);

            int responseCode = con.getResponseCode();
            if (responseCode != 200) {
                throw new RuntimeException("Could not create remote dataset\nstatus:" + responseCode + "\nerror:" + con.getResponseMessage());
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (con != null) {
                con.disconnect();
            }
        }
    }

}

and DBUnitRest.java

@Path("/dbunit")
public class DBUnitRest {

    @GET
    @Path("create/{dataset}")
    public Response createDataset(@PathParam("dataset") String dataset) {
        try {
            DBUnitUtils.createDataset(dataset);//i feel like going in circles
        } catch (Exception e) {
            return Response.status(Status.BAD_REQUEST).entity("Could not create dataset.\nmessage:" + e.getMessage() + "\ncause:" + e.getCause()).build();
        }
        return Response.ok("dataset created sucessfully").build();
    }

    @GET
    @Path("delete/{dataset}")
    public Response deleteDataset(@PathParam("dataset") String dataset) {
        try {
            DBUnitUtils.deleteDataset(dataset);//i feel like going in circles
        } catch (Exception e) {
            return Response.status(Status.BAD_REQUEST).entity("Could not delete dataset.\nmessage:" + e.getMessage() + "\ncause:" + e.getCause()).build();
        }
        return Response.ok("dataset deleted sucessfully").build();
    }

}

 

and here is my functional test which uses Drone and Graphene:


@RunWith(ArquillianCucumber.class)
@Features("features/search-cars.feature")
@Tags("@blackbox")
public class CrudAt {

  @Deployment(name = "cdi-crud.war", testable=false)
  public static Archive<?> createDeployment() {
    WebArchive war = Deployments.getBaseDeployment();
        war.addAsResource("datasets/car.yml","car.yml").//needed by DBUnitUtils
                addPackage(DBUnitUtils.class.getPackage()).addClass(CrudBean.class).addClass(YamlDataSet.class).
                addClass(YamlDataSetProducer.class).
                addClass(Row.class).addClass(Table.class).addClass(DBUnitRest.class);

        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory("src/main/webapp").as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png|gif)$"));
        MavenResolverSystem resolver = Maven.resolver();
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.dbunit:dbunit:2.5.0").withoutTransitivity().asSingleFile());
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.yaml:snakeyaml:1.10").withoutTransitivity().asSingleFile());
    System.out.println(war.toString(true));
    return war;
  }

  @ArquillianResource
  URL url;

  @Drone
  WebDriver webDriver;

  @Page
  IndexPage index;

  @Before
  public void initDataset() {
      DBUnitUtils.createRemoteDataset(url,"car.yml");
  }

  @After
  public void clear(){
      DBUnitUtils.deleteRemoteDataset(url,"car.yml");
   }

  @When("^search car by id (\\d+)$")
  public void searchCarById(int id){
      Graphene.goTo(IndexPage.class);
      index.findById(""+id);
  }

  @Then("^must find car with model \"([^\"]*)\" and price (.+)$")
  public void returnCarsWithModel(String model, final double price){
    assertEquals(model,index.getInputModel().getAttribute("value"));
    assertEquals(price,Double.parseDouble(index.getInputPrice().getAttribute("value")),0);
  }

}

here is cucumber reports:

whitebox

blackbox

That’s all folks.