Arquillian + Cucumber + DBUnit

Arquillian and Cucumber integration is represented by Cukespace project, a very active with nice examples project.

One issue I’ve found with cukespace(also with arquillian jbehave) is the fact that I can’t use arquillian-persistence extension to initialize my scenarios datasets, the main cause is that cucumber test life cycle is different from junit/testng so arquillian isnt aware when a test or cucumber event is occurring, e.g. arquillian org.jboss.arquillian.test.spi.event.suite.* events aren’t triggered during cucumber tests execution.

There is an issue at cukespace project.

As arquillian persistence uses DBUnit behind the scenes i solved that limitation(while the issue isnt solved) using pure DBUnit api, here is some sample code, it can be found here.

CrudBdd.java

@RunWith(ArquillianCucumber.class)
@Features("features/search-cars.feature")
@Tags("@whitebox")
public class CrudBdd {

    @Inject
    CarService carService;

    Car carFound;

    int numCarsFound;

    @Deployment(name = "cdi-crud.war")
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment();
        war.addAsResource("datasets/car.yml", "car.yml").//needed by DBUnitUtils
                addClass(DBUnitUtils.class);
        System.out.println(war.toString(true));
        return war;
    }

    @Before
    public void initDataset() {
        DBUnitUtils.createDataset("car.yml");
    }

    @After
    public void clear() {
        DBUnitUtils.deleteDataset("car.yml");
    }

    @Given("^search car with model \"([^\"]*)\"$")
    public void searchCarWithModel(String model) {
        Car carExample = new Car();
        carExample.setModel(model);
        carFound = carService.findByExample(carExample);
        assertNotNull(carFound);
    }

    @When("^update model to \"([^\"]*)\"$")
    public void updateModel(String model) {
        carFound.setModel(model);
        carService.update(carFound);
    }

    @Then("^searching car by model \"([^\"]*)\" must return (\\d+) of records$")
    public void searchingCarByModel(final String model, final int result) {
        Car carExample = new Car();
        carExample.setModel(model);
        assertEquals(result, carService.crud().example(carExample).count());
    }

    @When("^search car with price less than (.+)$")
    public void searchCarWithPrice(final double price) {
        numCarsFound = carService.crud().initCriteria().le("price", price).count();
    }

    @Then("^must return (\\d+) cars")
    public void mustReturnCars(final int result) {
        assertEquals(result, numCarsFound);
    }

}

DBUnitUtils is a simple class that deals with DBUnit api, the idea is to initialize database
on Before and After events which are triggered on each scenario execution, also they are triggered for each ‘example‘ which is very important so we can have a clean database on each execution.

Here is my feature file:

Feature: Search cars

@whitebox
Scenario Outline: simple search and update
Given search car with model "Ferrari"
When update model to "Audi"
Then searching car by model "<model>" must return <number> of records
Examples:
| model | number |
| Audi  | 1      |
| outro | 0      |

@whitebox
Scenario Outline: search car by price
When search car with price less than <price>
Then must return <number> cars
Examples:
| price | number |
| 1390.2 | 0 |
| 1390.3 | 1 |
| 10000.0 | 2 |
| 13000.0 | 3 |

@blackbox
Scenario Outline: search car by id
When search car by id <id>
Then must find car with model "<model>" and price <price>
Examples:
| id | model | price |
| 1 | Ferrari | 2450.8 |
| 2 | Mustang | 12999.0 |
| 3 | Porche | 1390.3 |

DBUnitUtils.java

public class DBUnitUtils {

    private static DataSource ds;
    private static DatabaseConnection databaseConnection;

    public static void createDataset(String dataset) {

        if (!dataset.startsWith("/")) {
            dataset = "/" + dataset;
        }
        try {
            initConn();
            DatabaseOperation.CLEAN_INSERT.execute(databaseConnection,
new YamlDataSet(DBUnitUtils.class.getResourceAsStream(dataset)));
        } catch (Exception e) {
            e.printStackTrace();
            throw new RuntimeException("could not initialize dataset:" + dataset +
" \nmessage: " + e.getMessage());
        } finally {
            closeConn();
        }
    }

    public static void deleteDataset(String dataset) {
        if (!dataset.startsWith("/")) {
            dataset = "/" + dataset;
        }
        try {
            initConn();
            DatabaseOperation.DELETE_ALL.execute(databaseConnection,
new YamlDataSet(DBUnitUtils.class.getResourceAsStream(dataset)));
        } catch (Exception e) {
            e.printStackTrace();
            throw new RuntimeException("could not delete dataset dataset:" + dataset +
" \nmessage: " + e.getMessage());
        } finally {
            closeConn();
        }
    }

    private static void closeConn() {
        try {
            if (databaseConnection != null && !databaseConnection.getConnection().isClosed()) {
                databaseConnection.getConnection().close();
            }
        } catch (SQLException e) {
            e.printStackTrace();
            throw new RuntimeException("could not close conection \nmessage: " + e.getMessage());
        }

    }

    private static void initConn() throws SQLException, NamingException, DatabaseUnitException {
        if (ds == null) {
            ds = (DataSource) new InitialContext()
.lookup("java:jboss/datasources/ExampleDS");
        }
        databaseConnection = new DatabaseConnection(ds.getConnection());
    }

and car.yml

car:
  - id: 1
    model: "Ferrari"
    price: 2450.8
  - id: 2
    model: "Mustang"
    price: 12999.0
  - id: 3
    model: "Porche"
    price: 1390.3
  - id: 4
    model: "Porche274"
    price: 18990.23

 

DBUnit  Rest endpoint

Another limitation of persistence-extension is its integration with functional tests(blackbox/RunAsClient/testable=false), see this issue. Basically arquillian persistence needs server side resources like datasource to work but blackbox tests run outside the container, in a separated JVM.

To overcome that limitation I’ve created a DBUnit rest endpoint and deployed within my test so i can make rest calls to the server and create dataset there where i have all needed resources, here is the DBUnitUtils with rest calls:

 


public class DBUnitUtils {

    private static DataSource ds;
    private static DatabaseConnection databaseConnection;

  public static void createRemoteDataset(URL context, String dataset) {
        HttpURLConnection con = null;
        try {
            URL obj = new URL(context + "rest/dbunit/create/" + dataset);
            con = (HttpURLConnection) obj.openConnection();
            con.setRequestMethod("GET");
            con.setDoOutput(true);
            int responseCode = con.getResponseCode();
            if (responseCode != 200) {
                throw new RuntimeException("Could not create remote dataset\nstatus:" + responseCode + "\nerror:" + con.getResponseMessage());
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (con != null) {
                con.disconnect();
            }
        }
    }

    public static void deleteRemoteDataset(URL context, String dataset) {
        HttpURLConnection con = null;
        try {
            URL obj = new URL(context + "rest/dbunit/delete/" + dataset);
            con = (HttpURLConnection) obj.openConnection();
            con.setRequestMethod("GET");
            con.setDoOutput(true);

            int responseCode = con.getResponseCode();
            if (responseCode != 200) {
                throw new RuntimeException("Could not create remote dataset\nstatus:" + responseCode + "\nerror:" + con.getResponseMessage());
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (con != null) {
                con.disconnect();
            }
        }
    }

}

and DBUnitRest.java

@Path("/dbunit")
public class DBUnitRest {

    @GET
    @Path("create/{dataset}")
    public Response createDataset(@PathParam("dataset") String dataset) {
        try {
            DBUnitUtils.createDataset(dataset);//i feel like going in circles
        } catch (Exception e) {
            return Response.status(Status.BAD_REQUEST).entity("Could not create dataset.\nmessage:" + e.getMessage() + "\ncause:" + e.getCause()).build();
        }
        return Response.ok("dataset created sucessfully").build();
    }

    @GET
    @Path("delete/{dataset}")
    public Response deleteDataset(@PathParam("dataset") String dataset) {
        try {
            DBUnitUtils.deleteDataset(dataset);//i feel like going in circles
        } catch (Exception e) {
            return Response.status(Status.BAD_REQUEST).entity("Could not delete dataset.\nmessage:" + e.getMessage() + "\ncause:" + e.getCause()).build();
        }
        return Response.ok("dataset deleted sucessfully").build();
    }

}

 

and here is my functional test which uses Drone and Graphene:


@RunWith(ArquillianCucumber.class)
@Features("features/search-cars.feature")
@Tags("@blackbox")
public class CrudAt {

  @Deployment(name = "cdi-crud.war", testable=false)
  public static Archive<?> createDeployment() {
    WebArchive war = Deployments.getBaseDeployment();
        war.addAsResource("datasets/car.yml","car.yml").//needed by DBUnitUtils
                addPackage(DBUnitUtils.class.getPackage()).addClass(CrudBean.class).addClass(YamlDataSet.class).
                addClass(YamlDataSetProducer.class).
                addClass(Row.class).addClass(Table.class).addClass(DBUnitRest.class);

        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory("src/main/webapp").as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png|gif)$"));
        MavenResolverSystem resolver = Maven.resolver();
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.dbunit:dbunit:2.5.0").withoutTransitivity().asSingleFile());
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.yaml:snakeyaml:1.10").withoutTransitivity().asSingleFile());
    System.out.println(war.toString(true));
    return war;
  }

  @ArquillianResource
  URL url;

  @Drone
  WebDriver webDriver;

  @Page
  IndexPage index;

  @Before
  public void initDataset() {
      DBUnitUtils.createRemoteDataset(url,"car.yml");
  }

  @After
  public void clear(){
      DBUnitUtils.deleteRemoteDataset(url,"car.yml");
   }

  @When("^search car by id (\\d+)$")
  public void searchCarById(int id){
      Graphene.goTo(IndexPage.class);
      index.findById(""+id);
  }

  @Then("^must find car with model \"([^\"]*)\" and price (.+)$")
  public void returnCarsWithModel(String model, final double price){
    assertEquals(model,index.getInputModel().getAttribute("value"));
    assertEquals(price,Double.parseDouble(index.getInputPrice().getAttribute("value")),0);
  }

}

here is cucumber reports:

whitebox

blackbox

That’s all folks.

 

Advertisements

CDI Crud Multi “Tenancy”

Just added an example of multi tenancy to our cdi-crud example, now Car and Movie tables are in different datasources. In fact our example is not truly multi tenancy where you have the same boundaries but different schemas so you can have multiple clients on the same application and each one using its database, see [1] and [2] for eclipse link and hibernate native support for multi tenancy respectively.The example below is a simple approach to switch between datasources using CDI. Although with some changes in our TenantController like eg: use of SystemProperty to define tenant or use alternatives like [3] could add some more ‘tenantness’.

As usual there are some arquillian integration tests here.

I will summarize the idea as i’m out of time(also as usual), here are the steps:

Changed entityManager resolution in our Crud/GenericDao/NanoService/whatever class FROM:

@PersistenceContext
EntityManager em;

public Entitymanager getEntityManager(){
    return em;
}

TO:

@Inject
TenantController tenantController;
TenantType tenantType;

public Entitymanager getEntityManager(){
    return tenantController.getTenant(tenantType);
}

WHERE TenantController simple has all entityManagers injected and decide which one to return using TenantType:

public class TenantController {

	@PersistenceContext(unitName="CarPU")
	EntityManager carEm;

	@PersistenceContext(unitName="MoviePU")
	EntityManager movieEm;

	public EntityManager getTenant(TenantType type){
		switch (type) {
		case CAR:
			return carEm;
		case MOVIE:
			return movieEm;
			default:{
				Logger.getLogger(getClass().getCanonicalName()).info("no tenant provided, resolving to CarPU");
				return carEm;//force error, we dont want to resolve it by "accident"
			}
		}
	}

}

TenantType is passed via Annotation to each service:

@Stateless
@Tenant(TenantType.MOVIE)
public class MovieService extends CrudService<Movie> {

}

via injection point into Generic Crud:

@Stateless
@Tenant(TenantType.MOVIE)//not a qualifier just an inherited annotation
public class MovieService extends CrudService<Movie> {

@Inject
@Tenant(TenantType.CAR)
Crud<Car> carCrud;//becareful to not pass wrong entity in generics, see this test:<a href="https://github.com/rmpestano/cdi-crud/blob/master/src/test/java/com/cdi/crud/test/MultiTenantIt.java#L93" target="_blank">https://github.com/rmpestano/cdi-crud/blob/master/src/test/java/com/cdi/crud/test/MultiTenantIt.java#L93</a>

}

Or programmatically:

@Stateless
public class MovieService extends CrudService<Movie> {

public void someMethod(){
   super.crud(TenantType.CAR).listAll();
  }
}

I’m not producing qualified entityManagers just to not have qualified Crud.java neither qualified Services.

Thats it, and how about you, how do you deal with multiple databases in your applications?

[1]http://wiki.eclipse.org/EclipseLink/Development/Indigo/Multi-Tenancy
[2]http://docs.jboss.org/hibernate/orm/4.2/devguide/en-US/html/ch16.html
[3]http://antoniogoncalves.org/2014/05/25/switch-datasource-with-cdi-alternatives-and-stereotypes/
[4]http://lambda-et-al.eu/multi-tenancy-with-jee-and-jboss/

Arquillian and Mocks

Although it’s a bit contradictory to talk about Arquillian and Mocks sometimes you may need to skip some operations which are not relevant to your tests, for example:

  • Replicate data in legacy databases, here is somewhat a limitation of arquillian persistence extension where you can’t have multiple datasources but in general you don’t want to test data replication although if you have the legacy datasource configured in the test server you can simple fire a native query to see if data was replicated there.
  • If you use in memory databases such as H2 (which is somewhat a form of fake but we use it a lot ;)) you may not have some operations, functions or procedures which are available in the real application database so you may want to skip those calls in tests.
  • Skip CMS, such as Alfresco, calls in tests.
  • avoid start a mail service during tests
  • Skip In memory datagrids, such as Hazelcast, initialization during tests

In my case most of the issues are due to the fact that i can’t replicate all my infrastructure in tests, maybe with arquillian docker integration they can be solved without mocks but another interesting fact is that the components i’m faking slow down my tests and sometimes their relevance to my business logic is low.

So if you have faced any situation where you need to mock components in arquillian integration tests here is how i am dealing with that, the source code is available here: https://github.com/rmpestano/arquillian-mocks, the examples are quite simple, if you know alternatives don’t hesitate to comment here or send a pull request .

Here is some code:

MyBeanImpl.java


@RequestScoped
public class MyBeanImpl implements MyBean {

    private boolean alive = false;

    @PostConstruct
    public void init(){
       alive = true;
    }

    @Override
    public boolean someSlowOperation() {
        try {
            Thread.currentThread().sleep(10000);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        return true;
    }

    @Override
    public boolean isAlive() {
        return alive;
    }
}

and the integration test:


@RunWith(Arquillian.class)
public class MyBeanTest {

  @Inject
  MyBeanImpl realBean;

  @Deployment
  public static WebArchive createDeployment() {
        WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
                .addClasses(MyBeanImpl.class, MyBean.class)
                .addAsWebInfResource("test-beans.xml", "beans.xml");
        return war;
    }

  @Test
  public void shouldRunSlowWithRealBean() {
        long start = System.currentTimeMillis();
        assertTrue(realBean.someSlowOperation());
        long executionTime = (System.currentTimeMillis() - start);
        assertTrue(executionTime >= 10000);//should take at least 10000ms
    }

 } 

Now here are some ways to mock the bean slow call.

1 – Using a mock framework:
You can use a mock framework like Mockito to fake method calls, to do that you need to add mockito to test deployment and then you can mock the slow method call:


@RunWith(Arquillian.class)
public class MyBeanTest {

    //@Mock //to use @Mock see auto discover extension: https://github.com/arquillian/arquillian-showcase/blob/master/extensions/autodiscover
    @Inject
    MyBeanImpl realMockedBean;

    @Before
    public void setupMock() {
        realMockedBean = Mockito.mock(MyBeanImpl.class);//real bean replaced by mocked one
        Mockito.when(realMockedBean.someSlowOperation()).thenReturn(true);
    }

    @Deployment
    public static WebArchive createDeployment() {
        WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
                .addClasses(MyBeanImpl.class, MyBean.class)
                .addAsWebInfResource("test-beans.xml", "beans.xml");
         MavenResolverSystem resolver = Maven.resolver();
         war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.mockito:mockito-all:1.10.8").withTransitivity().asSingleFile());
        return war;
    }
 
    @Test
    public void shouldRunFastWithMockedBean() {
        long start = System.currentTimeMillis();
        assertTrue(realMockedBean.someSlowOperation());
        assertTrue((System.currentTimeMillis() - start) < 10000);
    }
}

Note that this approach have some limitations:

  1. If the mocked bean is injected into another (real)bean you will have to set the mocked bean into the real one.
  2. When you use Mock(SomeBean.class) CDI doesn’t manage the bean anymore meaning that you will have to fake all behaviour you need in tests.

2 – CDI alternatives:

Just create an alternative implementation of the real bean and enable it in test-beans.xml:

MyBeanAlternative.java


@Alternative
public class MyBeanAlternative implements MyBean {

    @Override
    public boolean someSlowOperation() {
         return true;
    }
}

 

add the alternative bean to the test deployment and inject MyBean interface:


@RunWith(Arquillian.class)
public class MyBeanTest {

    @Inject
    MyBean myAlternativebean;

    @Deployment
    public static WebArchive createDeployment() {
        WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
                .addClasses(MyBeanImpl.class, MyBean.class, MyBeanAlternative.class)
                .addAsWebInfResource("test-beans.xml", "beans.xml");

             return war;      }

    @Test
    public void shouldRunFastWithAlternativeBean() {
        long start = System.currentTimeMillis();
        assertTrue(myAlternativebean.someSlowOperation());
        assertTrue((System.currentTimeMillis() - start) < 5000);//should run faster than 10000
    }
}

This is a good approach that solves the first limitation of previous approach but still doesn’t solve second limitation.

3 – CDI Bean Specialization

For bean specialization you dont need to implement an interface but just extend the real bean and provide new implementation to the method you want to fake:


@Specializes
public class MyBeanSpecialization extends MyBeanImpl {

    @Override
    public boolean someSlowOperation()  {
        return true;
    }
}

 

and now in the test you can inject the real bean but dont forget to add the bean specializition on test deployment:

@RunWith(Arquillian.class)
public class MyBeanTest {

@Inject
MyBeanImpl realBean;

 @Deployment
 public static WebArchive createDeployment() {
     WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
             .addClasses(MyBeanImpl.class, MyBean.class, MyBeanSpecialization.class)
             .addAsWebInfResource("test-beans.xml", "beans.xml");
    return war;
    }

 @Test
 public void shouldRunFastWithSpecializedBean() {
        long start = System.currentTimeMillis();
        assertTrue(realBean.someSlowOperation());
        long executionTime = (System.currentTimeMillis() - start);
        assertTrue(executionTime < 5000);//should run faster than 10000
    }
}

This approach solve both mentioned limitations.

4 – Bytecode manipulation

For the bytecode manipulation i will use arquillian-byteman extension where you can override bean methods through byte code weaving:

@RunWith(Arquillian.class)
public class MyBeanTest {

   @Inject
   MyBeanImpl realBean;

   @Deployment
   public static WebArchive createDeployment() {
       WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
              .addClasses(MyBeanImpl.class, MyBean.class)
              .addAsWebInfResource("test-beans.xml", "beans.xml");
       return war;
    }

    @Test
    @BMRule(
      name = "fake method call", targetClass = "MyBeanImpl",
      targetMethod = "someSlowOperation",
      action = "return true;")
    public void shouldRunFastWithBytecodeManipulatedBean() {
        long start = System.currentTimeMillis();
        assertTrue(realMockedBean.someSlowOperation());
        assertTrue(realMockedBean.isAlive());
        assertTrue((System.currentTimeMillis() - start) < 5000);
    }

 

I make it work only in managed container, you need to add this vmArg to arquillian.xml:

<container qualifier="wildfly-managed">
        <configuration>
            <property name="outputToConsole">true</property>
            <property name="javaVmArguments">-Xmx512m -XX:MaxPermSize=256m
                -Djboss.modules.system.pkgs=com.sun.tools.attach,org.jboss.byteman -Xbootclasspath/a:${java.home}/../lib/tools.jar
            </property>
            <property name="allowConnectingToRunningServer">true</property>
    </configuration>
   <extension qualifier="byteman">
        <property name="autoInstallAgent">true</property>
        <property name="agentProperties">org.jboss.byteman.verbose=false</property>
    </extension>

I have tested only with wildfly application server.

Conclusion

If somehow you need mocks in your integration tests i recommend option 3 – Bean specialization cause option 1 and 2 you’ll need to reimplement all bean methods/behaviour and option 4 doesn’t work on all containers yet.

Arquillian – The aliens are invading

I dedicate this post to my forefather, most of it was written while i was taking care of my father at hospital, rest in peace my best friend!

The ideia behind this post is to share my experience with Arquillian[1] which is becoming – the de facto standard – framework for writing (real) tests in the JavaEE environment(hence the blog title).

We will use arquillian to test a JavaEE6(compatible with EE7) application, all sources are available at github here, its a simple User, Group and Role management application. Also as usual there is a video showing what we will see on this entry:http://youtu.be/iGkCcK1EwAQ

Introduction

Arquillian is a testing platform which brings the power of real tests in Java enterprise applications by enabling the easy creation of integration, functional, behaviour tests among others.

One of the main characteristic of tests written with arquillian is that they run inside a container like servlet container, CDI, JavaEE(6 or +) server, mobile and so on, the so called in-container testing[2]. With tests inside a container the developer or test engineer don’t need to be concerned with server environment such as EJBs, JPA, CDI and JMS infrastructures and can focus on test itself.

To be able to run tests inside a container its necessary to provide which classes and resources will be part of the tests, this is called a “micro-deployment” cause its usually a subset of all application resources.

Micro deployment

Creating micro-deployments is ease by ShrinkWrap library, here is an example:

     @Deployment//tells arquillian which method generates the deployment
     public static Archive<?> createDeployment() {
	  WebArchive war = ShrinkWrap.create(WebArchive.class);

          //adding classes
	  war.addClass(MyClass.class).addClass(AnotherClass.class)
          //adding all classes in com.mypackage and in its subpackages
          .addPackages(true,com.mypackage);
          //adding libs
          MavenResolverSystem resolver = Maven.resolver();
	  //adds primefaces4.0.jar in web-inf/libs
          war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.primefaces:primefaces:4.0").withoutTransitivity().asFile());

	  //web-inf resources

	  //copies src/main/webapp/WEB-INF/beans.xml to micro-deployment WEB-INF folder
          war.addAsWebInfResource("src/main/webapp/WEB-INF/beans.xml", "beans.xml");

	 //resources

	 //copies src/test/resources/test-persistence.xml to src/main/resources/META-INF/persistence.xml micro deployment folder
         war.addAsResource("test-persistence.xml", "META-INF/persistence.xml");

	return war;
     }

Execution modes

To perform in-container tests arquillian can be configured to operate in three ways:

  • Embedded: An embedded container version is downloaded via maven during execution time and tests run on top of it
  • Remote: Arquillian connects to a container running on a local or remote machine.
  • Managed: in this mode arquillian manages container startup/shutdown by using a container installation. Note that if the container is already running locally arquillian will connect to it like in remote mode.

Test run modes

There are two ways tests can run with arquillian, in-container and as client.

Running tests in container means you have full control of artifacts deployed by micro-deployment like injecting beans, access persistenceContext, EJBs and so on. This is how we do integration/white box tests.

Running as client is the opposite, we cannot access any deployed resource. This mode simulates a client accessing our application from outside and denotes black box testing.

Also there is the mixed mode where some tests run as client and others inside container.

Here is a detailed explanation of arquillian test run mode.

Test lifecycle

Every arquillian test execution follows the following steps:

  1.  Micro deployment creation, it can be and jar, war or ear. If a dependency cannot be colected(resource not found), the process is aborted.
  2.  Startup of the container(embedded and managed) or arquillian connects to a container(remote mode) where tests will run. Can abort process if server doesn’t start within 60 sec or isn’t found.
  3. Deploy of micro-deployment generated in step 1. If a dependency is missing, e.g.: CDI bean dependent beans not deployed, arquillian will not run tests related to this micro-deploy.
  4. Tests related to step 3 micro-deployment are executed
  5. Undeploy of step 3 micro-deploy
  6. Container shutdown(in managed or embedded) or container disconnect(remote)

OBS: steps 1, 3, 4 and 5 will repeat for each @Deployment – @RunWith(Arquillian.class) present in test classaph.

Extensions:

The framework is composed by various extensions, here are the ones we will use:

  • Core: base of all other extensions, it’s responsible of deploy/undeploy,container startup and shutdown and also manage tests life cycle.
  • Drone: Integration with selenium like webdriver management/injection.
  • Graphene: provide a layer of abstraction on top of webdriver extending its functionalities to ease functional tests creation
  • Persistence: brings database management to tests via DBUnit
  • JBehave: enables BDD tests in arquillian through JBehave.
  • Warp: enables gray box testing creation
  • Recorder: allows test recording through video and images.
  • Rest: enables restful endpoints testing
  • JaCoCo: test coverage metrics.

for more extensions refer to arquillian github organization and reference guide

Configuring a project to use Arquillian

Following are the main steps to configure arquillian.

Dependencies

Below is a basic arquillian pom.xml

 <dependencies>
      <!-- Test dependencies -->
       <dependency>
            <groupId>junit</groupId> <!--testNG is also supported-->
            <artifactId>junit</artifactId>
            <version>4.8.2</version>
            <scope>test</scope>
        </dependency>
        <!-- arquillian test framework set to junit, could be testNG -->
        <dependency>
            <groupId>org.jboss.arquillian.junit</groupId>
            <artifactId>arquillian-junit-container</artifactId>
            <scope>test</scope>
        </dependency>
       <!-- shrinkWrap resolvers -->
        <dependency>
            <groupId>org.jboss.shrinkwrap.resolver</groupId>
            <artifactId>shrinkwrap-resolver-depchain</artifactId>
            <scope>test</scope>
            <type>pom</type>
        </dependency>
	 <dependency>
            <groupId>org.jboss.spec.javax.annotation</groupId>
            <artifactId>jboss-annotations-api_1.1_spec</artifactId>
            <version>1.0.1.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.spec.javax.ejb</groupId>
            <artifactId>jboss-ejb-api_3.1_spec</artifactId>
            <version>1.0.2.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.protocol</groupId>
            <artifactId>arquillian-protocol-servlet</artifactId>
            <scope>test</scope>
        </dependency>
		<!-- container adapter -->
	 <dependency><!-- server and mode(managed)-->
            <groupId>org.jboss.as</groupId>
            <artifactId>jboss-as-arquillian-container-managed</artifactId>
            <scope>test</scope>
        </dependency>
	  <!-- end test dependencies -->
</dependencies>

Arquillian uses the concept of maven bom(bill of materials) where a dependency of type pom dictates the recommended versions(can be overriden) of declared dependencies so because of this we didnt declared the version of most of above dependencies. Here follows the arquillian core bom which must be declared at dependencyManagement section:

   <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.jboss.arquillian</groupId>
                <artifactId>arquillian-bom</artifactId>
                <version>1.1.4.Final</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

Configuration file

Arquillian centralizes its configuration in arquillian.xml which contains container adapters and extensions configuration, it must be located at src/test/resources, here is an example:

<arquillian xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xmlns="http://jboss.org/schema/arquillian"
	xsi:schemaLocation="
        http://jboss.org/schema/arquillian
        http://jboss.org/schema/arquillian/arquillian_1_0.xsd">

	<!-- Force the use of the Servlet 3.0 protocol with all containers, as it is the most mature -->
	<defaultProtocol type="Servlet 3.0" />

	<container qualifier="jboss-remote" >
		<configuration>
			<property name="managementAddress">127.0.0.1</property>
			<property name="managementPort">9999</property>

		</configuration>
	</container>
	<container qualifier="jboss-managed"  default="true" >
		<configuration>
		   <!-- jbossHome can be replaced by JBOSS_HOME maven environmentVariable-->
	    	   <property name="jbossHome">#{arquillian.serverHome}</property>
	           <property name="outputToConsole">true</property>
                   <property name="javaVmArguments">-Xmx512m -XX:MaxPermSize=256m -Djboss.bind.address=localhost</property>
//makes it behave like remote adapter if container is already started
		   <property name="allowConnectingToRunningServer">true</property>
                </configuration>
	</container>
</arquillian>
   

Basically what we have are adapters config, in this example jboss managed is active by default. Note that container activated in arquillian.xml must have its dependency present in test classpath when running the tests, in our case jboss-as-arquillian-container-managed.

To switch beetween adapters you can either hardcode default property(as above) or use arquillian.launch file which may contain the name of the qualifier to be used by tests.

To make arquillian.launch dinamic you can use a maven property instead of using a constant adapter qualifier, but a better approach is to use maven system property in maven surefire plugin so for example jboss-managed maven profile sets arquillian.launch to jboss-manaded as below:

	<profile>
            <id>jboss-managed</id>
             <dependencies>
                <dependency>
                    <groupId>org.jboss.as</groupId>
                    <artifactId>jboss-as-arquillian-container-managed</artifactId>
                    <scope>test</scope>
                    <version>${jboss.version}</version>
                </dependency>
            </dependencies>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>2.16</version>
                        <configuration>
                            <systemPropertyVariables>
                                <arquillian.launch>jboss-managed</arquillian.launch>
                            </systemPropertyVariables>
                            <environmentVariables>
                                <JBOSS_HOME>${arquillian.serverHome}</JBOSS_HOME>
                            </environmentVariables>
                        </configuration>
                    </plugin>
           </plugins>
         </build>
     </profile>

Using this approach we garantee that adapter dependency will be present in classpath. Also note JBOSS_HOME enviromnment variable is here to specify container location(needed by managed adapter), we are using ${arquillian.serverHome} so it can be overriden via maven command when executing by CI.

Hello Arquillian

With maven dependencies and arquillian.xml configured to use jboss as test container in managed mode we already can create an arquillian (integration)test.

@RunWith(Arquillian.class)
public class HelloArquillianIt {//'It' sulfix is used to identify different types of test in pom.xml, we'll detail in tips section  

    @Deployment
    public static WebArchive createDeployment() {
        WebArchive war = ShrinkWrap.create(WebArchive.class);
        war.addPackages(true, "org.conventions.archetype.model");//entities
        war.addClasses(RoleService.class, RoleServiceImpl.class) //we will test RoleService
        .addClasses(UserService.class, UserServiceImpl.class)//used by SecurityInterceptorImpl.java
        .addPackages(true, "org.conventions.archetype.qualifier")//@ListToUpdate, @see roleServiceImpl
        .addClass(TestService.class);
        war.addPackages(true, "org.conventions.archetype.security");//security interceptor @see beans.xml
        war.addPackages(true, "org.conventions.archetype.event");//UpdateListEvent @see RoleServiceImpl#afterStore
        war.addPackages(true, "org.conventions.archetype.util");
        //LIBS
        MavenResolverSystem resolver = Maven.resolver();
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.conventionsframework:conventions-core:1.1.2").withTransitivity().asFile());//convention is a experimental framework to enable some JavaEE utilities
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.primefaces:primefaces:4.0").withoutTransitivity().asSingleFile());

        //WEB-INF
        war.addAsWebInfResource(new File(WEB_INF,"beans.xml"), "beans.xml");//same app beans.xml
        war.addAsWebInfResource(new File(WEB_INF,"web.xml"), "web.xml");same app web.xml
        war.addAsWebInfResource(new File(WEB_INF,"faces-config.xml"), "faces-config.xml");

        war.addAsWebInfResource("jbossas-ds.xml", "jbossas-ds.xml");//datasource

        //resources
        war.addAsResource("test-persistence.xml", "META-INF/persistence.xml");//customized persistence.xml to use a different database 

        return war;
    }

    @Inject
    RoleService roleService;// service real instance

    @Test
    public void shouldListRolesWithSuccess(){
        assertEquals(roleService.crud().countAll(),???????????);// how many roles?
    }

}

As you can see the most difficult part is creating the test deployment, you need to discover each component dependency and you usually face some
‘noClassDefFound’ error or ‘unsatisfiedDependency’ Exception but after that you can move most deployment entries to a utility class and reuse between your tests, see Deployments.java.

Also note that we could not made the assertion in test method cause we dont know how many roles there is in database, by the way which database the test is using? it’s using the one declared in test-persistence.xml(added in micro-deployment) which uses a maven property ${datasource} that is
defined in pom.xml. For tests using jboss or wildfly we are going to use ‘exampleDS’ as datasource.

The test datasource uses exampleDS2 cause its good practice that tests has its own database, as exampleDS2 is not configured in the server we add it on demand in our micro deployment via jbossas-ds.xml.

Ok we answered which database but we still doesn’t know how to populate test database so we can make assertions on top of it.

One way is to initialize database before each test using an utility class, here is TestService:

@Named
@Stateless
public class TestService implements Serializable {

    @PersistenceContext(unitName = "archetypeTestPU")
    EntityManager em;	

    @Inject
    RoleService roleService;

    public void createRoleDataset(){
        clearDatabase();

        Role roleArch = new Role("architect");
        Role roleAdmin = new Role("administrator");

        em.persist(roleAdmin);
        em.persist(roleArch);
        em.flush();

    }

    public void clearDatabase() {

        em.createNativeQuery("delete from group__role_").executeUpdate();//intermediate tables
        em.flush();
        em.createNativeQuery("delete from user__group_").executeUpdate(); //intermediate tables
        em.flush();
        em.createNativeQuery("delete from role_").executeUpdate();//role
        em.flush();
        em.createNativeQuery("delete from group_").executeUpdate();//group
        em.flush();
        em.createNativeQuery("delete from user_").executeUpdate();//user
        em.flush();
    }

}

here is our integration test with test service to initialize database:

    //testService needs to be added to deployment with .addClass(TestService.class); so it can be injected into test.

    @Inject
    TestService testService;

    @Test
    public void shouldListRolesWithSuccess(){
        testService.createRoleDataset();
        int numRoles = roleService.crud().countAll();
        assertEquals(numRoles, 2);
    }

Running the test

To run the test from IDE just right click in HelloArquillianIt.java and choice run/debug as junit test(dont forget to activate adapter maven profile in IDE).

Also be careful with JBOSS_HOME environment variable, in eclipse it also must be set (in Intellij it is set automatically for you)

To run via maven, surefire plugin must be configured:


	<plugin>
		<groupId>org.apache.maven.plugins</groupId>
		<artifactId>maven-surefire-plugin</artifactId>
		<version>2.16</version>
		<configuration>
			<skipTests>false</skipTests>
			<includes>
				<include>HelloArquillianIt.java</include>
			</includes>
		</configuration>
	</plugin>

 
in case surefire is defined in main build tag in pom.xml then tests will be executed on each build(mvn package, install or test). We will see in
TIPS section how to separate tests in specific maven profiles.

Another way to initialize database is to use DBUnit throught arquillian persistence extension.

Managing test database with arquillian persistence

Persistence extension helps the writing of tests where persistence layer is involved like preparing a test dataset, make a dataset assertion and so on, its based on dbunit framework.

Dependencies

<!--arquillian persistence(dbunit) -->
		<dependency>
			<groupId>org.jboss.arquillian.extension</groupId>
			<artifactId>arquillian-persistence-api</artifactId>
			<version>1.0.0.Alpha7</version>
			<scope>test</scope>
		</dependency>
		<dependency>
			<groupId>org.jboss.arquillian.extension</groupId>
			<artifactId>arquillian-persistence-dbunit</artifactId>
			<version>1.0.0.Alpha7</version>
			<scope>test</scope>
		</dependency>

Configuration file

    <extension qualifier="persistence">
        <property name="defaultDataSource">${datasource}</property>
        <!--<property name="defaultDataSeedStrategy">CLEAN_INSERT</property>-->
    </extension>

${datasource} is set via maven property in pom.xml so we can have dinamic datasource to run tests in different servers, for example in JBossAS tests must use datasource java:jboss/datasources/ExampleDS

Example

    @Test
    @UsingDataSet("role.yml")
    @Cleanup(phase = TestExecutionPhase.BEFORE)
    public void shouldListRolesUsingDataset(){
        int numRoles = roleService.crud().countAll();
        log.info("COUNT:"+numRoles);
        assertEquals(numRoles, 3);
    }

role.yml is a file located at src/test/resources/datasets folder with the following content describing 3 roles:


role_:
  - id: 1
    name: "role"
    version: 1
    createDate: 2014-01-01
    updateDate: 2014-01-01
  - id: 2
    name: "role2"
    version: 1
    createDate: 2014-01-01
    updateDate: 2014-01-01
  - id: 3
    name: "role3"
    version: 1
    createDate: 2014-01-01
    updateDate: 2014-01-01

Other formats like json e xml are supported.

Its also possible to execute scripts before/after each test with @ApplyScriptBefore/After and make dataset assertions with @ShouldMatchDataSet.

Some limitations:

  • hard to maintain large datsets
  • dataset values are static but in next version there will be a way to feed datasets with expressions
  • can degradate test performance(looks like it was fixed for next version(alpha7))
  • doesn’t work with client tests(black box)

Behaviour driven tests(BDD)

BDD[4] with arquillian is implemented in arquillian-jbehave extension throught jbehave[5] framework where an story file(.story) dictates the behaviour of a functionalite using natural language with ‘give when then’ tuple which links the story with tests that will execute based on the described behaviour.

Dependencies


	<dependency>
		<groupId>org.jboss.arquillian.jbehave</groupId>
		<artifactId>arquillian-jbehave-core</artifactId>
		<version>1.0.2</version>
		<scope>test</scope>
	</dependency>

	<!-- although jbehave extension works great and integrates well with most other extensions it doesn't <a href="https://community.jboss.org/message/865003#865003" target="_blank">have an official release</a> -->
        <repository>
            <id>arquillian jbehave UNofficial maven repo</id>
            <url>http://arquillian-jbehave-repo.googlecode.com/git/</url>
            <layout>default</layout>
        </repository>

Example

here follows BDD example, all bdd related tests can be found here.

@RunWith(Arquillian.class)
@Steps(UserSteps.class)
public class UserBdd extends BaseBdd {

    @Deployment
    public static WebArchive createDeployment()
    {
        WebArchive archive = Deployments.getBaseDeployment() //same deployment we saw before
                .addPackage(BaseBdd.class.getPackage())
                .addClass(UserSteps.class)
        return archive;
    }

}

Superclass BaseBdd is used to configure jbehave like for example customize test output report and initialize Steps – classes that execute the test behaviour(given when then).

The mapping between step class and story file works in the following way:

  • arquillian looks for story file in the same package of the class which extends JUnitStory which in case is UserBdd cause it extends BaseBdd that extends JUnitStory.
  • story file name by convention must have the same name of the class which extends JUnitStory but with ‘_'(underline) instead of camelcase, so for example UserBdd story file is user_bdd.story. This convention can be configured in BaseBdd or you can override configuration() method.
  • to execute the behaviour jbehave uses one or more step classes provided in the annotation @Steps(MyStepClass.class)
Story: manage users

Scenario: listing users by role

When i search users with role [name]

Then users found is equal to [total]

Examples:
|name|total|
|developer|2|
|administrator|1|
|secret|0|

//other scenarios related to user

and the step class to execute the behaviour UserSteps.java:

public class UserSteps extends BaseStep implements Serializable {

    private Integer totalUsersFound;

    private String message;

    @BeforeStory
    public void setUpStory(){
	//initialize story
        testService.initDatabaseWithUserAndGroups();
    }

    @BeforeScenario
    public void setUpScenario(){
        //initialize scenario
    }

    @When("i search users with role $name")
    public void searchUserByRole(@Named("name") String roleName){
           totalUsersFound = userService.findUserByRole(new Role(roleName)).size();
    }

    @Then("users found is equal to $total")
    public void usersFoundEqualTo(@Named("total") Integer total){
        assertEquals(totalUsersFound,total);
    }

The matching between story and test(Step) is done by exact string comparison but you can use @Alias annotation to match multiple strings for a step.

For more information about JBehave see its (great) documentation at jbehave.org.

Jbehave enables acceptance tests which can be white box, sometimes called system acceptance tests and usually are defined by development team cause they cover internal system logic or can be black box and named as user acceptance tests which is usualy written by or with client(final user), we will cover black boxed bdd tests later.

To execute bdd tests in our app use:

mvn clean test -Pwildfly-managed -Pbdd-tests

Functional Tests

Tests we saw until now were white boxed or in other words executed in the same proccess(JVM) where container executes tests so we have direct access(through injection) to the objects that were deployed by the micro deployment. Functional tests are black boxed and run “from outside” the container in a different JVM simulating a client accessing the system through user interface.

Arquillian enables functional tests through Drone and Graphene extensions. Both work on top of selenium, the first manages webdriver lifecycle and injection and the second extends selenium funcionalities as PageObjects[6], jquery selectors, page fragments, simplified waits and so on.

Dependencies

		<dependency>
			<groupId>org.jboss.arquillian.graphene</groupId>
			<artifactId>graphene-webdriver</artifactId>
			<type>pom</type>
			<scope>test</scope>
			<version>${version.graphene}</version>
		</dependency>

<dependencyManagement>
		<dependencies>
			<dependency>
				<groupId>org.jboss.arquillian.selenium</groupId>
				<artifactId>selenium-bom</artifactId>
				<version>${version.selenium}</version>
				<type>pom</type>
				<scope>import</scope>
			</dependency>
			<dependency>
				<groupId>org.jboss.arquillian.extension</groupId>
				<artifactId>arquillian-drone-bom</artifactId>
				<version>${version.drone}</version>
				<type>pom</type>
				<scope>import</scope>
			</dependency>
		</dependencies>
</dependencyManagement>

note that the extension uses two boms, selenium bom updates webdriver version regardless webdriver version that comes in drone bom but it needs to be declared before cause in case of identical dependencies(from each bom) the first has precedence on the others in dependencyManagement section.

Configuration file

<extension qualifier="graphene">
	   <property name="waitGuiInterval">3</property>
	   <property name="waitAjaxInterval">4</property>
	   <property name="waitModelInterval">5</property>

	</extension>
	 <extension qualifier="webdriver">
<!--         <property name="browser">firefox</property> -->
        <property name="browser">${arquillian.browser}</property>
        <property name="dimensions">1280x1024</property>
        <property name="remoteReusable">${arquillian.remoteReusable}</property>
 		<property name="remote">${arquillian.remote}</property>
        <property name="remoteAddress">${arquillian.seleniumGrid}</property>
        <property name="chromeDriverBinary">${arquillian.chromeDriver}</property>
        <!-- propriedade para rodar com o chrome localmente, baixar em: http://code.google.com/p/chromedriver/downloads/list -->
    </extension>

to more configuration details see drone and graphene documentation.

Example

here follows functional testing example, all functional related source can be found here

first step is micro-deployment:

    @Deployment(testable = false)
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment()
        .addPackages(true, UserMBean.class.getPackage()) //managed beans
        .addPackages(true,"org.conventions.archetype.converter");//faces converters

        //web resources (pages, js, css etc...)
        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory(WEBAPP_SRC).as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png)$"));
        war.addAsWebResource(new File(TEST_RESOURCES, "/pages/test-logon.xhtml"), "/templates/logon.xhtml");//test logon initialize database for black box tests on each logon, we will detail it later in TIPS section
        System.out.println(war.toString(true));
        return war;
    }

the main difference is the property testable = false which indicates that the test will run as client(separated JVM) this is necessary for drone getting in action cause it executes as a separated proccess.

We also need to add managed beans(bean package), converters and web related resources
such as pages and js files to the test deploy, that is done via ShrinkWrap Filters which accepts regular expressions letting this task simple and generic but becareful cause you are adding all files of a type(eg .js) and if the application is big it can onerate deployment process.

Here is logon example:

    @Drone
    WebDriver browser;

    @Test
    @InSequence(1)
    public void shouldLogonWithSuccess(@InitialPage HomePage home){
        assertTrue(home.getLogonDialog().isPresent());
        home.getLogonDialog().doLogon("admin", "admin");
        home.verifyMessage(resourceBundle.getString("logon.info.successful"));/;/asserts primefaces growl message
    }

Observations

– WebDriver is injected by drone and it know the browser through the arquillian.xml property ${arquillian.browser}. ${arquillian.browser} is defined in pom.xml.
– @InitialPage denote the page that webdriver must navigate before executing the test.
– HomePage is a PageObject[6], here is its code:

@Location("home.faces")
public class HomePage extends BasePage{

    @FindByJQuery("div[id$=logonDialog]")
    private LogonDialog logonDialog;

    public LogonDialog getLogonDialog() {
        return logonDialog;
    }

Also note:
– @Location annotation helps Drone to navigate via @InitialPage
– BasePage has some utilities for page objects
– @FindByJquery is an extension of FindBy selenium selector and is based on JQuery selector
– LogonDialog is a pageFragment and here is its code:

public class LogonDialog {

    @Root
    private GrapheneElement dialog;

    @FindByJQuery("input[id$=inptUser]")
    private GrapheneElement username;

    @FindByJQuery("input[id$=inptPass]")
    private GrapheneElement password;

    @FindByJQuery("button[id$=btLogon]")
    private GrapheneElement btLogon;

    public void doLogon(String username, String password){
        this.username.clear();
        this.username.sendKeys(username);
        this.password.clear();
        this.password.sendKeys(password);
        guardHttp(btLogon).click();
    }

    public boolean isPresent(){
        return this.username.isPresent();
    }

About LogonDialog fragment:

– GrapheneElement is an extension of seleniun webelement
– guardHttp is a Graphene implicity wait that will block test until the http request is done, there is also ajax wait which is very useful
– there is also explicity wait like Graphene.waitModel() and its timeout is configured in arquillian.xml

To execute functional tests in our app use:

mvn clean test -Pwildfly-managed -Pft-tests

User Acceptance tests

In this article i will call user acceptance tests[7] a combination of functional tests with Jbehave or in other words, black box behaviour driven tests. Its important to separate different types of tests cause you probably will want to execute them in different moments, eg: faster tests(white box) on each commit and slower and resource consumer ones at the end of the day.

As this kind of test is a combination of other kind of tests we wont have any specific configuration in arquillian.xml nor dependency at pom.

Example

@Steps({ RoleStep.class, LogonStep.class })
@RunWith(Arquillian.class)
public class RoleAt extends BaseAt {

    @Deployment(testable = false)
    public static WebArchive createDeployment()
    {
        WebArchive archive = createBaseDeployment();

        System.out.println(archive.toString(true));
        return archive;
    }

}

BaseAt combines elements of functional tests(testable=false) and bdd tests that we saw earlier.

role_at.story(remember name convention)


Story: manage user roles

GivenStories: org/conventions/archetype/test/at/logon/logon_at.story

Scenario: insert new role

Given user go to role home

When user clicks in new button

Then should insert role with name new role

Scenario: search roles

Given user go to role home

When user filter role by name [name]

Then should list only roles with name [name]

And return [total] rows

Examples:
|name|total|
|developer|1|
|admin|1|
|a|2|

public class RoleStep implements Serializable {

    @Drone
    private WebDriver browser;

    @ArquillianResource
    private URL baseUrl;

    @Page
    private RoleHome roleHome;

    @FindByJQuery("div[id$=menuBar]")
    private Menu menu;

    @Given("user go to role home")
    public void goToRoleHome() {
        menu.gotoRoleHome();
    }

    @When("user clicks in new button")
    public void userClickInNewButton() {
        WebElement footer = roleHome.getFooter();
        assertTrue(footer.isDisplayed());
        WebElement newButton = footer.findElement(By.xpath("//button"));
        assertTrue(newButton.isDisplayed());
        guardHttp(newButton).click();
    }

    @Then("should insert role with name $name")
    public void shouldInsertRole(String name) {
        Assert.assertTrue(roleHome.isFormPage());
        roleHome.insertRole(name);

    }

    @When("user filter role by name $name")
    public void userFilterRolesBy(@Named("name") String name) {
        roleHome.filterByName(name);
    }

    @Then("should list only roles with name $name")
    public void shouldListRolesWith(@Named("name") String name) {
        for (WebElement row : roleHome.getTableRows("table")) {
            assertTrue(row.findElement(By.xpath("//td[@role='gridcell']//span[contains(@id,'name')]")).getText().contains(name));
        }
    }

    @Then("return $total rows")
    public void shouldReturn(@Named("total") Integer total){
         assertEquals(roleHome.getTableRows("table").size(), total.intValue());
    }

}

RoleHome is a pageObject that controls role form(roleHome.xhtml), its source can be found here.

logon_at.story

 
Story: logon as user

Scenario: user should logon successfully

Given i am at logon screen

When i logon providing credentials admin, admin

Then i should be logged in

public class LogonStep extends BaseAtStep implements Serializable {

  @Drone
  private WebDriver       browser;

  @ArquillianResource
  private URL             baseUrl;

  @FindByJQuery("div[id$=menuBar]")
  private Menu menu;

  @Page
  private HomePage home;

  @Given("i am at logon screen")
  public void imAtLogon() {
    if (!home.getLogonDialog().isPresent()) {
      //if is already logged in, do logout
        super.goToPage(home);
      }
  }

  @When("i logon providing credentials $username, $password")
  public void loginWithCredentials(String username, String password) {
    home.getLogonDialog().doLogon(username,password);
  }

  @Then("i should be logged in")
  public void shouldBeAt() {
    home.verifyMessage(resourceBundle.getString("logon.info.successful"));
  }

}

To execute user acceptance tests in our example project use this maven command:

mvn clean test -Pwildfly-managed -Pat-tests

Gray box tests with Warp

With gray box testing[3] we can fire a request as client(eg:via web driver) and inspect internal objects(like in white box tests) like http session, faces context etc.

Warp tests of our example app can be found here.

Dependencies

	<dependency>
		<groupId>org.jboss.arquillian.extension</groupId>
		<artifactId>arquillian-warp</artifactId>
		<type>pom</type>
		<scope>test</scope>
		<version>1.0.0.Alpha7</version>
	</dependency>
	<dependency>
		<groupId>org.jboss.arquillian.extension</groupId>
		<artifactId>arquillian-warp-jsf</artifactId>
		<version>1.0.0.Alpha7</version>
	</dependency>

Example


@RunWith(Arquillian.class)
@WarpTest
@RunAsClient
public class LogonWarp {

    protected static final String WEBAPP_SRC = "src/main/webapp";

    protected static final String TEST_RESOURCES = "src/test/resources";

    @Drone
    protected WebDriver browser;

    @ArquillianResource
    protected URL baseUrl;

    @Deployment(testable = true)
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment()
                .addPackages(true, UserMBean.class.getPackage()) //managed beans
                .addPackages(true,"org.conventions.archetype.converter");

        //web resources
        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory(WEBAPP_SRC).as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png)$"));
        war.addAsWebResource(new File(TEST_RESOURCES, "/pages/test-logon.xhtml"), "/templates/logon.xhtml");//test logon clears the database on each logon
        System.out.println(war.toString(true));
        return war;
    }

        @Test
        @InSequence(1)
        public void shouldLogonWithSuccess(@InitialPage final HomePage home){
            assertTrue(home.getLogonDialog().isPresent());
            Warp.initiate(new Activity() {

                @Override
                public void perform() {
                    home.getLogonDialog().doLogon("admin", "admin");

                }
            })
                    .observe(request().method().equal(HttpMethod.POST).uri().contains("home.faces"))
                    .inspect(new Inspection() {
                        private static final long serialVersionUID = 1L;

                        @Inject
                        SecurityContext securityContext;

                        @Inject
                        ResourceBundle resourceBundle;

                        @BeforePhase(Phase.INVOKE_APPLICATION)
                        public void shouldNotBeLoggedIn() {
                            System.out.println("shouldNotBeLoggedIn:"+securityContext.loggedIn());
                            assertFalse(securityContext.loggedIn());
                        }

                        @AfterPhase(Phase.INVOKE_APPLICATION)
                        public void shouldBeLoggedIn(@ArquillianResource FacesContext context) {
                            System.out.println("shouldBeLoggedIn:"+securityContext.loggedIn());
                            assertTrue(securityContext.loggedIn());
                            boolean loggedInMessage = false;
                            for (FacesMessage facesMessage : context.getMessageList()) {
                                  if(facesMessage.getSummary().equals(resourceBundle.getString("logon.info.successful"))){
                                      loggedInMessage = true;
                                  }
                            }
                            assertTrue(loggedInMessage);
                        }

                    });
        }

Gray box tests mix elements from black and white box tests, this is achieved with arquillian mixed mode where we have a Deployment(testable=true) which denotes white testing and at the same time we RunAsClient for the black box part.

The black box part is represented by the Activity interface which has a perform method that is responsible for firing client requests.

The white(server side execution) part is represented by Inspection interface, whithin inspection you can do anything you do with white box testing such as accessing CDI beans through injection for example.

Also note that you can observe specific requests(Activity may start multiple requests) with observe method.

In our example we first logon in the application via user interface with Drone/Graphene and latter we access internal system object, in case SecurityContext to see if user is in fact logged.

Tips and recommendations

Notation

I’m using the following sulfix convention to differentiate tests:

it: Integration (white box)tests (eg: UserIt.java)
ft: Functional (black box)tests (eg: UserFt.java)
bdd: white box (system)acceptance tests (eg: UserBdd.java)
at: black box (user)acceptance tests (ex: UserAt.java)
warp: gray box tests (ex: LogonWarp.java)

this way we easy test profiles management.

Test profiles

Its used to separate different types of test so you can run them in different moments(eg: run lightweight tests more frequent or earlier)

With the sulfix notation we can separate tests in surefire plugin as follows:

		<profile>
			<!-- all tests -->
			<id>all-tests</id>
			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<skipTests>false</skipTests>
							<runOrder>reversealphabetical</runOrder>
							<includes>
								<include>**/*Ft.java</include><!--functional -->
								<include>**/*It.java</include><!--integration -->
								<include>**/*UnitTests.java</include><!--unit -->
								<include>**/*Bdd.java</include><!--white acceptance -->
								<include>**/*At.java</include><!--black acceptance -->
								<include>**/*Warp.java</include><!--gray box -->
							</includes>
							<excludes>
<!-- avoid execution of test superclasses -->
								<exclude>**/*BaseFt.java</exclude>
								<exclude>**/*BaseBdd.java</exclude>
								<exclude>**/*BaseIt.java</exclude>
								<exclude>**/*BaseAt.java</exclude>
								<exclude>**/*BaseWarp.java</exclude>
							</excludes>

						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

		<profile>
			<!-- only integration tests and bdd(white box) -->
			<id>it-tests</id>
			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<skipTests>false</skipTests>
							<runOrder>reversealphabetical</runOrder>
							<includes>
								<include>**/*It.java</include>
								<include>**/*Bdd.java</include>
							</includes>
							<excludes>
								<exclude>**/*BaseBdd.java</exclude>
								<exclude>**/*BaseIt.java</exclude>
							</excludes>

						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

		<profile>
			<!-- only functional and user acceptance tests (black box) -->
			<id>ft-tests</id>
			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<skipTests>false</skipTests>
							<runOrder>reversealphabetical</runOrder>
							<includes>
								<include>**/*Ft.java</include>
								<include>**/*At.java</include>
							</includes>
							<excludes>
								<exclude>**/*BaseFt.java</exclude>
								<exclude>**/*BaseAt.java</exclude>
							</excludes>
						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

Container profiles(adapters)

We talked about three execution modes to run arquillian tests inside a container. To switch between adapters easily we can define maven profiles:

	<profile>
			<id>jboss-remote</id>
			<dependencies>
				<dependency>
					<groupId>org.jboss.as</groupId>
					<artifactId>jboss-as-arquillian-container-remote</artifactId>
					<scope>test</scope>
					<version>7.2.0.Final</version>
				</dependency>
			</dependencies>
                        <build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<systemPropertyVariables>
								<arquillian.launch>jboss-remote</arquillian.launch>
							</systemPropertyVariables>

						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

		<profile>
			<id>jboss-managed</id>
			<activation>
				<activeByDefault>true</activeByDefault>
			</activation>
			<properties>
				<arquillian.serverHome>/home/jboss-eap-6.2</arquillian.serverHome>
                        </properties>
			<dependencies>
				<dependency>
					<groupId>org.jboss.as</groupId>
					<artifactId>jboss-as-arquillian-container-managed</artifactId>
					<scope>test</scope>
					<version>7.2.0.Final</version>
				</dependency>
			</dependencies>

			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<systemPropertyVariables>
								<arquillian.launch>jboss-managed</arquillian.launch>
							</systemPropertyVariables>
							<environmentVariables>
								<JBOSS_HOME>${arquillian.jbossHome}</JBOSS_HOME>
							</environmentVariables>
						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

Note that in this case we are using surefire only to define properties like:

– JBOSS_HOME provide container path(in managed case). We hold container path in
arquillian.serverHome maven property so it can be overriden by CI(Jenkins) later.
– arquillian.launch informs which adapter must be activated in arquillian.xml

You can also use container maven dependency to download container automaticly during tests as follows:

  	<profile>
            <id>wildfly-managed</id>
            <properties>
                <arquillian.serverHome>${project.build.directory}/wildfly-${wildfly.version}</arquillian.serverHome>
            </properties>
            <dependencies>
                <dependency>
                    <groupId>org.wildfly</groupId>
                    <artifactId>wildfly-arquillian-container-managed</artifactId>
                    <version>${wildfly.version}</version>
                    <scope>test</scope>
                </dependency>
            </dependencies>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-dependency-plugin</artifactId>
                        <executions>
                            <execution>
                                <id>unpack</id>
                                <phase>process-test-classes</phase>
                                <goals>
                                    <goal>unpack</goal>
                                </goals>
                                <configuration>
                                    <artifactItems>
                                        <artifactItem>
                                            <groupId>org.wildfly</groupId>
                                            <artifactId>wildfly-dist</artifactId>
                                            <version>${wildfly.version}</version>
                                            <type>zip</type>
                                            <overWrite>false</overWrite>
                                            <outputDirectory>${project.build.directory}</outputDirectory>
                                        </artifactItem>
                                    </artifactItems>
                                </configuration>
                            </execution>
                        </executions>
                    </plugin>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>2.16</version>
                        <configuration>
                            <systemPropertyVariables>
                                <arquillian.launch>jboss-managed</arquillian.launch>
                            </systemPropertyVariables>
                            <environmentVariables>
                                <JBOSS_HOME>${arquillian.serverHome}</JBOSS_HOME>
                            </environmentVariables>
                        </configuration>
                    </plugin>
                </plugins>
	</profile>

maven dependency plugin downloads the container(if its not in local maven repo) and unpack it inside target directory. Surefire plugin uses
arquillian.serverHome property which points to the server inside target dir.

Faster test execution

When we are developing tests we tend to execute them a lot which can be very time consuming, imagine execute them inside a container which needs to be started and stopped. One way to minimize that is to use the remote container adapter so the developer starts the container once and then execute tests on top of this started container so the only time consuming task(aside from the test itself) is the micro-deployment deploy/undeploy. here is a a remote adapter maven profile:

	<profile>
			<id>jboss-remote</id>

			<dependencies>
				<dependency>
					<groupId>org.jboss.as</groupId>
					<artifactId>jboss-as-arquillian-container-remote</artifactId>
					<scope>test</scope>
					<version>7.2.0.Final</version>
				</dependency>
			</dependencies>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>2.16</version>
                        <configuration>
                            <systemPropertyVariables>
                                <arquillian.launch>jboss-remote</arquillian.launch>
                            </systemPropertyVariables>
                        </configuration>
                    </plugin>
                </plugins>
            </build>
		</profile>

the system property arquillian.launch selects the container to be activated in arquillian.xml:

<container qualifier="jboss-remote" >
		<configuration>
			<property name="managementAddress">127.0.0.1</property>
			<property name="managementPort">9999</property>
		</configuration>
	</container>

OBS: As you usually will deploy a datasource within your test deploy becareful with the infamous DuplicateService exception.

Avoiding multiple deployments

In white box tests it is possible to avoid multiple deployments(@Deployment) through test dependency injection, to achieve that you just need to separate your test classes and inject them in a common class which has the arquillian deployment, see ArchetypeIt.java where we inject UserIt and RoleIt to leverage ArchetypeIt deployment.

Manipulating test database in functional tests

As we don’t have access to deployed classes during black box tests we can’t use our TestService(neither arquillian persistence) to initialize/prepare database for tests so what we do is to deploy a customized logon page so each time a login is done we init database, this is done in BaseFt as follows:

war.addAsWebResource(new File(TEST_RESOURCES, "/pages/test-logon.xhtml"), "/templates/logon.xhtml");

//test logon clears and initializes test database on each logon

the customized logon fires the initialization method via primefaces remote command each time logon dialog is opened, its the same application logon dialog with the additional code showed below:

 <h:form id="loginForm">
            <p:remoteCommand name="init" rendered="#{not loggedIn}" autoRun="true" immediate="true" process="@this" update="@none" partialSubmit="true" actionListener="#{testService.initDatabaseWithUserAndGroups}"/>
</h:form>

this way we garantee a specific dataset for tests.

Rest testing

For testing restful web services we are running arquillian as client(black box) and deploying our Rest endpoint to microdeployent.

When running as client we can inject the application url(which is dinamic generated by arquillian) in tests via @ArquillianResource, that’s all we need to fire rest requests as client.

Rest test in our example app can be found here.

Also note that we are deploying a RestDataset class which is responsible for inializing our database before the tests.

For more advanced restful tests like testing endpoints from the server side refer to rest extension.

Running tests in CI server(Jenkins)

Running tests in Jenkin is same as locally, the only problem you may face is with functional tests cause CI servers usually dont have graphical interface so it can’t open a web browser to run tests. A way to solve that is to run functional tests remotely in a machine with
graphical card and with a selenium grid up and running, if you have that then you just need to pass some maven parammeters to your test command so you enable remote functional tests:

mvn clean test -Pjboss-managed -Darquillian.remote=true -Darquillian.seleniumGrid=http://remote-machine-ip:1044/wd/hub -Darquillian.localAddress=jenkins-ip

Note that:

arquillian.remote, arquillian.seleniumGrid and Darquillian.localAddress are properties defined in arquillian.xml used by webdriver extension that enable remote functional testing.

http://remote-machine-ip:1044/wd/hub is a selenium grid waiting for conections at port 1044

OBS: you can encapsulate above command in a maven profile:

<profile>
	<id>jenkins</id>
	<properties>//profile default properties, can be overrided with -Dproperty-name=value
		<arquillian.remote>true</arquillian.remote>
		<arquillian.serverHome>/opt/jboss-eap</arquillian.serverHome>
		<arquillian.seleniumGrid>http://remote-machine-ip:1044/wd/hub</arquillian.seleniumGrid>
		<arquillian.localAddress>jenkins-ip</arquillian.localAddress>
	</properties>
</profile>

Another important aspect when running tests in continuous integration is where is the container, for that you can either have a container instalation on CI machine and use jboss managed pointing to that instalation or use the container as maven dependency as we explained here.

Yet another addendum is when you have multiple projects using arquillian tests and these tests run concurrent on CI, in this case you may have conflicts cause arquillian will try to start multiple containers on the same machine, one way to solve that on JBossAS is using port offset in arquillian.xml as show below:

	<container qualifier="jboss-managed"  default="true" >
	   <configuration>
	     <property name="jbossHome">${arquillian.jbossHome}</property>
	     <property name="outputToConsole">true</property>
            <property name="javaVmArguments">-Xmx512m -XX:MaxPermSize=512m  -Djboss.bind.address=localhost
                -Djboss.socket.binding.port-offset=100
                -Djboss.management.native.port=9054            
            </property>
	    <property name="allowConnectingToRunningServer">false</property>
            <property name="managementPort">9154</property>
        </configuration>
	</container>

The parameters -Djboss.socket.binding.port-offset=100 and -Djboss.management.native.port=9054 work in conjunction with property:
9154

You just need to configure different offsets in your application so they can run concurrent on CI server. For more details see[11]

DuplicateServiceException

A problem you may face from time to time is the raise of DuplicateService Exception during test deploy, it usualy happens due to the dynamic datasource inclusion inside test deployment. Arquillian generates an entry in standalone.xml(in case of jbossas/wildfly)for each test deployment and undeploy it after test execution but somethimes it can’t undeploy it so next time you execute the test the datasource(added by micro deployment) will be registered again resulting in the exception below:

Caused by: org.jboss.msc.service.DuplicateServiceException: Service jboss.data-source.java:jboss/datasources/ArchetypeTestDS is already registered

When that happens you must remove the deployment entry in standalone.xml in order to run the tests again.

Note that this exception won’t happen when using the adapter as maven dependency as we saw in container profiles section.

Publishing test metrics

The generation of test metrics like % of test success/fails, tests executed and coverage is done by arquillian-jacoco extension:

JaCoCo works through byte code instrumentation and is activated by a maven plugin:

		<profile>
			<id>jacoco</id>
			<properties>
				<jacoco.version>0.7.0.201403182114</jacoco.version>
			</properties>
			<dependencies>
				<dependency>
					<groupId>org.jboss.arquillian.extension</groupId>
					<artifactId>arquillian-jacoco</artifactId>
					<scope>test</scope>
					<version>1.0.0.Alpha6</version>
				</dependency>
				<dependency>
					<groupId>org.jacoco</groupId>
					<artifactId>org.jacoco.core</artifactId>
					<scope>test</scope>
					<version>${jacoco.version}</version>
				</dependency>
			</dependencies>
			<build>
				<plugins>
					<plugin>
						<groupId>org.jacoco</groupId>
						<artifactId>jacoco-maven-plugin</artifactId>
						<version>${jacoco.version}</version>
						<executions>
							<execution>
								<goals>
									<goal>prepare-agent</goal>
								</goals>
							</execution>
							<execution>
								<id>report</id>
								<phase>prepare-package</phase>
								<goals>
									<goal>report</goal>
								</goals>
							</execution>
						</executions>
					</plugin>
				</plugins>
			</build>
		</profile>

its also necessary to pass the following properties in pom.xml(or use -D option in maven command) so sonar can read jaCoCo reports

   <properties>
	        <sonar.core.codeCoveragePlugin>jacoco </sonar.core.codeCoveragePlugin>
		<sonar.dynamicAnalysis>reuseReports</sonar.dynamicAnalysis>
		<sonar.core.codeCoveragePlugin>jacoco</sonar.core.codeCoveragePlugin>
   </properties>

then you can use the command:

mvn clean install -Pjboss-managed -Pall-tests -Pjacoco

note that you need to use ‘install’ so report is generated in target folder and also note that coverage report will only take into account white box tests.

Arquillian all dependencies

Arquillian All is an all-in-one maven dependency for arquillian, its main objective is to facilitate beginners to setup arquillian dependencies, if you have good knowlegde of arquillian platform then preferably declare each dependency you need so you can leveragy arquillian modularity. Here is a before-after arquillian-all pom.xml:

Before:

<dependencies
        <!-- Test dependencies -->
       <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.11</version>
            <scope>test</scope>
        </dependency>
 
        <!-- arquillian -->
        <dependency>
            <groupId>org.jboss.arquillian.junit</groupId>
            <artifactId>arquillian-junit-container</artifactId>
            <scope>test</scope>
        </dependency>
 
        <!--arquillian persistence(dbunit) -->
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-persistence-api</artifactId>
            <version>1.0.0.Alpha7</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-persistence-dbunit</artifactId>
            <version>1.0.0.Alpha7</version>
            <scope>test</scope>
        </dependency>
 
        <!-- warp -->
 
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-warp</artifactId>
            <type>pom</type>
            <scope>test</scope>
            <version>${warp.version}</version>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-warp-jsf</artifactId>
            <version>${warp.version}</version>
 
        </dependency>
 
        <!-- shrinkWrap resolvers -->
        <dependency>
            <groupId>org.jboss.shrinkwrap.resolver</groupId>
            <artifactId>shrinkwrap-resolver-depchain</artifactId>
            <scope>test</scope>
            <type>pom</type>
        </dependency>
 
        <dependency>
            <groupId>org.jboss.arquillian.graphene</groupId>
            <artifactId>graphene-webdriver</artifactId>
            <type>pom</type>
            <scope>test</scope>
            <version>${version.graphene}</version>
        </dependency>
 
          <dependency>
             <groupId>org.jboss.arquillian.graphene</groupId>
             <artifactId>arquillian-browser-screenshooter</artifactId>
             <version>2.1.0.Alpha1</version>
             <scope>test</scope>
          </dependency>
 
         <!-- REST -->
 
         <dependency>
             <groupId>org.jboss.arquillian.extension</groupId>
             <artifactId>arquillian-rest-client-api</artifactId>
             <version>1.0.0.Alpha3</version>
         </dependency>
         <dependency>
             <groupId>org.jboss.arquillian.extension</groupId>
             <artifactId>arquillian-rest-client-impl-2x</artifactId>
             <version>1.0.0.Alpha3</version>
         </dependency>
 
         <dependency>
             <groupId>org.jboss.arquillian.extension</groupId>
             <artifactId>arquillian-rest-warp-impl-resteasy</artifactId>
             <version>1.0.0.Alpha3</version>
         </dependency>
 
        <!-- arquillian bdd -->
 
         <!-- jbehave -->
        <dependency>
            <groupId>org.jboss.arquillian.jbehave</groupId>
            <artifactId>arquillian-jbehave-core</artifactId>
            <version>1.0.2</version>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.jboss.spec.javax.annotation</groupId>
            <artifactId>jboss-annotations-api_1.1_spec</artifactId>
            <version>1.0.1.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.spec.javax.ejb</groupId>
            <artifactId>jboss-ejb-api_3.1_spec</artifactId>
            <version>1.0.2.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.protocol</groupId>
            <artifactId>arquillian-protocol-servlet</artifactId>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.apache.httpcomponents</groupId>
            <artifactId>httpcore</artifactId>
            <version>4.2.5</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>commons-collections</groupId>
            <artifactId>commons-collections</artifactId>
            <version>3.2.1</version>
        </dependency>
        <dependency>
            <groupId>xml-apis</groupId>
            <artifactId>xml-apis</artifactId>
            <version>1.4.01</version>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>1.7.5</version>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.codehaus.jackson</groupId>
            <artifactId>jackson-core-lgpl</artifactId>
            <version>1.9.13</version>
            <scope>test</scope>
        </dependency>
 
    </dependencies>
 
    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.jboss.arquillian</groupId>
                <artifactId>arquillian-bom</artifactId>
                <version>${version.arquillian}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
            <dependency>
                <groupId>org.jboss.arquillian.selenium</groupId>
                <artifactId>selenium-bom</artifactId>
                <version>${version.selenium}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
            <dependency>
                <groupId>org.jboss.arquillian.extension</groupId>
                <artifactId>arquillian-drone-bom</artifactId>
                <version>${version.drone}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

After:

<dependencies
        <!-- Test dependencies -->
       <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.11</version>
            <scope>test</scope>
        </dependency>
 
        <!-- arquillian -->
         <dependency>
            <groupId>org.jboss.arquillian</groupId>
            <artifactId>arquillian-all</artifactId>
            <version>1.0.1</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>xml-apis</groupId>
            <artifactId>xml-apis</artifactId>
            <version>1.4.01</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.shrinkwrap</groupId>
            <artifactId>shrinkwrap-api</artifactId>
            <version>1.2.2</version>
          <scope>test</scope>
    </dependency>
 
</dependencies>

Functional tests on each commit?

As ftests tends to be slower than white box tests its unbearable to execute them on each commit in continuous integration, a way to workaround it is to create a dedicated jenkins job to run black box tests(ft-tests profile we showed before) and schedule it to run at the end of the day for example.

Another possibility is to execute functional tests is a headless browsers[8] so test time execution is faster and viable on each commit(or every 10 minutes at least).

Phantomjs is the most recommended for this kind of test, it simulates a web browser through the V8 javascript engine(same used by Chrome).

To execute tests using phantom in our sample application just activate ‘webdriver-phantomjs’ profile or -Darquillian.browser=phantomjs in maven command

Another option of headless webdriver is HtmlUnit, also supported by Drone.

Prime arquillian

Prime-Arquillian is an opensource project which aims testing Primefaces showcase, the output of this project must be Arquillian graphene page fragments representing primefaces components to easy functional testing of primefaces based applications.

The project is in initial state and has a draft of primefaces datatable fragment that can be found in components module.

The alien build pipeline

A good testing suite is the basis of a deployment pipeline where we can promote a build across multiple stages, this process is well explained by Martin Fowler here.

For our application i’ve created a simple pipeline with only automated steps that uses the arquillian tests of this post on a build pipeline orchestrated by Jenkins with metrics published in Sonar ending with a deployed version in Wildfly application server.

I have not detailed the pipeline here to not extend this post even more, instead i’ve published a video on youtube, any question you can post a comment here.

Conclusion

In this article we saw a powerfull testing platform that facilitate the development of tests in the JavaEE ecosystem and also saw it working in practice on a JavaEE6 application.

We could notice that arquillian provided support to most used kinds of tests and can be easily used in continous integration but its initial setup/configuration, mainly in existing projects, is not trivial but once done can bring inumerous advantages.

References

 
 
 
 
 
 
 
 
 
 
 

My Forge Experience pt1

In this post i’ll share my experience with JBoss Forge 1.x[1], i will cover Forge 2.x[2] in pt2.

So the main objective of this entry is to create a forge plugin that get useful information from OSGi[3][4][5] projects.

But what is Forge? and what is OSGi?

to be straightforward forge is a plugin based java command line tool(in forge 2.x this definition may change bit) based on CDI[6] providing a rich API to create and/or manipulate java projects which is the main purpose of forge but not limited to it.

OSGi is the de facto standard for building modular, dynamic, service based java applications.

One of the benefits of Forge is to create, configure and add features to projecs such as JPA, REST, JSF functionality and so on, in this post we are NOT going to add any feature neither configure projects, instead we are going to inspect and extract data from existing projects. To be more exact we are going to build a Forge plugin to get information from OSGi based projets, such as the one from this great paper introducing OSGi: http://homepages.dcc.ufmg.br/~mtov/pub/2008_sen.pdf.

Before we ge our hands dirty here is a video showing the result project of this post: http://youtu.be/rS-6LuMWPHI and
the source code is available at github: https://github.com/rmpestano/intrabundle

Forge Configuration

First thing to do is to start forge, you’ll need to download a zipped file[12], unconpress and execute forge file (windows, linux and osx compatible).
Optional step is to add forge to your system path, in linux you can add the following line to ~.profile file:
export PATH=$PATH:/home/rmpestano/projetos/forge/dist/forge-distribution-1.4.3.Final/bin(on windows just add …..forge-distribution-1.4.3.Final/bin to path environment variable). For detailed information see[7]

OBS: For debugging purposes you can also add to ~.profile: export FORGE_OPTS=”-Xdebug -Xnoagent -Djava.compiler=NONE -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=8000″(on windows create FORGE_OPTS enviromnent variable) so you can attach a remote debugger to your plugin.

Creating the plugin project

After that we are going to create our plugin project, forge will help us on that:
Open a terminal and start forge by typing ‘forge'(if you didnt added forge to your environment cd into forge-distribution/bin and execute the command) as image 1:

pic01

image 1

now change dir to a folder of your choice and type new-project –named intrabundle –topLevelPackage br.ufrgs.rmpestano.intrabundle
to create the project.

Next lets create our forge plugin, first we need to setup plugin dependencies, type plugins setup and accept the defaults by clicking ENTER.

OBS: you can cd to pom.xml and type ‘ls’ to confirm that dependencies were added, see image 2:

pic03

image 2

After setting up plugin the plugins new-plugin command becomes available and here a (big) parenteses: How that command showed up now?
One great thing of Forge is that it is build up by forge plugins so we gain for free a lot of (nice)plugins examples so looking at
org.jboss.forge.dev.PluginsPlugin.java we can see how that “magic” works, basically it is annotated with ‘@RequiresFacet(ForgeAPIFacet.class)’ annotation which tells forge that this plugin’s commands(except the @SetupCommand) can only be executed in certain context and who dictates this context is the required facet, in this case ForgeAPIFacet.java. The context is ‘alive’ when facet isInstalled() method returns true. Take a look at [8] for more information about facets.

Closing our big parenteses, lets execute plugins new-plugin –named OSGiPlugin –alias osgi

Now you can see OSGiPlugin.java file was created by forge with some basic commands. Also OSGiPluginTest.java was created so you can test your plugins without starting forge and installing the plugin. To do that forge leverages Arquillian framework [9] the de facto framework for testing JavaEE applications. To run the generated tests(via IDE) just right click in OSGiPluginTest and Run as JUnit test or (via forge) run ‘build’ command on project, see [11] for more detais on testing plugins.
Lets take a look at OSGiPlugin.java:

package br.ufrgs.rmpestano.intrabundle;
//imports ommited
@Alias("osgi")
public class OSGiPlugin implements Plugin
{
   @Inject
   private ShellPrompt prompt;

   @DefaultCommand
   public void defaultCommand(@PipeIn String in, PipeOut out)
   {
      out.println("Executed default command.");
   }

   @Command
   public void command(@PipeIn String in, PipeOut out, @Option String... args)
   {
      if (args == null)
         out.println("Executed named command without args.");
      else
         out.println("Executed named command with args: " + Arrays.asList(args));
   }

   @Command
   public void prompt(@PipeIn String in, PipeOut out)
   {
      if (prompt.promptBoolean("Do you like writing Forge plugins?"))
         out.println("I am happy.");
      else
         out.println("I am sad.");
   }
}

It has three commands denoted by @Command annotation, the name of the command is the name of the method(you can provide the name via ‘value’ attribute in the command annotation). Every command is prefixed by plugin alias, @Alias anotation at class level, plus command name (in our case @Alias(“osgi”)). The default command has the same name as plugin alias.

 Installing the Plugin

To install our plugin and start executing commands type forge source-plugin “project location” as image 3.1 and image 3.2

img04.0

image 3.1

img04

image 3.2

now you can execute the commands by typing ‘osgi command’ (use tab for autocompletion).

If your project is available at a git project repository such as github you can install your plugin directly from it using forge git plugin command, for our plugin you should use: forge git-plugin git://github.com/rmpestano/intrabundle.git. See image 4:

pic02

image 4 

that one was easy, but as you can see you can execute the commands regardless the location or project you are and our idea is to execute commands on top of OSGi based projects. To do that we are going to create our plugin facet(the same idea behind ‘plugins new-plugin’ command we talked earlier)

The OSGi Facet

To restrict our plugin to OSGi projects we need to specify what is the prerequisite that a project must satisfy to be considered OSGi based.

One thing that differ OSGi projects from others is the presence of OSGi metadata in MANIFEST.MF file. So this is what OSGi Facet will look for, also we will consider that OSGi projects have a parent folder and inside it has its modules(a.k.a bundles) so the algorithm of OSGiFacet will go down two levels of directories after META-INF folder, if it find META-INF folders it will try to find MANIFEST file inside, if it succeeds it will read the file looking for OSGi metadata, if it finds any the OSGiFacet is satisfied and we will be able to execute our plugin commands, here is the code:

package br.ufrgs.rmpestano.intrabundle;

import org.jboss.forge.project.facets.BaseFacet;
import org.jboss.forge.resources.DirectoryResource;
import org.jboss.forge.resources.Resource;
import org.jboss.forge.resources.ResourceFilter;

import javax.inject.Singleton;
import java.io.File;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.util.ArrayList;
import java.util.List;

@Singleton
public class OSGiFacet extends BaseFacet {

    private int metaInfSearchLevel = 1;//defines how much directory levels to go down looking for OSGi metadata(manifest file)

    @Override
    public boolean install() {

        /**
         * we are not going to install OSGi projects,
         * just analyse existing ones
         *
         */
        return isInstalled();
    }

    @Override
    public boolean isInstalled() {
        return isOSGiProject(project.getProjectRoot());

    }

    /**
     * search OSGi metadata looking for META-INF directory with manisfest.mf file
     * containing the 'bundle' word
     *
     * @param directoryResource
     * @return
     */
    public boolean isOSGiProject(DirectoryResource directoryResource) {
        List<Resource<?>> metaInfList = new ArrayList<Resource<?>>();

        this.getMetaInfDirectories(directoryResource, metaInfList, 0);

        if (metaInfList.isEmpty()) {
            return false;
        }
        for (Resource<?> metaInf : metaInfList) {
            if (isOSGiModule(metaInf)) {
                return true;
            }
        }
        return false;
    }

    /**
     * gather META-INF directories by looking
     * for each @parent directory get its meta-inf directory
     * until metaInfSearchLevel is reached
     *
     * @param parent
     * @param resourcesFound
     * @param currentLevel
     */
    public void getMetaInfDirectories(DirectoryResource parent, List<Resource<?>> resourcesFound, int currentLevel) {
        if (currentLevel >= metaInfSearchLevel) {
            return;
        }
        for (Resource<?> r : parent.listResources()) {
            if (r instanceof DirectoryResource) {
                resourcesFound.addAll(r.listResources(new ResourceFilter() {
                    @Override
                    public boolean accept(Resource<?> resource) {
                        return resource.getName().equalsIgnoreCase("meta-inf");
                    }
                }));
                getMetaInfDirectories((DirectoryResource) r, resourcesFound, ++currentLevel);
            }
        }

    }

    private boolean isOSGiModule(Resource<?> metaInf) {
        Resource<?> manifest = metaInf.getChild("MANIFEST.MF");
        if (!manifest.exists()) {
            return false;
        }
        RandomAccessFile randomAccessFile;
        try {
            File f = new File(manifest.getFullyQualifiedName());
            randomAccessFile = new RandomAccessFile(f, "r");
            return hasOsgiConfig(randomAccessFile);
        } catch (Exception e) {
            e.printStackTrace();
            return false;
        }
    }

    private boolean hasOsgiConfig(RandomAccessFile aFile) throws IOException {
        String line;
        while ((line = aFile.readLine()) != null) {
            if (line.contains("Bundle")) {
                return true;
            }
        }
        return false;

    }

    public int getMetaInfSearchLevel() {
        return metaInfSearchLevel;
    }

    public void setMetaInfSearchLevel(int metaInfSearchLevel) {
        this.metaInfSearchLevel = metaInfSearchLevel;
    }
}

now add requiresFacet to our OSGiPlugin and install it again

@Alias("osgi")
@RequiresFacet(OSGiFacet.class)
public class OSGiPlugin implements Plugin

now you can only execute the commands in OSGi projects, the ones which have a subfolder with meta-inf folder containing a MANIFEST.MF file with osgi metadata. You can find an example OSGi project in [10] it’s from the paper ‘A gentle Introduction to OSGi’[13].

Before testing the plugin there is one problem, one limitation of Forge1.x is that it needs a pom.xml file in project root to work, in other words it was made for maven projects. Most OSGi projects use eclipse bnd tools plugin, some use maven plus maven bundle plugin but we are going to focus on non maven ones. To surpass this limitation we are going to add a minimal pom.xml in our OSGi projects and to do that we are going to create a forge Project Locator.

OSGi Project Locator

A project locator is responsible for finding and creating forge projects, they are called by org.jboss.forge.project.services.ProjectFactory#findProject() every time we change folder.

A forge project is an object that holds information about projects such as directory location, the facets it satisfies and so on.

So when we cd into an OSGi project our locator will create a java object representing it, also it will create a minimal pom.xml in project root to overcome forge1 limitation we talked about, here is the interface our Project will implement:

public interface OSGiProject extends Serializable{

    List<OSGiModule> getModules();
}

basically our OSGi project will hold a list of OSGi modules:

import org.jboss.forge.project.Project;
import org.jboss.forge.resources.FileResource;

import java.io.Serializable;

public interface OSGiModule extends Serializable,Project {

    Long getLinesOfCode();

    Boolean getUsesDeclarativeServices();

    FileResource<?> getManifest();

    FileResource<?> getActivator();
}

here is OSGiProjectImpl.java

import br.ufrgs.rmpestano.intrabundle.facet.OSGiFacet;
import org.jboss.forge.project.BaseProject;
import org.jboss.forge.project.Project;
import org.jboss.forge.project.Facet;
import org.jboss.forge.project.facets.FacetNotFoundException;
import org.jboss.forge.project.services.ProjectFactory;
import org.jboss.forge.resources.DirectoryResource;
import org.jboss.forge.resources.Resource;

import javax.enterprise.inject.Typed;
import java.util.ArrayList;
import java.util.List;

@Typed()
public class OSGiProjectImpl extends BaseProject implements OSGiProject,Project {
    private DirectoryResource projectRoot = null;
    private final ProjectFactory factory;
    private List<OSGiModule> modules;

    public OSGiProjectImpl() {
        factory = null;
    }

    public OSGiProjectImpl(final ProjectFactory factory, final DirectoryResource dir) {
        this.factory = factory;
        this.projectRoot = dir;
    }

    @Override
    public <F extends Facet> F getFacet(final Class type) {
        try {
            return super.getFacet(type);
        } catch (FacetNotFoundException e) {
            factory.registerSingleFacet(this, type);
            return super.getFacet(type);
        }
    }

    public List<OSGiModule> getModules() {
        if (modules == null) {
            modules = initalizeModules();
        }
        return modules;
    }

    private List<OSGiModule> initalizeModules() {
        List<OSGiModule> modulesFound = new ArrayList<>();
        OSGiFacet osgi = getFacet(OSGiFacet.class);
        List<Resource<?>> metaInfList = new ArrayList<Resource<?>>();
        osgi.getMetaInfDirectories(this.getProjectRoot(), metaInfList, 0);
        for (Resource<?> resource : metaInfList) {
            OSGiModule osGiModule = new OSGiModuleImpl(factory, (DirectoryResource) resource.getParent());
            modulesFound.add(osGiModule);
        }
        return modulesFound;
    }

    @Override
    public DirectoryResource getProjectRoot() {
        return projectRoot;
    }

    @Override
    public boolean exists() {
        return (projectRoot != null) && projectRoot.exists();
    }

    @Override
    public String toString() {
        return "OSGiProjectImpl [" + getProjectRoot() + "]";
    }

}

and OSGiModuleImpl.java

import org.jboss.forge.project.BaseProject;
import org.jboss.forge.project.Facet;
import org.jboss.forge.project.facets.FacetNotFoundException;
import org.jboss.forge.project.services.ProjectFactory;
import org.jboss.forge.resources.DirectoryResource;
import org.jboss.forge.resources.FileResource;
import org.jboss.forge.resources.Resource;

import javax.enterprise.inject.Typed;
import java.io.File;
import java.io.IOException;
import java.io.RandomAccessFile;

@Typed()
public class OSGiModuleImpl extends BaseProject implements OSGiModule {
    private DirectoryResource projectRoot = null;
    private final ProjectFactory factory;
    private Long totalLoc;
    private Boolean usesDeclarativeServices;
    private FileResource<?> activator;
    private FileResource<?> manifest;

    public OSGiModuleImpl() {
        factory = null;
    }

    public OSGiModuleImpl(final ProjectFactory factory, final DirectoryResource dir) {
        this.factory = factory;
        this.projectRoot = dir;
    }

    @Override
    public <F extends Facet> F getFacet(final Class type) {
        try {
            return super.getFacet(type);
        } catch (FacetNotFoundException e) {
            factory.registerSingleFacet(this, type);
            return super.getFacet(type);
        }
    }

    @Override
    public DirectoryResource getProjectRoot() {
        return projectRoot;
    }

    @Override
    public boolean exists() {
        return (projectRoot != null) && projectRoot.exists();
    }

    @Override
    public String toString() {
        return getProjectRoot().toString();
    }

    private FileResource<?> findActivator() throws IOException {
        RandomAccessFile randomAccessFile;
        File f = new File(getManifest().getFullyQualifiedName());
        randomAccessFile = new RandomAccessFile(f, "r");

        String line;
        while((line = randomAccessFile.readLine()) != null){
            if (line.contains("Bundle-Activator:")) {
               break;
            }
        }
        if(line == null){
            return null;//no activator
        }
        String actvatorPath = line.trim().substring(line.indexOf("Bundle-Activator:") + 17);
        actvatorPath = actvatorPath.trim().replaceAll("\\.","/");
        if(!actvatorPath.startsWith("/")){
            actvatorPath = "/" +actvatorPath;
        }
        actvatorPath = "/src"+actvatorPath;
        Resource<?> activator = getProjectRoot().getChild(actvatorPath.concat(".java"));
        if(activator == null || !activator.exists()){
            throw new RuntimeException("Could not find activator class at "+getProjectRoot() + actvatorPath);
        }

        return (FileResource<?>) activator;

    }

    private Long countModuleLines(DirectoryResource projectRoot) {
        for (Resource<?> resource : projectRoot.listResources()) {
            if (resource instanceof FileResource<?> && resource.getName().endsWith(".java")) {
                try {
                    this.totalLoc += countFileLines((FileResource<?>) resource);
                } catch (Exception e) {
                    e.printStackTrace();
                }
            } else if (resource instanceof DirectoryResource) {
                this.totalLoc = countModuleLines((DirectoryResource) resource);
            }
        }
        return totalLoc;
    }

    private Long countFileLines(FileResource<?> resource) throws IOException {
        RandomAccessFile file = new RandomAccessFile(new File(resource.getFullyQualifiedName()), "r");
        Long total = new Long(0);
        String line;
        while ((line = file.readLine()) != null) {
            total++;
        }
        return total;
    }

    private boolean usesDeclarativeServices() {
        Resource<?> OSGiInf = getProjectRoot().getChild("OSGI-INF");
        return OSGiInf.exists() && OSGiInf.getChild("service.xml").exists();
    }

    //getters

    public Boolean getUsesDeclarativeServices() {
        if (usesDeclarativeServices == null) {
            usesDeclarativeServices = usesDeclarativeServices();
        }
        return usesDeclarativeServices;
    }

    @Override
    public FileResource<?> getActivator() {
        if (activator == null) {
            try {
                activator = findActivator();
            } catch (IOException e) {
                throw new RuntimeException("Could not find activator class");
            }
        }
        return activator;
    }

    @Override
    public FileResource<?> getManifest() {
        if (manifest == null) {
            manifest = findManifest();
        }
        return manifest;
    }

    private FileResource<?> findManifest() {
        Resource<?> metaInf = getProjectRoot().getChild("META-INF");
        if (metaInf == null || !metaInf.exists()) {
            throw new RuntimeException("OSGi project(" + getProjectRoot().getFullyQualifiedName() + ") without META-INF directory cannot be analysed by intrabundle");
        }
        Resource<?> manifest = metaInf.getChild("MANIFEST.MF");
        if (manifest == null || !manifest.exists()) {
            throw new RuntimeException("OSGi project(" + getProjectRoot().getFullyQualifiedName() + ") without MANIFEST.MF file cannot be analysed by intrabundle");
        }
        return (FileResource<?>) manifest;
    }

    public Long getLinesOfCode() {
        if (totalLoc == null) {
            totalLoc = new Long(0L);
            totalLoc = countModuleLines(getProjectRoot());
        }
        return totalLoc;
    }
}

Im not get into details of implementation here but you can see that i’m just using forge api and standard java file manipulation to get module information, such as its location, lines of code and so on.

Back to OSGi project locator, the guy that will in fact create the OSGiProject and add minimal pom.xml(the pom.xml file to be added must be in intrabundle/src/main/resources folder):

@Singleton
public class OSGiProjectLocator implements ProjectLocator {

    private final ProjectFactory factory;

    private final Instance osgiFacetInstance;

    @Inject
    Shell shell;

    @Inject
    public OSGiProjectLocator(final ProjectFactory factory, @Any final Instance osgiFacet) {
        this.factory = factory;
        this.osgiFacetInstance = osgiFacet;
    }

    @Override
    public Project createProject(DirectoryResource directoryResource) {
        OSGiFacet osgi = osgiFacetInstance.get();
        OSGiProjectImpl result = new OSGiProjectImpl(factory, directoryResource);
        osgi.setProject(result);
        /* we are not going to install OSGi projects, only inspect existing ones
        if (!osgi.isInstalled()) {
            result.installFacet(osgi);
        } else    */
        result.registerFacet(osgi);

        if (!result.hasFacet(OSGiFacet.class)) {
            return null;
        }
	//FORGE limitation of having a pom.xml in project root
        if(!directoryResource.getChild("pom.xml").exists()){
            FileResource<?> pom = (FileResource<?>) directoryResource.getChild("pom.xml");
            pom.setContents(getClass().getResourceAsStream("/pom.xml"));
        }
        shell.println(ShellColor.YELLOW,"OSGi project detected, type osgi + tab to list available commands");
        return result;
    }

    @Override
    public boolean containsProject(DirectoryResource directoryResource) {
        return osgiFacetInstance.get().isOSGiProject(directoryResource);
    }
}

So the locator will act only on projects that satisfies OSGiFacet. Forge will use it automactly cause ProjectFactory iterates over all classes implementing ProjectLocator which is the case of OSGiProjectLocator.

Now install the project again and you are ready to use our plugin in non maven based projects.

Implementing OSGiPlugin commands

As you saw in OSGiModuleImpl we already have some methods that get information from OSGi projects but how our plugin can access OSGiProject and get its modules?

As forge leverages CDI programming model we just Inject current project in the plugin with @Inject OSGiProject project. Cause we created OSGiProjectImpl via new operator in OSGiProjectLocator CDI is not aware of it so we need to produce OSGiProject so CDI can handle its injection.

We will produce it in OSGiFacet which holds current OSGi project(setted by OSGiProjectLocator#createProject), here is the producer method:

@Produces
public OSGiProject getCurrentOSGiProject() {
    return (OSGiProject) getProject();
}

now we have access to the current OSGiProject and its modules via CDI Injection, here is the OSGiPlugin commands implementation:

package br.ufrgs.rmpestano.intrabundle.plugin;

import br.ufrgs.rmpestano.intrabundle.facet.OSGiFacet;
import br.ufrgs.rmpestano.intrabundle.i18n.ResourceBundle;
import br.ufrgs.rmpestano.intrabundle.model.OSGiModule;
import br.ufrgs.rmpestano.intrabundle.model.OSGiProject;
import org.jboss.forge.shell.ShellColor;
import org.jboss.forge.shell.ShellPrompt;
import org.jboss.forge.shell.plugins.*;

import javax.enterprise.inject.Instance;
import javax.inject.Inject;
import java.util.List;

@Alias("osgi")
@RequiresFacet(OSGiFacet.class)
public class OsgiPlugin implements Plugin {

    @Inject
    private ShellPrompt prompt;

    @Inject
    OSGiProject project;

    @Inject
    @Current
    Instance<ResourceBundle> resourceBundle;

    @DefaultCommand
    public void defaultCommand(@PipeIn String in, PipeOut out) {
        out.println(ShellColor.YELLOW, resourceBundle.get().getString("osgi.defaultCommand"));
    }

    @Command(value = "countBundles")
    public void countBundles(@PipeIn String in, PipeOut out) {
        out.println("Total number of bundles:" + getModules().size());
    }

    @Command(value = "listBundles")
    public void listBundles(@PipeIn String in, PipeOut out) {
        for (int i = 0; i < getModules().size(); i++) {
            out.println("bundle(" + i + "):" + getModules().get(i).getProjectRoot());
        }
    }

    @Command(value = "loc", help = "count lines of code of all bundles")
    public void loc(@PipeIn String in, PipeOut out) {
        long total = 0;
        for (int i = 0; i < getModules().size(); i++) {
            long loci = getModules().get(i).getLinesOfCode();
            out.println(getModules().get(i).getProjectRoot() + ":" + loci);
            total += loci;
        }
        out.println("Total lines of code:" + total);
    }

    @Command(value = "usesDeclarativeServices", help = "list modules that use declarative services")
    public void usesDeclarativeServices(@PipeIn String in, PipeOut out) {
        out.println(resourceBundle.get().getString("osgi.declarativeServices"));
        for (OSGiModule module: getModules()) {
            if(module.getUsesDeclarativeServices()){
                out.println(module.toString());
            }
        }
    }

    @Command(value = "listActivators", help = "list modules activator classes")
    public void listActivators(@PipeIn String in, PipeOut out) {
        out.println(resourceBundle.get().getString("osgi.listActivators"));
        for (OSGiModule module: getModules()) {
             out.println(module.toString()+":"+(module.getActivator() != null ? module.getActivator().getFullyQualifiedName() : resourceBundle.get().getString("osgi.no-activator")));
        }
    }

    public List getModules() {
        return project.getModules();
    }

}

Conclusion

Jboss Forge is a great tool, has great API for manipulating projects and a very nice and easy to understand architecture. We saw here a simple plugin that inspects OSGi project files and structure, your imagination is the limit for creating plugins.

I hope you enjoy.

References

[1]http://forge.jboss.org/index.html
[2]https://github.com/forge/core
[3]www.osgi.org/‎
[4]http://www.osgi.org/Technology/HowOSGi
[5]https://rpestano.wordpress.com/2013/03/14/hello-osgi/
[6]http://docs.jboss.org/weld/reference/latest/en-US/html/
[7]http://forge.jboss.org/docs/using/#content
[8]http://forge.jboss.org/docs/plugin_development/facets.html#content
[9] Arquillian.org/
[10]www.dcc.ufmg.br/~mtov/osgi_example.zip
[11]http://forge.jboss.org/docs/plugin_development/test-plugins.html#content
[12]Forge1 zip distribution

[13]http://homepages.dcc.ufmg.br/~mtov/pub/2008_sen.pdf

CDI Generic Dao

UPDATE(November/2014):

Just added multiple datasources support to our example application, see here details: https://rpestano.wordpress.com/2014/11/04/cdi-crud-multi-tenancy/

<END UPDATE>

UPDATE(September/2014):

I’ve revisited this post some days ago and refactored the code, of course i keeped everything here and the github repo untouched so you can folow this post. Instead i’ve created a new github repository to share the new code there, i have updated the apis, added some (arquillian)tests, new functionality like true pagination and new apis(hibernate and deltaspike) and now it works on wildfly, Glassfish and Jboss7. I’ve also changed a bit of the project structure but the idea is the same, the code can be found here: cdi-crud.

<END UPDATE>

In this post i will show how to implement basic CRUD operations using a generic dao based on CDI beans. Its not the purpose to discuss about the Dao pattern itself, there are already long discussions about it see[1],[2],[3] and [4]. For a good and detailed introduction about the pattern see [5].

The source code can be found here: https://github.com/rmpestano/cdi-generic-dao and also as usual there is a video produced by this post here: http://www.youtube.com/watch?v=9mGQx0tjxgo&feature=youtu.be

Show me the code

lets get hands dirty with some code, here is the classic BaseEntity which in this case will hold only the primary key of our JPA entities:

@MappedSuperclass
public abstract class BaseEntity<ID> {

    @Id
    @GeneratedValue(strategy = GenerationType.AUTO)
    private ID id;

    public ID getId() {
        return id;
    }

    public void setId(ID id) {
        this.id = id;
    }

}

Next is the simple Car entity:

@Entity
public class Car extends BaseEntity<Integer>{

    private String model;
    private Double price;

 //getter & setters

Now the so called generic dao:

@Stateless
@Named("baseDao")
public class<T extends BaseEntity<ID>, ID>> implements Serializable {

    @PersistenceContext
    private EntityManager entityManager;
    private Class entityClass;

    public EntityManager getEntityManager() {
        return entityManager;
    }

    public void setEntityManager(EntityManager entityManager) {
        this.entityManager = entityManager;
    }

    public Class getEntityClass() {
        if (entityClass == null) {
            //only works if one extends BaseDao, we will take care of it with CDI
            entityClass = (Class) ((ParameterizedType) getClass().getGenericSuperclass()).getActualTypeArguments()[0];
        }
        return entityClass;
    }

    public void setEntityClass(Class entityClass) {
        this.entityClass = entityClass;
    }

    //utility database methods
    @TransactionAttribute(TransactionAttributeType.SUPPORTS)
    public T find(ID id) {
        return (T) this.entityManager.find(getEntityClass(), id);
    }

    public void delete(ID id) {
        Object ref = this.entityManager.getReference(getEntityClass(), id);
        this.entityManager.remove(ref);
    }

    public T update(T t) {
        return (T) this.entityManager.merge(t);
    }

    public void insert(T t) {
        this.entityManager.persist(t);
    }

    @TransactionAttribute(TransactionAttributeType.SUPPORTS)
    public List findAll() {
        return entityManager.createQuery("Select entity FROM "+getEntityClass().getSimpleName() +" entity").getResultList();
    }
//more utility methods

It is an usual dao, it has a PersistenceContext injected by the EJB container, it’s a Stateless Session Bean, have some utility database methods but it cannot work alone, you need to feed it with an Entity so it can perform database operations.

There are several ways to provide an entity to a generic dao, via constructor, annotation, extending it etc. In this example we provide the Entity by two manners, extending it and via CDI producer.

Extending the Dao

Here is a generic dao specialization, the CarDao:

@Stateless
public class CarDao extends BaseDao<Car, Integer>{
    //put here specific car business
}

The entity is passed to BaseDao in getEntityClass() via ParameterizedType and now we can inject our dao in any bean:


public class SomeBean{
    @Inject
    CarDao carDao;

    public void createCar(Car car){
      carDao.insert(car);
    }
}

this is good cause you usually will extend the base dao to add some more complex operations then CRUD but sometimes you just need the CRUD and you will have to create empty daos extending the BaseDao just to benefit from its utility methods, what i really want is to inject the BaseDao directly:

   @Inject
   BaseDao<Car,Integer> baseCarDao;

The only thing that prohibits me from doing that is that the ParameterizedType can only be got from a superclass with getGenericSuperclass()(the only way I know 😉 ), at least before CDI came.

Producing the Dao

We are going to set the entity in a CDI producer, to do that we provide a qualifier that will tell CDI we want a produced Dao not the real one, here is the Dao qualifier:

@Qualifier
@Retention(RUNTIME)
@Target({FIELD,PARAMETER,METHOD,TYPE})
public @interface Dao {
    }

And here is our producer responsible of creating a baseDao with a setted Entity based on the InjectionPoint ParameterizedType:

public class DaoProducer implements Serializable {

    private static final long serialVersionUID = 1L;

    @Produces
    @Dependent//must be dependent pseudoScope cause baseDao is a SLB
    @Dao
    public <ID, T extends BaseEntity> BaseDao<T, ID> produce(InjectionPoint ip, BeanManager bm) {
        if (ip.getAnnotated().isAnnotationPresent(Dao.class)) {
            BaseDao<T, ID> genericDao = (BaseDao<T, ID>)  this.getBeanByName("baseDao", bm);//ask bean manager for a instance of GenericDao
            ParameterizedType type = (ParameterizedType) ip.getType();
            Type[] typeArgs = type.getActualTypeArguments();
            Class<T> entityClass = (Class<T>) typeArgs[0];
            genericDao.setEntityClass(entityClass);
            return genericDao;
        }
        throw new IllegalArgumentException("Annotation @Dao is required when injecting BaseDao");
    }

    public Object getBeanByName(String name, BeanManager bm) { // eg. name=availableCountryDao{
        Bean bean = bm.getBeans(name).iterator().next();
        CreationalContext ctx = bm.createCreationalContext(bean); // could be inlined below
        Object o = bm.getReference(bean, bean.getBeanClass(), ctx); // could be inlined with return
        return o;
    }

}

The ‘secret’ here is that we are infering the Dao’s entity in InjectionPoint so when we Inject BaseDao with:

  @Inject @Dao BaseDao<User,Long> baseUserDao;

the (Class<T>) typeArgs[0]; in line 13 will return User.class and we set it in the Dao. Below is a concrete bean using our Dao to perform crud operations:

@Named
@ViewAccessScoped
public class CarBean implements Serializable{

    private List carList;
    private List filteredValue;//datatable filteredValue attribute
    private Integer id;
    private Car car;

    @Inject CarDao carDao;

    @Inject @Dao
    BaseDao<Car,Integer> genericDao;//reuse generic dao for basic crud operation in various entities
//    @Inject @Dao
//    BaseDao<Person,Long> genericDao;
//    @Inject @Dao
//    BaseDao<Client,IDClass> genericDao;

    @PostConstruct
    public void init(){
        if(genericDao.findAll().isEmpty()){
            for (int i = 1; i < 10; i++) {
                Car c = new Car("Car"+i, i);
                genericDao.insert(c);
            }
        }
        //same as above
//         if(carDao.findAll().isEmpty()){
//            for (int i = 0; i < 10; i++) {
//                Car c = new Car("Car"+i, i);
//                carDao.insert(c);
//            }
//        }

    }

    public List getCarList(){
        if(carList == null){
            carList = carDao.findAll();
        }
        return carList;
    }

    public Integer getId() {
        return id;
    }

    public void setId(Integer id) {
        this.id = id;
    }

    public Car getCar() {
        if(car == null){
            car = new Car();
        }
        return car;
    }

    public void setCar(Car car) {
        this.car = car;
    }

    public void findCarById(Integer id){
         car = genericDao.find(id);
    }

    public List getFilteredValue() {
        return filteredValue;
    }

    public void setFilteredValue(List filteredValue) {
        this.filteredValue = filteredValue;
    }

    public void remove(){
        if(car != null && car.getId() != null){
            genericDao.delete(car.getId());
            FacesContext.getCurrentInstance().addMessage(null, new FacesMessage("Car "+car.getModel() +" removed successfully"));
            clear();
        }
    }

    public void update(){
        String msg;
        if(car.getId() == null){
             genericDao.insert(car);
             msg = "Car "+car.getModel() +" created successfully";
        }
        else{
           genericDao.update(car);
           msg = "Car "+car.getModel() +" updated successfully";
        }
        FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(msg));
        clear();//reload car list
    }

    public void clear(){
        car = new Car();
        carList = null;
        id = null;
    }

    public void onRowSelect(SelectEvent event) {
        setId(((Car) event.getObject()).getId());
        findCarById(getId());
    }

    public void onRowUnselect(UnselectEvent event) {
        car = new Car();
    }
}

Addendum

Just a small addendum(which is not that small) there was an issue with the above implementation which showed up when using BaseDao to crud more than one entity in the same bean. The bug was caused because i was storing EntityClass in a stateless session bean(BaseDao), although CDI produces different instances for each injection point the container was returning the same session bean(from the pool of BaseDaos) for each produced bean. The solution was to remove EntityClass from BaseDao and put it in a CDI Bean called CrudDao which has a composion association with BaseDao making it really stateless(as it should be).Here is the CrudDao:

@Named("crudDao")
public class CrudDao<T extends BaseEntity<ID>, ID> implements Serializable{

    @Inject
    protected BaseDao<T,ID> dao;

    protected Class<T> entityClass;

   public Class<T> getEntityClass() {
        if (entityClass == null) {
            //only works if one extends CrudDao, we will take care of it with CDI
            entityClass = (Class<T>) ((ParameterizedType) getClass().getGenericSuperclass()).getActualTypeArguments()[0];
        }
        return entityClass;
    }

    public void setEntityClass(Class<T> entityClass) {
        this.entityClass = entityClass;
    }

    public EntityManager getEntityManager(){
        return dao.getEntityManager();
    }

    public T find(ID id){
        return dao.find(id, getEntityClass());
    }

    public void delete(ID id){
         dao.delete(id, getEntityClass());
    }

    public T update(T t){
        return dao.update(t);
    }

    public void insert(T t){
        dao.insert(t);
    }

    public List<T> findAll(){
        return dao.findAll(getEntityClass());
    }

    public List<T> findWithNamedQuery(String namedQueryName){
        return dao.findWithNamedQuery(namedQueryName);
    }
}

References:

[1]http://www.adam-bien.com/roller/abien/entry/generic_crud_service_aka_dao

[2http://www.infoq.com/news/2007/09/jpa-dao

[3]http://www.adam-bien.com/roller/abien/entry/jpa_ejb3_killed_the_dao

[4]http://www.rponte.com.br/2009/06/08/no-more-daos/ [PT_BR]

[5]]http://tutorials.jenkov.com/java-persistence/dao-design-pattern.html

CDI Custom Scope

In this post i will show how to create a custom CDI scope by performing the following steps:

  1. register the scope in a CDI Extension
  2. manage the scope through a CDI context
  3. show how to store and keep custom scoped beans ‘alive’
  4. ‘kill’ custom beans through CDI events

Check the source code of this entry at: https://github.com/rmpestano/cdi-custom-scope.
also you can see a video produced by this blog entry code here

First of all lets talk about CDI scopes, which are responsible by the lifecycle of CDI beans. Every scope is bound to a CDI context[1] which in turn manages the scope and states when the scope is active and passive. The context also holds all instances of the beans associated to the scope it manages.

For example the built in RequestScope is bound to a http request, the RequestContext must ensure that all beans in this scope will be alive during a http request and be passivated at the end of it, you can see weld http based contexts in [4].

Creating the Custom Scope

We will create a CDI Custom Scope which will be activated when a bean in this scope is referenced through EL and or injected in another bean and it will ‘die’ when a specific CDI event is fired.

First thing to do is create an annotation which will represent our scope:

@Scope
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE,ElementType.METHOD,ElementType.FIELD})
public @interface MyScope {}

Now we add our scope and register the context that will manage it  in a CDI extension:

public class CustomScopeExtension implements Extension, Serializable {
    public void addScope(@Observes final BeforeBeanDiscovery event) {
        event.addScope(MyScope.class, true, false);
    }
    public void registerContext(@Observes final AfterBeanDiscovery event) {
        event.addContext(new CustomScopeContext());
    }
}

All the logic of our scope is in the CustomScopeContext class:

public class CustomScopeContext implements Context, Serializable {

    private Logger log = Logger.getLogger(getClass().getSimpleName());

    private CustomScopeContextHolder customScopeContextHolder;

    public CustomScopeContext() {
        log.info("Init");
        this.customScopeContextHolder = CustomScopeContextHolder.getInstance();
    }

    @Override
    public <T> T get(final Contextual<T> contextual) {
        Bean bean = (Bean) contextual;
        if (customScopeContextHolder.getBeans().containsKey(bean.getBeanClass())) {
            return (T) customScopeContextHolder.getBean(bean.getBeanClass()).instance;
        } else {
            return null;
        }
    }

    @Override
    public <T> T get(final Contextual<T> contextual, final CreationalContext<T> creationalContext) {
        Bean bean = (Bean) contextual;
        if (customScopeContextHolder.getBeans().containsKey(bean.getBeanClass())) {
            return (T) customScopeContextHolder.getBean(bean.getBeanClass()).instance;
        } else {
            T t = (T) bean.create(creationalContext);
            CustomScopeInstance customInstance = new CustomScopeInstance();
            customInstance.bean = bean;
            customInstance.ctx = creationalContext;
            customInstance.instance = t;
            customScopeContextHolder.putBean(customInstance);
            return t;
        }
    }

    @Override
    public Class<? extends Annotation> getScope() {
        return MyScope.class;
    }

    public boolean isActive() {
        return true;
    }

    public void destroy(@Observes KillEvent killEvent) {
        if (customScopeContextHolder.getBeans().containsKey(killEvent.getBeanType())) {
            customScopeContextHolder.destroyBean(customScopeContextHolder.getBean(killEvent.getBeanType()));
        }
    }
}

The Mandatory methods you must implement are getScope() which must return the qualifier bound to this scope, two get() implementaions that i will talk later and isActive() which states when the scope should be taken into account, for example Seam3 viewScope[5] is active only when viewRoot is rendered.

The idea behind the context is that its GET methods will be called every time a bean associated with the context is injected in or is referenced though expression language in a page.  The GET with CreationalContext  parammeter will be called only the first time the bean is referenced so it can be created.

Every time a bean is referenced the context will look in customScopeContextHolder to verify if the bean is already created, in positive case  it will return the instance otherwise it will create one and add into the list of beans managed by customScopeContextHolder which is a singleton class that holds the list of CDI beans associated with our custom scope, note that you might create your own logic to hold your context’s beans. Also note that our context is observing a CDI event to remove beans from the context which is another important thing you must take care when creating your own scope, see destroy method.

Below is the bean ContextHolder code:


public class CustomScopeContextHolder implements Serializable {

    private static CustomScopeContextHolder INSTANCE;
    private Map<Class, CustomScopeInstance> beans;//we will have only one instance of a type so the key is a class

    private CustomScopeContextHolder() {
        beans = Collections.synchronizedMap(new HashMap<Class, CustomScopeInstance>());
    }

    public synchronized static CustomScopeContextHolder getInstance() {
        if (INSTANCE == null) {
            INSTANCE = new CustomScopeContextHolder();
        }
        return INSTANCE;
    }

    public Map<Class, CustomScopeInstance> getBeans() {
        return beans;
    }

    public CustomScopeInstance getBean(Class type) {
        return getBeans().get(type);
    }

    public void putBean(CustomScopeInstance customInstance) {
        getBeans().put(customInstance.bean.getBeanClass(), customInstance);
    }

    void destroyBean(CustomScopeInstance customScopeInstance) {
        getBeans().remove(customScopeInstance.bean.getBeanClass());
        customScopeInstance.bean.destroy(customScopeInstance.instance, customScopeInstance.ctx);
    }

    /**
* wrap necessary properties so we can destroy the bean later:
*
* @see
* CustomScopeContextHolder#destroyBean(custom.scope.extension.CustomScopeContextHolder.CustomScopeInstance)
*/
    public static class CustomScopeInstance<T> {

        Bean<T> bean;
        CreationalContext<T> ctx;
        T instance;
    }
}

Holding bean references

Note that CustomScopeContext is a normal java class, it is instantiated via new operator, so it is not a good ideia storing our bean references in it because CDI will instantiate it as it needs losing the bean instances stored in it(see Martin’s comment). For example when a bean(in our context) participate on a CDI event the CDI container will create a thread of the context to attend the event so because of this we introduced the CustomScopeContextHolder to manage our bean instances. There are other approaches such as ThreadLocals[6], http session, JSF viewroot[5], static variables etc… I decided to use a singleton something like Deltaspike TransactionContext[7] but faraway simpler.

Conclusion

We saw here a powerful mechanism provided by CDI to manage our beans lifecycle. In 90% of the cases the built in scopes will be sufficient but having the hability to extend the plataform to allow things such as CODI scopes[8] or Jaxb objects handled by a CDI scope[9][10] is priceless.

Now to use our scope we just declare a bean with @MyScope and inject it in another bean, to see the CustomScope in action visit: http://www.youtube.com/watch?v=2JkZFIQqrVo or clone the code at https://github.com/rmpestano/cdi-custom-scope.


References:

  1. http://docs.jboss.org/weld/reference/latest/en-US/html/scopescontexts.html
  2. http://adventuresintechology.blogspot.com.br/2012/04/custom-cdi-scopes.html
  3. http://www.verborgh.be/articles/2010/01/06/porting-the-viewscoped-jsf-annotation-to-cdi/
  4. https://github.com/weld/core/tree/2.0/impl/src/main/java/org/jboss/weld/context/http
  5. https://github.com/seam/faces/blob/develop/impl/src/main/java/org/jboss/seam/faces/context/ViewScopedContext.java
  6. https://github.com/aaronanderson/cdi-scope-test/blob/master/src/main/java/com/github/FooCDIContextImpl.java
  7. https://github.com/apache/deltaspike/blob/master/deltaspike/modules/jpa/impl/src/main/java/org/apache/deltaspike/jpa/impl/transaction/context/TransactionContext.java
  8. https://cwiki.apache.org/confluence/display/EXTCDI/JSF+Usage#JSFUsage-Scopes
  9. https://github.com/aaronanderson/jaxb-cdi
  10. http://adventuresintechology.blogspot.com.br/2013/06/more-on-cdi-scopes.html