Smooth and powerful Swagger integration via JaxRS Analyzer

TL;DR – watch this video: https://youtu.be/aKShM1AUbIU

This post is an update from “Some words on JavaEE, REST and Swagger” and will show the following:

  • Generating Swagger spec file via JaxRS Analyzer
  • Executable API documentation with Swagger UI
  • Automate everything with Swagger Addon

Generating Swagger 2.0 spec based on your REST endpoints

We will start with the most important (and easy) step shown in this post.

Let’s suppose that you have the following (JaxRS) REST endpoint which intends to update a given car:

@Path("/cars")
@Produces("application/json;charset=utf-8")
public class CarEndpoint {

    /**
     * @param id The identifier of the car to be updated
     * @param entity the changes to be applied
     */
    @PUT
    @Path("/{id:[0-9][0-9]*}")
    @Consumes("application/json")
    public Response update(@PathParam("id") Integer id, Car entity) {
        if (entity == null) {
            return Response.status(Status.BAD_REQUEST).build();
        }
        if (!id.equals(entity.getId())) {
            return Response.status(Status.CONFLICT).entity(entity).build();
        }
        if (carService.crud().eq("id", id).count() == 0) {
            return Response.status(Status.NOT_FOUND).build();
        }
        try {
            carService.update(entity);
        } catch (OptimisticLockException e) {
            return Response.status(Status.CONFLICT).entity(e.getEntity()).build();
        }

        return Response.noContent().build();
    }
}

 

Now to generate the swagger specification from this endpoint you can use the JaxRS Analyzer. JaxRS Analyzer will use byte code analysis to infer the endpoint payload, response status, response types and so on. It also reads javadoc tags to enrich endpoint and parameters descriptions.

We will use the analyzer maven plugin to generate swagger spec file at build time:

 

<plugin>
    <groupId>com.sebastian-daschner</groupId>
    <artifactId>jaxrs-analyzer-maven-plugin</artifactId>
    <version>0.14</version>
    <executions>
        <execution>
            <goals>
                <goal>analyze-jaxrs</goal>
            </goals>
        </execution>
    </executions>
    <configuration>
        <backend>swagger</backend>
        <resourcesDir>cdi-crud/apidocs</resourcesDir>
    </configuration>
</plugin>

The minimal configuration for the plugin will require a backend which can be swagger, plain text (default) and asciidoc.

The resourcesDir, which defaults to jaxrs-analyzer, is a directory (relative to build directory) where swagger specification file will be generated. Other configuration attributes can be found here.

For our example, after maven build the swagger.json file will be generated at target/cdi-crud/apidocs, here is its content:

 

{
    "swagger":"2.0",
    "info":{
        "version":"4.0.0",
        "title":"cdi-crud"
    },
    "host":"",
    "basePath":"/cdi-crud/rest",
    "schemes":[
        "http"
    ],
    "tags":[
        {
            "name":"cars"
        }
    ],
    "paths":{
        "/cars/{id}":{
            "put":{
                "consumes":[
                    "application/json"
                ],
                "produces":[
                    "application/json;charset=utf-8"
                ],
                "parameters":[
                    {
                        "type":"integer",
                        "name":"id",
                        "in":"path",
                        "required":true,
                        "description":"The identifier of the car to be updated"
                    },
                    {
                        "name":"body",
                        "in":"body",
                        "required":true,
                        "schema":{
                            "$ref":"#/definitions/Car"
                        },
                        "description":"the changes to be applied"
                    }
                ],
                "responses":{
                    "204":{
                        "description":"No Content",
                        "headers":{
                        }
                    },
                    "400":{
                        "description":"Bad Request",
                        "headers":{
                        }
                    },
                    "404":{
                        "description":"Not Found",
                        "headers":{
                        }
                    },
                    "409":{
                        "description":"Conflict",
                        "headers":{
                        },
                        "schema":{
                            "$ref":"#/definitions/Car"
                        }
                    }
                },
                "tags":[
                    "cars"
                ]
            }
        }
    },
    "definitions":{
        "Car":{
            "properties":{
                "id":{
                    "type":"integer"
                },
                "model":{
                    "type":"string"
                },
                "name":{
                    "type":"string"
                },
                "price":{
                    "type":"number"
                },
                "version":{
                    "type":"integer"
                }
            }
        }
    }
}

As you can see the analyzer was able to infer all response codes from our endpoint as well as the payload. It also could read the javadoc and added parameters description.

You can paste the above json content into swagger editor to see how it looks like.

Executable API documentation with Swagger UI

A nice way to expose your API documentation is through Swagger UI. It is mainly a javascript library which will read your swagger.json spec file and turn it into a nice executable api documentation. As you are generating the swagger spec at build time it turns out that this documentation is always updated.

The steps to add swagger ui (as documented in the official site) is mainly copy the library to your project. I’ve done that and made a small change to make swagger ui read the swagger.json from within our application instead of a remote URL.

Here is how our sample endpoint is rendered in swagger ui:

cdi-crud-rest-api

 

You can also browse the API docs live on Openshift here (it may be unavailable on your first access because of my openshift free account [it shuts the app down when there is no access in a day], just wait a few minutes and try again).

Automating everything with JBoss Forge Swagger Addon

All the above steps described can be automated with Swagger Addon as showed in the video on the beginning of this post.

Conclusion

We saw that creating and maintaining an executable API documentation based on JaxRS endpoints is quite easy (thanks to JaxRS analyzer). You can even automate that with the Swagger Addon.

All code is available on github here.

 

A Simple Java EE Docker Example

read the asciidoc based version of this post here.

 

In this post we will play a bit with docker in the context of Java EE. Here is what we will do:

  • Create, build and run a docker image;
  • the image will start a wildfly server within a JavaEE sample application deployed;
  • show some docker commands;
  • start multiple containers to see the same app running on different ports.

Introduction

I will not introduce docker as there are already many good references on the subject. To create this post i’ve read the following tutorials:

  1.  Docker userguide;
  2. Working with docker images;
  3.  this great post: Docker begginers tutorial;
  4. Arun Gupta’s tech tips: #39, #57#61 and #65.

Pre requisites

To run this tutorial you will need:

  • A docker daemon running on your host machine
    •  after installing docker add this lines in ‘etc\default\docker‘ file: DOCKER_OPTS=”-H tcp://127.0.0.1:2375 -H unix:///var/run/docker.sock”
    •  after that, restart your machine and try to run the command:  docker -H tcp://127.0.0.1:2375 –version
      the output must be something like: Docker version 1.4.1, build 5bc2ff8
  • A wildfly 8.2.0 installation(unziped);
  • jdk-8u25-linux-x64.tar.gz file;
  • car-service.war available here;
  • Dockerfile available here.

Creating the Docker image

Docker images represent/describe the container itself. As i got limited internet access(3g from my cellphone) i have created an image using resources from my local machine. So the image will only work if it is build in a directory containing the following files:

  • wildfly-8.2.0.Final: the application server
  • car-service.war: the app we will deploy
  • Dockerfile: the file describing this container
  • jdk-8u25-linux-x64.tar.gz: the java version we will install in the container

NOTE: It is not good practice to use fixed resources in a docker container as it will only work if the files are present during the image build. The best approach is to install everything from scratch and download necessary files. Here is an example of docker file that download/install/deploy an app into wildfly 10 without using local files.

Here is the Dockerfile content:

FROM ubuntu
MAINTAINER Rafael Pestano &amp;lt;rmpestano@gmail.com&amp;gt;

# setup WildFly
COPY wildfly-8.2.0.Final /opt/wildfly

# install example app on wildfy
COPY car-service.war /opt/wildfly/standalone/deployments/

# setup Java

RUN mkdir /opt/java

COPY jdk-8u25-linux-x64.tar.gz /opt/java/

# change dir to Java installation dir

WORKDIR /opt/java/

RUN tar -zxf jdk-8u25-linux-x64.tar.gz

# setup environment variables

RUN update-alternatives --install /usr/bin/javac javac /opt/java/jdk1.8.0_25/bin/javac 100

RUN update-alternatives --install /usr/bin/java java /opt/java/jdk1.8.0_25/bin/java 100

RUN update-alternatives --display java

RUN java -version

# Expose the ports we're interested in
EXPOSE 8080 9990

# Set the default command to run on boot
# This will boot WildFly in the standalone mode and bind to all interface
CMD [&amp;quot;/opt/wildfly/bin/standalone.sh&amp;quot;, &amp;quot;-c&amp;quot;, &amp;quot;standalone-full.xml&amp;quot;, &amp;quot;-b&amp;quot;, &amp;quot;0.0.0.0&amp;quot;]

The image inherits from ubuntu, an image which installs Ubuntu OS. Ubuntu image is installed when you follow the docker instalation tutorial.

Next we copy the server to the folder /opt/wildfly inside the container we are creating. COPY is a command available in Dockerfile DSL. All commands can be found here.

Next we copy our app war inside the server with: COPY car-service.war /opt/wildfly/standalone/deployments/.

After, we setup Java by unziping it to /opt/java inside the container and setting up environment variables. A best approach would be apt-get but it requires (good)internet access which i didn’t had at the time of the writing. I used RUN command to execute java -version which will print the version during the image build (if java is correctly installed).

Later I use EXPOSE 8080 9990 to tell docker the ports that can be exposed by the container. A container is the instantiation of a Docker image. When we run an image (docker run) we can specify which ports are accessible to the host machine.

Finally we specify the default command using CMD [“/opt/wildfly/bin/standalone.sh”, “-c”, “standalone-full.xml”, “-b”, “0.0.0.0”]. This command will be fired everytime our container is stated.

Building the image

After describing our image we have to build it. Run the following command from the parent folder containing the Dockerfile:

docker -H tcp://127.0.0.1:2375 build -t javaee_sample java_ee/

  • -H flag specify docker daemon address(we are using tcp to communicate with daemon);
  • build is the command itself;
  • -t specify the name of the tag to identify the image (javaee_sample in this case);
  • java_ee/ is the folder containing the Dockerfile describing our image.

More docker commands can be found here. Here is the output of the command:

After that we can see the created image by listing images installed: docker -H tcp://127.0.0.1:2375 images :

 

Starting the container

The container can be started with the command: docker -H tcp://127.0.0.1:2375 run -p 8180:8080 javaee_sample

  • -p specifies the container port(8080) to be exposed on the host machine. Port 8180 in this case (EXPOSE Dockerfile command);
  • run is the command itself;
  • javaee_sample is the name of the image.

The output of command is wildfly starting because we set it as initial command (CMD Dockerfile command):

img4

Running multiple containers

We can instantiate many container as we want since their ports don’t conflict in the host machine. I will start two more containers exposing port 8080 to 8280 and 8380 respectively:

docker -H tcp://127.0.0.1:2375 run -p 8280:8080 javaee_sample
docker -H tcp://127.0.0.1:2375 run -p 8380:8080 javaee_sample

To list started containers we can use the command docker -H tcp://127.0.0.1:2375 ps, here is the output:

rmpestano@rmpestano-ubuntu:~/docker /images$ docker -H tcp://127.0.0.1:2375 ps
CONTAINER ID        IMAGE                  COMMAND                CREATED             STATUS              PORTS                              NAMES
7b9079806e69        javaee_sample:latest   &amp;quot;/opt/wildfly/bin/st   27 seconds ago      Up 27 seconds       9990/tcp, 0.0.0.0:8280-&amp;gt;8080/tcp   suspicious_lovelace
d4975e825751        javaee_sample:latest   &amp;quot;/opt/wildfly/bin/st   28 seconds ago      Up 28 seconds       9990/tcp, 0.0.0.0:8380-&amp;gt;8080/tcp   loving_hopper
96e58eb65126        javaee_sample:latest   &amp;quot;/opt/wildfly/bin/st   42 seconds ago      Up 42 seconds       9990/tcp, 0.0.0.0:8180-&amp;gt;8080/tcp   clever_cori

And now we can access the three apps in the browser at the same time:

img5

You can stop the container by its ID or by name with docker -H tcp://127.0.0.1:2375 stop suspicious_lovelace

Remember that all data will be lost when the container is stopped. Use Docker volumes for persistent data.

RESTFul Java with JAX-RS 2.0 Book Review

During my vacation i’ve read RESTFul Java with JAX-RS 2.0 second edition by Bill Burk.

This review is just some notes i made while i was visiting each chapter. It is mainly listing the topics i found important and interesting in the book. I’ve also made some comments.

Positive aspects:

  • Very practical with plenty of nice examples using latest JAX-RS version;
  • really easy to ready and understand book;
  • interesting topics were covered.

Negative Aspects:

  • Missing a REST API documentation chapter;
  • not enough attention on testing, maybe a dedicate chapter with best practices;
  • use more json in favor of xml;
  • Jersey was not mentioned.

Overall it is a great book and fully recommended, even for those already working with REST.

Here are my notes on each chapter. A narrative on most important topics covered so you can have an idea on the content of the book:

Chapter 1 – Introduction to REST

Although it is very objective and succinct, this chapter goes directly to the heart of REST. Compares with CORBA, SOAP and WS-* standards. How REST and HTTP are related. A bit of SOA. Refers to Roy Fielding’s PhD thesis[LINK], a must read article. Finally describes the five RESTful architectural principles: Addressability, Constrained Interface, Representation-Oriented, Communicate Statelessly and HATEOAS.
An excellent overview.

Chapter 2 – Designing RESTful Services

This chapter presents a RESTFul Order entry system interface(a.k.a endpoint) of an hypothetical ecommerce. It shows the concepts described on the first chapter explaining it in an HTTP oriented way, no Java code yet. There is an interesting discussion about “State vs Operation” and best practice to model REST resources. The data format chosen(XML) for model was not the best option i think, in my opinion json would be a best approach both for exemplifying as for best practices (i don’t buy that xml is for Java and json is for web related technologies such as Ajax). This chapter accomplishes well it’s objectives which is to illustrate RESTful concepts in practice.

Chapter 3 – Your First JAX-RS Service

It starts talking a bit of servlets then jumps to JAX-RS, it summarizes well the framework for writing RESTFul services in Java.
Next the Order system designed in chapter 2 is implemented in Java using JAX-RS. For the ones already working with REST in Java it does not adds much to the table but it is a necessary step so the application can evolve during the book. Maybe its a personal taste but i have to say again that “application/xml” was not a good choice. In my opinion the examples would be simpler with json, for example


return new StreamingOutput() {
public void write(OutputStream outputStream)
throws IOException, WebApplicationException {
outputCustomer(outputStream, customer);
}
};

Maybe introducing JAX-B in this chapter could be an option to avoid “streams”, inner classes and could simplify the client example.

Chapter 4 – HTTP Method and URI Matching

Details @Path annotation and its matching rules, a bit of sub resources and dispatching, matrix x query params and finally some gotchas in request matching.

Chapter 5 – JAX-RS Injection

Interesting hint on field injection versus singleton resources. @PathParam injection is revisited and more examples are presented. PathSegment and UroInfo are introduced. MatrixParam, QueryParam, FormParam, HeaderParam, CookieParam and BeanParam are detailed with nice examples. BeanParam is new and added to JAX-RS 2.0, a very useful feature. Next it talks about automatic type conversion. How JAX-RS can map request Strings to primitives, enums, lists and objects. Later it goes into details about ParamConverter so JAX-RS can convert http request String into Java objects. Finally the chapter ends explaining @Encoded annotation.

Chapter 6 – JAX-RS Content Handlers

The chapter starts with content marshalling and build in providers (maybe here is the motivation of streams in previous chapters). Some byte and File related examples are presented. Next there is an example of posting a form with Multivalued Map<String, String>. Next the chapter focus JAXB. There is a small intro and some examples. There is an interesting section about JAXB and JSON and how they integrate. Later the chapter details JSON objects. Finally it talks about custom marshalling and exemplifies message body reader and writer.

Chapter 7 – Server Responses and Exception Handling

It starts talking about successful and error responses. Next topic is how to create response with ResponseBuilder. A bit on cookies and later status codes. Next GenericEntity is presented to deal with generics. Finally exception handling is detailed by showing WebApplicationException, exception mappers. It ends explaining error codes and build in JAX-RS exceptions.

Chapter 8 – JAX-RS Client API

A very good introduction to the Client API that comes in JAX-RS 2.0. It really do its job in a very practical way with nice examples.

Chapter 9 – HTTP Content Negotiation

A nice overview of how JAX-RS supports the Conneg protocol to easy the integration with heterogeneous clients and evolution of the system. It starts explaining the negotiation protocol with media type examples, language negotiation and encoding. Next, examples with JAX-RS are presented.The chapter ends with Variants (multiple types of response for the same uri), URI negotiation, new media types (for versioning) and flexible schemas using content negotiation.

Chapter 10 – HATEOAS

A little introduction to the concept. How it can be applied to Web Services. Atom links is presented. Next, the advantages of HATEOAS are explained. Later JAX-RS and HATEOAS plus URI builder and URIInfo are presented with examples. Finally building links and link header is presented.

Chapter 11 – Scaling JAX-RS Applications

The chapter begins talking about the web and mechanisms that help it scale. It talks about caching (browser, proxy and CDN). After introducing caching it explores the HTTP caching with JAX-RS examples. Cache revalidation is visited, again with nice examples. Next topic is concurrency with conditional PUT and POST followed by JAX-RS examples.

Chapter 12 – Filters and Interceptors

Server side filter are presented first with Request and Response filter examples like cache control and authorization. Next, reader and writer interceptors with GZIP example. Client Filters are presented using JAX-RS client API. A cache control filter example is explained, it basically caches some requests and manipulates “If-None-Match” and “If-Modified-Since” headers. Deploying (@Provider) and ordering (@Priority) of filters and interceptors are visited. Method filters and interceptors are exemplified with DynamicFeature and NameBinding. Finally there is a note on exception on filters or interceptors.

Chapter 13 – Asynchronous JAX-RS

It first starts with Client API and AsyncInvoker using futures. Next, Callbacks are presented with nice examples. Server side asynchronous response is introduced. The Internet HTTP request/response thread model and its challenges is explained. Next, the AsyncResponse API is presented with JAX-RS examples.
The chapter made clear that asynchronous responses is for specific applications and most of the times “normal” request/response paradigm is sufficient. Later exception handling and response with resume and cancel is explained. Timeouts and response callbacks are explained. Use cases like server push and publish subscribe (chat) are presented and exemplified. A note on WebSockets and Server Sent Events versus pure HTTP server push apps. Finally scheduling using executors is presented.

Chapter 14 – Deployment and Integration

It starts by registering REST resources by extending application class, initializing Singletons and Classes. Difference between Servlet container and JavaEE JAX-RS deployment is explained. Web.xml configuration is presented. Next topics are EJB 3.1 ad Spring integration. Pretty simple but useful chapter.

Chapter 15 – Securing JAX-RS

A small introduction to security in the web and JavaEE like authentication, authorization, and encryption. It dive in servlet authentication and authorization mechanism followed by encryption. Next authorization annotations like @RolesAllowed and @PermitAll is presented. Next topic is programmatic security with SecurityContext. A JAX-RS RequestFilter for authorization is exemplified. Next is Client side security using JAX-RS client API. OAuth is the next topic. The CNN case is presented as an example of OAuth. Signing and encrypting message bodies is next security topic. It is basically concerned with security in intermediary REST services (a.k.a integrations), twitter is used as example. Later Digital signatures is introduced. DKIM and JOSE JSON Signature (JWS) are exemplified. The last topic is the encryption of representations (the message body). JOSE JSON Encryption (JWE) is used as example. The chapter is more conceptual, the majority of security examples are in Chapter 29.

In chapter 29 there is an interesting OTPAuthenticated Request Filter(one time password). @AllowedPerDay is introduced in the security chain, a nice example of limiting number of access to a resource by user. It is also a ContainerRequestFilter by with lower priority meaning that it runs after OTPAuthenticated filter.

Chapter 16 – Alternative Java Clients

Besides JAX-RS client API other clients are presented. It starts with “pure Java” and HttpURLConnection examples. There is a small note on caching and authentication using java.net classes. Standard Java certificate auth using “keytool” command is introduced. Next topic is HttpClient examples. Client authentication with HttpClient is presented. RESTEasy client proxy is introduced.

CHAPTER 17 – Workbook Introduction

This chapter is a step by step tutorial on how to setup your environment with RESTEasy 3.x(an implementation of JAX-RS 2.0 spec). It uses JDK 6, Maven 3, Jetty 8.1 for servlet container and Wildfly 8.0 for examples that require JavaEE 7. It creates the project and illustrates its directory structure.

From chapter 18 to Chapter 29 is reserved to more elaborate and complete examples for each chapter. This is the great thing on this book, it is example oriented and has dedicate chapters for examples. I will not comment the examples but recommend you to try them when you read, it is great for learning.

Some Words on JavaEE, REST and Swagger

Introduction

In this post i will cover the following topics:

  1. Create a simple JavaEE REST application using JBoss Forge
  2. Add some Rest authentication and authorization
  3. Create some Arquillian tests for the created Rest endpoint
  4. Use Swagger to generate Rest API documentation

I will enhance the CDI Crud application presented on previous posts: CDI Generic Dao, CDI Crud Multi “Tenancy” and Arquillian + DBUnit + Cucumber. All source code is avaiable here: https://github.com/rmpestano/cdi-crud

 Creating the REST Endpoint

I used JBoss Forge to execute this task. As Forge can be used to evolve an application i have just executed Rest setup command:

img01

I have chosen to use JaxRS 1.1 because i want to run the app in JBoss AS and Wildfly:

img02

As we already have our JPA entities, creating the endpoint is done with generate endpoints from entities:

img03

And the CarEndpoint is created and ready to CRUD cars via REST:

@Stateless
@Path("/cars")
public class CarEndpoint {
    @Inject
    CarService carService;

    @POST
    @Consumes("application/json")
    public Response create(Car entity) {
        carService.insert(entity);
        return Response.created(UriBuilder.fromResource(CarEndpoint.class).path(String.valueOf(entity.getId())).build()).build();
    }

    @DELETE
    @Path("/{id:[0-9][0-9]*}")
    public Response deleteById(@PathParam("id") Integer id) {
        Car entity = carService.findById(id);
        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        carService.remove(entity);
        return Response.noContent().build();
    }

    @GET
    @Path("/{id:[0-9][0-9]*}")
    @Produces("application/json")
    public Response findById(@PathParam("id") Integer id) {
        Car entity;
        try {
            entity = carService.findById(id);
        } catch (NoResultException nre) {
            entity = null;
        }

        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        return Response.ok(entity).build();
    }

    @GET
    @Produces("application/json")
    @Path("list")
    public List<Car> listAll(@QueryParam("start") @DefaultValue("0") Integer startPosition, @QueryParam("max") @DefaultValue("10") Integer maxResult) {
        Filter<Car> filter = new Filter<>();
        filter.setFirst(startPosition).setPageSize(maxResult);
        final List<Car> results = carService.paginate(filter);
        return results;
    }

    @PUT
    @Path("/{id:[0-9][0-9]*}")
    @Consumes("application/json")
    public Response update(@PathParam("id") Integer id,  Car entity) {
        if (entity == null) {
            return Response.status(Status.BAD_REQUEST).build();
        }
        if (!id.equals(entity.getId())) {
            return Response.status(Status.CONFLICT).entity(entity).build();
        }
        if (carService.crud().eq("id",id).count() == 0) {
            return Response.status(Status.NOT_FOUND).build();
        }
        try {
            carService.update(entity);
        } catch (OptimisticLockException e) {
            return Response.status(Response.Status.CONFLICT).entity(e.getEntity()).build();
        }

        return Response.noContent().build();
    }
}

I have only replaced EntityManager used by Forge with CarService which was created on previous posts, the rest of the code was generated by Forge. This step was really straightforward, thanks to Forge.

REST Authentication

To authenticate client before calling the REST endpoint i’ve created a CDI Interceptor:

@RestSecured
@Interceptor
public class RestSecuredImpl implements Serializable{

    @Inject
    CustomAuthorizer authorizer;

    @Inject
    Instance<HttpServletRequest> request;

    @AroundInvoke
    public Object invoke(InvocationContext context) throws Exception {
        String currentUser = request.get().getHeader("user");
         if( currentUser != null){
             authorizer.login(currentUser);
         } else{
             throw new CustomException("Access forbidden");
         }
        return context.proceed();
    }

}

 

So for this app we are getting current user from HttpHeader of name user. If the interceptor doesn’t find the header it will throw a CustomException, it will be explained in next section. Note that only endpoints annotated with @RestSecured will be intercepted. Authorization is done by CustomAuthorizer

Verifying Authorization

Authorization is performed by CustomAuthorizer which is based on DeltaSpike security module. A very simple authorizer was created, it is based on username and stores logged user in a hashmap:

@ApplicationScoped
public class CustomAuthorizer implements Serializable {

    Map<String, String> currentUser = new HashMap<>();

    @Secures
    @Admin
    public boolean doAdminCheck(InvocationContext invocationContext, BeanManager manager) throws Exception {
        boolean allowed = currentUser.containsKey("user") && currentUser.get("user").equals("admin");
        if(!allowed){
            throw new CustomException("Access denied");
        }
        return allowed;
    }

   
    public void login(String username) {
        currentUser.put("user", username);
    }
}

When authorization fails (check method returns false) we are throwing another CustomException.

I have created a Rest Provider to map CustomException into Response types:

@Provider
public class CustomExceptionMapper implements ExceptionMapper<CustomException> {

    @Override
    public Response toResponse(CustomException e) {
        Map map = new HashMap();
        map.put("message", e.getMessage());

        if (e.getMessage().equals("Access forbidden")) {//TODO create specific exception and its mapper
            return Response.status(Response.Status.FORBIDDEN).type(MediaType.APPLICATION_JSON).entity(map).build();
        }
        if (e.getMessage().equals("Access denied")) {//TODO create specific exception and its mapper
            return Response.status(Response.Status.UNAUTHORIZED).type(MediaType.APPLICATION_JSON).entity(map).build();
        }
        return Response.status(Response.Status.BAD_REQUEST).type(MediaType.APPLICATION_JSON).entity(map).build();
    }
}

I have only added authentication to delete endpoint:

    @DELETE
    @Path("/{id:[0-9][0-9]*}")
    @RestSecured
    public Response deleteById(@PathParam("id") Integer id) {
        Car entity = carService.findById(id);
        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        carService.remove(entity);
        return Response.noContent().build();
    }

Basically added @RestSecured annotation. It means that if a client fires a request to this endpoint without providing a user on http header, the method will not be called and response will be 403. If client provides a user but it is not allowed then http response must be 401.

For authorization we use @Admin in the service method:

@Stateless
public class CarService extends CrudService<Car> {

    @Override
    @Admin
    public void remove(Car car) {
        super.remove(car);
    }
}

@Admin activates our CustomAuthorizer which verifies if current user has authorization to execute annotated method.

Testing the REST Endpoint

To test CarEndpoint i have used Arquillian, RestAssured and DBUnit. Before tests our database is populated with 4 cars described in car.yml dataset:

car:
  - id: 1
    model: "Ferrari"
    price: 2450.8
  - id: 2
    model: "Mustang"
    price: 12999.0
  - id: 3
    model: "Porche"
    price: 1390.3
  - id: 4
    model: "Porche274"
    price: 18990.23

I’ve implemented tests for all CRUD and HTTP operations. I will show only List and DELETE tests, other tests can be found in CrudRest.java. Here is how List all cars test look like:


    @Test
    public void shouldListCars() {
        given().
                queryParam("start",0).queryParam("max", 10).
        when().
                get(basePath + "rest/cars/list").
        then().
                statusCode(Response.Status.OK.getStatusCode()).
                body("", hasSize(4)).//dataset has 4 cars
                body("model", hasItem("Ferrari")).
                body("price", hasItem(2450.8f)).
                body(containsString("Porche274"));
    }

For DELETE methods i have one that fails with authentication, another fails with authorization and one which can delete a car:


    @Test
    public void shouldFailToDeleteCarWithoutAuthentication() {
        given().
                contentType(ContentType.JSON).
                when().
                delete(basePath + "rest/cars/1").  //dataset has car with id =1
                then().
                statusCode(Response.Status.FORBIDDEN.getStatusCode());
    }

    @Test
    public void shouldFailToDeleteCarWithoutAuthorization() {
        given().
                contentType(ContentType.JSON).
                header("user", "guest"). //only admin can delete
                when().
                delete(basePath + "rest/cars/1").  //dataset has car with id =1
                then().
                statusCode(Response.Status.UNAUTHORIZED.getStatusCode());
    }

    @Test
    public void shouldDeleteCar() {
        given().
                contentType(ContentType.JSON).
                header("user","admin").
        when().
                delete(basePath + "rest/cars/1").  //dataset has car with id =1
        then().
                statusCode(Response.Status.NO_CONTENT.getStatusCode());

        //ferrari should not be in db anymore
        given().
        when().
                get(basePath + "rest/cars/list").
         then().
                statusCode(Response.Status.OK.getStatusCode()).
                body("", hasSize(3)).
                body("model", not(hasItem("Ferrari")));
    }

Generating the REST API Documentation

To generate the API documentation i will use Swagger which is a specification for REST apis. Swagger is composed by various components, the main ones are:

  • swagger-spec: describes the format of REST APIs
  • swagger-codgen: generates REST clients based on swagger spec
  • swagger-ui: generates web pages describing the API based on swagger spec
  • swagger-editor: designing swagger specifications from scratch, using a simple YAML structure

Instead of using “pure” swagger, which requires its own annotations, i will use swagger-jaxrs-doclet that is based on javadoc and leverages JAXRS annotations.

The first thing to do is to copy the swagger-ui distribution (the swagger-ui i’ve used can be found here, i’ve made minor changes to index.html)to your application in webapp/apidocs as in image below:

img04

Now we just have to generate the swagger spec files based on our REST endpoints. This is done by the doclet maven plugin:


     <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-javadoc-plugin</artifactId>
                <version>2.9.1</version>
                <executions>
                    <execution>
                        <id>generate-service-docs</id>
                        <phase>generate-resources</phase>
                        <configuration>
                            <doclet>com.carma.swagger.doclet.ServiceDoclet</doclet>
                            <docletArtifact>
                                <groupId>com.carma</groupId>
                                <artifactId>swagger-doclet</artifactId>
                                <version>1.0.2</version>
                            </docletArtifact>
                            <reportOutputDirectory>src/main/webapp</reportOutputDirectory>
                            <useStandardDocletOptions>false</useStandardDocletOptions>
                            <additionalparam>-apiVersion 1 -docBasePath /cdi-crud/apidocs
                                -apiBasePath /cdi-crud/rest
                                -swaggerUiPath ${project.build.directory}/
                            </additionalparam>
                        </configuration>
                        <goals>
                            <goal>javadoc</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

The plugin has 3 main configurations:

  • docBasePath: where swagger spec files(.json) will be generated
  • apiBasePath: path used in the calls made from the API documentation(swagger-ui generates executable documentation)
  • swaggerUiPath: the plugin can generate the swagger-ui ditribution. As i am copying the dist manually i do not use this option and point it to target folder (in fact i could not get it working well so i need to play more with this option).

With this configuration the swagger spec files will be generated on each build which makes the documentation and API synchronized, see .json spec files (in red):

img05

Now you can access your REST API documentation in /apisdocs url:

img06

you can also fire REST requests through the API Docs:

img12

We can also enhance the documentation via Javadoc so for example we can add response types to describe the response codes, see modified delete method:


   /**
     * @description deletes a car based on its ID
     * @param user name of the user to log in
     * @param id car ID
     * @status 401 only authorized users can access this resource
     * @status 403 only authenticated users can access this resource
     * @status 404 car not found
     * @status 204 Car deleted successfully
     */
    @DELETE
    @Path("/{id:[0-9][0-9]*}")
    @RestSecured
    public Response deleteById(@HeaderParam("user") String user, @PathParam("id") Integer id) {
        Car entity = carService.findById(id);
        if (entity == null) {
            return Response.status(Status.NOT_FOUND).build();
        }
        carService.remove(entity);
        return Response.noContent().build();
    }

and here is the generated doc(note that i’ve added @HeaderParam so we can authenticate through documentation page):

img11

All supported doclet annotations can be found here.

This app and REST documentation is available at openshift here: http://cdicrud-rpestano.rhcloud.com/cdi-crud/apidocs. There is also a simple car crud app.   To see a “bit more elaborated” documentation generated by swagger and doclet see Carma Rest API.

Arquillian + Cucumber + DBUnit

Arquillian and Cucumber integration is represented by Cukespace project, a very active with nice examples project.

One issue I’ve found with cukespace(also with arquillian jbehave) is the fact that I can’t use arquillian-persistence extension to initialize my scenarios datasets, the main cause is that cucumber test life cycle is different from junit/testng so arquillian isnt aware when a test or cucumber event is occurring, e.g. arquillian org.jboss.arquillian.test.spi.event.suite.* events aren’t triggered during cucumber tests execution.

There is an issue at cukespace project.

As arquillian persistence uses DBUnit behind the scenes i solved that limitation(while the issue isnt solved) using pure DBUnit api, here is some sample code, it can be found here.

CrudBdd.java

@RunWith(ArquillianCucumber.class)
@Features("features/search-cars.feature")
@Tags("@whitebox")
public class CrudBdd {

    @Inject
    CarService carService;

    Car carFound;

    int numCarsFound;

    @Deployment(name = "cdi-crud.war")
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment();
        war.addAsResource("datasets/car.yml", "car.yml").//needed by DBUnitUtils
                addClass(DBUnitUtils.class);
        System.out.println(war.toString(true));
        return war;
    }

    @Before
    public void initDataset() {
        DBUnitUtils.createDataset("car.yml");
    }

    @After
    public void clear() {
        DBUnitUtils.deleteDataset("car.yml");
    }

    @Given("^search car with model \"([^\"]*)\"$")
    public void searchCarWithModel(String model) {
        Car carExample = new Car();
        carExample.setModel(model);
        carFound = carService.findByExample(carExample);
        assertNotNull(carFound);
    }

    @When("^update model to \"([^\"]*)\"$")
    public void updateModel(String model) {
        carFound.setModel(model);
        carService.update(carFound);
    }

    @Then("^searching car by model \"([^\"]*)\" must return (\\d+) of records$")
    public void searchingCarByModel(final String model, final int result) {
        Car carExample = new Car();
        carExample.setModel(model);
        assertEquals(result, carService.crud().example(carExample).count());
    }

    @When("^search car with price less than (.+)$")
    public void searchCarWithPrice(final double price) {
        numCarsFound = carService.crud().initCriteria().le("price", price).count();
    }

    @Then("^must return (\\d+) cars")
    public void mustReturnCars(final int result) {
        assertEquals(result, numCarsFound);
    }

}

DBUnitUtils is a simple class that deals with DBUnit api, the idea is to initialize database
on Before and After events which are triggered on each scenario execution, also they are triggered for each ‘example‘ which is very important so we can have a clean database on each execution.

Here is my feature file:

Feature: Search cars

@whitebox
Scenario Outline: simple search and update
Given search car with model "Ferrari"
When update model to "Audi"
Then searching car by model "<model>" must return <number> of records
Examples:
| model | number |
| Audi  | 1      |
| outro | 0      |

@whitebox
Scenario Outline: search car by price
When search car with price less than <price>
Then must return <number> cars
Examples:
| price | number |
| 1390.2 | 0 |
| 1390.3 | 1 |
| 10000.0 | 2 |
| 13000.0 | 3 |

@blackbox
Scenario Outline: search car by id
When search car by id <id>
Then must find car with model "<model>" and price <price>
Examples:
| id | model | price |
| 1 | Ferrari | 2450.8 |
| 2 | Mustang | 12999.0 |
| 3 | Porche | 1390.3 |

DBUnitUtils.java

public class DBUnitUtils {

    private static DataSource ds;
    private static DatabaseConnection databaseConnection;

    public static void createDataset(String dataset) {

        if (!dataset.startsWith("/")) {
            dataset = "/" + dataset;
        }
        try {
            initConn();
            DatabaseOperation.CLEAN_INSERT.execute(databaseConnection,
new YamlDataSet(DBUnitUtils.class.getResourceAsStream(dataset)));
        } catch (Exception e) {
            e.printStackTrace();
            throw new RuntimeException("could not initialize dataset:" + dataset +
" \nmessage: " + e.getMessage());
        } finally {
            closeConn();
        }
    }

    public static void deleteDataset(String dataset) {
        if (!dataset.startsWith("/")) {
            dataset = "/" + dataset;
        }
        try {
            initConn();
            DatabaseOperation.DELETE_ALL.execute(databaseConnection,
new YamlDataSet(DBUnitUtils.class.getResourceAsStream(dataset)));
        } catch (Exception e) {
            e.printStackTrace();
            throw new RuntimeException("could not delete dataset dataset:" + dataset +
" \nmessage: " + e.getMessage());
        } finally {
            closeConn();
        }
    }

    private static void closeConn() {
        try {
            if (databaseConnection != null && !databaseConnection.getConnection().isClosed()) {
                databaseConnection.getConnection().close();
            }
        } catch (SQLException e) {
            e.printStackTrace();
            throw new RuntimeException("could not close conection \nmessage: " + e.getMessage());
        }

    }

    private static void initConn() throws SQLException, NamingException, DatabaseUnitException {
        if (ds == null) {
            ds = (DataSource) new InitialContext()
.lookup("java:jboss/datasources/ExampleDS");
        }
        databaseConnection = new DatabaseConnection(ds.getConnection());
    }

and car.yml

car:
  - id: 1
    model: "Ferrari"
    price: 2450.8
  - id: 2
    model: "Mustang"
    price: 12999.0
  - id: 3
    model: "Porche"
    price: 1390.3
  - id: 4
    model: "Porche274"
    price: 18990.23

 

DBUnit  Rest endpoint

Another limitation of persistence-extension is its integration with functional tests(blackbox/RunAsClient/testable=false), see this issue. Basically arquillian persistence needs server side resources like datasource to work but blackbox tests run outside the container, in a separated JVM.

To overcome that limitation I’ve created a DBUnit rest endpoint and deployed within my test so i can make rest calls to the server and create dataset there where i have all needed resources, here is the DBUnitUtils with rest calls:

 


public class DBUnitUtils {

    private static DataSource ds;
    private static DatabaseConnection databaseConnection;

  public static void createRemoteDataset(URL context, String dataset) {
        HttpURLConnection con = null;
        try {
            URL obj = new URL(context + "rest/dbunit/create/" + dataset);
            con = (HttpURLConnection) obj.openConnection();
            con.setRequestMethod("GET");
            con.setDoOutput(true);
            int responseCode = con.getResponseCode();
            if (responseCode != 200) {
                throw new RuntimeException("Could not create remote dataset\nstatus:" + responseCode + "\nerror:" + con.getResponseMessage());
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (con != null) {
                con.disconnect();
            }
        }
    }

    public static void deleteRemoteDataset(URL context, String dataset) {
        HttpURLConnection con = null;
        try {
            URL obj = new URL(context + "rest/dbunit/delete/" + dataset);
            con = (HttpURLConnection) obj.openConnection();
            con.setRequestMethod("GET");
            con.setDoOutput(true);

            int responseCode = con.getResponseCode();
            if (responseCode != 200) {
                throw new RuntimeException("Could not create remote dataset\nstatus:" + responseCode + "\nerror:" + con.getResponseMessage());
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            if (con != null) {
                con.disconnect();
            }
        }
    }

}

and DBUnitRest.java

@Path("/dbunit")
public class DBUnitRest {

    @GET
    @Path("create/{dataset}")
    public Response createDataset(@PathParam("dataset") String dataset) {
        try {
            DBUnitUtils.createDataset(dataset);//i feel like going in circles
        } catch (Exception e) {
            return Response.status(Status.BAD_REQUEST).entity("Could not create dataset.\nmessage:" + e.getMessage() + "\ncause:" + e.getCause()).build();
        }
        return Response.ok("dataset created sucessfully").build();
    }

    @GET
    @Path("delete/{dataset}")
    public Response deleteDataset(@PathParam("dataset") String dataset) {
        try {
            DBUnitUtils.deleteDataset(dataset);//i feel like going in circles
        } catch (Exception e) {
            return Response.status(Status.BAD_REQUEST).entity("Could not delete dataset.\nmessage:" + e.getMessage() + "\ncause:" + e.getCause()).build();
        }
        return Response.ok("dataset deleted sucessfully").build();
    }

}

 

and here is my functional test which uses Drone and Graphene:


@RunWith(ArquillianCucumber.class)
@Features("features/search-cars.feature")
@Tags("@blackbox")
public class CrudAt {

  @Deployment(name = "cdi-crud.war", testable=false)
  public static Archive<?> createDeployment() {
    WebArchive war = Deployments.getBaseDeployment();
        war.addAsResource("datasets/car.yml","car.yml").//needed by DBUnitUtils
                addPackage(DBUnitUtils.class.getPackage()).addClass(CrudBean.class).addClass(YamlDataSet.class).
                addClass(YamlDataSetProducer.class).
                addClass(Row.class).addClass(Table.class).addClass(DBUnitRest.class);

        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory("src/main/webapp").as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png|gif)$"));
        MavenResolverSystem resolver = Maven.resolver();
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.dbunit:dbunit:2.5.0").withoutTransitivity().asSingleFile());
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.yaml:snakeyaml:1.10").withoutTransitivity().asSingleFile());
    System.out.println(war.toString(true));
    return war;
  }

  @ArquillianResource
  URL url;

  @Drone
  WebDriver webDriver;

  @Page
  IndexPage index;

  @Before
  public void initDataset() {
      DBUnitUtils.createRemoteDataset(url,"car.yml");
  }

  @After
  public void clear(){
      DBUnitUtils.deleteRemoteDataset(url,"car.yml");
   }

  @When("^search car by id (\\d+)$")
  public void searchCarById(int id){
      Graphene.goTo(IndexPage.class);
      index.findById(""+id);
  }

  @Then("^must find car with model \"([^\"]*)\" and price (.+)$")
  public void returnCarsWithModel(String model, final double price){
    assertEquals(model,index.getInputModel().getAttribute("value"));
    assertEquals(price,Double.parseDouble(index.getInputPrice().getAttribute("value")),0);
  }

}

here is cucumber reports:

whitebox

blackbox

That’s all folks.

 

CDI Crud Multi “Tenancy”

Just added an example of multi tenancy to our cdi-crud example, now Car and Movie tables are in different datasources. In fact our example is not truly multi tenancy where you have the same boundaries but different schemas so you can have multiple clients on the same application and each one using its database, see [1] and [2] for eclipse link and hibernate native support for multi tenancy respectively.The example below is a simple approach to switch between datasources using CDI. Although with some changes in our TenantController like eg: use of SystemProperty to define tenant or use alternatives like [3] could add some more ‘tenantness’.

As usual there are some arquillian integration tests here.

I will summarize the idea as i’m out of time(also as usual), here are the steps:

Changed entityManager resolution in our Crud/GenericDao/NanoService/whatever class FROM:

@PersistenceContext
EntityManager em;

public Entitymanager getEntityManager(){
    return em;
}

TO:

@Inject
TenantController tenantController;
TenantType tenantType;

public Entitymanager getEntityManager(){
    return tenantController.getTenant(tenantType);
}

WHERE TenantController simple has all entityManagers injected and decide which one to return using TenantType:

public class TenantController {

	@PersistenceContext(unitName="CarPU")
	EntityManager carEm;

	@PersistenceContext(unitName="MoviePU")
	EntityManager movieEm;

	public EntityManager getTenant(TenantType type){
		switch (type) {
		case CAR:
			return carEm;
		case MOVIE:
			return movieEm;
			default:{
				Logger.getLogger(getClass().getCanonicalName()).info("no tenant provided, resolving to CarPU");
				return carEm;//force error, we dont want to resolve it by "accident"
			}
		}
	}

}

TenantType is passed via Annotation to each service:

@Stateless
@Tenant(TenantType.MOVIE)
public class MovieService extends CrudService<Movie> {

}

via injection point into Generic Crud:

@Stateless
@Tenant(TenantType.MOVIE)//not a qualifier just an inherited annotation
public class MovieService extends CrudService<Movie> {

@Inject
@Tenant(TenantType.CAR)
Crud<Car> carCrud;//becareful to not pass wrong entity in generics, see this test:<a href="https://github.com/rmpestano/cdi-crud/blob/master/src/test/java/com/cdi/crud/test/MultiTenantIt.java#L93" target="_blank">https://github.com/rmpestano/cdi-crud/blob/master/src/test/java/com/cdi/crud/test/MultiTenantIt.java#L93</a>

}

Or programmatically:

@Stateless
public class MovieService extends CrudService<Movie> {

public void someMethod(){
   super.crud(TenantType.CAR).listAll();
  }
}

I’m not producing qualified entityManagers just to not have qualified Crud.java neither qualified Services.

Thats it, and how about you, how do you deal with multiple databases in your applications?

[1]http://wiki.eclipse.org/EclipseLink/Development/Indigo/Multi-Tenancy
[2]http://docs.jboss.org/hibernate/orm/4.2/devguide/en-US/html/ch16.html
[3]http://antoniogoncalves.org/2014/05/25/switch-datasource-with-cdi-alternatives-and-stereotypes/
[4]http://lambda-et-al.eu/multi-tenancy-with-jee-and-jboss/

Arquillian – The aliens are invading

I dedicate this post to my forefather, most of it was written while i was taking care of my father at hospital, rest in peace my best friend!

The ideia behind this post is to share my experience with Arquillian[1] which is becoming – the de facto standard – framework for writing (real) tests in the JavaEE environment(hence the blog title).

We will use arquillian to test a JavaEE6(compatible with EE7) application, all sources are available at github here, its a simple User, Group and Role management application. Also as usual there is a video showing what we will see on this entry:http://youtu.be/iGkCcK1EwAQ

Introduction

Arquillian is a testing platform which brings the power of real tests in Java enterprise applications by enabling the easy creation of integration, functional, behaviour tests among others.

One of the main characteristic of tests written with arquillian is that they run inside a container like servlet container, CDI, JavaEE(6 or +) server, mobile and so on, the so called in-container testing[2]. With tests inside a container the developer or test engineer don’t need to be concerned with server environment such as EJBs, JPA, CDI and JMS infrastructures and can focus on test itself.

To be able to run tests inside a container its necessary to provide which classes and resources will be part of the tests, this is called a “micro-deployment” cause its usually a subset of all application resources.

Micro deployment

Creating micro-deployments is ease by ShrinkWrap library, here is an example:

     @Deployment//tells arquillian which method generates the deployment
     public static Archive<?> createDeployment() {
	  WebArchive war = ShrinkWrap.create(WebArchive.class);

          //adding classes
	  war.addClass(MyClass.class).addClass(AnotherClass.class)
          //adding all classes in com.mypackage and in its subpackages
          .addPackages(true,com.mypackage);
          //adding libs
          MavenResolverSystem resolver = Maven.resolver();
	  //adds primefaces4.0.jar in web-inf/libs
          war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.primefaces:primefaces:4.0").withoutTransitivity().asFile());

	  //web-inf resources

	  //copies src/main/webapp/WEB-INF/beans.xml to micro-deployment WEB-INF folder
          war.addAsWebInfResource("src/main/webapp/WEB-INF/beans.xml", "beans.xml");

	 //resources

	 //copies src/test/resources/test-persistence.xml to src/main/resources/META-INF/persistence.xml micro deployment folder
         war.addAsResource("test-persistence.xml", "META-INF/persistence.xml");

	return war;
     }

Execution modes

To perform in-container tests arquillian can be configured to operate in three ways:

  • Embedded: An embedded container version is downloaded via maven during execution time and tests run on top of it
  • Remote: Arquillian connects to a container running on a local or remote machine.
  • Managed: in this mode arquillian manages container startup/shutdown by using a container installation. Note that if the container is already running locally arquillian will connect to it like in remote mode.

Test run modes

There are two ways tests can run with arquillian, in-container and as client.

Running tests in container means you have full control of artifacts deployed by micro-deployment like injecting beans, access persistenceContext, EJBs and so on. This is how we do integration/white box tests.

Running as client is the opposite, we cannot access any deployed resource. This mode simulates a client accessing our application from outside and denotes black box testing.

Also there is the mixed mode where some tests run as client and others inside container.

Here is a detailed explanation of arquillian test run mode.

Test lifecycle

Every arquillian test execution follows the following steps:

  1.  Micro deployment creation, it can be and jar, war or ear. If a dependency cannot be colected(resource not found), the process is aborted.
  2.  Startup of the container(embedded and managed) or arquillian connects to a container(remote mode) where tests will run. Can abort process if server doesn’t start within 60 sec or isn’t found.
  3. Deploy of micro-deployment generated in step 1. If a dependency is missing, e.g.: CDI bean dependent beans not deployed, arquillian will not run tests related to this micro-deploy.
  4. Tests related to step 3 micro-deployment are executed
  5. Undeploy of step 3 micro-deploy
  6. Container shutdown(in managed or embedded) or container disconnect(remote)

OBS: steps 1, 3, 4 and 5 will repeat for each @Deployment – @RunWith(Arquillian.class) present in test classaph.

Extensions:

The framework is composed by various extensions, here are the ones we will use:

  • Core: base of all other extensions, it’s responsible of deploy/undeploy,container startup and shutdown and also manage tests life cycle.
  • Drone: Integration with selenium like webdriver management/injection.
  • Graphene: provide a layer of abstraction on top of webdriver extending its functionalities to ease functional tests creation
  • Persistence: brings database management to tests via DBUnit
  • JBehave: enables BDD tests in arquillian through JBehave.
  • Warp: enables gray box testing creation
  • Recorder: allows test recording through video and images.
  • Rest: enables restful endpoints testing
  • JaCoCo: test coverage metrics.

for more extensions refer to arquillian github organization and reference guide

Configuring a project to use Arquillian

Following are the main steps to configure arquillian.

Dependencies

Below is a basic arquillian pom.xml

 <dependencies>
      <!-- Test dependencies -->
       <dependency>
            <groupId>junit</groupId> <!--testNG is also supported-->
            <artifactId>junit</artifactId>
            <version>4.8.2</version>
            <scope>test</scope>
        </dependency>
        <!-- arquillian test framework set to junit, could be testNG -->
        <dependency>
            <groupId>org.jboss.arquillian.junit</groupId>
            <artifactId>arquillian-junit-container</artifactId>
            <scope>test</scope>
        </dependency>
       <!-- shrinkWrap resolvers -->
        <dependency>
            <groupId>org.jboss.shrinkwrap.resolver</groupId>
            <artifactId>shrinkwrap-resolver-depchain</artifactId>
            <scope>test</scope>
            <type>pom</type>
        </dependency>
	 <dependency>
            <groupId>org.jboss.spec.javax.annotation</groupId>
            <artifactId>jboss-annotations-api_1.1_spec</artifactId>
            <version>1.0.1.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.spec.javax.ejb</groupId>
            <artifactId>jboss-ejb-api_3.1_spec</artifactId>
            <version>1.0.2.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.protocol</groupId>
            <artifactId>arquillian-protocol-servlet</artifactId>
            <scope>test</scope>
        </dependency>
		<!-- container adapter -->
	 <dependency><!-- server and mode(managed)-->
            <groupId>org.jboss.as</groupId>
            <artifactId>jboss-as-arquillian-container-managed</artifactId>
            <scope>test</scope>
        </dependency>
	  <!-- end test dependencies -->
</dependencies>

Arquillian uses the concept of maven bom(bill of materials) where a dependency of type pom dictates the recommended versions(can be overriden) of declared dependencies so because of this we didnt declared the version of most of above dependencies. Here follows the arquillian core bom which must be declared at dependencyManagement section:

   <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.jboss.arquillian</groupId>
                <artifactId>arquillian-bom</artifactId>
                <version>1.1.4.Final</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

Configuration file

Arquillian centralizes its configuration in arquillian.xml which contains container adapters and extensions configuration, it must be located at src/test/resources, here is an example:

<arquillian xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xmlns="http://jboss.org/schema/arquillian"
	xsi:schemaLocation="
        http://jboss.org/schema/arquillian
        http://jboss.org/schema/arquillian/arquillian_1_0.xsd">

	<!-- Force the use of the Servlet 3.0 protocol with all containers, as it is the most mature -->
	<defaultProtocol type="Servlet 3.0" />

	<container qualifier="jboss-remote" >
		<configuration>
			<property name="managementAddress">127.0.0.1</property>
			<property name="managementPort">9999</property>

		</configuration>
	</container>
	<container qualifier="jboss-managed"  default="true" >
		<configuration>
		   <!-- jbossHome can be replaced by JBOSS_HOME maven environmentVariable-->
	    	   <property name="jbossHome">#{arquillian.serverHome}</property>
	           <property name="outputToConsole">true</property>
                   <property name="javaVmArguments">-Xmx512m -XX:MaxPermSize=256m -Djboss.bind.address=localhost</property>
//makes it behave like remote adapter if container is already started
		   <property name="allowConnectingToRunningServer">true</property>
                </configuration>
	</container>
</arquillian>
   

Basically what we have are adapters config, in this example jboss managed is active by default. Note that container activated in arquillian.xml must have its dependency present in test classpath when running the tests, in our case jboss-as-arquillian-container-managed.

To switch beetween adapters you can either hardcode default property(as above) or use arquillian.launch file which may contain the name of the qualifier to be used by tests.

To make arquillian.launch dinamic you can use a maven property instead of using a constant adapter qualifier, but a better approach is to use maven system property in maven surefire plugin so for example jboss-managed maven profile sets arquillian.launch to jboss-manaded as below:

	<profile>
            <id>jboss-managed</id>
             <dependencies>
                <dependency>
                    <groupId>org.jboss.as</groupId>
                    <artifactId>jboss-as-arquillian-container-managed</artifactId>
                    <scope>test</scope>
                    <version>${jboss.version}</version>
                </dependency>
            </dependencies>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>2.16</version>
                        <configuration>
                            <systemPropertyVariables>
                                <arquillian.launch>jboss-managed</arquillian.launch>
                            </systemPropertyVariables>
                            <environmentVariables>
                                <JBOSS_HOME>${arquillian.serverHome}</JBOSS_HOME>
                            </environmentVariables>
                        </configuration>
                    </plugin>
           </plugins>
         </build>
     </profile>

Using this approach we garantee that adapter dependency will be present in classpath. Also note JBOSS_HOME enviromnment variable is here to specify container location(needed by managed adapter), we are using ${arquillian.serverHome} so it can be overriden via maven command when executing by CI.

Hello Arquillian

With maven dependencies and arquillian.xml configured to use jboss as test container in managed mode we already can create an arquillian (integration)test.

@RunWith(Arquillian.class)
public class HelloArquillianIt {//'It' sulfix is used to identify different types of test in pom.xml, we'll detail in tips section  

    @Deployment
    public static WebArchive createDeployment() {
        WebArchive war = ShrinkWrap.create(WebArchive.class);
        war.addPackages(true, "org.conventions.archetype.model");//entities
        war.addClasses(RoleService.class, RoleServiceImpl.class) //we will test RoleService
        .addClasses(UserService.class, UserServiceImpl.class)//used by SecurityInterceptorImpl.java
        .addPackages(true, "org.conventions.archetype.qualifier")//@ListToUpdate, @see roleServiceImpl
        .addClass(TestService.class);
        war.addPackages(true, "org.conventions.archetype.security");//security interceptor @see beans.xml
        war.addPackages(true, "org.conventions.archetype.event");//UpdateListEvent @see RoleServiceImpl#afterStore
        war.addPackages(true, "org.conventions.archetype.util");
        //LIBS
        MavenResolverSystem resolver = Maven.resolver();
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.conventionsframework:conventions-core:1.1.2").withTransitivity().asFile());//convention is a experimental framework to enable some JavaEE utilities
        war.addAsLibraries(resolver.loadPomFromFile("pom.xml").resolve("org.primefaces:primefaces:4.0").withoutTransitivity().asSingleFile());

        //WEB-INF
        war.addAsWebInfResource(new File(WEB_INF,"beans.xml"), "beans.xml");//same app beans.xml
        war.addAsWebInfResource(new File(WEB_INF,"web.xml"), "web.xml");same app web.xml
        war.addAsWebInfResource(new File(WEB_INF,"faces-config.xml"), "faces-config.xml");

        war.addAsWebInfResource("jbossas-ds.xml", "jbossas-ds.xml");//datasource

        //resources
        war.addAsResource("test-persistence.xml", "META-INF/persistence.xml");//customized persistence.xml to use a different database 

        return war;
    }

    @Inject
    RoleService roleService;// service real instance

    @Test
    public void shouldListRolesWithSuccess(){
        assertEquals(roleService.crud().countAll(),???????????);// how many roles?
    }

}

As you can see the most difficult part is creating the test deployment, you need to discover each component dependency and you usually face some
‘noClassDefFound’ error or ‘unsatisfiedDependency’ Exception but after that you can move most deployment entries to a utility class and reuse between your tests, see Deployments.java.

Also note that we could not made the assertion in test method cause we dont know how many roles there is in database, by the way which database the test is using? it’s using the one declared in test-persistence.xml(added in micro-deployment) which uses a maven property ${datasource} that is
defined in pom.xml. For tests using jboss or wildfly we are going to use ‘exampleDS’ as datasource.

The test datasource uses exampleDS2 cause its good practice that tests has its own database, as exampleDS2 is not configured in the server we add it on demand in our micro deployment via jbossas-ds.xml.

Ok we answered which database but we still doesn’t know how to populate test database so we can make assertions on top of it.

One way is to initialize database before each test using an utility class, here is TestService:

@Named
@Stateless
public class TestService implements Serializable {

    @PersistenceContext(unitName = "archetypeTestPU")
    EntityManager em;	

    @Inject
    RoleService roleService;

    public void createRoleDataset(){
        clearDatabase();

        Role roleArch = new Role("architect");
        Role roleAdmin = new Role("administrator");

        em.persist(roleAdmin);
        em.persist(roleArch);
        em.flush();

    }

    public void clearDatabase() {

        em.createNativeQuery("delete from group__role_").executeUpdate();//intermediate tables
        em.flush();
        em.createNativeQuery("delete from user__group_").executeUpdate(); //intermediate tables
        em.flush();
        em.createNativeQuery("delete from role_").executeUpdate();//role
        em.flush();
        em.createNativeQuery("delete from group_").executeUpdate();//group
        em.flush();
        em.createNativeQuery("delete from user_").executeUpdate();//user
        em.flush();
    }

}

here is our integration test with test service to initialize database:

    //testService needs to be added to deployment with .addClass(TestService.class); so it can be injected into test.

    @Inject
    TestService testService;

    @Test
    public void shouldListRolesWithSuccess(){
        testService.createRoleDataset();
        int numRoles = roleService.crud().countAll();
        assertEquals(numRoles, 2);
    }

Running the test

To run the test from IDE just right click in HelloArquillianIt.java and choice run/debug as junit test(dont forget to activate adapter maven profile in IDE).

Also be careful with JBOSS_HOME environment variable, in eclipse it also must be set (in Intellij it is set automatically for you)

To run via maven, surefire plugin must be configured:


	<plugin>
		<groupId>org.apache.maven.plugins</groupId>
		<artifactId>maven-surefire-plugin</artifactId>
		<version>2.16</version>
		<configuration>
			<skipTests>false</skipTests>
			<includes>
				<include>HelloArquillianIt.java</include>
			</includes>
		</configuration>
	</plugin>

 
in case surefire is defined in main build tag in pom.xml then tests will be executed on each build(mvn package, install or test). We will see in
TIPS section how to separate tests in specific maven profiles.

Another way to initialize database is to use DBUnit throught arquillian persistence extension.

Managing test database with arquillian persistence

Persistence extension helps the writing of tests where persistence layer is involved like preparing a test dataset, make a dataset assertion and so on, its based on dbunit framework.

Dependencies

<!--arquillian persistence(dbunit) -->
		<dependency>
			<groupId>org.jboss.arquillian.extension</groupId>
			<artifactId>arquillian-persistence-api</artifactId>
			<version>1.0.0.Alpha7</version>
			<scope>test</scope>
		</dependency>
		<dependency>
			<groupId>org.jboss.arquillian.extension</groupId>
			<artifactId>arquillian-persistence-dbunit</artifactId>
			<version>1.0.0.Alpha7</version>
			<scope>test</scope>
		</dependency>

Configuration file

    <extension qualifier="persistence">
        <property name="defaultDataSource">${datasource}</property>
        <!--<property name="defaultDataSeedStrategy">CLEAN_INSERT</property>-->
    </extension>

${datasource} is set via maven property in pom.xml so we can have dinamic datasource to run tests in different servers, for example in JBossAS tests must use datasource java:jboss/datasources/ExampleDS

Example

    @Test
    @UsingDataSet("role.yml")
    @Cleanup(phase = TestExecutionPhase.BEFORE)
    public void shouldListRolesUsingDataset(){
        int numRoles = roleService.crud().countAll();
        log.info("COUNT:"+numRoles);
        assertEquals(numRoles, 3);
    }

role.yml is a file located at src/test/resources/datasets folder with the following content describing 3 roles:


role_:
  - id: 1
    name: "role"
    version: 1
    createDate: 2014-01-01
    updateDate: 2014-01-01
  - id: 2
    name: "role2"
    version: 1
    createDate: 2014-01-01
    updateDate: 2014-01-01
  - id: 3
    name: "role3"
    version: 1
    createDate: 2014-01-01
    updateDate: 2014-01-01

Other formats like json e xml are supported.

Its also possible to execute scripts before/after each test with @ApplyScriptBefore/After and make dataset assertions with @ShouldMatchDataSet.

Some limitations:

  • hard to maintain large datsets
  • dataset values are static but in next version there will be a way to feed datasets with expressions
  • can degradate test performance(looks like it was fixed for next version(alpha7))
  • doesn’t work with client tests(black box)

Behaviour driven tests(BDD)

BDD[4] with arquillian is implemented in arquillian-jbehave extension throught jbehave[5] framework where an story file(.story) dictates the behaviour of a functionalite using natural language with ‘give when then’ tuple which links the story with tests that will execute based on the described behaviour.

Dependencies


	<dependency>
		<groupId>org.jboss.arquillian.jbehave</groupId>
		<artifactId>arquillian-jbehave-core</artifactId>
		<version>1.0.2</version>
		<scope>test</scope>
	</dependency>

	<!-- although jbehave extension works great and integrates well with most other extensions it doesn't <a href="https://community.jboss.org/message/865003#865003" target="_blank">have an official release</a> -->
        <repository>
            <id>arquillian jbehave UNofficial maven repo</id>
            <url>http://arquillian-jbehave-repo.googlecode.com/git/</url>
            <layout>default</layout>
        </repository>

Example

here follows BDD example, all bdd related tests can be found here.

@RunWith(Arquillian.class)
@Steps(UserSteps.class)
public class UserBdd extends BaseBdd {

    @Deployment
    public static WebArchive createDeployment()
    {
        WebArchive archive = Deployments.getBaseDeployment() //same deployment we saw before
                .addPackage(BaseBdd.class.getPackage())
                .addClass(UserSteps.class)
        return archive;
    }

}

Superclass BaseBdd is used to configure jbehave like for example customize test output report and initialize Steps – classes that execute the test behaviour(given when then).

The mapping between step class and story file works in the following way:

  • arquillian looks for story file in the same package of the class which extends JUnitStory which in case is UserBdd cause it extends BaseBdd that extends JUnitStory.
  • story file name by convention must have the same name of the class which extends JUnitStory but with ‘_'(underline) instead of camelcase, so for example UserBdd story file is user_bdd.story. This convention can be configured in BaseBdd or you can override configuration() method.
  • to execute the behaviour jbehave uses one or more step classes provided in the annotation @Steps(MyStepClass.class)
Story: manage users

Scenario: listing users by role

When i search users with role [name]

Then users found is equal to [total]

Examples:
|name|total|
|developer|2|
|administrator|1|
|secret|0|

//other scenarios related to user

and the step class to execute the behaviour UserSteps.java:

public class UserSteps extends BaseStep implements Serializable {

    private Integer totalUsersFound;

    private String message;

    @BeforeStory
    public void setUpStory(){
	//initialize story
        testService.initDatabaseWithUserAndGroups();
    }

    @BeforeScenario
    public void setUpScenario(){
        //initialize scenario
    }

    @When("i search users with role $name")
    public void searchUserByRole(@Named("name") String roleName){
           totalUsersFound = userService.findUserByRole(new Role(roleName)).size();
    }

    @Then("users found is equal to $total")
    public void usersFoundEqualTo(@Named("total") Integer total){
        assertEquals(totalUsersFound,total);
    }

The matching between story and test(Step) is done by exact string comparison but you can use @Alias annotation to match multiple strings for a step.

For more information about JBehave see its (great) documentation at jbehave.org.

Jbehave enables acceptance tests which can be white box, sometimes called system acceptance tests and usually are defined by development team cause they cover internal system logic or can be black box and named as user acceptance tests which is usualy written by or with client(final user), we will cover black boxed bdd tests later.

To execute bdd tests in our app use:

mvn clean test -Pwildfly-managed -Pbdd-tests

Functional Tests

Tests we saw until now were white boxed or in other words executed in the same proccess(JVM) where container executes tests so we have direct access(through injection) to the objects that were deployed by the micro deployment. Functional tests are black boxed and run “from outside” the container in a different JVM simulating a client accessing the system through user interface.

Arquillian enables functional tests through Drone and Graphene extensions. Both work on top of selenium, the first manages webdriver lifecycle and injection and the second extends selenium funcionalities as PageObjects[6], jquery selectors, page fragments, simplified waits and so on.

Dependencies

		<dependency>
			<groupId>org.jboss.arquillian.graphene</groupId>
			<artifactId>graphene-webdriver</artifactId>
			<type>pom</type>
			<scope>test</scope>
			<version>${version.graphene}</version>
		</dependency>

<dependencyManagement>
		<dependencies>
			<dependency>
				<groupId>org.jboss.arquillian.selenium</groupId>
				<artifactId>selenium-bom</artifactId>
				<version>${version.selenium}</version>
				<type>pom</type>
				<scope>import</scope>
			</dependency>
			<dependency>
				<groupId>org.jboss.arquillian.extension</groupId>
				<artifactId>arquillian-drone-bom</artifactId>
				<version>${version.drone}</version>
				<type>pom</type>
				<scope>import</scope>
			</dependency>
		</dependencies>
</dependencyManagement>

note that the extension uses two boms, selenium bom updates webdriver version regardless webdriver version that comes in drone bom but it needs to be declared before cause in case of identical dependencies(from each bom) the first has precedence on the others in dependencyManagement section.

Configuration file

<extension qualifier="graphene">
	   <property name="waitGuiInterval">3</property>
	   <property name="waitAjaxInterval">4</property>
	   <property name="waitModelInterval">5</property>

	</extension>
	 <extension qualifier="webdriver">
<!--         <property name="browser">firefox</property> -->
        <property name="browser">${arquillian.browser}</property>
        <property name="dimensions">1280x1024</property>
        <property name="remoteReusable">${arquillian.remoteReusable}</property>
 		<property name="remote">${arquillian.remote}</property>
        <property name="remoteAddress">${arquillian.seleniumGrid}</property>
        <property name="chromeDriverBinary">${arquillian.chromeDriver}</property>
        <!-- propriedade para rodar com o chrome localmente, baixar em: http://code.google.com/p/chromedriver/downloads/list -->
    </extension>

to more configuration details see drone and graphene documentation.

Example

here follows functional testing example, all functional related source can be found here

first step is micro-deployment:

    @Deployment(testable = false)
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment()
        .addPackages(true, UserMBean.class.getPackage()) //managed beans
        .addPackages(true,"org.conventions.archetype.converter");//faces converters

        //web resources (pages, js, css etc...)
        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory(WEBAPP_SRC).as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png)$"));
        war.addAsWebResource(new File(TEST_RESOURCES, "/pages/test-logon.xhtml"), "/templates/logon.xhtml");//test logon initialize database for black box tests on each logon, we will detail it later in TIPS section
        System.out.println(war.toString(true));
        return war;
    }

the main difference is the property testable = false which indicates that the test will run as client(separated JVM) this is necessary for drone getting in action cause it executes as a separated proccess.

We also need to add managed beans(bean package), converters and web related resources
such as pages and js files to the test deploy, that is done via ShrinkWrap Filters which accepts regular expressions letting this task simple and generic but becareful cause you are adding all files of a type(eg .js) and if the application is big it can onerate deployment process.

Here is logon example:

    @Drone
    WebDriver browser;

    @Test
    @InSequence(1)
    public void shouldLogonWithSuccess(@InitialPage HomePage home){
        assertTrue(home.getLogonDialog().isPresent());
        home.getLogonDialog().doLogon("admin", "admin");
        home.verifyMessage(resourceBundle.getString("logon.info.successful"));/;/asserts primefaces growl message
    }

Observations

– WebDriver is injected by drone and it know the browser through the arquillian.xml property ${arquillian.browser}. ${arquillian.browser} is defined in pom.xml.
– @InitialPage denote the page that webdriver must navigate before executing the test.
– HomePage is a PageObject[6], here is its code:

@Location("home.faces")
public class HomePage extends BasePage{

    @FindByJQuery("div[id$=logonDialog]")
    private LogonDialog logonDialog;

    public LogonDialog getLogonDialog() {
        return logonDialog;
    }

Also note:
– @Location annotation helps Drone to navigate via @InitialPage
– BasePage has some utilities for page objects
– @FindByJquery is an extension of FindBy selenium selector and is based on JQuery selector
– LogonDialog is a pageFragment and here is its code:

public class LogonDialog {

    @Root
    private GrapheneElement dialog;

    @FindByJQuery("input[id$=inptUser]")
    private GrapheneElement username;

    @FindByJQuery("input[id$=inptPass]")
    private GrapheneElement password;

    @FindByJQuery("button[id$=btLogon]")
    private GrapheneElement btLogon;

    public void doLogon(String username, String password){
        this.username.clear();
        this.username.sendKeys(username);
        this.password.clear();
        this.password.sendKeys(password);
        guardHttp(btLogon).click();
    }

    public boolean isPresent(){
        return this.username.isPresent();
    }

About LogonDialog fragment:

– GrapheneElement is an extension of seleniun webelement
– guardHttp is a Graphene implicity wait that will block test until the http request is done, there is also ajax wait which is very useful
– there is also explicity wait like Graphene.waitModel() and its timeout is configured in arquillian.xml

To execute functional tests in our app use:

mvn clean test -Pwildfly-managed -Pft-tests

User Acceptance tests

In this article i will call user acceptance tests[7] a combination of functional tests with Jbehave or in other words, black box behaviour driven tests. Its important to separate different types of tests cause you probably will want to execute them in different moments, eg: faster tests(white box) on each commit and slower and resource consumer ones at the end of the day.

As this kind of test is a combination of other kind of tests we wont have any specific configuration in arquillian.xml nor dependency at pom.

Example

@Steps({ RoleStep.class, LogonStep.class })
@RunWith(Arquillian.class)
public class RoleAt extends BaseAt {

    @Deployment(testable = false)
    public static WebArchive createDeployment()
    {
        WebArchive archive = createBaseDeployment();

        System.out.println(archive.toString(true));
        return archive;
    }

}

BaseAt combines elements of functional tests(testable=false) and bdd tests that we saw earlier.

role_at.story(remember name convention)


Story: manage user roles

GivenStories: org/conventions/archetype/test/at/logon/logon_at.story

Scenario: insert new role

Given user go to role home

When user clicks in new button

Then should insert role with name new role

Scenario: search roles

Given user go to role home

When user filter role by name [name]

Then should list only roles with name [name]

And return [total] rows

Examples:
|name|total|
|developer|1|
|admin|1|
|a|2|

public class RoleStep implements Serializable {

    @Drone
    private WebDriver browser;

    @ArquillianResource
    private URL baseUrl;

    @Page
    private RoleHome roleHome;

    @FindByJQuery("div[id$=menuBar]")
    private Menu menu;

    @Given("user go to role home")
    public void goToRoleHome() {
        menu.gotoRoleHome();
    }

    @When("user clicks in new button")
    public void userClickInNewButton() {
        WebElement footer = roleHome.getFooter();
        assertTrue(footer.isDisplayed());
        WebElement newButton = footer.findElement(By.xpath("//button"));
        assertTrue(newButton.isDisplayed());
        guardHttp(newButton).click();
    }

    @Then("should insert role with name $name")
    public void shouldInsertRole(String name) {
        Assert.assertTrue(roleHome.isFormPage());
        roleHome.insertRole(name);

    }

    @When("user filter role by name $name")
    public void userFilterRolesBy(@Named("name") String name) {
        roleHome.filterByName(name);
    }

    @Then("should list only roles with name $name")
    public void shouldListRolesWith(@Named("name") String name) {
        for (WebElement row : roleHome.getTableRows("table")) {
            assertTrue(row.findElement(By.xpath("//td[@role='gridcell']//span[contains(@id,'name')]")).getText().contains(name));
        }
    }

    @Then("return $total rows")
    public void shouldReturn(@Named("total") Integer total){
         assertEquals(roleHome.getTableRows("table").size(), total.intValue());
    }

}

RoleHome is a pageObject that controls role form(roleHome.xhtml), its source can be found here.

logon_at.story

 
Story: logon as user

Scenario: user should logon successfully

Given i am at logon screen

When i logon providing credentials admin, admin

Then i should be logged in

public class LogonStep extends BaseAtStep implements Serializable {

  @Drone
  private WebDriver       browser;

  @ArquillianResource
  private URL             baseUrl;

  @FindByJQuery("div[id$=menuBar]")
  private Menu menu;

  @Page
  private HomePage home;

  @Given("i am at logon screen")
  public void imAtLogon() {
    if (!home.getLogonDialog().isPresent()) {
      //if is already logged in, do logout
        super.goToPage(home);
      }
  }

  @When("i logon providing credentials $username, $password")
  public void loginWithCredentials(String username, String password) {
    home.getLogonDialog().doLogon(username,password);
  }

  @Then("i should be logged in")
  public void shouldBeAt() {
    home.verifyMessage(resourceBundle.getString("logon.info.successful"));
  }

}

To execute user acceptance tests in our example project use this maven command:

mvn clean test -Pwildfly-managed -Pat-tests

Gray box tests with Warp

With gray box testing[3] we can fire a request as client(eg:via web driver) and inspect internal objects(like in white box tests) like http session, faces context etc.

Warp tests of our example app can be found here.

Dependencies

	<dependency>
		<groupId>org.jboss.arquillian.extension</groupId>
		<artifactId>arquillian-warp</artifactId>
		<type>pom</type>
		<scope>test</scope>
		<version>1.0.0.Alpha7</version>
	</dependency>
	<dependency>
		<groupId>org.jboss.arquillian.extension</groupId>
		<artifactId>arquillian-warp-jsf</artifactId>
		<version>1.0.0.Alpha7</version>
	</dependency>

Example


@RunWith(Arquillian.class)
@WarpTest
@RunAsClient
public class LogonWarp {

    protected static final String WEBAPP_SRC = "src/main/webapp";

    protected static final String TEST_RESOURCES = "src/test/resources";

    @Drone
    protected WebDriver browser;

    @ArquillianResource
    protected URL baseUrl;

    @Deployment(testable = true)
    public static Archive<?> createDeployment() {
        WebArchive war = Deployments.getBaseDeployment()
                .addPackages(true, UserMBean.class.getPackage()) //managed beans
                .addPackages(true,"org.conventions.archetype.converter");

        //web resources
        war.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class).importDirectory(WEBAPP_SRC).as(GenericArchive.class), "/", Filters.include(".*\\.(xhtml|html|css|js|png)$"));
        war.addAsWebResource(new File(TEST_RESOURCES, "/pages/test-logon.xhtml"), "/templates/logon.xhtml");//test logon clears the database on each logon
        System.out.println(war.toString(true));
        return war;
    }

        @Test
        @InSequence(1)
        public void shouldLogonWithSuccess(@InitialPage final HomePage home){
            assertTrue(home.getLogonDialog().isPresent());
            Warp.initiate(new Activity() {

                @Override
                public void perform() {
                    home.getLogonDialog().doLogon("admin", "admin");

                }
            })
                    .observe(request().method().equal(HttpMethod.POST).uri().contains("home.faces"))
                    .inspect(new Inspection() {
                        private static final long serialVersionUID = 1L;

                        @Inject
                        SecurityContext securityContext;

                        @Inject
                        ResourceBundle resourceBundle;

                        @BeforePhase(Phase.INVOKE_APPLICATION)
                        public void shouldNotBeLoggedIn() {
                            System.out.println("shouldNotBeLoggedIn:"+securityContext.loggedIn());
                            assertFalse(securityContext.loggedIn());
                        }

                        @AfterPhase(Phase.INVOKE_APPLICATION)
                        public void shouldBeLoggedIn(@ArquillianResource FacesContext context) {
                            System.out.println("shouldBeLoggedIn:"+securityContext.loggedIn());
                            assertTrue(securityContext.loggedIn());
                            boolean loggedInMessage = false;
                            for (FacesMessage facesMessage : context.getMessageList()) {
                                  if(facesMessage.getSummary().equals(resourceBundle.getString("logon.info.successful"))){
                                      loggedInMessage = true;
                                  }
                            }
                            assertTrue(loggedInMessage);
                        }

                    });
        }

Gray box tests mix elements from black and white box tests, this is achieved with arquillian mixed mode where we have a Deployment(testable=true) which denotes white testing and at the same time we RunAsClient for the black box part.

The black box part is represented by the Activity interface which has a perform method that is responsible for firing client requests.

The white(server side execution) part is represented by Inspection interface, whithin inspection you can do anything you do with white box testing such as accessing CDI beans through injection for example.

Also note that you can observe specific requests(Activity may start multiple requests) with observe method.

In our example we first logon in the application via user interface with Drone/Graphene and latter we access internal system object, in case SecurityContext to see if user is in fact logged.

Tips and recommendations

Notation

I’m using the following sulfix convention to differentiate tests:

it: Integration (white box)tests (eg: UserIt.java)
ft: Functional (black box)tests (eg: UserFt.java)
bdd: white box (system)acceptance tests (eg: UserBdd.java)
at: black box (user)acceptance tests (ex: UserAt.java)
warp: gray box tests (ex: LogonWarp.java)

this way we easy test profiles management.

Test profiles

Its used to separate different types of test so you can run them in different moments(eg: run lightweight tests more frequent or earlier)

With the sulfix notation we can separate tests in surefire plugin as follows:

		<profile>
			<!-- all tests -->
			<id>all-tests</id>
			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<skipTests>false</skipTests>
							<runOrder>reversealphabetical</runOrder>
							<includes>
								<include>**/*Ft.java</include><!--functional -->
								<include>**/*It.java</include><!--integration -->
								<include>**/*UnitTests.java</include><!--unit -->
								<include>**/*Bdd.java</include><!--white acceptance -->
								<include>**/*At.java</include><!--black acceptance -->
								<include>**/*Warp.java</include><!--gray box -->
							</includes>
							<excludes>
<!-- avoid execution of test superclasses -->
								<exclude>**/*BaseFt.java</exclude>
								<exclude>**/*BaseBdd.java</exclude>
								<exclude>**/*BaseIt.java</exclude>
								<exclude>**/*BaseAt.java</exclude>
								<exclude>**/*BaseWarp.java</exclude>
							</excludes>

						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

		<profile>
			<!-- only integration tests and bdd(white box) -->
			<id>it-tests</id>
			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<skipTests>false</skipTests>
							<runOrder>reversealphabetical</runOrder>
							<includes>
								<include>**/*It.java</include>
								<include>**/*Bdd.java</include>
							</includes>
							<excludes>
								<exclude>**/*BaseBdd.java</exclude>
								<exclude>**/*BaseIt.java</exclude>
							</excludes>

						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

		<profile>
			<!-- only functional and user acceptance tests (black box) -->
			<id>ft-tests</id>
			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<skipTests>false</skipTests>
							<runOrder>reversealphabetical</runOrder>
							<includes>
								<include>**/*Ft.java</include>
								<include>**/*At.java</include>
							</includes>
							<excludes>
								<exclude>**/*BaseFt.java</exclude>
								<exclude>**/*BaseAt.java</exclude>
							</excludes>
						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

Container profiles(adapters)

We talked about three execution modes to run arquillian tests inside a container. To switch between adapters easily we can define maven profiles:

	<profile>
			<id>jboss-remote</id>
			<dependencies>
				<dependency>
					<groupId>org.jboss.as</groupId>
					<artifactId>jboss-as-arquillian-container-remote</artifactId>
					<scope>test</scope>
					<version>7.2.0.Final</version>
				</dependency>
			</dependencies>
                        <build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<systemPropertyVariables>
								<arquillian.launch>jboss-remote</arquillian.launch>
							</systemPropertyVariables>

						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

		<profile>
			<id>jboss-managed</id>
			<activation>
				<activeByDefault>true</activeByDefault>
			</activation>
			<properties>
				<arquillian.serverHome>/home/jboss-eap-6.2</arquillian.serverHome>
                        </properties>
			<dependencies>
				<dependency>
					<groupId>org.jboss.as</groupId>
					<artifactId>jboss-as-arquillian-container-managed</artifactId>
					<scope>test</scope>
					<version>7.2.0.Final</version>
				</dependency>
			</dependencies>

			<build>
				<plugins>
					<plugin>
						<groupId>org.apache.maven.plugins</groupId>
						<artifactId>maven-surefire-plugin</artifactId>
						<version>2.16</version>
						<configuration>
							<systemPropertyVariables>
								<arquillian.launch>jboss-managed</arquillian.launch>
							</systemPropertyVariables>
							<environmentVariables>
								<JBOSS_HOME>${arquillian.jbossHome}</JBOSS_HOME>
							</environmentVariables>
						</configuration>
					</plugin>
				</plugins>
			</build>
		</profile>

Note that in this case we are using surefire only to define properties like:

– JBOSS_HOME provide container path(in managed case). We hold container path in
arquillian.serverHome maven property so it can be overriden by CI(Jenkins) later.
– arquillian.launch informs which adapter must be activated in arquillian.xml

You can also use container maven dependency to download container automaticly during tests as follows:

  	<profile>
            <id>wildfly-managed</id>
            <properties>
                <arquillian.serverHome>${project.build.directory}/wildfly-${wildfly.version}</arquillian.serverHome>
            </properties>
            <dependencies>
                <dependency>
                    <groupId>org.wildfly</groupId>
                    <artifactId>wildfly-arquillian-container-managed</artifactId>
                    <version>${wildfly.version}</version>
                    <scope>test</scope>
                </dependency>
            </dependencies>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-dependency-plugin</artifactId>
                        <executions>
                            <execution>
                                <id>unpack</id>
                                <phase>process-test-classes</phase>
                                <goals>
                                    <goal>unpack</goal>
                                </goals>
                                <configuration>
                                    <artifactItems>
                                        <artifactItem>
                                            <groupId>org.wildfly</groupId>
                                            <artifactId>wildfly-dist</artifactId>
                                            <version>${wildfly.version}</version>
                                            <type>zip</type>
                                            <overWrite>false</overWrite>
                                            <outputDirectory>${project.build.directory}</outputDirectory>
                                        </artifactItem>
                                    </artifactItems>
                                </configuration>
                            </execution>
                        </executions>
                    </plugin>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>2.16</version>
                        <configuration>
                            <systemPropertyVariables>
                                <arquillian.launch>jboss-managed</arquillian.launch>
                            </systemPropertyVariables>
                            <environmentVariables>
                                <JBOSS_HOME>${arquillian.serverHome}</JBOSS_HOME>
                            </environmentVariables>
                        </configuration>
                    </plugin>
                </plugins>
	</profile>

maven dependency plugin downloads the container(if its not in local maven repo) and unpack it inside target directory. Surefire plugin uses
arquillian.serverHome property which points to the server inside target dir.

Faster test execution

When we are developing tests we tend to execute them a lot which can be very time consuming, imagine execute them inside a container which needs to be started and stopped. One way to minimize that is to use the remote container adapter so the developer starts the container once and then execute tests on top of this started container so the only time consuming task(aside from the test itself) is the micro-deployment deploy/undeploy. here is a a remote adapter maven profile:

	<profile>
			<id>jboss-remote</id>

			<dependencies>
				<dependency>
					<groupId>org.jboss.as</groupId>
					<artifactId>jboss-as-arquillian-container-remote</artifactId>
					<scope>test</scope>
					<version>7.2.0.Final</version>
				</dependency>
			</dependencies>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-surefire-plugin</artifactId>
                        <version>2.16</version>
                        <configuration>
                            <systemPropertyVariables>
                                <arquillian.launch>jboss-remote</arquillian.launch>
                            </systemPropertyVariables>
                        </configuration>
                    </plugin>
                </plugins>
            </build>
		</profile>

the system property arquillian.launch selects the container to be activated in arquillian.xml:

<container qualifier="jboss-remote" >
		<configuration>
			<property name="managementAddress">127.0.0.1</property>
			<property name="managementPort">9999</property>
		</configuration>
	</container>

OBS: As you usually will deploy a datasource within your test deploy becareful with the infamous DuplicateService exception.

Avoiding multiple deployments

In white box tests it is possible to avoid multiple deployments(@Deployment) through test dependency injection, to achieve that you just need to separate your test classes and inject them in a common class which has the arquillian deployment, see ArchetypeIt.java where we inject UserIt and RoleIt to leverage ArchetypeIt deployment.

Manipulating test database in functional tests

As we don’t have access to deployed classes during black box tests we can’t use our TestService(neither arquillian persistence) to initialize/prepare database for tests so what we do is to deploy a customized logon page so each time a login is done we init database, this is done in BaseFt as follows:

war.addAsWebResource(new File(TEST_RESOURCES, "/pages/test-logon.xhtml"), "/templates/logon.xhtml");

//test logon clears and initializes test database on each logon

the customized logon fires the initialization method via primefaces remote command each time logon dialog is opened, its the same application logon dialog with the additional code showed below:

 <h:form id="loginForm">
            <p:remoteCommand name="init" rendered="#{not loggedIn}" autoRun="true" immediate="true" process="@this" update="@none" partialSubmit="true" actionListener="#{testService.initDatabaseWithUserAndGroups}"/>
</h:form>

this way we garantee a specific dataset for tests.

Rest testing

For testing restful web services we are running arquillian as client(black box) and deploying our Rest endpoint to microdeployent.

When running as client we can inject the application url(which is dinamic generated by arquillian) in tests via @ArquillianResource, that’s all we need to fire rest requests as client.

Rest test in our example app can be found here.

Also note that we are deploying a RestDataset class which is responsible for inializing our database before the tests.

For more advanced restful tests like testing endpoints from the server side refer to rest extension.

Running tests in CI server(Jenkins)

Running tests in Jenkin is same as locally, the only problem you may face is with functional tests cause CI servers usually dont have graphical interface so it can’t open a web browser to run tests. A way to solve that is to run functional tests remotely in a machine with
graphical card and with a selenium grid up and running, if you have that then you just need to pass some maven parammeters to your test command so you enable remote functional tests:

mvn clean test -Pjboss-managed -Darquillian.remote=true -Darquillian.seleniumGrid=http://remote-machine-ip:1044/wd/hub -Darquillian.localAddress=jenkins-ip

Note that:

arquillian.remote, arquillian.seleniumGrid and Darquillian.localAddress are properties defined in arquillian.xml used by webdriver extension that enable remote functional testing.

http://remote-machine-ip:1044/wd/hub is a selenium grid waiting for conections at port 1044

OBS: you can encapsulate above command in a maven profile:

<profile>
	<id>jenkins</id>
	<properties>//profile default properties, can be overrided with -Dproperty-name=value
		<arquillian.remote>true</arquillian.remote>
		<arquillian.serverHome>/opt/jboss-eap</arquillian.serverHome>
		<arquillian.seleniumGrid>http://remote-machine-ip:1044/wd/hub</arquillian.seleniumGrid>
		<arquillian.localAddress>jenkins-ip</arquillian.localAddress>
	</properties>
</profile>

Another important aspect when running tests in continuous integration is where is the container, for that you can either have a container instalation on CI machine and use jboss managed pointing to that instalation or use the container as maven dependency as we explained here.

Yet another addendum is when you have multiple projects using arquillian tests and these tests run concurrent on CI, in this case you may have conflicts cause arquillian will try to start multiple containers on the same machine, one way to solve that on JBossAS is using port offset in arquillian.xml as show below:

	<container qualifier="jboss-managed"  default="true" >
	   <configuration>
	     <property name="jbossHome">${arquillian.jbossHome}</property>
	     <property name="outputToConsole">true</property>
            <property name="javaVmArguments">-Xmx512m -XX:MaxPermSize=512m  -Djboss.bind.address=localhost
                -Djboss.socket.binding.port-offset=100
                -Djboss.management.native.port=9054            
            </property>
	    <property name="allowConnectingToRunningServer">false</property>
            <property name="managementPort">9154</property>
        </configuration>
	</container>

The parameters -Djboss.socket.binding.port-offset=100 and -Djboss.management.native.port=9054 work in conjunction with property:
9154

You just need to configure different offsets in your application so they can run concurrent on CI server. For more details see[11]

DuplicateServiceException

A problem you may face from time to time is the raise of DuplicateService Exception during test deploy, it usualy happens due to the dynamic datasource inclusion inside test deployment. Arquillian generates an entry in standalone.xml(in case of jbossas/wildfly)for each test deployment and undeploy it after test execution but somethimes it can’t undeploy it so next time you execute the test the datasource(added by micro deployment) will be registered again resulting in the exception below:

Caused by: org.jboss.msc.service.DuplicateServiceException: Service jboss.data-source.java:jboss/datasources/ArchetypeTestDS is already registered

When that happens you must remove the deployment entry in standalone.xml in order to run the tests again.

Note that this exception won’t happen when using the adapter as maven dependency as we saw in container profiles section.

Publishing test metrics

The generation of test metrics like % of test success/fails, tests executed and coverage is done by arquillian-jacoco extension:

JaCoCo works through byte code instrumentation and is activated by a maven plugin:

		<profile>
			<id>jacoco</id>
			<properties>
				<jacoco.version>0.7.0.201403182114</jacoco.version>
			</properties>
			<dependencies>
				<dependency>
					<groupId>org.jboss.arquillian.extension</groupId>
					<artifactId>arquillian-jacoco</artifactId>
					<scope>test</scope>
					<version>1.0.0.Alpha6</version>
				</dependency>
				<dependency>
					<groupId>org.jacoco</groupId>
					<artifactId>org.jacoco.core</artifactId>
					<scope>test</scope>
					<version>${jacoco.version}</version>
				</dependency>
			</dependencies>
			<build>
				<plugins>
					<plugin>
						<groupId>org.jacoco</groupId>
						<artifactId>jacoco-maven-plugin</artifactId>
						<version>${jacoco.version}</version>
						<executions>
							<execution>
								<goals>
									<goal>prepare-agent</goal>
								</goals>
							</execution>
							<execution>
								<id>report</id>
								<phase>prepare-package</phase>
								<goals>
									<goal>report</goal>
								</goals>
							</execution>
						</executions>
					</plugin>
				</plugins>
			</build>
		</profile>

its also necessary to pass the following properties in pom.xml(or use -D option in maven command) so sonar can read jaCoCo reports

   <properties>
	        <sonar.core.codeCoveragePlugin>jacoco </sonar.core.codeCoveragePlugin>
		<sonar.dynamicAnalysis>reuseReports</sonar.dynamicAnalysis>
		<sonar.core.codeCoveragePlugin>jacoco</sonar.core.codeCoveragePlugin>
   </properties>

then you can use the command:

mvn clean install -Pjboss-managed -Pall-tests -Pjacoco

note that you need to use ‘install’ so report is generated in target folder and also note that coverage report will only take into account white box tests.

Arquillian all dependencies

Arquillian All is an all-in-one maven dependency for arquillian, its main objective is to facilitate beginners to setup arquillian dependencies, if you have good knowlegde of arquillian platform then preferably declare each dependency you need so you can leveragy arquillian modularity. Here is a before-after arquillian-all pom.xml:

Before:

<dependencies
        <!-- Test dependencies -->
       <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.11</version>
            <scope>test</scope>
        </dependency>
 
        <!-- arquillian -->
        <dependency>
            <groupId>org.jboss.arquillian.junit</groupId>
            <artifactId>arquillian-junit-container</artifactId>
            <scope>test</scope>
        </dependency>
 
        <!--arquillian persistence(dbunit) -->
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-persistence-api</artifactId>
            <version>1.0.0.Alpha7</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-persistence-dbunit</artifactId>
            <version>1.0.0.Alpha7</version>
            <scope>test</scope>
        </dependency>
 
        <!-- warp -->
 
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-warp</artifactId>
            <type>pom</type>
            <scope>test</scope>
            <version>${warp.version}</version>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.extension</groupId>
            <artifactId>arquillian-warp-jsf</artifactId>
            <version>${warp.version}</version>
 
        </dependency>
 
        <!-- shrinkWrap resolvers -->
        <dependency>
            <groupId>org.jboss.shrinkwrap.resolver</groupId>
            <artifactId>shrinkwrap-resolver-depchain</artifactId>
            <scope>test</scope>
            <type>pom</type>
        </dependency>
 
        <dependency>
            <groupId>org.jboss.arquillian.graphene</groupId>
            <artifactId>graphene-webdriver</artifactId>
            <type>pom</type>
            <scope>test</scope>
            <version>${version.graphene}</version>
        </dependency>
 
          <dependency>
             <groupId>org.jboss.arquillian.graphene</groupId>
             <artifactId>arquillian-browser-screenshooter</artifactId>
             <version>2.1.0.Alpha1</version>
             <scope>test</scope>
          </dependency>
 
         <!-- REST -->
 
         <dependency>
             <groupId>org.jboss.arquillian.extension</groupId>
             <artifactId>arquillian-rest-client-api</artifactId>
             <version>1.0.0.Alpha3</version>
         </dependency>
         <dependency>
             <groupId>org.jboss.arquillian.extension</groupId>
             <artifactId>arquillian-rest-client-impl-2x</artifactId>
             <version>1.0.0.Alpha3</version>
         </dependency>
 
         <dependency>
             <groupId>org.jboss.arquillian.extension</groupId>
             <artifactId>arquillian-rest-warp-impl-resteasy</artifactId>
             <version>1.0.0.Alpha3</version>
         </dependency>
 
        <!-- arquillian bdd -->
 
         <!-- jbehave -->
        <dependency>
            <groupId>org.jboss.arquillian.jbehave</groupId>
            <artifactId>arquillian-jbehave-core</artifactId>
            <version>1.0.2</version>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.jboss.spec.javax.annotation</groupId>
            <artifactId>jboss-annotations-api_1.1_spec</artifactId>
            <version>1.0.1.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.spec.javax.ejb</groupId>
            <artifactId>jboss-ejb-api_3.1_spec</artifactId>
            <version>1.0.2.Final</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.arquillian.protocol</groupId>
            <artifactId>arquillian-protocol-servlet</artifactId>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.apache.httpcomponents</groupId>
            <artifactId>httpcore</artifactId>
            <version>4.2.5</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>commons-collections</groupId>
            <artifactId>commons-collections</artifactId>
            <version>3.2.1</version>
        </dependency>
        <dependency>
            <groupId>xml-apis</groupId>
            <artifactId>xml-apis</artifactId>
            <version>1.4.01</version>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>1.7.5</version>
            <scope>test</scope>
        </dependency>
 
        <dependency>
            <groupId>org.codehaus.jackson</groupId>
            <artifactId>jackson-core-lgpl</artifactId>
            <version>1.9.13</version>
            <scope>test</scope>
        </dependency>
 
    </dependencies>
 
    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.jboss.arquillian</groupId>
                <artifactId>arquillian-bom</artifactId>
                <version>${version.arquillian}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
            <dependency>
                <groupId>org.jboss.arquillian.selenium</groupId>
                <artifactId>selenium-bom</artifactId>
                <version>${version.selenium}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
            <dependency>
                <groupId>org.jboss.arquillian.extension</groupId>
                <artifactId>arquillian-drone-bom</artifactId>
                <version>${version.drone}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

After:

<dependencies
        <!-- Test dependencies -->
       <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.11</version>
            <scope>test</scope>
        </dependency>
 
        <!-- arquillian -->
         <dependency>
            <groupId>org.jboss.arquillian</groupId>
            <artifactId>arquillian-all</artifactId>
            <version>1.0.1</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>xml-apis</groupId>
            <artifactId>xml-apis</artifactId>
            <version>1.4.01</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.jboss.shrinkwrap</groupId>
            <artifactId>shrinkwrap-api</artifactId>
            <version>1.2.2</version>
          <scope>test</scope>
    </dependency>
 
</dependencies>

Functional tests on each commit?

As ftests tends to be slower than white box tests its unbearable to execute them on each commit in continuous integration, a way to workaround it is to create a dedicated jenkins job to run black box tests(ft-tests profile we showed before) and schedule it to run at the end of the day for example.

Another possibility is to execute functional tests is a headless browsers[8] so test time execution is faster and viable on each commit(or every 10 minutes at least).

Phantomjs is the most recommended for this kind of test, it simulates a web browser through the V8 javascript engine(same used by Chrome).

To execute tests using phantom in our sample application just activate ‘webdriver-phantomjs’ profile or -Darquillian.browser=phantomjs in maven command

Another option of headless webdriver is HtmlUnit, also supported by Drone.

Prime arquillian

Prime-Arquillian is an opensource project which aims testing Primefaces showcase, the output of this project must be Arquillian graphene page fragments representing primefaces components to easy functional testing of primefaces based applications.

The project is in initial state and has a draft of primefaces datatable fragment that can be found in components module.

The alien build pipeline

A good testing suite is the basis of a deployment pipeline where we can promote a build across multiple stages, this process is well explained by Martin Fowler here.

For our application i’ve created a simple pipeline with only automated steps that uses the arquillian tests of this post on a build pipeline orchestrated by Jenkins with metrics published in Sonar ending with a deployed version in Wildfly application server.

I have not detailed the pipeline here to not extend this post even more, instead i’ve published a video on youtube, any question you can post a comment here.

Conclusion

In this article we saw a powerfull testing platform that facilitate the development of tests in the JavaEE ecosystem and also saw it working in practice on a JavaEE6 application.

We could notice that arquillian provided support to most used kinds of tests and can be easily used in continous integration but its initial setup/configuration, mainly in existing projects, is not trivial but once done can bring inumerous advantages.

References