Testing in a Vert.x environment

Vert.x is a lightweight, high performance application platform for the JVM that’s designed for modern mobile, web, and enterprise applications.

At least, according to the Vert.x homepage. Vert.x provides a communication infrastructure through which multiple, typically smaller and simpler, applications can communicate with one another. As a result it provides simplicity, scalability and concurrency in a relatively simple framework. Each of the applications individually can be comprehended more easily, because each application handles events sequentially. With this approach, the developer is not bothered with concurrency issues any more. At least this is the case within a single such application, also called a Verticle.

Vert.x is a framework that facilitates reactive programming. However, while Vert.x and reactive programming sound nice in theory, we still need to test our applications. With Vert.x we create a whole new infrastructure layer beneath our applications. Now we will have to work with Vert.x in order to do reliable and representative testing.

Unit testing

As we all know, in unit tests we take the smallest unit and test it in isolation. Even though Vert.x is now part of the application’s infrastructure, unit testing still means that you test an individual unit. This means that Vert.x can be either ignored or mocked out, whichever applies.

In order to test the pieces of code that directly make use of Vert.x facilities, you can mock the required Vert.x classes, such as Vertx or EventBus. For example, you can use the mocking framework Mockito to mock the EventBus and script a few prepared responses. Then, afterwards, you can verify that the correct methods have been called with the required argument conditions. These mocking facilities should be enough to confirm that the unit behaves as you would expect.

Now, since Vert.x is part of your application’s infrastructure, what it really is about, is integration testing.

Integration testing facilities

While looking into ways of doing integration testing, I encountered at least 3 different approaches.

The first is vertx-junit-annotations. This project has been updated last around the start of 2013. The approach it takes is outdated and has since been abandoned.

The second is vertx-test-framework. This project has officially been deprecated.

Finally, there is a third way of doing integration testing, which is with vertx-testtools. Vert.x Test Tools is currently the recommended tool set. This is also the method that we will look at now.

Integration testing with Vert.x Test Tools

Vert.x provides infrastructure. Now if you graciously use Vert.x facilities you will be able to make applications more modular. However, you are supposed to test the interoperability between these modules and verticles. To do that Vert.x provides Vert.x testtools. Vert.x testtools is a very minimal set of tools that does a wonderful job in providing (almost) everything that you need for integration testing with Vert.x.

Vert.x’s Ping-Pong example

Let’s have a look at Vert.x’s own ping-pong example. Below is the PingVerticle implementation. For the original source, please refer to the Maven archetype which contains this example, and also contains the license and other information.

This is our example verticle. This particular verticle responds to every message on the “ping-address” topic with a “pong!” and logs its activity. Now to do actual integration tests, we need to contact the PingVerticle via the Vert.x infrastructure as you would with other verticles.

Let’s have a look at how you would test the interaction with this verticle. I have removed some parts from the original example file for brevity as they are not relevant for this explanation.

Let’s have a look at the various pieces of code. I am a strong believer of actually looking at an implementation as part of trying to understand it, so I will include links to sources.

Integration tests inherit from TestVerticle

TestVerticle is a specially prepared class that takes some measures to make it possible to start testing modules and verticles:

  1. It is annotated with a specialized JUnit test runner designed for launching the Vert.x infrastructure and running the tests there. As a result we can simply run integration tests as we would do other JUnit unit tests and have everything set up for us. We can even debug our tests from within the IDE.
  2. It provides the method “initialize()”, which is the way to initialize the Vertx instance that we need for our tests to access the Vert.x (test) infrastructure.
  3. It provides a method “startTests()” that we call after having done our test initialization to actually start running tests.

TestVerticle is just another verticle and it is run as a verticle. As such, start() is the method that is called when the verticle is started. This is the location where you do your module or verticle deployments and any other initialization. As explained, we should always start with initialize() in order to initialize the Vert.x infrastructure and finally, when all initializations are done, call startTests() to start testing.

A consequence of running tests as a verticle is that certain JUnit annotations can not be used anymore, among which:

See also Vert.x documentation on Java Integration Tests.

Writing (asynchronous) tests

Tests are identified by the well-known org.junit.Test annotation, a.k.a. @Test. When startTests() is called, it will run each of the annotated tests.

Our goal is integration testing. For integration testing we are interacting with independently functioning verticles that we need to test. Now, remember that our test case is just another verticle? This means that it takes on the same characteristics: the verticle is executed in its own thread, events are executed in sequence, while handling one event, all other events are queued up - waiting to be processed. We cannot write the complete test in the test-method. The test method is only suitable for setting up handlers and timers and possibly to initiate the test by sending the first message(s).

Test assertions are executed asynchronously, for example in a registered message handler, reply handler or timed/periodic task. Keep in mind that verticles handle events sequentially. That means that you cannot wait for some result at the end of your test body. Doing so would mean that you block all other events for your TestVerticle implementation, which basically prevents any interaction with the test verticle and defeats the purpose of integration testing.

Because of the asynchronous nature of these tests, there are a few things to keep in mind while writing tests. The class VertxAssert contains all of these necessary utilities.

  1. *Every test should be completed with VertxAssert.testComplete().*

    To inform the test runner that the test has completely finished, i.e. on all asynchronous components, there exists VertxAssert.testComplete() which informs the test runner of this fact. This is the actual signal for the end of a test. The end of executing the @Test-annotated test method is not the end of the test, as is the case for typical unit tests. The test’s body finishes earlier.

    Note that it does not make sense to call VertxAssert.testComplete() from within the test method itself, since there could not have been any (bi-directional) interaction yet.

  2. Assertions do not have the same effect as they would in normal tests. In a normal test, an assertion would throw an AssertionError in case of a failed assertion. Doing this in a verticle will just cause the verticle to stop handling the event prematurely and continue with the next event.

    VertxAssert.handleThrowable(Throwable t) solves this by propagating the error to the test runner, since it is the component who is actually interested in the AssertionError. This can be used for complex assertions, but it does require some effort to set this up, since we need to catch the AssertionError and pass it on to handleThrowable.

    VertxAssert contains almost all of the common assertions, as well as fail() itself. By (statically) importing VertxAssert instead of org.junit.Assert, you can again write assertions as you are used to even though tests are asynchronous by nature.

As you will have noticed in my description of the VertxAssert methods, they communicate with the test runner. Since verticles are deployed in their own thread, the test runner is running in a different thread than your tests. So, instead of expressing failure with an assertion error or success by running the test to completion, we need to signal the test runner for success or failure, once this can be decided.

Now, if you take everything into account:

  1. tests are only set up and possibly initiated in the test method,
  2. assertions are called from within handlers and/or timers,
  3. and finally you need to end the test by calling testComplete() from somewhere other than the test’s body.

You can imagine that for non-trivial tests it may take a bit of book keeping in order to determine in every and all cases when the tests have completely finished. This is basically the cost you pay for setting up integration tests in Vert.x, at least with Vert.x Test Tools. It does force you to think about its interaction though, which I think is a good thing.

Test runner internals

Now that we know to use VertxAssert, let’s make a small side-step into Test runner internals.

As I said before, Vert.x Test Tools provides a special JUnit test runner: JavaClassRunner. The test runner will set up an embedded Vert.x environment, starting with the PlatformManager that is created already in the constructor. Incidentally, this is a nice example of how to embed Vert.x inside your own Java-application. (Even though this is not the recommended way of using Vert.x. Refer to the Vert.x documentation for more information on this.) Note that this PlatformManager is instantiated for local use only. It does not provide the necessary parameters for setting up clustering.

Then we run all the tests (annotated with @Test), one by one. First, an event bus is set up and a special handler is registered that will handle the success or failure message for the Vert.x test runner. Then the test verticle is deployed. This is the verticle inside which the tests will be executed. In case the deployment failed it registers an error, otherwise it waits until the test finishes.

This is where the handler comes in to play. The test runner handles messages on a specific topic “vertx.testframework.handler”. The registered handler handles messages on this topic. A failure means that the transferred, serialized exception is registered as being the cause of the failure. A success simply means that the test has finished. The runChild method of the test runner will wait until a result is handled. Once the result is handled, the test runner will continue. This also means that a badly constructed test can run for a very long time … Fortunately, there is a timeout that, by default, waits for 5 minutes. A custom timeout can be defined by setting the system property vertx.test.timeout. The timeout value must be a number and is interpreted as the number of seconds to wait before the test times out.

Once the event is received and the result is known, the test runner continues its execution. The test is done, so the next steps are to unregister the handler and undeploy the test verticle. Then, finally, the test result is registered in the (JUnit) notifier and a single test is done. In case something goes wrong during deployment or undeployment, this is also registered as a test failure.

TestUtils for random stuff

VertxAssert contains all of the important methods for assertion tests. TestUtils, on the other hand, contains some random utilities such as:

Given that Vert.x is centered around messaging between verticles these can come in useful, for example to test how your verticles respond to bad messages.


We have now discussed most of Vert.x Test Tools. It enables you to do unit test-like integration testing within you IDE (even with debugging support), Maven, Gradle or however you prefer to run JUnit. It provides exactly what you need to add the Vert.x piece to the puzzle and asks little in return. Most notably, that you need to construct your tests differently. There are some improvements possible, though none of them serious enough to make integration testing unacceptably difficult.




  1. https://groups.google.com/forum/#!topic/vertx/GgS5EagrohQ
  2. https://github.com/vert-x/testtools/tree/master/src/main/java/org/vertx/testtools
  3. http://vertx.io/dev_guide.html#java-integration-tests
  4. http://vertx.io/maven_dev.html