Tests are an integral part of the development process. No feature or functionality can be considered done without a set of tests implementing scenarios, which verify that what’s implemented works according to specifications. If we talk in terms of functional or integration tests, more than often they depend upon some infrastructure, like some kind of database, messaging queue, distributed cache, etc.
Usually, when we want to create integration tests (for e.g to test persistence layer), it is convenient to load up the in-memory database for that purpose. This although easy to set up and get started, carries some drawbacks and challenges with it.
If tests pass against the in-memory database, that does not necessarily mean that the tested code will work correctly against the production database. This can be due to the fact that some vendor-specific things of the production database are simply not supported by the in-memory database. For example, Postgres has JSONB type, which is not supported by H2 or HSQLDB. To overcome this issue, one route is to implement some workarounds, or simply not cover that part with tests, but this is not ideal, to say the least, because we end up rewriting code to accommodate tests or risk leaving code untested. Another example that comes to mind is a virtual column in Oracle and MySQL databases. Those are just a few examples related to RDBs, in case that we have a hybrid setup with NoSQL database, messaging queue, cache… things became more challenging really fast.
One option to overcome these issues is to run our tests against the specific production-like environment. Since this environment is probably already present as part of the deployment and delivery process, this seems like a solid option for testing needs. But there are some drawbacks, main are that this means that we cannot easily run a test from a local development environment (for e.g. additional configuration needs to be done). This environment might not always be available (it might be used for performance testing, or the environment simply goes down). Also, when we deploy our application and tests onto the environment and run tests there, the feedback loop slows down, so any errors are detected and fixes are implemented much later.
Testcontainers use case
Luckily, there is an easier way to manage all these challenges by using testcontainers.
Examples in this blog post are implemented using Java 8, Spring Boot, Spring Boot Test, JUnit, Testcontainers (uses latest Postgres image), Gradle, and Liquibase. Basic knowledge of those technologies is assumed, but in any case, core principles and ideas presented here should be applicable when using Testcontainers with any other technology stack.
There are two ways to use Testcontainers – managed by Java test code, or managed by build tools like Gradle. The approach described here will focus on Java code usage. The second approach is outside of the scope of this post, so for more info check official documentation and examples provided by the community.
The first step is to add a dependency to the project. For this example, we’ll use org.testcontainers.postgressql dependency, which is specialized for supporting Postgres docker container. There is generic module testcontainers dependency that supports generic containers, docker-compose… For more info, consult usage documentation. Dependency is added to build a Gradle file:
testImplementation(’org.testcontainers:postgresql:1.7.1’) // Note: this is the version at the time of writing this blog post, latest artifact version should be used.
Now when we have Testcontainers on our classpath, the easiest way to initialize our Postgres container is to create an instance like so:
final PostgreSQLContainer postgreSQLContainer = new PostgreSQLContainer();
This will initialize Postgres container with sensible defaults (check PostgreSQLContainer class for details). After that, we could start container by invoking postgreSQLContainer.start();.
We could also make the instance public and static and mark it with @ClassRule, which will automatically start and then stop container after all tests are executed.
A container is by default started on random port, to avoid potential conflicts. We can customize username, password and database name during initialization:
private static final PostgreSQLContainer postgreSQLContainer = new PostgreSQLContainer().withUsername(“user01”).withPassword(“pass01”) .withDatabaseName(“testDatabase”);
but we cannot reconfigure port. Also, the port is assigned when the container is started, so there is no way to know which port is going to be used before we actually start the container.
This presents a challenge, assuming that the application has an instance configured to communicate with the database. In that case, we also need to configure the datasource instance for our integration tests.
One option here is to use a specialized container instance called
FixedHostPortGenericContainer:
@ClassRule
public static FixedHostPortGenericContainer postgreSQLContainer =
new FixedHostPortGenericContainer<>(“postgres:latest”)
.withEnv(“POSTGRES_USER”,”testUser”)
.withEnv(“POSTGRES_PASSWORD”,”testPassword”)
.withEnv(“POSTGRES_DB”,”testDb”)
.withFixedExposedPort(60015);
In the example above, we fixed port on 60015, so now before we start container we can configure our datasource instance using JDBC connection string:
“jdbc:postgresql://”
+ DockerClientFactory.instance().dockerHostIpAddress()
+ “:60015/testDb”;
This approach is less than ideal because there are no guarantees that port 60015 will always remain open, and that we won’t have some conflicts down the line. Taking that into the account, we must leave port assignment dynamic, but somehow initialize datasource instance, and with it, ideally, the Liquibase instance which can be used to (re)create the database schema. This would require the setup of application context after the container has been started. So for example, we can have following test configuration class:
@TestConfiguration
public class TestRdbsConfiguration {
@Bean
public PostgreSQLContainer postgreSQLContainer() {
final PostgreSQLContainer postgreSQLContainer = new PostgreSQLContainer();
postgreSQLContainer.start();
return postgreSQLContainer;
}
@Bean
public DataSource dataSource(final PostgreSQLContainer postgreSQLContainer) {
// Datasource initialization
ds.setJdbcUrl(postgreSQLContainer.getJdbcUrl());
ds.setUsername(postgreSQLContainer.getUsername());
ds.setPassword(postgreSQLContainer.getPassword());
ds.setDriverClassName(postgreSQLContainer.getDriverClassName());
// Additional parameters configuration omitted
return ds;
}
@Bean
public Liquibase liquibase(final DataSource dataSource) throws LiquibaseException, SQLException {
final Database database = DatabaseFactory.getInstance().findCorrectDatabaseImplementation(new JdbcConnection(dataSource.getConnection()));
return new liquibase.Liquibase(Paths.get(“.”, this.get(“.”, PATH_TO_CHANGELOG_FILE)
.normalize()
.toAbsolutePath()
.toString(), new FileSystemResourceAccessor(), database);
}
}
And then in our integration test class, we can do for example:
@RunWith(SpringRunner.class)
@SpringBootTest(classes = TestRdbsConfiguration.class,
webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
public class SomeIntRdbsTest {
@Autowire
public PostgreSQLContainer postgreSQLContainer;
@Autowired
private Liquibase liquibase;
// Recreate database scheme before each test so no data interdependencies are introduced
@Before
public void before() throws LiquibaseException {
liquibase.dropAll();
liquibase.update(new Contexts());
}
// Test methods …
}
Using this approach is much better as opposed to hardcoding the port. Only thing is not to forget to add test configuration class to SpringBootTest annotation so it’ll be picked up. Another similar approach came after scouring the community for Testcontainers and Spring Boot testing topics. The idea is to create an application context initializer class which will after the container has been started, create and configure Liquibase and datasource beans. So for e.g., we define a class:
public class LbAndDsInitializer implements
ApplicationContextInitializer<ConfigurableWebApplicationContext> {
public static final ThreadLocal<PostgreSQLContainer> PG_CONTAINER = ThreadLocal.withInitial(() -> null);
// We override initialize method:
@Override
public void initialize(ConfigurableWebApplicationContext applicationContext) {
final PostgreSQLContainer postgreSQLContainer = PG_CONTAINER.get();
try {
if (postgreSQLContainer != null) {
// We initialize data source same way as before
final DataSource dataSource = initializeDataSource(postgreSQLContainer);
applicationContext.getBeanFactory().registerSingleton(“dataSource”,
dataSource);
// We initialize liquibase same way as before
final Liquibase liquibase = initializeLiquibase(dataSource);
applicationContext.getBeanFactory().registerSingleton(“liquibase”,
liquibase);
}
} catch (LiquibaseException | SQLException e) {
// Do something with the exception
}
}
As shown in this example, we initialize datasource and Liquibase beans the same way as in the previous example, the only thing here is that we are explicitly putting beans into application context. So now in our test class, we need to start Postgres testcontainer before context initialization and pass it to our initializer class so configuration can be completed before tests are executed. Before all that we need to tell our test class which context initializer to use:
@ContextConfiguration(initializers = LbAndDsInitializer.class)
public class SomeIntRdbsTest
Then we create Postgres testcontainer and define test class rule which will set the container in our initializer:
@ContextConfiguration(initializers = LbAndDsInitializer.class)
public class SomeIntRdbsTest {
private static final PostgreSQLContainer postgreSQLContainer = new PostgreSQLContainer();
@ClassRule
public static TestRule exposePortMappings = RuleChain.outerRule(postgreSQLContainer).
around(SomeIntRdbsTest::apply);
private static Statement apply(Statement base, Description description) {
return new Statement() {
@Override
public void evaluate() throws Throwable {
LbAndDsInitializer.PG_CONTAINER.set(postgreSQLContainer);
base.evaluate();
}
};
}
}
Now when application contexts start we have set our datasource and Liquibase beans correctly so we can access database in testcontainer.
Summary
Testcontainers present a very good option to quickly bring up the infrastructure needed for integration testing, giving more control to the developer. There are specialized container options for databases (Postgres, MySQL, Oracle, and Virtuoso), Selenium driver, and others. If this is not enough, generic containers can be used, which could take an image from both public and private (some extra configuration is needed) docker repository, and could be customized for specific test needs. When using JUnit and Spring Test make sure to leverage Rules to automatically handle startup, stopping, and cleanup of containers. Use custom test configuration classes, or initializers to configure beans or populate property values with testcontainer parameters. Ideas on how to obtain container parameters and initialize necessary beans and application context that’s shown here present just a few patterns, as I’m sure there are other ways to achieve a similar thing.
Links and references
• Official testcontainers documentation – https://www.testcontainers.org/