In part 2 of this post, we will refactor the application written in part 1 in order to use a database. We will take a short look at the choices we have when selecting a database in combination with Spring WebFlux, use an embedded version of the database, refactor the sources and find solutions for the problems we encounter. The code can be found at GitHub in branch mongodb.
Select a database
When we take a look at Spring Initializr, we notice that only NoSQL database have reactive support. At the time of writing these are:
- Reactive Redis
- Reactive MongoDB
- Reactive Cassandra
- Reactive Couchbase
Why do only NoSQL database have reactive support and not the traditional relational databases? The answer is quite simple: relational databases make use of JDBC and JDBC is a blocking API. Also, we use database transactions and these block resources which is not very reactive. However, there are some initiatives for developing an async driver for relational databases. But for now, only NoSQL databases are officially supported because they fit best in the reactive world. As a consequence, there is no reactive support for Hibernate or JPA.
In our example, we will choose for Reactive MongoDB and more specifically for the embedded version, this will allow us to run and test the application without the need to install MongoDB.
In the next sections, we will make small refactor steps and check whether the application still compiles and starts without error.
Adapt pom.xml
First of all, we are going to remove the dependency on spring-data-commons. We added this dependency in order to make use of the ReactiveCrudRepository interface.
<dependency> <groupId>org.springframework.data</groupId> <artifactId>spring-data-commons</artifactId> <version>2.0.5.RELEASE</version> </dependency>
Next, we add a dependency for Reactive MongoDB which includes spring-data-mongodb and the reactive driver:
<dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-data-mongodb-reactive</artifactId> </dependency>
In order to make use of embedded MongoDB, we add the dependency on Embedded MongoDB:
<dependency> <groupId>de.flapdoodle.embed</groupId> <artifactId>de.flapdoodle.embed.mongo</artifactId> <scope>runtime</scope> </dependency>
Run the application with spring-boot:run and the application still starts successfully.
Adapt the repository
For the repository, we can now make use of the ReactiveMongoRepository interface. Our code was the following:
public class ReactiveShowRepository implements ReactiveCrudRepository<Show, String> { // a lot of code }
And now becomes:
@Repository public interface ReactiveShowRepository extends ReactiveMongoRepository<Show, String> { }
So, we just extend the interface and then automagically our repository is working? Yes, it is. Spring Boot will automatically plugin an implementation of the ReactiveShowRepository based on the class SimpleReactiveMongoRepository at runtime. This class provides the most common CRUD methods.
We also added the following to the module-info.java:
requires spring.data.mongodb;
Run the application, our url http://localhost:8080/shows returns nothing, as expected because we have nothing in our database.
Adapt the Show domain object
We annotate our Show domain object as a Document and remove the constructor we made.
@Document public class Show { @Id private String id; private String title; ... }
Fill the database
Everything is up-and-running now, but we need some data in our database. In order to do so, I will create a class DataImportConfiguration like is demonstrated in the Spring Boot webinar by Phil Webb. This class reads a yaml file which contains the data, converts it into properties and converts it into a list of Show objects. When the application is started, the data is injected into the database.
Because we are using the YamlPropertiesFactoryBean, we also add the dependency on spring.beans in our module-info.java.
The DataImportConfiguration class is the following:
@Configuration public class DataImportConfiguration { @Bean public CommandLineRunner initData(MongoOperations mongo) { return (String... args) -> { mongo.dropCollection(Show.class); mongo.createCollection(Show.class); getShows().forEach(mongo::save); }; } private List<Show> getShows() { Properties yaml = loadShowsYaml(); MapConfigurationPropertySource source = new MapConfigurationPropertySource(yaml); return new Binder(source).bind("shows", Bindable.listOf(Show.class)).get(); } private Properties loadShowsYaml() { YamlPropertiesFactoryBean properties = new YamlPropertiesFactoryBean(); properties.setResources(new ClassPathResource("shows.yml")); return properties.getObject(); } }
We create the following show.yml file into our resources directory:
shows: - title: "Title 1" - title: "Title 2" - title: "Title 3" - title: "Title 4" - title: "Title 5"
Run the application and invoke http://localhost:8080/shows . The following output is shown into our browser:
[{"id": "5aad24d7c568b82764d592e8","title": "Title 1"}, {"id": "5aad24d7c568b82764d592e9","title": "Title 2"}, {"id": "5aad24d7c568b82764d592ea","title": "Title 3"}, {"id": "5aad24d7c568b82764d592eb","title": "Title 4"}, {"id": "5aad24d7c568b82764d592ec","title": "Title 5"}]
Invoke the url http://localhost:8080/shows/5aad24d7c568b82764d592e8 in order to retrieve the data for 1 show, the output is:
{ "id": "5aad24d7c568b82764d592e8", "title": "Title 1" }
And we can invoke the url for retrieving events for a show: http://localhost:8080/shows/5aad24d7c568b82764d592e8/events. Every second an event is added to the output:
data:{"id":"5aad24d7c568b82764d592e8","date":1521296867227} data:{"id":"5aad24d7c568b82764d592e8","date":1521296868263} data:{"id":"5aad24d7c568b82764d592e8","date":1521296869297} data:{"id":"5aad24d7c568b82764d592e8","date":1521296870307} data:{"id":"5aad24d7c568b82764d592e8","date":1521296871335}
At this point, we have refactored our application in order to use embedded MongoDB and our endpoints are still working as expected.
Adapt the unit test
Last thing to check is whether our unit test is still working. Run the unit test MySpringWebfluxCrudPlanetApplicationTests from within your IDE. I would have expected that the unit test was still working since we didn’t change anything to the application logic. However, 100 errors were displayed :-o. Below a snippet, all errors were similar to these shown below:
Error:java: the unnamed module reads package com.mongodb from both mongodb.driver.core and mongodb.driver Error:java: the unnamed module reads package com.mongodb.client from both mongodb.driver.core and mongodb.driver ...
After some investigation, it turns out that these errors occur because we are using Java 9 modules. Disable the usage of Java 9 modules by renaming the module-info.java to module-info.java_ and run the test again. The test result is now successful.
What did happen? The reason for the error is called ‘split packages’: two packages with the same name exist in different modules and this is not allowed, except for unnamed modules.
Let’s take a look at the problem. The mongodb-driver-3.6.3.jar contains the following packages and we notice that this jar does not have a module descriptor, thus an automatic module is created.
No module descriptor found. Derived automatic module. mongodb.driver@3.6.3 automatic requires java.base mandated contains com.mongodb contains com.mongodb.client contains com.mongodb.client.gridfs contains com.mongodb.client.jndi contains com.mongodb.client.model contains com.mongodb.gridfs contains com.mongodb.util contains org.bson contains org.bson.io contains org.bson.types contains org.bson.util
Similar output for the mongodb-driver-core-3.6.3.jar.
No module descriptor found. Derived automatic module. mongodb.driver.core@3.6.3 automatic requires java.base mandated contains com.mongodb contains com.mongodb.annotations contains com.mongodb.assertions contains com.mongodb.async contains com.mongodb.binding contains com.mongodb.bulk contains com.mongodb.client contains com.mongodb.client.gridfs.codecs contains com.mongodb.client.gridfs.model contains com.mongodb.client.model contains com.mongodb.client.model.changestream contains com.mongodb.client.model.geojson contains com.mongodb.client.model.geojson.codecs contains com.mongodb.client.result contains com.mongodb.connection contains com.mongodb.connection.netty contains com.mongodb.diagnostics.logging contains com.mongodb.event contains com.mongodb.internal contains com.mongodb.internal.async contains com.mongodb.internal.authentication contains com.mongodb.internal.connection contains com.mongodb.internal.dns contains com.mongodb.internal.event contains com.mongodb.internal.management.jmx contains com.mongodb.internal.session contains com.mongodb.internal.thread contains com.mongodb.internal.validator contains com.mongodb.management contains com.mongodb.operation contains com.mongodb.selector contains com.mongodb.session
We can see that packages com.mongodb and com.mongodb.client are present in the automatic modules mongodb.driver and mongodb.driver.core. But, they are unnamed modules and in that case split packages are allowed. Therefore, it is strange that the errors occur when running the test from the IDE.
Running the test with a module-info.java and by means of the Maven target test is successful. Therefore, I guess something is wrong in my run configuration of IntelliJ, but I haven’t found out what.
Summary
In this post we refactored the basic CRUD application from part 1 in order to use an embedded MongoDB database. We used the YamlPropertiesFactoryBean in order to initially fill the database and ran against an issue when running the unit test.
Thanks these two articles about Spring WebFlux. Can you write another post on your blog about developing a front end applications with Angular that consumes these endpoints ? I didn’t see many posts about this subject (spring webflux + angular) with a front-end application.
LikeLike
Thank your for the suggestion. I will put it on my list 😉 Currently I am working on other topics but I will come back to Spring WebFlux later on.
LikeLike