mscharhag, Programming and Stuff;

A blog about programming and software development topics, mostly focused on Java technologies including Java EE, Spring and Grails.

  • Tuesday, 11 August, 2020

    Extending JUnit 5

    A look into the past

    With JUnit 4 we have the option to run test with a custom JUnit runner (indicated by the @RunWith annotation). This allows us to modify the way tests are executed with JUnit. However, JUnit runners are not that easy to implement. They also suffer on the major limitation that only one runner can be used in a test.

    With JUnit 4.7 Rules were introduced. Rules use a different concept to customize tests. It is also possible to use multiple rules within a test. So from this point JUnit 4 had two different ways (with different up and downsides) to customize test behavior.

    JUnit 5 introduces extensions

    This whole customization mechanism has changed with JUnit 5 which introduced extensions. Extensions can be added to tests in various ways. The most common way is the @ExtendWith annotation that can be used on test classes or on single test methods. For example:

    public class DemoTest {
        public void test() {
            // uses MyFirstExtension
        public void test2() {
            // uses MyFirstExtension and MySecondExtension

    Extensions added to the test class will be used for all test methods within the class.

    Multiple extensions can be registered by passing an array of extensions:

    @ExtendWith({ MyFirstExtension.class, MySecondExtension.class })
    public class DemoTest {

    @ExtendWith is also a repeatable annotation, so it can be added multiple times:

    public class DemoTest {

    Note that @ExtendWith can be composed to other annotations. For example, we can come up with our own annotation that is annotated with @ExtendWith:

    public @interface IntegrationTest {}

    We can now annotate our test with @IntegrationTest and JUnit 5 will run the tests using the two extensions defined in @IntegrationTest:

    public class DemoTest {

    While @ExtendWith is easy to use and works fine in most situations it has a drawback. Sometimes test code needs to interact with an extension or the extension might need some sort of configuration or setup code. This cannot be done if the extension is defined with @ExtendWith.

    In these situations we can create the extension manually, assign it to a field and add the @RegisterExtension annotation. For example lets look at a fictional extension that manages temporary files in a test:

    public class DemoTest {
        static TempFileExtension tempFiles = TempFileExtension.builder()
        public void test() {
            File f = tempFiles.newTempFile("foobar.tmp");

    Using a @RegisterExtension on a field gives us the option to configure the extension and to interact with the extension in test methods.

    Creating custom extensions

    Creating a custom extension for JUnit 5 is quite easy. We just have to create a class that implements one or more of JUnits extension interfaces.

    Assume we want to create a simple extension that measures how long a test runs. For this we create a new class that implements the interface InvocationInterceptor.

    public class TestDurationReportExtension implements InvocationInterceptor {
        public void interceptTestMethod(Invocation<Void> invocation,
                ReflectiveInvocationContext<Method> invocationContext,
                ExtensionContext extensionContext) throws Throwable {
            long beforeTest = System.currentTimeMillis();
            try {
            } finally {
                long afterTest = System.currentTimeMillis();
                long duration = afterTest - beforeTest;
                String testClassName = invocationContext.getTargetClass().getSimpleName();
                String testMethodName = invocationContext.getExecutable().getName();
                System.out.println(String.format("%s.%s: %dms", testClassName, testMethodName, duration));

    InvocationInterceptor has various methods with default implementations. We override the implementation of interceptTestMethod(..). This method lets us run code before and after a test method is executed. With the proceed() method of the Invocation method parameter we can proceed with the actual test execution.

    We simply subtract the system time before the test from the system time after the test execution to get the duration. After that, we use the InvocationContext parameter to obtain the names of the test class and test method. With this information we create a formatted output message.

    Now we can extend tests with our TestDurationReportExtension by using the @ExtendWith annotation:

    public class DemoTest { .. }

    When running tests, we will now see our extension output for every test method.

    The output for a test with two methods might look like this:

    DemoTest.slowTest: 64ms
    DemoTest.fastTest: 6ms

    Extension interfaces

    InvocationInterceptor is just one various extension interfaces. In this section, we will briefly look over these different interfaces and for what they can be used.

    Conditional test execution

    By implementing the interface ExecutionCondition an extension can decide if a test should be executed. This lets the extension decide if certain tests should be skipped. A simple example is the standard extension DisabledCondition that skips tests annotated with @Disabled.

    Test instance factories

    By default JUnit 5 will instantiate test classes by invoking the available constructor (if multiple test constructors are available an exception will be thrown). Possible constructor arguments are resolved using ParameterResolver extensions (see below).

    This default behavior can be customized using the TestInstanceFactory interface. An Extension that implements TestInstanceFactory is used as factory for creating test class instances. This is can be used to create Tests via static factory methods or to inject additional parameters into the test constructor.

    Processing test instances

    After a test instance has been created, the TestInstancePostProcessor interface can be used to post process test instances. A common extension use case for this is the injection of dependencies into fields of the test instance. Similarly the TestInstancePreDestroyCallback can be used to run custom cleanup logic, when a test has finished and the instance is no longer needed.

    Test parameter resolution

    Test class constructors or methods annotated with @Test, @BeforeEach, @BeforeAll etc. can contain parameters. These parameters are resolved at runtime by JUnit using ParameterResolvers. Extensions can implement ParameterResolver if they want to support additional parameters.

    Test Lifecycle callbacks and interceptions

    JUnit 5 provides a couple of test lifecycle callback interfaces that can be implemented by extensions:

    • BeforeAllCallback, runs before @BeforeAll methods in the test class
    • BeforeEachCallback, runs before @BeforeEach methods in the test class
    • BeforeTestExecutionCallback, runs before the test method
    • AfterTestExecutionCallback, runs after the test method
    • AfterEachCallback, runs after @AfterEach methods in the test class
    • AfterAllCallback, runs after @AfterAll methods in the test class

    Those interfaces provide a simple callback to do something at a certain time in the test lifecycle.

    Additionally there is the InvocationInterceptor interface we already used in the extension example above. InvocationInterceptor has similar methods as the callback interfaces. However, InvocationInterceptor gives us an Invocation parameter that allows us to manually continue the lifecycle by calling the proceed() method. This is useful if we want to wrap code around the invocation, like a try/catch block.


    Writing extensions for JUnit 5 is quite easy. We just have to create a class that implements one or more of JUnits extension interfaces. Extensions can be added to test classes (or methods) using the @ExtendWith and @RegisterExtension annotations. You can find the source code for the example extension on GitHub. Also make sure to checkout the excellent JUnit 5 user guide.

  • Monday, 3 August, 2020

    Integrating JSON Schema validation in Spring using a custom HandlerMethodArgumentResolver

    In previous posts we learned about JSON Schema and how we can validate a JSON document against a JSON Schema in Java. In this post we will integrate JSON Schema validation into Spring controllers using a custom HandlerMethodArgumentResolver. We will use the same JSON document and JSON Schema as in previous posts.

    So, what is a HandlerMethodArgumentResolver?

    Handler methods in Spring controllers (= methods annotated with @RequestMapping, @GetMapping, etc.) have flexible method signatures. Depending on what is needed inside a controller method, various method arguments can be added. Examples are request and response objects, headers, path variables or session values. Those arguments are resolved using HandlerMethodArgumentResolvers. Based on the argument definition (type, annotations, etc.) a HandlerMethodArgumentResolver is responsible for obtaining the actual value that should be passed to the controller.

    A few standard HandlerMethodArgumentResolvers provided by Spring are:

    • PathVariableMethodArgumentResolver resolves arguments annotated with @PathVariable.
    • Request related method arguments like WebRequest, ServletRequest or MultipartRequest are resolved by ServletRequestMethodArgumentResolve.
    • Arguments annotated with @RequestHeader are resolved by RequestHeaderMapMethodArgumentResolver.

    In the following we will create our own HandlerMethodArgumentResolver implementation that validates a JSON request body against a JSON Schema before the JSON data is passed to a controller method.

    Getting started

    We start with creating our own @ValidJson annotation. This annotation will be used to mark controller method arguments that should be resolved by our own HandlerMethodArgumentResolver.

    public @interface ValidJson {
        String value();

    As we see in the next snippet, the value parameter of our @ValidJson annotation is used to define the path to the JSON Schema. We can now come up with the following controller implementation:

    public class Painting {
        private String name;
        private String artist;
       // more fields, getters + setters
    public interface SchemaLocations {
        String PAINTING = "classpath:painting-schema.json";
    public class PaintingController {
        public ResponseEntity<Void> createPainting(@ValidJson(PAINTING) Painting painting) {

    Painting is a simple POJO we use for JSON mapping. SchemaLocations contains the location of our JSON Schema documents. In the handler method createPainting we added a Painting argument annotated with @ValidJson. We pass the PAINTING constant to the @ValidJson annotation to define which JSON Schema should be used for validation.

    Implementing HandlerMethodArgumentResolver

    HandlerMethodArgumentResolver is an interface with two methods:

    public interface HandlerMethodArgumentResolver {
        boolean supportsParameter(MethodParameter parameter);
        Object resolveArgument(MethodParameter parameter, ModelAndViewContainer mavContainer,
                NativeWebRequest webRequest, WebDataBinderFactory binderFactory) throws Exception;

    supportsParameter(..) is used to check if this HandlerMethodArgumentResolver can resolve a given MethodParameter, while resolveArgument(..) resolves and returns the actual argument.

    We implement this interface in our own class: JsonSchemaValidatingArgumentResolver:

    public class JsonSchemaValidatingArgumentResolver implements HandlerMethodArgumentResolver {
        private final ObjectMapper objectMapper;
        private final ResourcePatternResolver resourcePatternResolver;
        private final Map<String, JsonSchema> schemaCache;
        public JsonSchemaValidatingArgumentResolver(ObjectMapper objectMapper, ResourcePatternResolver resourcePatternResolver) {
            this.objectMapper = objectMapper;
            this.resourcePatternResolver = resourcePatternResolver;
            this.schemaCache = new ConcurrentHashMap<>();
        public boolean supportsParameter(MethodParameter methodParameter) {
            return methodParameter.getParameterAnnotation(ValidJson.class) != null;

    supportsParameter(..) is quite easy to implement. We simply check if the passed MethodParameter is annotated with @ValidJson.

    In the constructor we take a Jackson ObjectMapper and a ResourcePatternResolver. We also create a ConcurrentHashMap that will be used to cache JsonSchema instances.

    Next we implement two helper methods: getJsonPayload(..) returns the JSON request body as String while getJsonSchema(..) returns a JsonSchema instance for a passed schema path.

    private String getJsonPayload(NativeWebRequest nativeWebRequest) throws IOException {
        HttpServletRequest httpServletRequest = nativeWebRequest.getNativeRequest(HttpServletRequest.class);
        return StreamUtils.copyToString(httpServletRequest.getInputStream(), StandardCharsets.UTF_8);
    private JsonSchema getJsonSchema(String schemaPath) {
        return schemaCache.computeIfAbsent(schemaPath, path -> {
            Resource resource = resourcePatternResolver.getResource(path);
            if (!resource.exists()) {
                throw new JsonSchemaValidationException("Schema file does not exist, path: " + path);
            JsonSchemaFactory schemaFactory = JsonSchemaFactory.getInstance(SpecVersion.VersionFlag.V201909);
            try (InputStream schemaStream = resource.getInputStream()) {
                return schemaFactory.getSchema(schemaStream);
            } catch (Exception e) {
                throw new JsonSchemaValidationException("An error occurred while loading JSON Schema, path: " + path, e);

    A JsonSchema is retrieved from a Spring Resource that is obtained from a ResourcePatternResolver. JsonSchema instances are cached in the previously created Map. So a JsonSchema is only loaded once. If an error occurs while loading the JSON Schema, a JsonSchemaValidationException is thrown.

    The last step is the implementation of the resolveArgument(..) method:

    public Object resolveArgument(MethodParameter methodParameter, ModelAndViewContainer modelAndViewContainer, NativeWebRequest nativeWebRequest, WebDataBinderFactory webDataBinderFactory) throws Exception {
        // get schema path from ValidJson annotation
        String schemaPath = methodParameter.getParameterAnnotation(ValidJson.class).value();
        // get JsonSchema from schemaPath
        JsonSchema schema = getJsonSchema(schemaPath);
        // parse json payload
        JsonNode json = objectMapper.readTree(getJsonPayload(nativeWebRequest));
        // Do actual validation
        Set<ValidationMessage> validationResult = schema.validate(json);
        if (validationResult.isEmpty()) {
            // No validation errors, convert JsonNode to method parameter type and return it
            return objectMapper.treeToValue(json, methodParameter.getParameterType());
        // throw exception if validation failed
        throw new JsonValidationFailedException(validationResult);

    Here we first get the location of the JSON Schema from the annotation and resolve it to an actual JsonSchema instance. Next we parse the request body to a JsonNode and validate it using the JsonSchema. If validation errors are present we throw a JsonValidationFailedException.

    You can find the source code for the complete class on GitHub.

    Registering our HandlerMethodArgumentResolver

    Next we need to tell Spring about our JsonSchemaValidatingArgumentResolver. We do this by using the addArgumentResolvers(..) method from the WebMvcConfigurer interface.

    public class JsonValidationConfiguration implements WebMvcConfigurer {
        private ObjectMapper objectMapper;
        private ResourcePatternResolver resourcePatternResolver;
        public void addArgumentResolvers(List<HandlerMethodArgumentResolver> resolvers) {
            resolvers.add(new JsonSchemaValidatingArgumentResolver(objectMapper, resourcePatternResolver));

    Error handling

    In our JsonSchemaValidatingArgumentResolver we throw two different exceptions:

    • JsonSchemaLoadingFailedException if loading the JSON Schema fails
    • JsonValidationFailedException if the JSON validation fails

    JsonSchemaLoadingFailedException very likely indicates a programming error while a JsonValidationFailedException is caused by a client sending an invalid JSON document. So we should clearly send a useful error message to the client if the later occurs.

    We do this by using an @ExceptionHandler method in a class annotated with @ControllerAdvice:

    public class JsonValidationExceptionHandler {
        public ResponseEntity<Map<String, Object>> onJsonValidationFailedException(JsonValidationFailedException ex) {
            List<String> messages = ex.getValidationMessages().stream()
            return ResponseEntity.badRequest().body(Map.of(
                "message", "Json validation failed",
                "details", messages

    Within the method we retrieve the validation messages from the passed exception and send them to the client.

    Example request

    An example request we can send to our PaintingController looks like this:

    POST http://localhost:8080/paintings
    Content-Type: application/json
      "name": "Mona Lisa",
      "artist": null,
      "description": null,
      "dimension": {
        "height": -77,
        "width": 53
      "tags": [

    Note that the JSON body contains two errors: artist is not allowed to be null and the painting height is negative. (You can look at the JSON Schema on GitHub)

    Our server responds to this request with the following response:

    HTTP/1.1 400 (Bad Request)
      "message": "Json validation failed",
      "details": [
        "$.artist: null found, string expected",
        "$.dimension.height: must have a minimum value of 1"


    We learned how to integrate JSON Schema validation in Spring controllers by implementing a custom HandlerMethodArgumentResolver. Within our implementation we validate the JSON request body against a JSON Schema before it is passed as argument to the controller. To return proper error message we can add a @ExceptionHandler method.

    You can find the complete example project on GitHub.

  • Monday, 27 July, 2020

    REST: Managing One-To-Many relations

    In a previous post we looked at many-to-many relations. This time we will see how to model one-to-many relations in a RESTful API.

    An important question here is, if both sides of the relation can exist on their own (similar to typical many-to-many relations) or if the many-side is tightly coupled to the one-side. In the following we will examine both cases with different examples.

    Tightly coupled relations

    It is quite common for one-to-many relations that the many-side is tightly coupled to the one-side.

    For example, consider a relation between articles and comments. An article can have many comments while a comment always belongs to exactly one article. Comments cannot move from one article to another article and the deletion of an article also deletes attached comments.

    In such a scenario it is often a good idea to express this type of relation via the resource URI. In this example we can model comments as a sub-resource of articles. For example: /articles/<article-id>/comments. We can then use standard CRUD operations on this sub-resource to create, read, update and delete comments:

    Getting all comments of an article 123:

    GET /articles/123/comments

    Creating a new comment for article 123:

    POST /articles/123/comments
    Content-Type: application/json
        "message": "Foo",

    Updating comment 456:

    PUT /articles/123/comments/456
    Content-Type: application/json
        "message": "Bar",

    Deleting comment 456:

    DELETE /articles/123/comments/456

    Here the relation is only expressed by the resource URI. We do not need specific operations to attach or detach a comment to / from an article.

    Both sides of the relation can exist on their own

    Now let's look at a different example: A relationship between a player and a sports team. A team consists of many players and a player can only play for one team at a time. However, the player can change teams or be without a team for some time.

    In this situation we use an approach similar to many-to-many relations. We use two separate resources for players and teams: For example /players and /teams. Both resources can be managed on their own (for example via common CRUD operations).

    Next we create a sub-resource for the relation, for example /teams/<team-id>/players. This sub-resource is only used to manage the relation between both resources. We can now use GET, PUT and DELETE operations to retrieve, create and delete relations.

    Getting players assigned to team 123:

    GET /teams/123/players

    Assigning player 42 to team 123:

    PUT /teams/123/players/42

    Unassigning player 42 from team 123:

    DELETE /teams/123/players/42

    It is part of the servers logic, to make sure a player is only assigned to a single team. Assume player 42 is currently assigned to team 122. Now, when a PUT /teams/123/players/42 request is issued, the server has first to unassign player 42 from team 122 before he is assigned to team 123. So, this request also modifies the /teams/122/players resource, which should be remembered if a cache is present.

    Note we do not need a request body for any of these requests because the sub-resource is only used to manage the relation which can be fully determined by the request URI.

    We can also model this from the player-side of the relation. Again, we use a new sub-resource: /players/<player-id>/team.

    Getting the current team of player 42:

    GET /player/42/team

    Assigning player 42 to team 123:

    PUT /player/42/team/123

    Unassigning player 42 from the current team:

    DELETE /player/42/team

    Note: For the DELETE request no team id is required (a player can only be in one team).


    We looked into two different approaches of modelling one-to-many relations with a REST API.

    If both parts of the relation are tightly coupled we can often express the many-part as a sub-resource of the one-part and use simple CRUD operations. The relation is only expressed via the URI and no special assignment operation is needed.

    However, if both sides of the relation can exist on their own, we use two separate resources and add sub-resources to manage the relation.

  • Thursday, 23 July, 2020

    JSON Schema validation in Java

    In this post we will see how to validate a JSON document against a JSON Schema in Java. We will use the same JSON document and Schema as in the previous post about JSON Schema.

    You can find both as text files on GitHub: JSON document and JSON Schema.

    We use the networknt JSON Schema validator library in this example. This library seems like a good fit because it supports the latest JSON Schema version (2019-09) and uses Jackson as JSON library. This makes it easy to integrate JSON Schema validation in Spring (hint: upcoming blog post).

    We need to add the following dependency to our project:


    Now we can validate our JSON document in Java:

    private static InputStream inputStreamFromClasspath(String path) {
        return Thread.currentThread().getContextClassLoader().getResourceAsStream(path);
    public static void main(String[] args) throws Exception {
        ObjectMapper objectMapper = new ObjectMapper();
        JsonSchemaFactory schemaFactory = JsonSchemaFactory.getInstance(SpecVersion.VersionFlag.V201909);
        try (
                InputStream jsonStream = inputStreamFromClasspath("example.json");
                InputStream schemaStream = inputStreamFromClasspath("example-schema.json")
        ) {
            JsonNode json = objectMapper.readTree(jsonStream);
            JsonSchema schema = schemaFactory.getSchema(schemaStream);
            Set<ValidationMessage> validationResult = schema.validate(json);
            // print validation errors
            if (validationResult.isEmpty()) {
                System.out.println("no validation errors :-)");
            } else {
                validationResult.forEach(vm -> System.out.println(vm.getMessage()));

    When obtaining a JsonSchemaFactory we need to pass a VersionFlag. This defines the JSON Schema version we want to use (here: 2019-09).

    We then use a small helper method to load both files from the classpath. A Jackson ObjectMapper instance is used to read the JSON data from the InputStream and parse it into a JsonNode object. From the JsonSchemaFactory we can obtain a JsonSchema object which can then be used validate the JsonNode. In case of validation errors the returned Set will contain one or more ValidationMessage objects. If the returned Set is empty, no validation errors were found.

    If we accidentally set the painting height to a negative number in our JSON document, we will get the following validation message:

    $.dimension.height: must have a minimum value of 1

    You can find the example source code on GitHub.

  • Tuesday, 21 July, 2020

    Validating and documenting JSON with JSON Schema

    JSON Schema is a way to describe a JSON document. You can think of XML Schema for JSON. It allows you to define required elements, provide validation constraints and documentation.

    I think the easiest way to explain JSON Schema is to look at an example snippet of JSON and the corresponding JSON Schema. So, I created an image that shows both and is hopefully self explanatory:


    (Click to enlarge, if you are on a mobile device)

    A JSON Schema validator can be used to validate a JSON document against a JSON Schema. This can be done online (e.g. or using a library of your favourite programming language. In the implementations section of you can find a couple of libraries to work with JSON Schema.

    In case you want to copy/paste some of the sections of the image: You can find the example as text on GitHub.