Adding transaction /trace Id/Correlation id for each request in Micronaut for tracing the log easily.

Adding transaction /trace Id/Correlation id for each request in Micronaut for tracing the log easily.
Photo by Anne Nygård / Unsplash

A Micronout is a web framework that has recently gained popularity and acquired quite a bit of market share as well a growing community. We are adapting it to build cloud-native microservices and moving towards reactive and using non-blocking servers such as netty.

As we are moving towards distributed microservices so the first step taken is to correlate each request and response to trace the logs and make the debugging easier.

This Example code will help achieve the above first step.

The Micronout project is Gradle build works with native images that can also run or be deployed via  GraalVm.

Most of Micronaut's users use a reactive approach to build microservices and even I recommend adapting the reactive programming approach. The problem is we lose MDC context when we shift to the new thread pool or overwrite when the new concurrent request is hit to our server.

Note: This example does not work for Maven build, and Log4j2 does throw errors while building native images.

As I am coming from the Spring boot community, instead of using 'RXJava'  I used projector Reactor for this example. And as lo4j2 has issues building native image is used log back for logging

1) My `build.gradle` file for reference

plugins {
    id("com.github.johnrengelman.shadow") version "7.1.1"
    id("io.micronaut.application") version "3.2.2"
    id("io.micronaut.aot") version "3.2.2"
}

version = "0.1"
group = "reactivelogging.com"

repositories {
    mavenCentral()
}

dependencies {
    annotationProcessor("io.micronaut:micronaut-http-validation")
    annotationProcessor("org.projectlombok:lombok")
    implementation("io.micronaut:micronaut-http-client")
    implementation("io.micronaut:micronaut-jackson-databind")
    implementation("io.micronaut:micronaut-runtime")
    implementation("io.micronaut.reactor:micronaut-reactor")
    implementation("io.micronaut.reactor:micronaut-reactor-http-client")
    implementation("jakarta.annotation:jakarta.annotation-api")
    runtimeOnly("ch.qos.logback:logback-classic")
    implementation("io.micronaut:micronaut-validation")
    implementation("io.micronaut.tracing:micronaut-tracing-jaeger")
    implementation("org.projectlombok:lombok:1.18.22")
    compileOnly("com.jtransc:jtransc-rt:0.6.8")

}


application {
    mainClass.set("reactivelogging.com.Application")
}
java {
    sourceCompatibility = JavaVersion.toVersion("17")
    targetCompatibility = JavaVersion.toVersion("17")
}

graalvmNative.toolchainDetection = true

graalvmNative {
    binaries {
        main {
            imageName.set('custom-logging')
            buildArgs.add('--verbose --language:js')
        }
    }
}
micronaut {
    runtime("netty")
    testRuntime("junit5")
    processing {
        incremental(true)
        annotations("reactivelogging.com.*")
    }
    aot {
        version = '1.0.0-M7'
        optimizeServiceLoading = true
        convertYamlToJava = true
        precomputeOperations = true
        cacheEnvironment = true
        optimizeClassLoading = true
        deduceEnvironment = true
    }
}


2) Writing Http Filter which intersects the request and reads the correlation id passed via HTTP headers if available or else generates the new random id, and also writes into the response header for daisy-chaining correlation id next request if necessary.

I am making use of MDC so our log back logger can print in the logs

So MDC is called as Mapped Diagnostic Context, so as the name suggests we are mapping additional value for broadening the aspect of the logging.

package reactivelogging.com;

import io.micronaut.http.HttpRequest;
import io.micronaut.http.MutableHttpResponse;
import io.micronaut.http.annotation.Filter;
import io.micronaut.http.filter.HttpServerFilter;
import io.micronaut.http.filter.ServerFilterChain;
import jakarta.inject.Singleton;
import lombok.extern.slf4j.Slf4j;
import org.reactivestreams.Publisher;
import org.slf4j.MDC;
import reactor.core.publisher.Flux;

import java.util.UUID;

@Filter("/**")
@Singleton
@Slf4j
public class HttpFilter implements HttpServerFilter {

    @Override
    public Publisher<MutableHttpResponse<?>> doFilter(HttpRequest<?> request, ServerFilterChain chain) {
        String uuid = request.getHeaders().contains("X-TRACE-ID") ? request.getHeaders().get("X-TRACE-ID") : UUID.randomUUID().toString();
        request.setAttribute("traceId", uuid);

        log.debug("filter is working");
        return Flux.from(chain.proceed(request)).map(mutableHttpResponse -> {
            MDC.put("trace", uuid);
            mutableHttpResponse.getHeaders().add("X-TRACE-ID",uuid);
            log.debug(mutableHttpResponse.getBody(String.class).orElse("empty"));
            return mutableHttpResponse;
        });
    }
}


3) Writing the controller and service.

In this example, I am generating random numbers and switching the execution threads using reactor Schedulers, So to demonstrate we don't lose the correlation trace in our logs.

package reactivelogging.com;

import io.micronaut.http.MediaType;
import io.micronaut.http.annotation.Controller;
import io.micronaut.http.annotation.Get;
import io.micronaut.http.annotation.QueryValue;
import jakarta.inject.Inject;
import lombok.extern.log4j.Log4j2;
import lombok.extern.slf4j.Slf4j;
import reactor.core.publisher.Flux;
import reactor.core.scheduler.Schedulers;

import java.util.stream.IntStream;

@Controller
@Slf4j
public class TestController {
    @Inject
    TestService testService;

    @Get(value = "/",produces = MediaType.APPLICATION_JSON)
    public Flux<Integer> testLogging() {
        Flux<Integer> intermediate = Flux.fromStream(IntStream.range(1, 10000).boxed())
                .subscribeOn(Schedulers.parallel())
                .map(integer -> {
                    log.debug("input" + integer);
                    return integer * integer * integer;
                })
                .subscribeOn(Schedulers.boundedElastic()).map(i -> {
                    log.debug("log on new schedular");
                    return i * 1;
                })
                .doOnEach(intNumber -> log.debug(intNumber.hasValue() ? intNumber.get().toString() : "end of value" ));

        return testService.generateRandomIntegers(intermediate);
    }

    @Get(value = "/test",produces = MediaType.APPLICATION_JSON)
    public Flux<Integer> testLogging2(@QueryValue String logName) {
        Flux<Integer> intermediate = Flux.fromStream(IntStream.range(1, 3).boxed())
                .subscribeOn(Schedulers.parallel())
                .map(integer -> {
                    log.debug( logName +" " + integer);
                    return integer * integer * integer;
                })
                .subscribeOn(Schedulers.boundedElastic()).map(i -> {
                    log.debug(logName + " log on new schedular");
                    return i * 1;
                })
                .doOnEach(intNumber -> log.debug(intNumber.hasValue() ? logName + " " +intNumber.get().toString() : logName + " end of value" ));

        return testService.generateRandomIntegers(intermediate);
    }


}
package reactivelogging.com;

import jakarta.inject.Singleton;
import lombok.extern.log4j.Log4j2;
import lombok.extern.slf4j.Slf4j;
import reactor.core.publisher.Flux;
import reactor.core.scheduler.Schedulers;

import java.util.Random;

@Singleton
@Slf4j
public class TestService {
    Flux<Integer> generateRandomIntegers(Flux<Integer> integerFlux) {
        Random rand = new Random();
        return integerFlux
                .subscribeOn(Schedulers.boundedElastic())
                .log()
                .map(integer -> integer * rand.nextInt())
                .doOnEach(integerSignal -> log.debug(integerSignal.get() + " Muultipled Number"));

    }
}

4) Modifying the loback.xml to print the trace Id in our logs -

The flag used is `%X{trace}`  inside the pattern which prints the MDC context.

<configuration>

    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
        <withJansi>true</withJansi>
        <!-- encoders are assigned the type
             ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
        <encoder>
            <pattern>%X{trace} %cyan(%d{HH:mm:ss.SSS}) %gray([%thread]) %highlight(%-5level) %magenta(%logger{36}) - %msg%n</pattern>
        </encoder>
    </appender>

    <root level="debug">
        <appender-ref ref="STDOUT" />
    </root>
</configuration>