Mar 27, 202516 min read

Hands-on guide to microservices unit testing with CI/CD

Terrence Aluda

Software Engineer

Developer C sits at a desk working on an intermediate-level project

As microservices architectures dominate modern application development, the ability to test, secure, and automate their deployment has become a vital skill. In this guide, you’ll learn how to:

Let’s first set the stage by briefly exploring the foundational concepts of CI/CD and DevOps, which underpin the automation and agility required in development workflows.

What is CI/CD and DevOps?

CI/CD is the practice of automating the processes of integrating code changes into a shared repository and deploying those changes to production. This automation ensures rapid feedback on code quality, reduces the time between development and deployment, and allows teams to deliver updates more frequently and reliably.

DevOps, which stands for Development and Operations, complements CI/CD by fostering a culture of collaboration between software developers and IT operations teams. It emphasizes shared responsibility, continuous improvement, and leveraging automation to streamline workflows. Together, CI/CD and DevOps form the backbone of modern agile development environments, enabling teams to adapt quickly to changing requirements, reduce errors, and maintain high-quality software delivery at scale.

What is unit testing?

Unit testing is the practice of testing individual components or functions of an application in isolation to ensure they work as intended. By focusing on small, manageable parts of the codebase, developers can quickly identify and resolve issues before they affect the larger system. This practice is essential for detecting bugs early in the development process, reducing the cost and effort of fixing them later. Unit tests also allows developers to refactor or extend the codebase with confidence.

Individually, CI/CD, DevOps, and unit testing are powerful practices, but their true potential is unlocked when combined to improve the security and resilience of microservices. Integrating CI/CD, DevOps, and unit testing is essential for enhancing microservices’ security and stability. CI/CD ensures that updates are deployed swiftly and safely, with automated tests catching issues early in the process. DevOps encourages a culture of shared responsibility, integrating security practices into the development workflow. Unit testing adds another layer of protection by verifying that individual components are error-free.

To bring these concepts to life, CI/CD tools like CircleCI play a vital role. CircleCI is a powerful CI/CD platform that simplifies the automation of software delivery pipelines. With features like parallel builds, seamless integration with popular version control systems, and support for custom configurations, CircleCI helps development teams accelerate delivery and maintain high-quality standards, making it an invaluable part of any modern development workflow.

Prerequisites

To get started with CircleCI and implement the CI/CD pipeline for your microservices, you’ll need to have these few essentials in place:

  1. A GitHub account. Even though this article
  2. Git installed in your working machine.
  3. A CircleCI account. You can sign up for one and get 30,000 free credits.

Note:This article won’t provide a step-by-step guide for setting up the microservices. Instead, we’ll focus on explaining what the unit tests do and how to configure the CircleCI pipeline. Don’t worry if you’re not familiar with the languages or technologies used to create the microservices, i.e., Java (Spring Boot), Python (Django), and Node.js (Express). You can still use the concepts shared here to design unit tests for your microservices.

The microservices application overview

Before configuring the application to run in a CI/CD pipeline, here’s an overview of the microservices, their functionalities, and their associated unit tests.

This section only briefly discusses the functioning of the microservices. If you’d like to try out the services, refer to the README file in the GitHub repository for detailed instructions on running the application locally.

The application simulates a veterinary clinic, demonstrating the use of microservices to manage different aspects of its operations. Below is a summary of the functionalities offered by the microservices:

  1. Animal visits and veterinarian management service (springboot-service) This microservice, based on Spring Boot, allows users to add animal visits and veterinarians through forms. It additionally allows viewing the list of the visits and veterinarians through JSON lists. MySQL database is used to store the details of visits and veterinarians.

  2. Logging service (logs-service)
    This is an Express microservice responsible for capturing and storing logs of visits and veterinarian additions in a MySQL database. It communicates with the Spring Boot microservice discussed previously to ensure that the addition of new visits or veterinarians is recorded systematically.

  3. Pie chart visualization service (stats-service) This Django-based microservice fetches data from the Spring Boot microservice and generates a pie chart comparing the number of veterinarians to the number of animal visits. It uses Matplotlib to create visual representations, offering insights into the clinic’s operations.

Unit tests for the microservices

This section explains the unit tests implemented for each microservice and how they ensure the expected behavior of individual components.

springboot-service tests

These tests are found in the src/test/java/com/example/demo/controllers directory as shown in the tree below. They test the microservices controllers’ functionalities.

test
 └── java
 └── com
 └── example
 └── demo
 ├── controllers
 │   ├── AnimalVisitControllerTest.java
 │   └── VeterinarianControllerTest.java
 └── DemoApplicationTests.java

Animal visit controller tests (AnimalVisitControllerTest.java)

This unit test is designed to validate the behavior of the AnimalVisitController ensuring that the key functionalities work as expected. Using the Mockito framework, the test isolates the controller logic from its dependencies and provides a controlled environment to validate its behavior.

Before each test, the @Mock annotation is used to create mock instances of the AnimalVisitService and LogServiceInterceptorHelper.

@Mock
private AnimalVisitService animalVisitService;

@Mock
private LogServiceInterceptorHelper logServiceInterceptorHelper;

These dependencies are then injected into the AnimalVisitController using the @InjectMocks annotation.

@InjectMocks
private AnimalVisitController animalVisitController;

The test lifecycle is managed with @BeforeEach and @AfterEach methods to initialize and clean up the mocks, ensuring no interference between tests.

private AutoCloseable mocks;

@BeforeEach
void setUp() {
 mocks = MockitoAnnotations.openMocks(this); // Initialize mocks
}

@AfterEach
void tearDown() throws Exception {
    if (mocks != null) {
        mocks.close(); // Clean up mocks after each test
 }
}

The first test, testGetAllVisits_ReturnsListOfVisits, validates the AnimalVisitController.getAllVisits method of the controller. The test begins by creating a mock AnimalVisit object.

AnimalVisit visit = new AnimalVisit();
visit.setId(1L);
visit.setAnimalName("Buddy");
visit.setReason("Check-up");
visit.setVisitDate(LocalDate.now());

The behavior of the service method getAllVisits() is mocked to return a singleton list containing this visit.

when(animalVisitService.getAllVisits()).thenReturn(Collections.singletonList(visit));

When the getAllVisits method of the controller is invoked, the response is validated to ensure it contains exactly one visit, and its properties, such as animalName, match the mock data.

ResponseEntity<List<AnimalVisit>> response = animalVisitController.getAllVisits();

// Assert
assertEquals(1, Objects.requireNonNull(response.getBody()).size());
assertEquals("Buddy", response.getBody().get(0).getAnimalName());

The second test, testAddVisit_RedirectsAndLogsVisit, focuses on the addVisit method of the controller. Here, another mock AnimalVisit object is created to simulate adding a new visit.

AnimalVisit visit = new AnimalVisit();
visit.setAnimalName("Buddy");
visit.setReason("Vaccination");
visit.setVisitDate(LocalDate.now());

The saveVisit method of the service is mocked to save the visit, while the addLog method of the log service is mocked to log the operation. When the addVisit method is invoked, the test asserts that the returned view name is redirect:/visits, confirming the controller’s redirection behavior.

verify(animalVisitService).saveVisit(visit);
verify(logServiceInterceptorHelper).addLog("INFO: New animal visit added: Buddy");

Veterinarian controller tests (VeterinarianControllerTest.java)

The VeterinarianControllerTest validates the functionality of the VeterinarianController. Like the previous test class, this one also employs Mockito to mock dependencies and isolate the controller’s logic.

The testGetAllVeterinarians_ReturnsListOfVeterinarians test validates the behavior of the VeterinarianController.getAllVets method. It begins by arranging a mock Veterinarian object.

Veterinarian vet = new Veterinarian();
vet.setId(1L);
vet.setName("Dr. Smith");
vet.setSpecialization("Surgery");

The service method getAllVeterinarians() is mocked to return a singleton list containing this veterinarian. When the getAllVets method is called, the test verifies the response to ensure it contains the correct number of veterinarians and that the data matches the mock object.

assertEquals(1, Objects.requireNonNull(response.getBody()).size());
assertEquals("Dr. Smith", response.getBody().get(0).getName());

Additionally, the test verifies that the service method getAllVeterinarians() was invoked during the process.

verify(veterinarianService).getAllVeterinarians();

The testAddVeterinarian_ReturnsSavedVeterinarian test focuses on the addVeterinarian method of the controller. It also starts by creating a mock Veterinarian object.

Veterinarian vet = new Veterinarian();
vet.setName("Dr. Smith");
vet.setSpecialization("Surgery");

The service method saveVeterinarian and the log service’s addLog method are mocked. When the addVeterinarian method is called, the test asserts that the response redirects to the correct view.

assertEquals("redirect:/veterinarians", viewName);

The test also verifies that the saveVeterinarian method was invoked to persist the new veterinarian and that the addLog method was called with the appropriate log message.

verify(veterinarianService).saveVeterinarian(vet);
verify(logServiceInterceptorHelper).addLog("INFO: New veterinarian added: Dr. Smith");

logs-service tests

The test is found in the tests/logs.tests.js file and uses Jest and Supertest to validate the endpoint’s behavior. This test suite ensures the application handles different scenarios, including successful log creation, missing required fields, and server errors.

Before running the tests, the test environment is set up to mock the Sequelize database and the Log model. This ensures the tests remain isolated from the actual database and external configurations. The database connection is mocked using sequelize-mock, a lightweight library for mocking Sequelize models.

jest.mock('../config/db', () => {
    const SequelizeMock = require('sequelize-mock');
    const sequelizeMock = new SequelizeMock();

    return {
        sequelize: sequelizeMock,
        connectDB: jest.fn(),
 };
});

Similarly, the Log model is mocked to simulate interactions with the logs table without requiring an actual database connection. The mock defines a basic schema with a single message field.

jest.mock('../models/log', () => {
    const { sequelize } = require('../config/db');
    return sequelize.define('Log', {
        message: 'test message',
 });
});

These mocked dependencies are loaded automatically when the application starts, ensuring that the tests work in a predictable and controlled environment.

The first test validates the behavior of the endpoint when a valid log is provided. Before executing the test, the Log.create method is mocked to resolve with a sample log object.

jest.spyOn(Log, 'create').mockResolvedValueOnce({
    id: 1,
    message: 'Test log message',
    timestamp: new Date(),
});

The test uses Supertest to simulate a POST request to the /api/logs endpoint with a valid message.

const response = await request(app)
 .post('/api/logs')
 .send({ message: 'Test log message' });

The response is verified to ensure the status is 201, and the body contains the correct log object.

expect(response.status).toBe(201);
expect(response.body).toEqual({
    success: true,
    log: {
        id: 1,
        message: 'Test log message',
        timestamp: expect.any(String),
 },
});

The test also verifies that Log.create was called with the expected input.

expect(Log.create).toHaveBeenCalledWith({ message: 'Test log message' });

The second test ensures the endpoint properly handles requests where the required message field is missing. No data is sent in the POST request.

const response = await request(app).post('/api/logs').send({});

The test expects the server to respond with a 400 status and an error message indicating the missing field.

expect(response.status).toBe(400);
expect(response.body).toEqual({ error: 'Message is required' });

The third test simulates a server error scenario, such as a database failure, by mocking the Log.create method to reject the addition with an error message.

jest.spyOn(Log, 'create').mockRejectedValueOnce(new Error('Unknown database error'));

The test sends a valid POST request and verifies that the server responds with a 500 status and an appropriate error message.

const response = await request(app)
 .post('/api/logs')
 .send({ message: 'Test log message' });

expect(response.status).toBe(500);
expect(response.body).toEqual({ error: 'Failed to save log' });

After each test, jest.clearAllMocks() is called to reset any mocked behavior, ensuring that the tests remain independent. Additionally, the test suite includes a clean-up step to address potential issues with open handles in Jest.

afterAll(async () => {
    await new Promise((resolve) => setTimeout(() => resolve(), 500)); // Avoid Jest open handle error
    app.closeServer();
});

This clean-up step also ensures that your CircleCI pipeline stops when the tests have finished running.

stats-service tests

This test suite ensures that the chart view correctly handles normal responses, empty responses, and invalid routes.

The patch decorator replaces the requests.get method in the chart.views module with a mock object. This allows the tests to simulate API responses without making actual HTTP requests.

Additionally, the MockResponse class is defined to mimic the structure and behavior of responses from the requests library.

class MockResponse:
    def __init__(self, json_data, status_code):
        self.json_data = json_data
        self.status_code = status_code

    def json(self):
        return self.json_data

This class provides a json() method that returns the specified json_data, making it a suitable substitute for real API responses.

The test_generate_pie_chart method tests the view’s behavior when the APIs return data. The mock_get.side_effect attribute is used to specify the sequence of responses for the two API calls.

@patch('chart.views.requests.get')
def test_generate_pie_chart(self, mock_get):
    mock_get.side_effect = [
        MockResponse([{'id': 1, 'animalName': 'Buddy'}], 200),
        MockResponse([{'id': 1, 'name': 'Dr. Mike'}], 200)
    ]
    response = self.client.get(reverse('generate_pie_chart'))
    self.assertEqual(response.status_code, 200)
  • The mock_get object is configured to return mock responses with data for animals and veterinarians.
  • The test uses Django’s client to send a GET request to the generate_pie_chart route.
  • The test verifies that the response status code is 200, indicating the view successfully processed the data and generated the chart.

The test_empty_api_response method checks how the view handles scenarios where the APIs return empty lists. The mock_get.side_effect attribute is again used to define the responses.

@patch('chart.views.requests.get')
def test_empty_api_response(self, mock_get):
    mock_get.side_effect = [MockResponse([], 200), MockResponse([], 200)]
    response = self.client.get(reverse('generate_pie_chart'))
    self.assertEqual(response.status_code, 500)
  • The mock responses return empty lists to simulate a scenario where no data is available.
  • A GET request is sent to the generate_pie_chart route.
  • The test expects a 500 status code, indicating that the view encountered an error due to the lack of data. This error is generated by the Matplotlib library when empty requests are fed to it, but in production-scale applications, you will need to handle such scenarios.

The test_invalid_route method tests how the application responds to requests for nonexistent routes.

def test_invalid_route(self):
    response = self.client.get('/invalid-route/')
    self.assertEqual(response.status_code, 404)

The test verifies that the response status code is 404, indicating that the route was not found.

CircleCI configuration for the tests

To maintain the integrity of the codebase across three distinct microservices, we will use CircleCI to automate the testing pipeline. CircleCI provides an excellent platform to streamline this process by managing dependencies and executing tests in parallel. This section explains how the CircleCI configuration file has been structured to accommodate the three microservices.

The CircleCI configuration file provided below is written in YAML and defines a CI/CD pipeline to automate testing for a project with three distinct services: an Express.js service (logs-service), a Spring Boot service (springboot-service), and a Django service (stats-service).

version: 2.1
orbs:
  node: circleci/node@7.1.0
  python: circleci/python@3.0.0
jobs:
  test-node:
    # Install node dependencies and run tests
    executor: node/default
    working_directory: ~/project/logs-service
    steps:
      - checkout:
          path: ~/project
      - node/install-packages:
          pkg-manager: npm
      - run:
          name: Run tests
          command: npm test --passWithNoTests
  test-java:
    docker:
      - image: cimg/openjdk:23.0.2
    working_directory: ~/project/springboot-service
    steps:
      - checkout:
          path: ~/project
      - run:
          name: Calculate cache key
          command: |-
            find . -name 'pom.xml' -o -name 'gradlew*' -o -name '*.gradle*' | \
                    sort | xargs cat > /tmp/CIRCLECI_CACHE_KEY
      - restore_cache:
          key: cache-{{ checksum "/tmp/CIRCLECI_CACHE_KEY" }}
      - run:
          command: mvn verify
      - store_test_results:
          path: target/surefire-reports
      - save_cache:
          key: cache-{{ checksum "/tmp/CIRCLECI_CACHE_KEY" }}
          paths:
            - ~/.m2/repository
  test-python:
    # Install dependencies and run tests yes
    docker:
      - image: cimg/python:3.13.2
    working_directory: ~/project/stats-service
    steps:
      - checkout:
          path: ~/project
      - python/install-packages
      - run:
          name: Run tests
          command: python manage.py test
workflows:
  build-and-test:
    jobs:
      - test-node
      - test-java
      - test-python

Here is an explanation of its contents.

The configuration starts with the version: 2.1 declaration, which specifies the CircleCI pipeline’s version. The orbs section simplifies the setup process by using pre-configured reusable elements for node and python environments, ensuring consistent and efficient configurations for the respective services.

Job definitions

The file defines three jobs: test-node, test-java, and test-python. Each job specifies the environment, dependencies, and commands required to run tests for one of the microservices.

Node.js service testing (test-node)

This job is responsible for testing the logs-service, which uses Node.js. It employs the node/default executor provided by the Node.js orb.

  • Working directory: The job will operate in the ~/project/logs-service pipeline location, which is the root of the logs-service.
  • Steps:
    • Checkout code: Pulls the repository’s source code to ~/project.
    • Install dependencies: Uses the Node.js orb’s install-packages command with npm as the package manager to install required dependencies.
    • Run tests: Executes npm test --passWithNoTests to run the test suite. The flag ensures the pipeline doesn’t fail if there are no tests.

Java service testing (test-java)

This job focuses on the springboot-service, written in Java using Spring Boot. It uses the Docker image cimg/openjdk:17.0 as the environment.

  • Working directory: The job runs in ~/project/springboot-service.
  • Steps:
    • Checkout code: Pulls the source code into the workspace.
    • Cache Key Calculation: Generates a checksum-based cache key using configuration files (pom.xml and Gradle scripts) to improve build performance.
    • Restore Cache: Reuses dependencies from the cache if available.
    • Run Tests: Executes mvn verify to build the project and run tests.
    • Store Test Results: Collects test reports from target/surefire-reports for visibility in CircleCI.
    • Save Cache: Updates the cache with Maven dependencies for future builds.

Python service testing (test-python)

This job tests the stats-service, built using Python and Django. It uses the cimg/python:3.8-node Docker image to combine Python and Node.js environments for any hybrid needs.

  • Working directory: Operates within ~/project/stats-service.
  • Steps:
    • Checkout code: Pulls the codebase to ~/project.
    • Install dependencies: Uses the Python orb’s install-packages step to Install dependencies from the requirements.txt file.
    • Run Tests: Executes Django’s python manage.py test command to run all test cases.

Workflows

The workflows section orchestrates the execution of the defined jobs:

  • Workflow name: build-and-test.
  • Jobs:
    • The test-node job runs tests for the Node.js service.
    • The test-java job verifies and tests the Spring Boot service.
    • The test-python job executes Django service tests.

By running these jobs in parallel (default CircleCI behavior), the pipeline optimizes testing time, ensuring a swift feedback loop for developers.

Key features

  • Reusable Orbs: Simplifies configurations for Node.js and Python environments, reducing boilerplate code.
  • Caching: Ensures faster builds for the Java service by leveraging Maven’s dependency cache.
  • Parallel testing: Executes tests for the three microservices concurrently, reducing overall pipeline runtime.
  • Standardized structure: Each service is tested in isolation with its dependencies, ensuring no interference across services.

This configuration file exemplifies a multi-language pipeline setup, seamlessly integrating Node.js, Java, and Python services in a CircleCI pipeline. It highlights the importance of maintaining isolated, efficient, and repeatable test environments for microservices. For the versions of the orbs and images, try checking for the latest versions from the CircleCI orb and image pages.

The CircleCI pipeline in action

To implement the CircleCI pipeline for your project, follow these step-by-step instructions. This will ensure the pipeline is properly configured to run tests for all three microservices in your repository.

Create the .circleci directory

  1. Clone the repository:
    git clone https://github.com/CIRCLECI-GWP/circleci-microservices-unit-testing-demo.git
  1. Move into the cloned directory and checkout to the starter branch.
    cd circleci-microservices-unit-testing-demo && git checkout

If you want to view all the code together with the config file, you can stay in the master branch.

  1. Navigate to the root of the project directory, where the three microservices (logs-service, springboot-service, stats-service) are located:
   cd circleci-microservices-unit-testing-demo
  1. Create a directory named .circleci. This is where the CircleCI configuration file will reside:
   mkdir .circleci

Add the YAML configuration file

  1. Inside the .circleci directory, create a file named config.yml.

  2. Open the config.yml file in your preferred text editor and paste the CircleCI configuration discussed previously.

Commit the configuration file

  1. Stage the newly added .circleci/config.yml file for commit:
   git add .circleci/config.yml
  1. Commit the changes with a descriptive message:
   git commit -m "Add CircleCI configuration for multi-service testing"

Push the changes to the repository

  1. Create the GitHub repository.
  2. Push the committed changes to the newly created remote repository:
   git push origin master

You can use any branch as the CircleCI configuration isn’t restricted to a particular branch.

Create and trigger the pipeline

  1. Follow the documentation details on creating a project and linking it with GitHub. CircleCI will automatically detect the config.yml file and trigger the pipeline.
    • Monitor the pipeline execution in your CircleCI Dashboard.
    • You should see jobs for test-node, test-java, and test-python running in parallel as shown in the screenshot below.

Running pipeline jobs

If everything is OK, you should see a green badge labeled Success and the three jobs with checkmarks.

Successful jobs

To see a failed pipeline, you can try modifying the assertions in your tests. For example, modify the last line of the testAddVisit_RedirectsAndLogsVisit method of the AnimalVisitControllerTest class to this:

verify(logServiceInterceptorHelper).addLog("INFO: New animal visit added: Teddy");

Notice that we have changed the name to “Teddy” instead of “Buddy.”

Push the changes and go back to your CircleCI pipeline. The pipeline will fail.

Failed pipeline jobs

Click the failed job for a detailed output about what caused the fail.

Failed job detailed pipeline

Undo the changes and push it to clear the error.

verify(logServiceInterceptorHelper).addLog("INFO: New animal visit added: Buddy");

With these steps completed, your CircleCI pipeline is now active and will automate testing for your project whenever changes are pushed. This streamlined setup ensures quick feedback on the stability of your microservices.

Conclusion

In this guide, we explored setting up a robust CircleCI pipeline to automate testing across multiple microservices. Key takeaways are:

  • Creating a well-structured .circleci/config.yml file.
  • Configuring jobs for Node.js, Java, and Python services.
  • Understanding how to manage dependencies and caching for efficient builds.

You can check out the complete source code used in this tutorial on GitHub.

With CircleCI, you can streamline your development workflow, ensuring code quality and reducing manual effort. Ready to boost your CI/CD process? Get started with CircleCI today and experience the power of automated testing!


Terrence Aluda is a software engineer and technical writer with a specialization in Android development. He uses his extensive experience to create engaging and informative tutorials highlighting his interests in technology.

Copy to clipboard