Modern Web Development Setup – The Testing Environment Using Kontena (2/2)

Read the fourth article in the series of guest blogs about Modern Web Development Setup by Kontena Community blogger Juha Kärnä. This post continues discussing the testing environment setup.

Test Strategy for the Authentication Service

I come from the embedded software world where out-of-the-box solutions rarely exist, and setting up proper acceptance or integration test environments usually requires physical devices and quite a bit of in-house software to get everything working. Compared to that, the web development world is quite different with various available testing services and tools which make the high level system testing possible with much less resources.

Based on various web discussion forums and blogs it seems that the testing is seen as a crucial part of the web development, and test automation is highly recommended. One of the debated topics is the cost of change in different stages of the development process and whether or not the old research results are still valid for modern agile web development. I couldn't find any relevant and recent studies about the subject, but based on my personal experiences the issues which are found late in the deployment chain or end up in production are expensive to fix. While the major issues may have dire financial and legal consequences, the minor issues also cause extra work for the development teams, release organisation, customer service and so on.

So the testing is an investment to find costly issues sooner than later and to provide software, which meets the set requirements and regulations, for end users. Personally I also like to think about testing as a part of the software project risk management strategy. One of the usual challenges with software projects is estimating the risks involved in releasing new content, and while various factors affect the decision, the test results and other insights gained during the testing play a crucial part. While on paper it sounds like a nice idea to test the software completely and fix all the found issues, it's rarely feasible financially.

Testing should be designed and done with great care. While solo developers and small teams may have the temptation to cut corners during the testing efforts, they can also work efficiently and reap great benefits with thoughtful test design and test automation.

Unit Testing

With unit testing the target is to test and verify the functionality of small pieces of code - units, modules or functions usually - in an isolated environment. In web development, it's usually easy to automate the unit tests and run the tests very quickly. While it's fast to write unit tests, they also need active maintenance since code changes usually break the unit tests as well, for better or worse.

API Testing

The API testing article in Wikipedia explains quite well what API testing is, why it's done and what its role in the test stack is. API testing can be performed from various perspectives like usability, performance, reliability, security and documentation. Here we concentrate on the functionality and reliability aspects and create tests to verify the returned values are as expected based on the given input data.

One of the debates I've encountered in web forums is whether to run the API tests with a database and possibly other 3rd party connections or not. I was considering to mock the database connection so that the basic API tests would have been a bit more straightforward to setup in the local and testing environments, but I felt I'd get more out of the testing efforts when I added the database into the mix. This way the possible issues regarding database connections and queries should be caught before acceptance tests and acceptance tests can focus on the big picture. By keeping the microservice architecture in mind while adding new features to the system, the API test set for each service should stay within reasonable boundaries.

I realise the API testing with a database is quite easy to set up in our environment. And since the CI also supports the selected PostgreSQL out-of-the-box, complicated configurations aren't needed. The setup could be quite a bit more difficult with services requiring different databases and other external services, so in addition to ideological reasons, there might also be technical reasons necessitating the API testing being done without a database connection for example.

Acceptance Testing

In this level the functionality of the system as a whole is verified. We're going to write mostly positive system tests which also function as system acceptance tests. We'll simulate real user actions through the front end component and see if the system behaves as expected.
It’s good to run the acceptance tests in similar environments with testing and production environments. To enable this, we'll setup a new Kontena Node for acceptance testing purposes. We can use the already existing Kontena Master server for orchestrating the acceptance testing operations, but we need to create a new Kontena Grid to encapsulate the acceptance testing environment. The acceptance test setup and acceptance testing will continue in upcoming articles.

The Testing Pyramid and Its Shortcomings

The testing pyramid represents an idea about distribution of automated tests in a test stack. Based on the pyramid there should be more low level tests than high level tests. Low level tests are preferred over high level tests since - in general - they can pinpoint the found issues more accurately, they are faster to write, faster to run and are less prone to fail due to random external factors.

alt

Automated Testing Pyramid by Alister Scott, watirmelon.com

Writing the tests is like cementing the implementation, for better or worse. The initially built bamboo hut may be just fine when the sun is shining and wind is calm. But when the fall comes, and water starts pouring and the wind is getting stronger, it's definitely desirable to have some stone and concrete in the walls and foundations. But then again, when the time comes to renovate the house or move it to a better location, the concrete becomes a burden: it takes more effort to make changes.

Even if the idea of the testing pyramid is sound in general, it lacks the concept of the project risks, which are very common in the early stages of a software product development. If the product is still finding it's shape, and the feasibility of the product or service itself is still uncertain, implementing the tests by blindly following the test pyramid approach can be inefficient. For example, if the product contains simple CRUD style services, it might be just fine to favour automated integration or acceptance tests over unit tests and use the resources for mitigating other risks. By looking at the test distribution at this stage of the product development, the testing pyramid might be completely inverted.

That being said, it's highly likely - and desirable - that the test distribution progresses towards the test pyramid shape, while the product matures. If high level testing is favoured for wrong reasons, like getting-stuff-done-and-tested-and-fast!, it may backfire in the long run. Maybe it's my exposure to the dark side of software project management, or maybe it's just my lack of web development experience, but I see the testing pyramid as a common by-product of successful projects, instead of being a target itself. In the big picture, the development investments should be based on the project targets and identified risks, regardless of the levels the risks are in.

Selected Testing Strategy

We'll use Behaviour-Driven Development (BDD) and Test-Driven Development (TDD) where possible to promote the testing aspect of the development. While I recommend to try out BDD and TDD, a friend of mine pointed out that they are challenging to use in practice, and superficial trials with BDD and TDD are deemed to fail. And I agree with it. Taking a course - together with the people involved - and getting a guru developer or an external coach to help with the initial challenges, would be a good way to start the trial.

Our target is to cover 100 % of the code we write with automated API and unit tests and enforce this requirement in the CI builds. In general, it's just reasonable to at least somehow run every line of the written code via automated tests. If some part isn't covered by the tests, it's a good opportunity to think whether the code is needed at all, or whether there is a valid reason to keep the code, but ignore it from the test coverage calculations. The active add-more-tests-or-ignore decision is important, since it's highly beneficial to get the test coverage up to 100 %, to make it clearly visible when some part of the code isn't covered by the tests.

While our target is to cover 100 % of the code with automated tests, we won't automate all the tests we'll make. We'll use additional tools which are more suited for fast exploratory testing to try out different kinds of parameters and corner cases. This should keep the amount of automated test cases within reasonable limits and reduce the overall time needed for testing. In addition to automated testing, we'll also validate the code against the set ESLint coding conventions and enforce these conventions in the CI builds.

In upcoming articles we'll write the acceptance tests and go down from there. The implemented service components are quite simple and I feel like the TDD approach - but favouring API tests over unit tests - is the most efficient way to go forward. So we'll use the API tests to cover basic positive and negative cases and use unit tests to cover hard-to-repeat error cases and otherwise complicated functionality, if needed. Regarding the acceptance tests, we'll most likely leave cross-browser test automation out from this setup.

Setting Up the Test Environment for the Front Service

Selected Test Tools

For testing purposes there are quite a few different solutions available. Many of those solutions would be suitable for our needs, but instead of using different solutions for each testing need, I wanted to find a uniform solution.

In the end I selected Jest to fill the testing framework role, since when coupled with other libraries it could be used in acceptance testing, API testing and unit testing for both back and front end services.

For the front end unit testing we'll also take Enzyme into use to help with shallow rendering, for instance. We'll also use Grunt to help with the configuration and running the tests.

Unit Test Setup for the Front Service
My experience with Next.js is limited, so it's good that Zeit provides an example for testing Next.js applications with Jest and Enzyme. We'll base the testing on the provided example. So let's start by installing a bunch of needed npm modules:

front-service$ npm install enzyme enzyme-to-json react-test-renderer jest-cli jest-junit babel-jest react-addons-test-utils react-test-renderer grunt grunt-env grunt-run --save-dev  

Once the modules are installed, we’ll configure the Jest for our needs:

{
  "collectCoverage": true,
  "collectCoverageFrom": [
    "pages/**/*.js",
    "src/**/*.js"
  ],
  "coverageDirectory": "shippable/codecoverage",
  "coverageReporters": [
    "text",
    "cobertura"
  ],
  "coverageThreshold": {
    "global": {
      "branches": 100,
      "functions": 100,
      "lines": 100,
      "statements": 100
    }
  },
  "rootDir": "../../",
  "testMatch": [
    "**/?(*.)(test).js?(x)"
  ],
  "testResultsProcessor": "./node_modules/jest-junit"
}

With the config above we enabled test coverage collection, set the coverage output format to textual and cobertura format which is accepted by Shippable.

Grunt is configured using a gruntfile.js:

module.exports = function (grunt) {  
  grunt.initConfig({
    env: {
      ci: {
        NODE_ENV: 'continuous_integration',
        JEST_JUNIT_OUTPUT: './shippable/testresults/results.xml',
      },
    },
    run: {
      eslint: {
        exec: 'eslint .',
      },
      test: {
        exec: 'jest --config config/jest/jest.json',
      },
      test_and_update: {
        exec: 'jest --updateSnapshot --config config/jest/jest.json',
      },
    },
  });

  grunt.loadNpmTasks('grunt-env');
  grunt.loadNpmTasks('grunt-run');

  grunt.registerTask('default', [
    'env:ci',
    'run:test',
    'run:eslint',
  ]);

  grunt.registerTask('test_and_update', [
    'env:ci',
    'run:test_and_update',
    'run:eslint',
  ]);
};

With the file above we created a task for running the tests normally and another task for running the tests and updating the Jest snapshots. With the help of env and run modules we set the environment variables and run needed commands.

To run the configured grunt tasks we’ll update the related package.json test scripts:

  "scripts": {
    "build": "next build",
    "start": "next start -p 3300",
    "start:dev": "next -p 3300",
    "test": "grunt",
    "test:update": "grunt test_and_update"
  },

In addition to the standard test script, we added test:update for updating the Jest snapshots.

We also need to add an extra babel preset for Next.js to make the Jest tests run. We’ll do this by adding a .babelrc file into the project root folder:

{
  "presets": ["next/babel"],
  "env": {
    "continuous_integration": {
      "presets": [
        ["env", { "modules": "commonjs" }],
        "next/babel"
      ]
    }
  }
}

Lastly we need to tune our Dockerfile.prod a bit, since at the time of writing Next.js doesn’t support test files to be placed into /pages folder (#988, #1914). Due to this we’ll remove the tests files from the container before the Next.js production build operation.

COPY . /src

# Remove test files from /pages folder before build operation: https://github.com/zeit/next.js/issues/988
RUN find /src/pages -type f -name "*.test.js" -exec rm -f {} \;

RUN cd /src && npm run build  

Simple Unit Test Setup for the Front Services

We aren’t going to dive into the component implementation and testing just yet, but let’s write simple unit tests to make sure the test framework is working:

import React from 'react';  
import { shallow } from 'enzyme';  
import toJson from 'enzyme-to-json';  
import MyComponent from './index';

describe('Unit test example', () => {  
  test('Should show the welcome text', () => {
    const wrapper = shallow(
      <MyComponent />);
    expect(wrapper.text()).toEqual('Hi there!');
  });

  test('Should match with the snapshot', () => {
    const wrapper = shallow(
      <MyComponent />);
    expect(toJson(wrapper)).toMatchSnapshot();
  });
});

The first test checks if the Hi there! text is found from the rendered component. The second test creates or compares the component’s rendered output with an already existing snapshot, and fails the test if the snapshot has changed.

Setting Up the Test Environments for the App and Auth Services

Selected Test Tools

The unit and API testing setups – at this stage – are identical for both back end services, so we’ll do the setup only for the App Service here; the Auth Service sources are available in GitLab, as always.

In the beginning I used Mocha & Istanbul for unit testing the back end components and they worked fine. However, I wanted to see if Jest could be also used as a unit and API testing framework for the back end components, to keep the testing tech stack simple and consistent. Based on my experiences Jest works just fine, and the setup was slightly more straightforward compared to Mocha & Istanbul, since Jest provides many features out-of-the-box.

Unit and API Test Setup for the App and Auth Services

Let’s first install the needed modules:

app-service$ npm install grunt grunt-env grunt-run jest-cli jest-junit supertest --save-dev  

We’ll add two test configurations. We’ll set the regular test script to run unit tests, API tests check the test coverage and coding conventions. Due to this – later on – we need access to a database for the API tests, but that shouldn’t be an issue, since we can run the system via docker-compose. This test configuration is also run, as it is, by the CI service.

The second test script test:unit is a bit simpler since it runs only the unit tests. These test configurations are quite straightforward and easy to customise for various needs.

Jest configuration files:

{
  "collectCoverage": true,
  "collectCoverageFrom": [
    "app/**/*.js",
    "index.js"
  ],
  "coverageDirectory": "shippable/codecoverage",
  "coverageReporters": [
    "text",
    "cobertura"
  ],
  "coverageThreshold": {
    "global": {
      "branches": 100,
      "functions": 100,
      "lines": 100,
      "statements": 100
    }
  },
  "rootDir": "../../",
  "testEnvironment": "node",
  "testMatch": [
    "**/?(*.)(apitest|test).js?(x)"
  ],
  "testResultsProcessor": "./node_modules/jest-junit"
}
{
  "collectCoverage": false,
  "rootDir": "../../",
  "testEnvironment": "node",
  "testMatch": [
    "**/?(*.)(test).js?(x)"
  ]
}

We’ll use .apitest.js naming for the API test files so we also need to modify *.eslintrc a bit:

      "functions": "never"
    }],
    "import/no-extraneous-dependencies": ["error", {"devDependencies": ["**/*.test.js","**/*.apitest.js"]}]
  },
  "env": {

The needed gruntfile.js is quite simple at this stage:

module.exports = function (grunt) {  
  grunt.initConfig({
    env: {
      ci: {
        NODE_ENV: 'continuous_integration',
        JEST_JUNIT_OUTPUT: './shippable/testresults/results.xml',
      },
    },
    run: {
      eslint: {
        exec: 'eslint .',
      },
      test: {
        exec: 'jest --config config/jest/jest.json',
      },
      test_unit_only: {
        exec: 'jest --config config/jest/jest_unit_tests.json',
      },
    },
  });

  grunt.loadNpmTasks('grunt-env');
  grunt.loadNpmTasks('grunt-run');

  grunt.registerTask('default', [
    'env:ci',
    'run:test',
    'run:eslint',
  ]);

  grunt.registerTask('unit_tests_only', [
    'env:ci',
    'run:test_unit_only',
  ]);
};
Simple Unit and API Tests for the App and Auth Services

To make sure the test framework and the related modules work, lets write simple unit and API tests. To perform the tests, we need to export the server in the index.js:

server.listen(SERVER_PORT, () => {  
  console.log(`Server running on: ${JSON.stringify(server.address(), undefined, 2)}`);
});

module.exports = server;  

With the following unit test we’ll try out Jest’s mocking capabilities, even if the usefulness of the test is a bit questionable. We’ll put the tests into the project root for now:

/* eslint-disable global-require */

describe('Unit test example', () => {  
  const mockCreate = jest.fn();
  const mockListen = jest.fn();

  beforeEach(() => {
    jest.mock('restify', () => {
      const restify = jest.genMockFromModule('restify');
      const dummyServer = { listen: mockListen };
      mockCreate.mockReturnValueOnce(dummyServer);
      restify.createServer = mockCreate;
      return restify;
    });
  });

  test('Should start Restify with right function calls', () => {
    require('./index.js');
    expect(mockCreate).toHaveBeenCalledTimes(1);
    expect(mockListen).toHaveBeenCalledTimes(1);
  });
});

Simple API test for demonstration purposes:

/* eslint-disable global-require */
global.console = { log: jest.fn() };  
const request = require('supertest');

describe('API test example', () => {  
  let server;

  beforeAll(() => {
    server = require('./index');
  });

  afterAll(() => {
    server.close();
  });

  test('Should return a client error for #get /', done => (
    request(server)
    .get('/')
    .expect(404, {
      code: 'ResourceNotFound',
      message: '/ does not exist',
    })
    .end((err) => {
      if (err) return done.fail(err);
      return done();
    })
  ));
});

Test script update for the package.json:

  "scripts": {
    "start": "node index.js",
    "start:dev": "node-dev index.js",
    "test": "grunt",
    "test:unit": "grunt unit_tests_only"
  },

After the configurations npm test and npm run test:unit should work and pass.

Summary

We now have the unit and API test tools in place for upcoming component development efforts. I was planning to start the component level implementations in the next articles, but I feel it would be better to set up the acceptance test facilities first. So in the next article we’ll prepare the acceptance test environment, integrate it with the CI, and plan the acceptance tests for the Authentication Service.

About Kontena

Want to learn about real life use cases of Kontena, case studies, best practices, tips & tricks? Need some help with your project? Want to contribute to a project or help other people? Join Kontena Forum to discuss more about Kontena Platform, chat with other happy developers on our Slack discussion channel or meet people in person at one of our Meetup groups located all around the world. Check Kontena Community for more details.

Image Credits: Juha Kärnä.

Juha Kärnä

Read more posts by this author.

comments powered by Disqus