How to set up visual regression tests

updated on 12 April 2022

As covered in Visual regressions and how to prevent them, visual regression tests are a faster way to preserve your application’s health.

While SaaS solutions provide advanced visual regression and project management features, this article will focus on how to implement simple yet effective visual regression tests using Jest and Cypress.

We will set up visual regression tests on an example repository, with the following component:

const Avatar = ({ src, className }) => {
  const [fallback, setFallback] = React.useState(false);

  const onFallback = React.useCallback((e) => setFallback(true), []);

  return (
    <div className={className}>
      {fallback ? <AvatarFallback /> : <img src={src} onError={onFallback} />}
    </div>
  );
};

export default styled(Avatar)`
  margin: 10px;

  & img {
    border-radius: 50%;
  }

Our `<Avatar />` component displays the given image with some styling, and falls back to a custom component if the image fails to load.

We will set up both snapshot and screen testing for this repository with integration at the Pull Request level.

Set up snapshot tests

Most snapshot tests setups rely on Jest (a JavaScript Testing Framework), which provides a set of utilities to perform snapshot comparison.

Given a test scenario mounting a component, Jest will do an initial snapshot of the rendered DOM tree. Future runs of the same tests will trigger snapshot comparisons that will fail if a difference is found.

image8-g2g61

Installation and configuration

We need first to install the required dependencies:

yarn add jest @testing-library/jest-dom @testing-library/react @testing-library/user-event react-test-renderer

Then, we need to add some entries in our `package.json`:

/* … */
"scripts": {
    "test-snapshot": "react-scripts test",
    /* … */
  },
  "jest": {
    "testMatch": [
      "**/?(*.)+(snapshot).(spec|test).[jt]s?(x)"
    ]
  },
/* … */

The `jest` entry contains the Jest configuration.

Since the example project is built with Create React App, we extend the existing Jest configuration by specifying the test files matching pattern with `testMatch`.

In our setup, we can run tests by calling `react-scripts test`. Without CRA, simply running the `jest` command as follows will run the tests:

/* … */
"scripts": {
    "test-snapshot": "jest",
    /* … */
  },
/* … */

Writing and running our tests

We can now write our `<Avatar />` snapshot test in a new `src/__tests__/snapshot/Avatar.jsx` file:

import React from "react";
import renderer from "react-test-renderer";
import Avatar from "../src/Avatar";

describe("<Avatar />", () => {
  describe("with a valid image", () => {
    it("renders correctly", () => {
      const tree = renderer
        .create(
          <Avatar
            src={
              "https://i.picsum.photos/id/623/200/200.jpg?hmac=xquTjHIYmAPV3XEGlIUaV_KWyEofkbortxrK79jJhWA"
            }
          />
        )
        .toJSON();
      expect(tree).toMatchSnapshot();
    });
  });
  describe("with an invalid image", () => {
    it("renders correctly", async () => {
      const tree = renderer.create(<Avatar src={"<not_and_url>"} />).toJSON();
      expect(tree).toMatchSnapshot();
    });
  });
});

The `renderer()` function from `react-test-renderer` helps render the given component and serialize the generated DOM tree as JSON.

Then, the DOM result is passed to Jest expectation matcher `toMatchSnapshot()`, creating a snapshot file and raising a failure if a diff is detected with an existing snapshot.

We can now run our test with the `yarn test-snapshot` command:

image9-wsja3

A `src/__snapshots__/Avatar.snapshot.spec.jsx.snap` file is created.

This snapshot file should be committed since it will be used as a reference snapshot for further tests (locally and on CI).

Setup the CI workflow

Our test setup is working locally. However, to ensure that no visual regression is introduced in our component, we must enforce those tests when new Pull Requests are created on Github.

Let’s create a new Github Action to enforce such a policy by creating a Workflow file. Github Action Workflow files are a broad subject to cover; however, let’s have an overview of the main parts of the `.github/workflows/snapshot-test.yml` file:

  • We define a `strategy.matrix` to run our tests on multiple versions of Node.js (12.x and 17.x)
  • We configure caches for yarn dependencies and Jest

Finally, we run the tests.

Testing the workflow

The following Pull Request showcases how snapshots tests are integrated into the CI workflow.

This Pull Request introduces a breaking change by inverting the condition on the `<Avatar>` fallback mechanism.

Immediately, the snapshots tests catch the visual regression by reporting the DOM differences initiated by this new change:

image5-vyds2

Note: If the Snapshot changes were intentional, simply running the test locally with the `--updateSnapshot` argument will update the snapshots with the new changes: `yarn test-snapshot --updateSnapshot`.

Once the updated snapshots are committed, the CI tests will pass.

Snapshot test limitations

Looking closely at our snapshot tests reveals two main limitations.

First, we can see in our generated snapshot file (on master) that the “<Avatar /> with an invalid image renders correctly" test case is not generating the proper snapshot:

exports[`<Avatar /> with an invalid image renders correctly 1`] = `
<div
  className="sc-bdvvtL oSqWo"
>
  <img
    onError={[Function]}
    src="<not_and_url>"
  />
</div>
`;

This first limitation is generally linked to the difficulty to test image loading scenarios in Jest tests.

To properly test a “failing to load image” scenario with Jest, we will need to mock many parts of the code to trigger such a scenario. On the other hand, screen tests by running headless browser instances could help in testing such features.

Finally, any breaking change at the styling level raises cryptic snapshots changes, as shown in this Pull Request:

image6-nuift

The resulting snapshot test error is not very helpful in understanding which change introduced the issues, or what the final visual regression for end-users is.  

While some Jest plugins such as `jest-styled-components` help deal with this issue, screen testing is more appropriate for assessing visual regression on UI-Kit components.

Set up screen tests

As seen earlier, we need to confirm that our `<Avatar>` UI-Kit component has no visual regression of any kind, from styling to advanced DOM interactions (image failing load).

Based on actual screenshots of components' rendering, screen tests are the best way to avoid visual regressions of our component.

Screen tests share the same core logic as Snapshot tests, but by comparing screenshots.

image1-imfzn

Screen tests require Cypress (a JavaScript End to End Testing Framework) to start a headless browser instance to perform screenshots of the rendered components.

Installation and configuration

The following libraries are required to perform screenshots tests on components:

  • Cypress Component testing: a feature of Cypress that allows running test of components without having to set up a whole application (usually required for E2E testing)
  • `cypress-image-snapshot`: A Cypress plugin that brings the screenshot capabilities

Let’s install all the required dependencies:

yarn add @cypress/react @cypress/webpack-dev-server cypress cypress-image-snapshot

Then, creating a `cypress.json` file to specify test files pattern and location:

{
  "component": {
    "componentFolder": "src",
    "testFiles": "**/*.visual.spec.{js,jsx,ts,tsx}"
  },
  "video": false // Cypress will record videos by default
}
```

Again, we need to update our `package.json` to add a new script entry:

```json
// …
  “scripts”: {
     "test-visual": "cypress run-ct --reporter cypress-image-snapshot/reporter",
  }
// …

Finally, we need to properly configure the `cypress-image-snapshot` plugin by creating two files:

Writing and running our test

We can now define our `<Avatar />` screen tests as follows:

/* eslint-disable no-undef */
import React from "react";
import { mount } from "@cypress/react";
import Avatar from "./Avatar";

describe("<Avatar />", () => {
  describe("with a valid image", () => {
    it("renders the proper UI", () => {
      mount(
        <Avatar
          src={
            "https://i.picsum.photos/id/623/200/200.jpg?hmac=xquTjHIYmAPV3XEGlIUaV_KWyEofkbortxrK79jJhWA"
          }
        />
      );
      cy.matchImageSnapshot();
    });
  });
  describe("with an invalid image", () => {
    it("renders the proper UI", () => {
      mount(<Avatar src={"<not_and_url>"} />);
      cy.matchImageSnapshot();
    });
  });
});

The above tests are similar to previously seen Snapshot tests.

The only difference is that we mount the component and call an action on `cy` (cypress runner): `cy.matchImageSnapshot()`.

Similar to Jest’s `toMatchSnapshot()` method, `cy.matchImageSnapshot()` will take a screenshot of the rendered component and raise an error if any difference is found with the existing one.

We can now run our screen test with the `yarn test-visual` command:

image2-8tq3j

Our tests generate one screenshot file per test case (`describe()`) that must be committed.

Setup the CI workflow

Again, we need to integrate those tests in the development workflow by creating the corresponding Github Actions.

The new `.github/workflows/screen-tests.yml` file contains the following parts:

  • The `container` parameter initialize a Docker image provided by Cypress that includes all required operating system dependencies and some pre-installed browsers to run the tests
  • The “Store screenshots” action ensures that screenshots will be downloadable

Testing the workflow

By creating a PR that, again, introduces a visual regression by inverting the fallback logic, we can see the following failing screen test results:

image4-0tqek


Then, by following the test suite link, we can download the generated screenshots in the Artifacts section:

image3-5r1hg

The downloaded `.zip` file contains all original screenshots along with the diff outputs:

image7-1rfom

We can see that our “<Avatar /> with a valid image renders the proper UI” scenario is no longer behaving as expected.

Note: Like Snapshot tests, running the `yarn test-visual --env updateSnapshots=true` locally will update the screenshots with the new changes. Once the updated screenshots are committed, the CI tests will pass.

Conclusion

Snapshot tests are the fastest way to catch any regression related to:

  • Presence of a given element in a page (example: sorting button of an actionable list)
  • Displayed text in a given context (i18n regression testing)
  • Rendering of components after simple interactions (ex: filter list)

However, Snapshot tests are not the best match for catching visual regressions related to pure CSS issues such as overflows or advanced interactions like loading images.

Although slower to run, screen tests, by performing diff of screenshots, help catch advanced visual regressions that cannot be spotted at the DOM level.

Finally, keep in mind that it might be worth looking at SaaS solutions if your project relies on a growing number of visual regressions tests (more than the critical path) or that non-technical people get involved in the project.

Meticulous

Meticulous is a tool to catch JavaScript regressions in web applications, with zero maintenance burden. Sort of like a self-writing selenium test.

Insert a single line of JavaScript onto your local dev, QA and staging sites. This snippet records user sessions. Meticulous then tests your new code by replaying those user sessions to automatically find frontend regressions (JS exceptions, visual diffs) for you before they hit production. It does this without ever hitting your backend or causing any side effects. You can also view replays on new code to manually ascertain whether flows have broken. 

Find out more and watch a 60-second demo at www.meticulous.ai

Authored by Charly Poly

Read more