matplotlib.testing

matplotlib.testing

Helper functions for testing.

matplotlib.testing.is_called_from_pytest()[source]

[Deprecated] Whether we are in a pytest run.

Notes

Deprecated since version 3.2.

matplotlib.testing.set_font_settings_for_testing()[source]
matplotlib.testing.set_reproducibility_for_testing()[source]
matplotlib.testing.setup()[source]

matplotlib.testing.compare

Provides a collection of utilities for comparing (image) results.

matplotlib.testing.compare.compare_images(expected, actual, tol, in_decorator=False)[source]

Compare two "image" files checking differences within a tolerance.

The two given filenames may point to files which are convertible to PNG via the converter dictionary. The underlying RMS is calculated with the calculate_rms function.

Parameters:
expectedstr

The filename of the expected image.

actualstr

The filename of the actual image.

tolfloat

The tolerance (a color value difference, where 255 is the maximal difference). The test fails if the average pixel difference is greater than this value.

in_decoratorbool

Determines the output format. If called from image_comparison decorator, this should be True. (default=False)

Returns:
comparison_resultNone or dict or str

Return None if the images are equal within the given tolerance.

If the images differ, the return value depends on in_decorator. If in_decorator is true, a dict with the following entries is returned:

  • rms: The RMS of the image difference.
  • expected: The filename of the expected image.
  • actual: The filename of the actual image.
  • diff_image: The filename of the difference image.
  • tol: The comparison tolerance.

Otherwise, a human-readable multi-line string representation of this information is returned.

Examples

img1 = "./baseline/plot.png"
img2 = "./output/plot.png"
compare_images(img1, img2, 0.001)
matplotlib.testing.compare.comparable_formats()[source]

Return the list of file formats that compare_images can compare on this system.

Returns:
supported_formatslist of str

E.g. ['png', 'pdf', 'svg', 'eps'].

matplotlib.testing.decorators

class matplotlib.testing.decorators.CleanupTestCase(methodName='runTest')[source]

Bases: unittest.case.TestCase

A wrapper for unittest.TestCase that includes cleanup operations.

Create an instance of the class that will use the named test method when executed. Raises a ValueError if the instance does not have a method with the specified name.

classmethod setUpClass()[source]

Hook method for setting up class fixture before running tests in the class.

classmethod tearDownClass()[source]

Hook method for deconstructing the class fixture after running all tests in the class.

matplotlib.testing.decorators.check_figures_equal(\*, extensions=('png', 'pdf', 'svg'), tol=0)[source]

Decorator for test cases that generate and compare two figures.

The decorated function must take two arguments, fig_test and fig_ref, and draw the test and reference images on them. After the function returns, the figures are saved and compared.

This decorator should be preferred over image_comparison when possible in order to keep the size of the test suite from ballooning.

Parameters:
extensionslist, default: ["png", "pdf", "svg"]

The extensions to test.

tolfloat

The RMS threshold above which the test is considered failed.

Examples

Check that calling Axes.plot with a single argument plots it against [0, 1, 2, ...]:

@check_figures_equal()
def test_plot(fig_test, fig_ref):
    fig_test.subplots().plot([1, 3, 5])
    fig_ref.subplots().plot([0, 1, 2], [1, 3, 5])
matplotlib.testing.decorators.check_freetype_version(ver)[source]
matplotlib.testing.decorators.cleanup(style=None)[source]

A decorator to ensure that any global state is reset before running a test.

Parameters:
stylestr, dict, or list, optional

The style(s) to apply. Defaults to ["classic", "_classic_test_patch"].

matplotlib.testing.decorators.image_comparison(baseline_images, extensions=None, tol=0, freetype_version=None, remove_text=False, savefig_kwarg=None, style='classic', '_classic_test_patch')[source]

Compare images generated by the test with those specified in baseline_images, which must correspond, else an ImageComparisonFailure exception will be raised.

Parameters:
baseline_imageslist or None

A list of strings specifying the names of the images generated by calls to matplotlib.figure.savefig().

If None, the test function must use the baseline_images fixture, either as a parameter or with pytest.mark.usefixtures. This value is only allowed when using pytest.

extensionsNone or list of str

The list of extensions to test, e.g. ['png', 'pdf'].

If None, defaults to all supported extensions: png, pdf, and svg.

When testing a single extension, it can be directly included in the names passed to baseline_images. In that case, extensions must not be set.

In order to keep the size of the test suite from ballooning, we only include the svg or pdf outputs if the test is explicitly exercising a feature dependent on that backend (see also the check_figures_equal decorator for that purpose).

tolfloat, optional, default: 0

The RMS threshold above which the test is considered failed.

freetype_versionstr or tuple

The expected freetype version or range of versions for this test to pass.

remove_textbool

Remove the title and tick text from the figure before comparison. This is useful to make the baseline images independent of variations in text rendering between different versions of FreeType.

This does not remove other, more deliberate, text, such as legends and annotations.

savefig_kwargdict

Optional arguments that are passed to the savefig method.

stylestr, dict, or list

The optional style(s) to apply to the image test. The test itself can also apply additional styles if desired. Defaults to ["classic", "_classic_test_patch"].

matplotlib.testing.decorators.remove_ticks_and_titles(figure)[source]
matplotlib.testing.decorators.switch_backend(backend)[source]

[Deprecated]

Notes

Deprecated since version 3.1:

matplotlib.testing.exceptions

exception matplotlib.testing.exceptions.ImageComparisonFailure[source]

Bases: AssertionError

Raise this exception to mark a test as a comparison between two images.