All Pairs Testing

All pairs testing is a combinatorial technique to achieve adequate test coverage where testing all possible combinations would be resource intensive and the risk of less than complete testing is acceptable. The fundamental concept is that you test pairs of variables, rather than all possible combinations to achieve good statistical coverage.

Consider the following hypothetical example of testing a suite of office applications on multiple platforms, to verify a small code change to the way documents are exported.

The applications run on desktop and mobile devices with either a native app, default browser or Chrome, using one of two formats. The variables are:

ApplicationDeviceClientFormat
Word processorWindowsNative appMicrosoft
SpreadsheetMacOSDefault browserOpenDocument
PresentationiOSChrome
Android

At first glance, there a 71 possible combinations (3x4x3x2) less one for the duplicate case of Chrome on Android which is the default browser. If the risk is high, testing all possible combinations may be justified but for a small code change with a complex test, 71 tests could take a considerable time to complete this thus warrants a more pragmatic test approach.

Creating the first pairs

Select the most most numerous variable first, which is ‘Device’, followed by the next highest (although in this example ‘Client’ and ‘Application’ have an identical number), and then add first value, first variable with all the values from the second variable. Then repeat for the rest of the values in the first variable, iterating over the second variable.

DeviceApplication
WindowsWord processor
WindowsSpreadsheet
WindowsPresentation
MacOSWord processor
MacOSSpreadsheet
MacOSPresentation
iOSWord processor
iOSSpreadsheet
iOSPresentation
AndroidWord processor
AndroidSpreadsheet
AndroidPresentation

Next, add additional columns to pair up the third and forth variables, iterating through the values:

DeviceApplicationClientFormat
WindowsWord processorNativeMS
WindowsSpreadsheetEdgeOpen
WindowsPresentationChromeMS
MacOSWord processorNativeOpen
MacOSSpreadsheetSafariMS
MacOSPresentationChromeOpen
iOSWord processorNativeMS
iOSSpreadsheetSafariOpen
iOSPresentationChromeMS
AndroidWord processorNativeOpen
AndroidSpreadsheetChromeMS
AndroidPresentationChromeOpen

Rearranging the pairs

There is some duplication – Word processor is always paired with Native, and Presentation is always paired with Chrome, and there’s two Android + Chrome pairs. Let’s rearrange the clients with a non-repeating pattern (not just [1,2,3]), and eliminate one of the Android + Chrome pairs:

DeviceApplicationClientFormat
WindowsWord processorNativeMS
WindowsSpreadsheetEdgeOpen
WindowsPresentationChromeMS
MacOSWord processorChromeOpen
MacOSSpreadsheetSafariMS
MacOSPresentationNativeOpen
iOSWord processorSafariMS
iOSSpreadsheetNativeOpen
iOSPresentationChromeMS
AndroidWord processorChromeOpen
AndroidSpreadsheetChromeMS
AndroidPresentationNativeOpen

Review to ensure their are pairs for the values of each variable:

All devices are tested natively, with the default browser, and with Chrome. All applications are tested at least twice on each device. All applications are tested natively (albeit on different platforms), and on a browser. All applications are tested with both document formats.

If there were any specific risk areas, you may choose to swap some pairings around or add some additional cases to cover those – for example to increase coverage of the more popular devices or application types.

In this example, Edge only appears once. If that is not acceptable, add another case with Edge paired to a different format and application.

We now have 11 test cases (or perhaps a few more if necessary), but certainly less than a potential 71, a significant saving in time while accepting some risk that one of the untested combinations may be defective.