Pressure Decay Leak Test

Introduction

Leak checking is an important part of a calibration lab working with gasses and ensures that the measurement system maintains integrity by preventing unintended gas losses, which could compromise the accuracy and reliability of calibration results. No fluid system can ever be leak free. Every modification of the system may introduce leaks in connecting volume between reference standard and devices under test. It is thus important to run a leak test every time a system is altered.
Important:
Our lab does not provide leak checking services, leak checking is an internal step to be performed before other services, such as mass flow calibration.
Note:
In mass flow calibration, as the reference is either placed before or after the device under test, and the system is under positive pressure, a leak will always result in the device under test respectively either measuring a lower or higher flow then the standard.

The goal of the leak test is not to determine exactly what the leak flow is, rather to ensure that the leak is reduced a known specification accounted in this uncertainty calculation. This means that any physical plumbing error such as a mistightened fitting, a faulty quick connect, would be identified imediately and not result in an invlid calibration.

Principle

A pressure decay leak test is a method used to detect and quantify leaks in a pressurized system by monitoring the pressure drop over time. It works by measuring the drop in pressure over time within a sealed system. The general steps are:
  1. The system or component is filled with a gas (usually air or nitrogen) to a specified test pressure.

  2. The system is allowed to stabilize for a short period to account for initial pressure fluctuations.

  3. The test section is isolated by closing valves, sealing it from the pressure source.

  4. The pressure is monitored over a set duration.

  5. If the pressure drops beyond an acceptable threshold, it indicates a leak. A stable pressure suggests the system is leak-tight.

See Leak Test Procedure for the detailed procedure.

Quantifying the Leak in Terms of Mass Flow / Standard Volumetric Flow

The procedure above mention that the drop in system pressure must not fall below an acceptable threshold. What that acceptable threshold should be depends on many parameters, notably :
  • The system internal volume. The same drop in pressure in a small system indicates a smaller leak then in a large system.

  • The initial & final absolute pressures & temperatures.

  • The fluid used.

  • The decay duration

Calculating the what the leak is in terms of mass flow involves calculating what the initial and final mass of the gas trapped in the system are :

m Leak = m Final - m Initial

Knowning that the mass m Leak is lost over a decay duration Δ t , the average leak rate can then be calculated :

m · Leak = m Leak Δ t
m · Leak = m Final - m Initial Δ t

The initial and final masses can be calculated by introducing the initial and final densities of the gas and the system volume (which is the same throughout the test) :

m · Leak = V Δ t ( ρ Final - ρ Initial )

There is no universal equation of state that accurately describes the behavior of all gases under all conditions of temperature and pressure. Calculating the density of a gas based purely on its fundamental properties, temperature, and pressure can be challenging and inherently involves a degree of uncertainty, especially at elevated pressures or near phase transitions.

The ideal gas law, while simple and useful for rough estimates, only provides reasonable accuracy at low pressures (typically below a few bar) and moderate temperatures. Its assumptions—such as negligible molecular volume and no intermolecular interactions—break down in more demanding conditions, making it unsuitable for high-precision work like ours.

In scenarios requiring high accuracy, such as leak testing, it's often more reliable to use experimentally validated thermodynamic reference data rather than rely on simplified models. This is where NIST’s REFPROP database becomes invaluable.

REFPROP (short for Reference Fluid Thermodynamic and Transport Properties Database) is a comprehensive software developed by the U.S. National Institute of Standards and Technology (NIST). It provides highly accurate thermophysical property data for a wide range of pure fluids and mixtures. These properties notably include:
  • Density

  • Pressure

  • Temperature

REFPROP uses sophisticated equations of state and multi-parameter fitting models based on experimental data to achieve high accuracy, often with uncertainties lower than 0.1% in many conditions.

The initial and final densities can therefore be calculated using REFPROP by supplying it with the fluid type along with the corresponding initial and final temperatures and pressures of the system.

Once the mass leak rate m · Leak is calculated, it can be converted to a standard volumetric flow rate by dividing it by the gas's standard density—that is, the density of the gas at standard temperature and pressure (STP). This standard density can also be obtained from REFPROP. The conversion is expressed as:

Q Leak = m · Leak ρ s

The resulting expression for the test is thus:

Q Leak = V ρ s Δ ρ Δ t