To access your Agilent account, please sign in again.
Traditionally, mass spectrometry sensitivity has been defined using the signal-to-noise ratio (S/N) as a metric. However, this approach can be misleading since S/N values can be inflated based on how they are calculated. The Instrument Detection Limit (IDL) is a more robust and reliable method to assess mass spectrometry detection limits and precision, giving you more confidence that your signal is not noise. Learn more about IDL, how it’s calculated, and how you can use it to assess the performance of your Agilent LC/MS, or GC/MS system.
Most analytical instruments produce signal when a blank (matrix or solvent alone) is analyzed. The signal generated with the blank is referred to as the instrument background level and baseline noise, defined as the fluctuation of the background level.
Instrument Detection Limit (IDL) is a statistical measurement of analytical sensitivity that considers the precision of a measurement. It characterizes the amount of analyte (sample concentration or amount on-column) required to be 99% confident that a measured signal is real and not baseline noise.
As an analyte’s level approaches the method detection limit, the uncertainty of the measurement increases dramatically (precision starts to decrease). IDL mathematically determines the most minimally tolerable point of uncertainty to ensure that you’ll be able to distinguish between a very low, reproducibly observed signal or baseline noise.
Signal-to-noise ratio (S/N) is a commonly used mass spectrometry sensitivity metric and was useful when noise was easily measured. However, as technology has evolved to improve measurement specificity, using S/N to compare highly sensitive detection systems has become increasingly difficult due to ultralow noise levels.
S/N can be calculated using algorithms that can inflate the metric. Each algorithm is slightly different and will generate different results, even with the same sample. Many algorithms are also set up to search for the area with the lowest possible noise in a narrow time window and apply artificial smoothing, biasing the metric. Also, the signal-to-noise ratio is determined from a single injection, representing a sampling of n=1.
In this example different noise definitions were used within the same data file, demonstrating significant changes in S/N ratio with no real change in the mass spectrometry limit of detection.
1 pg of reserpine or chloramphenicol are commonly used performance standards to determine system sensitivity. However, there is no standard method to calculate signal-to-noise ratio in mass spectrometry. This lack of rigor or defined standard results in misleading S/N values reported based on an unrealistic noise definition with no tangible relevance to real-world analysis.
An example of misleading S/N ratio determination is shown here, where the noise region is extremely narrow and probably not representative of the actual noise.
IDL is determined using replicate measurements of the sample; providing confidence that the analyte can be reliably detected over a series of injections. The test sample’s response is measured near the instrument’s baseline, providing a reliable, statistically sound, characterization of the instrument’s performance at low levels.
Learn more about Instrument Detection Limits - watch this webinar.
IDL is a concept modified from the US EPA’s Method Detection Limit (MDL). The primary difference between IDL and MDL is the sample preparation: IDL is determined with an analyte dissolved “neat” in solvent while MDL is determined with an analyte dissolved in a sample matrix. Learn more about the IDL and MDL.
The equations used to calculate IDL and MDL are the same; the only difference is the sample preparation. IDL is calculated with analyte dissolved “neat” and MDL is calculated with analyte dissolved in a sample matrix.
MDL = t × %RSD × Concmatrix
IDL = t × %RSD × Concsolvent
Where t is the single-tailed critical value for 99% confidence for n-replicate injections, using a t-distribution
The mathematic foundations of IDL and MDL originate from a Statistical Hypothesis Testing framework using a t-distribution. Either equations can be derived from the t-test equation.
IDL can be used at installation or after preventative maintenance to confirm instrument performance while MDL should be used for routine laboratory investigations or characterizing a new method.
Agilent utilizes IDL as a primary analytical sensitivity specification for characterizing LC/MS and GC/MS systems.
As a “statistically sound” system performance metric, the IDL evaluation is first done after an instrument is manufactured, then as an installation checkout procedure at the user’s site.
Performing an IDL evaluation to evaluate your Agilent triple quadrupole LC/MS, GC/MS, and GC/MS triple quadrupole system is performing equivalently to when it was installed is simple. Just order the Agilent Performance Standard for LC/MS (Part Number: G1946-85004) and or Agilent Performance Standard for GC/MS (Part Number: 5188-5347). Then, just follow the IDL evaluation protocol found in the “Installation and Setup Guide” included with the instrument.
The IDL value should match the data sheet specification.
A lower IDL value indicates that the instrument can detect that level of analyte with 99% confidence and suitable precision where ‘analyte signal’ is greater than ‘background noise’. A more sensitive instrument detects a dilute or difficult analyte with greater precision and certainty regardless of peak area counts.