Gas Chromatography Overview
Gas Chromatography (GC) measures volatile analytes based on their boiling point and affinity for a stationary phase. This technique is commonly employed to measure solvents and other volatile analytes in a variety of sample matrices.
Galbraith’s GC methodology is divided according to the injection technique. Galbraith offers direct and headspace injection techniques. Both of these injection techniques may be coupled with the detectors below. For information on Galbraith’s USP <467> Residual Solvents testing services, please click here.
Direct injection is usually reserved for analytes that exhibit high boiling points, to ensure that the analyte is injected into the column. Direct injection is also used in area percent (%) assays to ensure that unknown contaminants are injected onto the column. Click below to view a summary of our direct injection method:
Headspace injection is used for analytes that have relatively low boiling points. This technique is desirable since the sample matrix is not injected onto the column. Often, headspace injection affords lower detection limits than direct injection. Click below to view a summary of our headspace injection method:
Flame Ionization Detectors (FID) detect the by-products created during the combustion of organic compounds in a hydrogen flame.
Thermal Conductivity Detectors (TCD) measure the change in heat flow from a GC column relative to a reference flow of carrier gas.
Mass Spectrometer (MS) detectors separate analytes based on their mass to charge ratio (m/z).
Separate methods for each detector are not provided. The method is chosen based on the injection technique and then the detector specified. All detectors are available for each injection technique.
Each successful analysis begins by adequately dissolving the sample for analysis. Many solvents may be used as long as they don’t interfere in the analysis and sufficiently dissolve the sample.
GC instruments are externally calibrated with the use of approximately 5 calibration standards. Sample solutions are injected onto the column and are separated based on their boiling point and affinity for the stationary (column) phase. Components of the analysis solution are detected using one of the available detectors to provide a sample chromatogram. The area under the desired (target) peak(s) is integrated and is compared to the calibration curve to provide the concentration.
Unless otherwise evaluated, the quantitation limit (QL) depends on the concentration of the lowest calibration standard used to calibrate the instrument. Results that read below the QL are reported as less than values, i.e <50 ppm.
Results are reported using the mass of sample originally taken for the analysis. Typical reporting units are ppm, mg/L, or %. Other units may be reported.
The general methodology is suitable for the analysis of regulated samples. The method is considered validated to a reference substance, but not to the sample matrix unless a formal validation is conducted.