When it comes to molecular biology and genomics research involving nucleic acid quantification, scientists often find themselves standing at the crossroads. Which quantification technique to choose to achieve research goals efficiently – the more precise and robust digital PCR (dPCR) or the more standardized and familiar quantitative real-time PCR (qPCR). The choice is application dependent. For the curious minds, this article provides a brief insight into the technical evolution as well as practical applications of these two generations of PCR techniques.
Since its discovery in 1993, researchers have valued qPCR for its speed, sensitivity, specificity, and ease-of-use. The technique is most useful when performing gene expression experiments, SNP detection, genotyping or allelic discrimination, and validating microarray data. Carefully designed qPCR assays can detect multiple copies of a target sequence per reaction, allowing a broad dynamic quantification range. One can choose the detection chemistry, ranging from intercalating dyes (SYBR Green) to target-specific probes (TaqMan, molecular beacons, etc.).1 The cost per sample is highly flexible and adapts to reaction volume, throughput, and detection method to meet specific research needs. Furthermore, the publication of the MIQE guidelines for qPCR promoted consistency between laboratories and increased experimental reproducibility.2
However, qPCR was seen to falter in applications requiring superior accuracy and sensitivity, such as copy number variation and rare event detection. In such applications, dPCR outperformed qPCR by not only measuring the absolute copy number but also overcoming the limits of detection, i.e., detecting small-fold differences and low abundance targets, or in other words, finding a needle in a haystack. Digital PCR eliminated the need for a standard curve and demonstrated lesser susceptibility to inhibitors owing to the end-point fluorescence reading of partitions. Error rates were reduced by removing the PCR efficiency bias, something which the scientific community indisputably appreciated.3
Ironically, although the early 1990s saw the emergence of “limiting dilution PCR” or “digital PCR,” its progress was soon impeded by the coming of real-time PCR.4,5 Why? Lack of instruments and chemistry. Fortunately, a more recent review paints a promising picture for the future of this technology.6 Digital PCR offers a straightforward all-or-nothing approach, which, combined with the continuous technological improvements and the publication of its MIQE guidelines, resulted in a resurgence of interest in exploiting its potential.7 Today, more and more scientists are turning to dPCR for studying cancer gene mutations, monitoring immunotherapy effectiveness, detecting pathogens, analyzing GMOs, assessing gene editing frequencies, and prenatal testing of genetic diseases, thus expanding its area of applications.8
Since its discovery in 1993, researchers have valued qPCR for its speed, sensitivity, specificity, and ease-of-use. The technique is most useful when performing gene expression experiments, SNP detection, genotyping or allelic discrimination, and validating microarray data. Carefully designed qPCR assays can detect multiple copies of a target sequence per reaction, allowing a broad dynamic quantification range. One can choose the detection chemistry, ranging from intercalating dyes (SYBR Green) to target-specific probes (TaqMan, molecular beacons, etc.).1 The cost per sample is highly flexible and adapts to reaction volume, throughput, and detection method to meet specific research needs. Furthermore, the publication of the MIQE guidelines for qPCR promoted consistency between laboratories and increased experimental reproducibility.2
However, qPCR was seen to falter in applications requiring superior accuracy and sensitivity, such as copy number variation and rare event detection. In such applications, dPCR outperformed qPCR by not only measuring the absolute copy number but also overcoming the limits of detection, i.e., detecting small-fold differences and low abundance targets, or in other words, finding a needle in a haystack. Digital PCR eliminated the need for a standard curve and demonstrated lesser susceptibility to inhibitors owing to the end-point fluorescence reading of partitions. Error rates were reduced by removing the PCR efficiency bias, something which the scientific community indisputably appreciated.3
Ironically, although the early 1990s saw the emergence of “limiting dilution PCR” or “digital PCR,” its progress was soon impeded by the coming of real-time PCR.4,5 Why? Lack of instruments and chemistry. Fortunately, a more recent review paints a promising picture for the future of this technology.6 Digital PCR offers a straightforward all-or-nothing approach, which, combined with the continuous technological improvements and the publication of its MIQE guidelines, resulted in a resurgence of interest in exploiting its potential.7 Today, more and more scientists are turning to dPCR for studying cancer gene mutations, monitoring immunotherapy effectiveness, detecting pathogens, analyzing GMOs, assessing gene editing frequencies, and prenatal testing of genetic diseases, thus expanding its area of applications.8