BackgroundQuantitative real-time polymerase chain reaction (qPCR) is a reliable and efficient method for quantitation of gene expression. Due to the increased use of qPCR in examining nutrient-gene interactions, it is important to examine, develop, and utilize standardized approaches for data analyses and interpretation. A common method used to normalize expression data involves the use of reference genes (RG) to determine relative mRNA abundance. When calculating the relative abundance, the selection of RG can influence experimental results and has the potential to skew data interpretation. Although common RG may be used for normalization, often little consideration is given to the suitability of RG selection for an experimental condition or between various tissue or cell types. In the current study, we examined the stability of gene expression using BestKeeper, comparative delta quantitation cycle, NormFinder, and RefFinder in a variety of tissues obtained from iron-deficient and pair-fed iron-replete rats to determine the optimal selection among ten candidate RG.ResultsOur results suggest that several commonly used RG (e.g., Actb and Gapdh) exhibit less stability compared to other candidate RG (e.g., Rpl19 and Rps29) in both iron-deficient and iron-replete pair-fed conditions. For all evaluated RG, Tfrc expression significantly increased in iron-deficient animal livers compared to the iron-replete pair-fed controls; however, the relative induction varied nearly 4-fold between the most suitable (Rpl19) and least suitable (Gapdh) RG.ConclusionThese results indicate the selection and use of RG should be empirically determined and RG selection may vary across experimental conditions and biological tissues.
Read full abstract