Extension of Critical Difference for Product Comparison. Application to Tobacco Products
Summary The differences or equivalence of products depend on various sources of variability like analytical methods, manufacturing processes, agricultural practices and environmental conditions. In addition, the capacity to compare and discriminate accurately two products is impacted by the number of characteristics considered for the comparison. Previously, it has been shown that a comparison of two products can be performed using the critical difference (CD), because it takes into consideration both the variability of measurements and laboratories. However, some additional sources of variability need to be added in the comparison when products were not manufactured at the same period of time or in the same factory. Here, an extended critical difference is proposed including manufacturing process variability according to the number of samples and batches collected for each product. The general formula and specific cases corresponding to different situations (one vs two labs, short vs long periods of time, same vs different periods of time, one vs several batches) are given.
- Research Article
- 10.3760/cma.j.issn.1009-2587.2018.03.005
- Mar 20, 2018
- Zhonghua shao shang za zhi = Zhonghua shaoshang zazhi = Chinese journal of burns
Objective: To explore the influence of three-level collaboration network of pediatric burns in Anhui province on treatment effects of burn children. Methods: The data of medical records of pediatric burn children transferred from Lu'an People's Hospital and Fuyang People's Hospital to the First Affiliated Hospital of Anhui Medical University from January 2014 to December 2015 and January 2016 to September 2017 (before and after establishing three-level collaboration network of pediatric burns treatment) were analyzed: percentage of transferred burn children to hospitalized burn children in corresponding period, gender, age, burn degree, treatment method, treatment result, occurrence and treatment result of shock, and operative and non-operative treatment time and cost. Rehabilitation result of burn children transferred back to local hospitals in 2016 and 2017. Data were processed with t test, chi-square test, Mann-Whitney U test, and Fisher's exact test. Results: (1) Percentage of burn children transferred from January 2014 to December 2015 was 34.3% (291/848) of the total number of hospitalized burn children in the same period of time, which was close to 30.4% (210/691) of burn children transferred from January 2016 to September 2017 (χ(2)=2.672, P>0.05). (2) Gender, age, burn degree, and treatment method of burn children transferred from the two periods of time were close (χ(2)=3.382, Z=-1.917, -1.911, χ(2)=3.133, P>0.05). (3) Cure rates of children with mild, moderate, and severe burns transferred from January 2016 to September 2017 were significantly higher than those of burn children transferred from January 2014 to December 2015 (χ(2)=11.777, 6.948, 4.310, P 0.05). (4) Children with mild and moderate burns transferred from the two periods of time were with no shock. The incidence of shock of children with severe burns transferred from January 2014 to December 2015 was 6.0% (4/67), and 3 children among them were cured. The incidence of shock of children with severe burns transferred from January 2016 to September 2017 was 3.9% (2/51), and both children were cured. The incidences and cures of shock of children with severe burns transferred from the two periods of time were close (χ(2)=0.006, P>0.05). Incidence of shock of children with extremely severe burns transferred from January 2014 to December 2015 was 57.1% (32/56), significantly higher than that of burn children transferred from January 2016 to September 2017 [34.5% (10/29), χ(2)=3.925, P 0.05). (5) Time of operative treatment of children with moderate, severe, and extremely severe burns transferred from January 2014 to December 2015 was obviously longer than that of burn children transferred from January 2016 to September 2017 (t=2.335, 2.065, 2.310, P 0.05). Costs of operative treatment of children with moderate and severe burns transferred from January 2014 to December 2015 were significantly more than those of burn children transferred from January 2016 to September 2017 (Z=-3.324, t=2.167, P 0.05). (6)Time of non-operative treatment of children with mild, moderate, and severe burns transferred from January 2014 to December 2015 was obviously longer than that of burn children transferred from January 2016 to September 2017 (t=2.335, Z=-2.095, t=2.152, P 0.05). Costs of non-operative treatment of children with moderate and severe burns transferred from January 2014 to December 2015 were obviously higher than those of burn children transferred from January 2016 to September 2017 (Z=-2.164, t=2.040, P 0.05). (7) Sixty-seven burn children transferred from January 2016 to September 2017 were transferred back to local hospitals for rehabilitation under the guidance of experts of the First Affiliated Hospital of Anhui Medical University, with 25 patients in 2016 and 42 patients in 2017. Effective rehabilitation rates of burn children transferred back to local hospitals for rehabilitation in 2016 and 2017 were both 100%. Conclusions: The three-level collaboration network of pediatric burns treatment in Anhui province can effectively increase cure rate of children with mild, moderate, and severe burns, reduce incidence of shock of children with extremely severe burns, shorten time of operative treatment of burn children with moderate, severe, and extremely severe burns, and time of non-operative treatment of children with mild, moderate, and severe burns, reduce treatment costs of children with moderate and severe burns, and improve rehabilitation effectiveness of children transferred from Lu'an People's Hospital and Fuyang People's Hospital to the the First Affiliated Hospital of Anhui Medical University.
- Conference Article
1
- 10.1117/12.911241
- Feb 23, 2012
For a rigorous x-ray imaging system optimization and evaluation, the need for exploring a large space of many different system parameters is immense. However, due to the high dimensionality of the problem, it is often infeasible to evaluate many system parameters in a laboratory setting. Therefore, it is useful to utilize computer simulation tools and analytical methods to narrow down to a much smaller space of system parameters and then validate the chosen optimal parameters by laboratory measurements. One great advantage of using the simulation and analytical methods is that the impact of various sources of variability on the system's diagnostic performance can be studied separately and collectively. Previously, we have demonstrated how to separate and analyze noise sources using covariance decomposition in a task-based approach to the assessment of digital breast tomosynthesis (DBT) systems in the absence of x-ray scatter and detector blur.<sup>1, 2</sup> In this work, we analytically extend the previous work to include x-ray scatter and detector blur. With use of computer simulation, we also investigate the use of the convolution method for approximating the scatter images of structured phantoms in comparison to those computed via Monte Carlo. The extended method is comprehensive and can be used both for exploring a large parameter space in simulation and for validating optimal parameters, chosen from a simulation study, with laboratory measurements.
- Single Report
- 10.3386/w2112
- Dec 1, 1986
There has been much recent discussion about the ultimate sources of macroeconomic variability. Shiller (1987) surveys this work, where he points out that a number of authors attribute most of output or unemployment variability to only a few sources, sometimes only one. The sources vary from technology shocks for Kydland and Prescott (1982), to unanticipated changes in the money stock for Barro (1977), to unusual structural shifts, such as changes in the demand for produced goods relative to services, for Lilien (1982), to oil price shocks for Hamilton (1983), to changes in desired consumption for Hall (1986). (See Shiller (1987) for more references.) Although it may be that there are only a few important sources of macroeconomic variability, this is far from obvious. Economies seem complicated, and it may be that there are many important sources. The purpose of this paper is to estimate the quantitative importance of various sources of variability using a macroeconometric model. Macroeconometric models provide an obvious vehicle for estimating the sources of variability of endogenous variables. There are two types of shocks that one needs to consider: shocks to the stochastic equations and shocks to the exogenous variables. Shocks to the stochastic equations are easy to handle. They are simply draws from the postulated distribution (usually normal) of the structural error terms, the distribution upon which the estimation of the model is based. Shocks to the exogenous variables are less straightforward to handle. Since by definition exogenous variables are not modeled, it is not unambiguous what one means by exogenous-variable shock. Another possibility is to postulate that exogenous-variable shocks are the errors that forecasting services make in their forecasts of the exogenous variables. The sources of output and price variability are examined in this paper using my United States model (Fair (1984)). The procedure that was followed, which is discussed in detail in the next section, is briefly as follows. Autoregressive equations were estimated for 23 exogenous variables in the model. These variables make up all the important exogenous variables in the model (in my view). These equations were then added to the model. There are 30 structural stochastic equations in the model, and so the expanded model includes 53 stochastic equations. The 53 x 53 covariance matrix of the error terms was then estimated. In estimating this matrix the error terms in the structural equations were assumed to be uncorrelated with the error terms in the exogenous-variable equations, which means that the matrix was taken to be block diagonal (with a 30 x 30 block and a 23 x 23 block). This procedure is consistent with the assumption upon which the estimation of the model is based, namely that the exogenous variables are not correlated with the error terms in the structural equations.
- Research Article
90
- 10.1128/jcm.02803-15
- Jan 13, 2016
- Journal of Clinical Microbiology
Interferon gamma release assays (IGRAs) are blood-based tests intended for diagnosis of latent tuberculosis infection (LTBI). IGRAs offer logistical advantages and are supposed to offer improved specificity over the tuberculin skin test (TST). However, recent serial testing studies of low-risk individuals have revealed higher false conversion rates with IGRAs than with TST. Reproducibility studies have identified various sources of variability that contribute to nonreproducible results. Sources of variability can be broadly classified as preanalytical, analytical, postanalytical, manufacturing, and immunological. In this minireview, we summarize known sources of variability and their impact on IGRA results. We also provide recommendations on how to minimize sources of IGRA variability.
- Research Article
3
- 10.1186/s12859-018-2075-8
- Mar 1, 2018
- BMC Bioinformatics
BackgroundIn the field of biomarker validation with mass spectrometry, controlling the technical variability is a critical issue. In selected reaction monitoring (SRM) measurements, this issue provides the opportunity of using variance component analysis to distinguish various sources of variability. However, in case of unbalanced data (unequal number of observations in all factor combinations), the classical methods cannot correctly estimate the various sources of variability, particularly in presence of interaction. The present paper proposes an extension of the variance component analysis to estimate the various components of the variance, including an interaction component in case of unbalanced data.ResultsWe applied an experimental design that uses a serial dilution to generate known relative protein concentrations and estimated these concentrations by two processing algorithms, a classical and a more recent one. The extended method allowed estimating the variances explained by the dilution and the technical process by each algorithm in an experiment with 9 proteins: L-FABP, 14.3.3 sigma, Calgi, Def.A6, Villin, Calmo, I-FABP, Peroxi-5, and S100A14. Whereas, the recent algorithm gave a higher dilution variance and a lower technical variance than the classical one in two proteins with three peptides (L-FABP and Villin), there were no significant difference between the two algorithms on all proteins.ConclusionsThe extension of the variance component analysis was able to estimate correctly the variance components of protein concentration measurement in case of unbalanced design.
- Book Chapter
- 10.1007/978-0-8176-8168-5_7
- Jan 1, 2004
In the preceding chapter, we considered a random effects model involving a two-way nested classification. Examples of three and higher-order nested classifications occur frequently in many industrial experiments where raw material is first broken up into batches and then into subbatches, subsubbatches, and so forth. For example, in an experiment designed to identify various sources of variability in tensile strength measurements, one may randomly select a lots of raw material, b boxes are taken from each lot, c sample preparations are made from the material in each box, and finally n tensile strength tests are performed for each preparation. These factors often present themselves in a hierarchical manner and are appropriately specified as random effects. In this chapter, we consider a random effects model involving a three-way nested classification and indicate its generalization to higher-order nested classifications.
- Research Article
15
- 10.1046/j.1460-9592.2003.00050.x
- Jan 1, 2003
- Pacing and clinical electrophysiology : PACE
The ECG may vary during the day (intra-day), and between days (interday), for the same subject. Variability in ECG characteristic measurements between different investigators is well documented and is often large. During days 1-6 of each placebo period of a two-way crossover Phase I study, digital ECGs were recorded at about 8 and 12 AM in 16 healthy volunteers (8 men, 8 women). Two observers independently analyzed leads V2 and V6 using EClysis software. The durations and amplitudes of major ECG waves and the intervals between major electrocardiographic events were analyzed in a mixed model ANOVA, in which subject, observer, time, and day were treated as random factors. The influence of various corrections for heart rate on the variability of QT intervals was investigated. The difference among subjects explained between 44-81% of the total variability in ECG intervals and amplitudes. Overall, inter- and intraday variability was not statistically significant for any variable. The individualized exponential correction of the QT interval for heart rate eliminated the QT interval dependence on the RR interval in all subjects. Changes in T wave morphology and shortening of the QT interval from morning to noon were observed in ten subjects. The interobserver variability was close to zero (SD < 0.005 ms) for all variables except the PQ interval (SD 1.4 ms). The various sources of variability in determinations of ECG wave characteristics should be considered in the design of clinical studies. The use of EClysis software for ECG measurements is this study made the results highly observer independent.
- Research Article
9
- 10.1002/jcla.20089
- Jan 1, 2005
- Journal of Clinical Laboratory Analysis
A standardized urinalysis and manual microscopic cell counting system was evaluated for its potential to reduce intra- and interoperator variability in urine and cerebrospinal fluid (CSF) cell counts. Replicate aliquots of pooled specimens were submitted blindly to technologists who were instructed to use either the Kova system with the disposable Glasstic slide (Hycor Biomedical, Inc., Garden Grove, CA) or the standard operating procedure of the University of California-Irvine (UCI), which uses plain glass slides for urine sediments and hemacytometers for CSF. The Hycor system provides a mechanical means of obtaining a fixed volume of fluid in which to resuspend the sediment, and fixes the volume of specimen to be microscopically examined by using capillary filling of a chamber containing in-plane counting grids. Ninety aliquots of pooled specimens of each type of body fluid were used to assess the inter- and intraoperator reproducibility of the measurements. The variability of replicate Hycor measurements made on a single specimen by the same or different observers was compared with that predicted by a Poisson distribution. The Hycor methods generally resulted in test statistics that were slightly lower than those obtained with the laboratory standard methods, indicating a trend toward decreasing the effects of various sources of variability. For 15 paired aliquots of each body fluid, tests for systematically higher or lower measurements with the Hycor methods were performed using the Wilcoxon signed-rank test. Also examined was the average difference between the Hycor and current laboratory standard measurements, along with a 95% confidence interval (CI) for the true average difference. Without increasing labor or the requirement for attention to detail, the Hycor method provides slightly better interrater comparisons than the current method used at UCI.
- Research Article
10
- 10.1080/00779959809544287
- Dec 1, 1998
- New Zealand Economic Papers
Structural vector autoregressive (SVAR) methodology is used to assess possible sources of macroeconomic variability in the New Zealand economy. As a test of robustness, two alternative business cycle filters are used to remove stochastic trends from integrated time series data. Regardless of the way in which cyclical fluctuations are empirically measured, the investigation attributes a considerable share of variability in the New Zealand macroeconomy to foreign sector shocks, particularly over the longer term. Furthermore, the relative importance of the various sources of variability are found to change following the removal of nominal interest rate and other controls and the floating of the New Zealand dollar in the mid‐1980s.
- Research Article
2
- 10.1029/2022ja031037
- Jun 1, 2023
- Journal of Geophysical Research: Space Physics
The variability of the thermosphere on a daily basis is influenced by a variety of factors, including solar, geomagnetic, and meteorological drivers. The column density ratio of atomic oxygen to molecular nitrogen (ΣO/N2) is a useful parameter for quantifying this variability, and has been shown to closely correspond to F‐region electron density, total electron content, and upper atmospheric transport. Despite the significance of the ΣO/N2, the relative contributions of these drivers to thermospheric variability are not well understood. In order to shed light on this issue, principal component analysis was performed in this study to distinguish and rank the various sources of variability in the ΣO/N2. The analysis was based on the ΣO/N2 data from the Global‐scale Observations of the Limb and Disk mission from days 81–135 of 2020. The resulting two‐dimensional eigen spatial patterns reveal the dominant variabilities during the specified period. The first six principal components are reported and associated with the major drivers through their spatial and temporal features. Geomagnetic storms, interhemispheric transport, atmospheric tides, and planetary waves were identified as the drivers of the first, second, third, and fifth components, respectively. The order of these components highlights that geomagnetic activity is the dominant source of daily variability in the ΣO/N2, followed by interhemispheric transport and meteorological drivers from the lower atmosphere.
- Research Article
14
- 10.2307/1885115
- May 1, 1988
- The Quarterly Journal of Economics
There has been much recent discussion about the ultimate sources of macroeconomic variability. A number of authors attribute most of this variability to only a few sources, sometimes only one. Although there may be only a few important sources, this is far from obvious, since economies seem complicated. The purpose of this paper is to provide quantitative estimates of various sources of variability using a U. S. econometric model. Stochastic simulation is used to estimate how much the overall variances of real GNP and the GNP deflator are reduced when various shocks are suppressed in the model.
- Research Article
81
- 10.1016/0002-9149(80)90168-x
- May 1, 1980
- The American Journal of Cardiology
Sources of variability in echocardiographic measurements
- Research Article
10
- 10.1109/ted.2022.3231569
- Feb 1, 2023
- IEEE Transactions on Electron Devices
This article presents the design of a novel and compact spin-orbit torque (SOT)-based ternary content addressable memory (TCAM). Experimentally validated/calibrated micromagnetic and macrospin simulations have been used to quantify various tradeoffs regarding the write operation, such as write energy, error rate, and retention time. SPICE simulations incorporating various sources of variability are used to evaluate search operations, optimize the proposed novel TCAM cell based on SOT magnetic random access memory (SOT-MRAM), and benchmark it against static random access memory (SRAM)- and FeFET-based TCAMs. We show low search error rates (SERs) (< 10−4) while considering various sources of variability for four transistors- and two magnetic tunnel junctions (MTJs)-based TCAM array.
- Research Article
11
- 10.1063/1.4971797
- Dec 23, 2016
- The Journal of Chemical Physics
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.
- Research Article
- 10.3760/cma.j.issn.1009-2587.2017.07.004
- Jul 20, 2017
- Chinese journal of burns
Objective: To investigate epidemiological characteristics of hospitalized children with burn injury in the author's affiliation, so as to provide theoretical basis for developing prevention strategies of children with burn injury. Methods: Medical records of 384 and 596 hospitalized children with burn injury, aged 0 to 12-year-old, were collected respectively from January 2001 to December 2005 and January 2011 to December 2015. Percentage of children with burn injury to total hospitalized patients with burn injury in the same period of time, age, causes of injury, gender, injury month, residence, condition of first aid measures conforming to medical standard, time of admission post injury, burn degree, and operation condition of children with burn injury were analyzed. Data were processed with Mann-Whitney U test and Chi-square test. Results: From January 2001 to December 2005 and January 2011 to December 2015, percentages of children with burn injury to total hospitalized patients with burn injury in the same period of time were respectively 23.6% (384/1 626) and 25.4% (596/2 346) , with no statistically significant difference (χ(2)=1.653, P>0.05). Age of all children with burn injury was 1.0 (1.0, 2.0) year old from January 2011 to December 2015, obviously lower than that from January 2001 to December 2005[1.0 (1.0, 3.0) year old, Z=-3.257, P<0.01]. Ages of children with burn caused by hot liquid and electrical burn from January 2011 to December 2015 were obviously lower than those from January 2001 to December 2005 (with Z values respectively -4.248 and -2.040, P<0.05 or P<0.01). Compared with that from January 2001 to December 2005, age of children with burn caused by flame from January 2011 to December 2015 increased, with no statistically significant difference (Z=1.852, P>0.05). There was no statistically significant difference in gender of children with burn injury between the two periods of time (χ(2)=1.374, P>0.05). Burn injury of children in the two periods of time mainly occurred in Spring, and season of burn injury between the two periods of time was similar (χ(2)=1.177, P>0.05). There was statistically significant difference in residence of children with burn injury between the two periods of time (χ(2)=15.513, P<0.01). The number of children with burn injury of first aid measures conforming to medical standard and admission within 6 h post injury from January 2011 to December 2015 was obviously more than that from January 2001 to December 2005 (with χ(2) values respectively 7.434 and 43.961, P values below 0.01). Burn degrees of children with burn injury mainly were moderate in the two periods of time, and there was no statistically significant difference in burn degree and condition of operation between the two periods of time (with χ(2) values respectively 5.731 and 1.583, P values above 0.05). Conclusions: Burn of children is a social problem. We should make great efforts on popularization of prevention and treatment about burn of children, especially children with younger age in rural areas. We should publicize standard first aid measures of burn of children and advocate admission of burn of children within 6 h post burn injury for treatment.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.