To compare the marginal fit of Ni-Cr copings made by casting in two different states (dry and wet) with either cellulose ring liners or no ring liners. An in vitro study was conducted on 40 patterns which were invested, and a burnout casting procedure was used, using a Ni-Cr alloy to obtain the cast copings. The cast copings were divested, cleaned, and run through a finishing procedure. The copings were seated on the stainless steel die and microscopically evaluated for marginal discrepancies at two predetermined reference points using an optical microscope (Rieichert, Austria). Statistical analysis was done using Pearson's correlation coefficient and Chi-square test, keeping 95% confidence intervals and having a p value of less than 0.05 statistically significant. Our results identified marginal discrepancies in all the Ni-Cr cast copings, which significantly differed from each other. The copings obtained from casting with a cellulose ring liner in a wet state showed a significantly higher value followed by casting using a cellulose ring liner in a dry state. The mean marginal discrepancy values were within the clinically acceptable range for all the Ni-Cr cast copings included in two groups. Our study concludes that the copings cast with a cellulose ring liner in dry states had the least amount of vertical marginal discrepancies as compared to those that were cast with a ring liner in two different forms or those cast without a ring liner, suggesting that the use of cellulose ring liner in a dry state is favorable for all casting procedures. Measuring gaps or discrepancies at margins is the commonly used method to determine the fit of Ni-Cr copings. In order to minimize marginal inaccuracies, various authors have suggested different methods to improve the marginal adaptation of cast restorations. There are compensation methods like setting expansion, hygroscopic expansion, and thermal expansion of the investment, which are used to assess metal shrinkage during cooling.
Read full abstract