The bleeding tendencies of patients with chronic liver disease have challenged clinicians for centuries. In recent decades, modern science has helped in elucidating some of the mechanisms for this coagulopathy, but there is much to be discovered. It is well known that progressive liver disease is associated with increasing disruption in protein synthesis. Indeed, one of the most ubiquitous of serum proteins, albumin, has been intricately linked to prognosis in cirrhosis patients through the Child–Turcotte–Pugh Score [1], and clinicians have known that dwindling albumin levels are predictive of poor prognosis in these patients. However, it is less widely understood that patients with chronic liver disease may have a persistence of some serum proteins due to ineffective protein degradation. This combination of protein deficiency and persistence contributes to the conflicting knowledge base on coagulopathy in liver disease. In the healthy state of hemostasis, the bleeding and clotting tendencies are balanced. In cirrhosis, this balance is clearly achieved most of the time. The key concept is that the balance is tenuous because of decreased reserve, and when disrupted there can be faltering towards bleeding, especially in the cases where mechanical issues are involved, such as procedural or variceal bleeding. However, there are also circumstances that can lead to abnormal clotting in these patients because of flow issues and local persistence of the prothrombotic proteins. It is the understanding and quantification of this tenuous balance between bleeding and clotting that is shaping our current understanding of the coagulopathy of liver disease. The quantification of this imbalance is the key to using this conceptual framework to establish a clinically useful paradigm for treating patients with chronic liver disease. The widely available clinical laboratory tests used to measure coagulopathy were generally developed to measure quantitative deficiencies in platelet counts or procoagulant factors, and were designed to assess congenital factor deficiencies, such as hemophilia and acquired platelet diseases, such as idiopathic thrombocytopenic purpura. Indeed, the international normalized ratio (INR) was first conceived as a means of standardizing treatment with warfarins. Recent commentary by Tripodi et al. [2], in this journal has outlined the difficulty in using traditional laboratory testing to measure the three fundamental coagulation phases in patients with cirrhosis. In primary hemostasis, which fundamentally involves the platelets and tissue factor, there is a near universal quantitative decrease in platelet numbers in cirrhosis patients, but the data supporting a primary platelet functional deficit is less strong. Based on the level of absolute thrombocytopenia commonly seen in advanced liver disease and early studies suggestive of an additional platelet function disorder [3], it was thought that primary hemostasis was markedly reduced in these patients. However, because much of the function of platelets is dependent on the flow conditions within the vessels, in vitro tests are not representative of the true in vivo function of the platelet. Lisman et al. [4], have extensively studied platelets in flow conditions, and find adequate platelet function using serum of cirrhosis patients due to elevated antigen levels of von Willebrand factor and reduced activity of its cleaving protein, ADAMTS13 [5]. This research has brought into question the magnitude of defect in primary hemostasis in P. G. Northup (&) S. H. Caldwell Division of Gastroenterology and Hepatology, University of Virginia, JPA and Lee Street, MSB 2142, P.O. Box 800708, Charlottesville, VA 22908-0708, USA e-mail: pgn5qs@virginia.edu
Read full abstract