Sort by
Evaluation of New Potential Inflammatory Markers in Patients with Nonvalvular Atrial Fibrillation.

Atrial fibrillation (AF), the most common arrhythmia in clinical practice, is associated with an increase in mortality and morbidity due to its high potential to cause stroke and systemic thromboembolism. Inflammatory mechanisms may play a role in the pathogenesis of AF and its maintenance. We aimed to evaluate a range of inflammatory markers as potentially involved in the pathophysiology of individuals with nonvalvular AF (NVAF). A total of 105 subjects were enrolled and divided into two groups: patients with NVAF (n = 55, mean age 72 ± 8 years) and a control group of individuals in sinus rhythm (n = 50, mean age 71 ± 8 years). Inflammatory-related mediators were quantified in plasma samples by using Cytometric Bead Array and Multiplex immunoassay. Subjects with NVAF presented significantly elevated values of interleukin (IL)-2, IL-4, IL-6, IL-10, tumor necrosis factor (TNF), interferon-gamma, growth differentiation factor-15, myeloperoxidase, as well as IL-4, interferon-gamma-induced protein (IP-10), monokine induced by interferon-gamma, neutrophil gelatinase-associated lipocalin, and serum amyloid A in comparison with controls. However, after multivariate regression analysis adjusting for confounding factors, only IL-6, IL-10, TNF, and IP-10 remained significantly associated with AF. We provided a basis for the study of inflammatory markers whose association with AF has not been addressed before, such as IP-10, in addition to supporting evidence about molecules that had previously been associated with the disease. We expect to contribute to the discovery of markers that can be implemented in clinical practice hereafter.

Open Access
Relevant
Early identification of ICU patients at risk of complications: Regularization based on robustness and stability of explanations

The aim of this study is to build machine learning models to predict severe complications using administrative and clinical elements that are collected immediately after patient admission to the intensive care unit (ICU). Risk models are of increasing importance in the ICU setting. However, they generally present the black-box issue because they do not provide meaningful information about the logic involved in patient-specific predictions. Fortunately, effective algorithms exist for explaining black-box models, and in practice, they offer valuable explanations for model predictions. These explanations are becoming essential to engender trust and accreditation to the model. However, once the model is implemented, a major issue is whether it will continue to employ the same prediction logic as originally intended to. To build our models, features are obtained from patient administrative data, laboratory results and vital signs available within the first hour after ICU admission. This enables our models to provide great anticipation because complications can occur at any moment during ICU stay. To build models that continue to work as originally designed we first propose to measure (i) how the provided explanations vary for different inputs (that is, robustness), and (ii) how the provided explanations change with models built from different patient sub-populations (that is, stability). Second, we employ these measures as regularization terms that are coupled with a feature selection procedure such that the final model provides predictions with more robust and stable explanations. Experiments were conducted on a dataset containing 6000 ICU admissions of 5474 patients. Results obtained on an external validation cohort of 1069 patients with 1086 ICU admissions showed that selecting features based on robustness led to gains in terms of predictive power that varied from 6.8% to 9.4%, whereas selecting features based on stability led to gains that varied from 7.2% to 11.5%, depending on the target complication. Our results are of practical importance as our models predict complications with great anticipation, thus facilitating timely and protective interventions.

Relevant
Splenic, hepatic, renal and pulmonary clearance dysfunction associated with high-energy X-rays

Purpose To verify the high-energy X-rays effects on the blood clearance of colloidal particles by the spleen, liver, kidneys, and lungs. Materials and methods Seventeen male Wistar rats were distributed into three groups. Group 1 (n = 5) – control – non-irradiated animals, group 2 (n = 6) – irradiated animals studied 24 h after irradiation, and group 3 (n = 6) – irradiated animals studied 48 h after irradiation. The animals were anesthetized and irradiated with a non-fractionated 8 Gy dose in the abdominal region divided in two parallel and opposite fields, 4 Gy was given to the anteroposterior and 4 Gy to the posteroanterior. This high dose of high-energy X-rays causes extensive cell killing, tissue disorganization and break down cell to cell communication. According to the groups, 50 µCi of technetium-phytate were injected into the right internal jugular vein. After thirty minutes, the liver, spleen, kidneys, and lungs were removed. The clot was harvested from the abdominal cavity two minutes after the sectioning of the abdominal aorta and cava vein. The organs and clot were placed into plastic flasks to be weighed and studied for the emission of radioactivity in a gamma radiation detector. The uptake function of each organ was calculated based on the count of gamma rays emitted per minute and normalized with the organ mass, having as a reference the radioactivity count of a standard sample. The arithmetic mean of each organ uptake was calculated and compared among the groups. Results After irradiation, the spleen uptake of colloidal radiopharmaceutical was greater, while the hepatic, renal, and pulmonary uptake were lower. The renal uptake decreased slower than the hepatic and pulmonary uptake. Conclusion A single high dose of high-energy X-rays enhances the splenic clearance function, while it reduces the hepatic, renal, and pulmonary clearance until 48 h after irradiation, with a rapid deterioration of the hepatic and pulmonary uptake function.

Relevant
819. Antibiotic Resistance in Pathogens Causing Hospital Acquired Infections in Brazil: A Multicenter Study

Abstract Background In the present study we determined the prevalence of antibiotic resistance in the most common organisms causing healthcare-associated infections in tertiary-care hospitals in Belo Horizonte, a 3,000,000 inhabitants city from Brazil. Methods Microbiology data of hospital acquired infections (HAI) defined by the National Healthcare Safety Network (NHSN)/CDC protocols of seven general hospitals were analyzed: three public institutions, two philanthropic, and two private hospitals. Samples from different topographies were plate in an ideal culture medium and after growth, the microorganisms were identified by standard biochemical and microbiological methods, using the VITEK 2 compact system (Biomerieux), which allows the simultaneous identification of Gram-positive and Gram bacteria -negative and combine the identification and TSA results in a single report. Six hospitals used automated methods and one institution used manual method for antimicrobial susceptibility testing. Results Samples of seven Gram-negative and two Gram-positive bacteria collected between Dec/2019-Nov/2020 from HAI isolates were analyzed: 565 Klebsiella, 293 Escherichia coli, 153 Proteus, 403 Pseudomonas, 275 Acinetobacter, 174 Serratia, 153, 361 Staphylococcus aureus, and 176 Enterococcus. Antibiotic resistance profile of each strain is summarized in Figures 1, 2, and 3. Resistance profile: Klebsiella, E. coli, Proteus. ATB profile: Pseudomonas, Acinetobacter, Serratia. ATB profile: Enterobacter, S. aureus, Enterococcus . Conclusion Benchmarks for antibiotic resistance in the most common organisms causing healthcare-associated infections were defined, and can be used as indicators for healthcare assessment, specially in developing countries institutions. Disclosures All Authors: No reported disclosures

Open Access
Relevant
Percutaneous hemiepiphysiodesis using transphyseal screws for adolescent tibia vara.

Hemiepiphysiodesis around the knee is becoming the mainstay procedure in adolescents for a wide range of aetiological deformities, when considering adolescent tibia vara (ATV), the published series have variable results. The purpose of this study was to review our experience with the percutaneous transphyseal screw (PETS) in these patients followed until bone maturity. We analysed the charts from 13 patients (20 knees) that underwent lateral tibial hemiepiphysiodesis using PETS. The radiographs were accessed before surgery, at implant removal, when occurred, and at the final follow-up. The clinical evaluation noted if there were complaints regarding pain or range of motion, and the radiographic assessment included: the femorotibial angle, the mechanical axis zone, the anatomic lateral distal femoral angle, and medial mechanical proximal tibial angle. There was one overcorrection, and after the screw removal (14 knees), rebound was observed in two knees modifying the result from excellent to good in all three knees. No bone bars and no implant breakage were observed. At the last appointment, all patients had normal knee range of motion, and two patients had unilateral alignment complaints, one of whom referred to occasional pain. Overall, the surgery was excellent in 12 knees (60%), good in six knees (30%), and poor in two knees (10%). This technique is indicated to be well tolerated and effective for treating ATV. When a complete correction cannot be obtained, in our opinion, it is advantageous to at least stabilise the deformity and postpone osteotomies until after skeletal maturity. Level of Evidence: Level IV - Case Series, Therapeutic Study.

Relevant
Protocol of BRICS: Brazilian multicentric pragmatic randomised trial of surgical interventions for displaced diaphyseal clavicle fracture study: MIPO versus ORIF for the treatment of displaced midshaft clavicle fractures

IntroductionFractures of the diaphysis of the clavicle are common; however, treatment guidelines for this condition are lacking. Surgery is associated with a lower risk of non-union and better functional outcomes but a higher risk of complications. Open reduction and internal fixation with plates and screws are the most commonly performed techniques, but they are associated with paraesthesia in the areas of incisions, extensive surgical exposure and high rates of implant removal. Minimally invasive techniques for treating these fractures have a lower rate of complications. The aim of this study is to evaluate which surgical treatment option (minimally invasive osteosynthesis or open reduction and internal fixation) has better prognosis in terms of complications and reoperations.Methods and analysisThe study proposed is a multicentric, pragmatic, randomised, open-label, superiority clinical trial between minimally invasive osteosynthesis and open reduction and internal fixation for surgical treatment of patients with displaced fractures of the clavicle shaft. In the proposed study, 190 individuals with displaced midshaft clavicle fractures, who require surgery as treatment, will be randomised. The assessment will occur at 2, 6, 12, 24 and 48 weeks, respectively. The primary outcome of the study will be the number of complications and reoperations. For sample size calculation, a moderate effective size between the techniques was considered in a two-tailed test, with 95% confidence and 90% power. Complications include cases of infection, hypertrophic scarring, non-union, refracture, implant failure, hypoesthesia, skin irritation and shoulder pain. Reoperations are defined as the number of surgeries for pseudoarthrosis, implant failure, infection and elective removal of the implant.Ethics and disseminationStudy approved by the institutional ethics committee (number 34249120.9.0000.5505—V.3). The results will be disseminated by publications in peer-reviewed journals and presentations in medical meetings.Trial registration numberRBR-3czz68)/UTN U1111-1257-8953.

Open Access
Relevant
A prospective validation and comparison of three multivariate models for prediction of difficult intubation in adult patients

PurposeSeveral bedside clinical tests have been proposed to predict difficult tracheal intubation. Unfortunately, when used alone, these tests show less than ideal prediction performance. Some multivariate tests have been proposed considering that the combination of some criteria could lead to better prediction performance. The goal of our research was to compare three previously described multivariate models in a group of adult patients undergoing general anesthesia. MethodsThis study included 220 patients scheduled for elective surgery under general anesthesia. A standardized airway evaluation which included modified Mallampati class (MM), thyromental distance (TMD), mouth opening distance (MOD), head and neck movement (HNM), and jaw protrusion capacity was performed before anesthesia. Multivariate models described by El-Ganzouri et al., Naguib et al., and Langeron et al. were calculated using the airway data. After anesthesia induction, an anesthesiologist performed the laryngoscopic classification and tracheal intubation. The sensitivity, specificity, and receiver operating characteristic (ROC) curves of the models were calculated. ResultsThe overall incidence of difficult laryngoscopic view (DLV) was 12.7%. The area under curve (AUC) for the Langeron, Naguib, and El-Ganzouri models were 0.834, 0.805, and 0.752, respectively, (Langeron...>...El-Ganzouri, p...=...0.004; Langeron...=...Naguib, p...=...0.278; Naguib...=...El-Ganzouri, p...=...0.101). The sensitivities were 85.7%, 67.9%, and 35.7% for the Langeron, Naguib, and El-Ganzouri models, respectively. ConclusionThe Langeron model had higher overall prediction performance than that of the El-Ganzouri model. Additionally, the Langeron score had higher sensitivity than the Naguib and El-Ganzouri scores, and therefore yielded a lower incidence of false negatives.

Open Access
Relevant
Food preferences and aversions of patients undergoing chemotherapy, radiotherapy and/or hematopoietic stem cell transplantation.

This longitudinal, qualitative, descriptive, and exploratory study aimed to identify and understand the food preferences and aversions arising from hematopoietic stem cell transplantation (HSCT), chemotherapy, and/or radiotherapy treatment. An open and individual interview was carried out with patients diagnosed with hematological diseases or cancer, submitted to HSCT, chemotherapy, and/or radiotherapy treatment. The participants answered the following questions: "Have you experienced any changes in taste since the beginning of radiotherapy/chemotherapy?"; "Have you experienced any strange taste in your mouth, aversion or preference for a certain food that did not exist before the beginning of radiotherapy/chemotherapy?" The software IRAMUTEQ (R Interface for Multidimensional Analysis of Texts and Questionnaires) version 0.7 alpha 2 was used for textual analysis, with similarity analysis and word cloud. One hundred and forty six patients were included in the study, 50% (n=73) female and 73% (n=50) elderly. The main words reported by the participants in regards to food aversions were "meat", "beef" and "chicken", which are related to dysphagia. Regarding food preferences, the most mentioned words were "fruits", "juices" and "soups", whose consumption was associated with an improvement in gastrointestinal symptoms, especially nausea. Adjustments in the diet plan based on this information can contribute to a better acceptance of the diet, and clinical and nutritional prognosis.

Relevant
Association of dietary total antioxidant capacity with anthropometric indicators, C-reactive protein, and clinical outcomes in hospitalized oncologic patients

ObjectiveMany studies have shown an inverse association between higher dietary total antioxidant capacity (DTAC) and chronic non-communicable diseases, including cancer. The aim of this study was to evaluate the association of the DTAC with anthropometric and biochemical indicators and clinical outcomes in hospitalized patients with cancer. MethodsA cross-sectional study was carried out with 196 hospitalized patients diagnosed with cancer. The DTAC, determined by the ferric-reducing antioxidant power method, was calculated using a validated standard spreadsheet. Multivariate linear regression was used to assess the association, identifying anthropometric indicators that were associated with DTAC and the variables of interest. P < 0.05 was statistically significant. ResultsThe individuals included in the last tertile of DTAC presented lower occurrences of death (P = 0.032), constipation (P = 0.010), dysphagia (P = 0.010), painful swallowing and chewing (P = 0.019), and dehydration (P = 0.032) than individuals in the first tertile. The C-reactive protein values were significantly lower (P = 0.010) and handgrip strength values were higher (P = 0.037) in individuals in the third tertile than in the other participants. ConclusionsDTAC was associated with a better prognosis of hospitalized cancer patients, considering signs and symptoms of nutritional impact, as well as the inflammatory state of the patients. These factors may influence the length of hospital stay and mortality. The findings of this research provide important information for a preventive and nutritional management perspective in this population.

Open Access
Relevant