Articles published on Sustainable Use
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
21293 Search results
Sort by Recency
- New
- Research Article
- 10.1002/wll2.70031
- Jan 16, 2026
- Wildlife Letters
- Feng Jiang + 8 more
ABSTRACT Amidst the global biodiversity crisis, conserving endangered species like musk deer ( Moschus spp.) is crucial. China holds the highest musk deer diversity worldwide, with abundant populations, wide distribution, and significant musk production. We reviewed the morphology, population trends, distribution, conservation status, captive breeding, and related challenges of musk deer to inform targeted conservation strategies. Currently, wild musk deer populations are primarily distributed in western, central, and northeastern regions of China. Over the past seven decades, illegal hunting and habitat fragmentation have caused severe population declines, reducing most populations by over 97% since the 1950s. To alleviate pressure on wild populations and promote sustainable resource use, China initiated captive breeding of musk deer in 1958, resulting in a 5.6‐fold increase in captive musk deer from the 1990s to the 2020s. However, challenges like serious diseases hinder further expansion of artificial musk production. Given the limitations of both in situ and ex situ conservation, it is advised to establish or optimize natural reserves in musk deer habitat. Strengthening law enforcement, population monitoring, ecological research, captive breeding, reintroduction, and public awareness is essential for global musk deer conservation.
- New
- Research Article
- 10.3389/fmicb.2025.1729855
- Jan 15, 2026
- Frontiers in Microbiology
- Anicia Gomes + 5 more
This study evaluated whether irrigation with treated wastewater of different microbiological quality (secondary- and tertiary-treated wastewater) contributes to the transmission of antibiotic-resistant bacteria (ARB) and antimicrobial resistance genes (ARGs) from irrigation water to lettuce plants, using potable water as control. Bacterial indicators ( Escherichia coli and extended-spectrum β-lactamase-producing E. coli , ESBL- E. coli ) and ARGs ( bla CTX – M –1 , bla TEM , sul1 , tetA ) were quantified in irrigation water and lettuce using culture-based methods and quantitative PCR (qPCR). In addition, the efficiency of tertiary treatment in reducing Escherichia coli , ESBL- E. coli , and resistance genes in reclaimed water was assessed. The relative abundance of ARGs was normalized to the 16S rRNA gene to evaluate potential amplification or persistence of resistance during water reuse and irrigation. Results showed that E. coli and ESBL- E. coli were consistently detected in crops irrigated with secondary-treated water but remained below detection limits after irrigation with tertiary-treated and potable water. Resistance gene profiles followed a similar trend: secondary-treated water contained the highest absolute and relative abundances of bla CTX – M –1 , bla TEM , sul1 , and tetA , while tertiary treatment substantially reduced but did not completely eliminate them. In lettuce, ARG levels on lettuce were substantially lower than in the corresponding irrigation waters, representing only 4 and 6% of the concentrations detected in tertiary- and secondary-treated wastewater, respectively. This reduction indicates limited transfer and/or persistence of ARGs on the plant surface despite detectable levels in the irrigation water. Our study provides valuable insights into the role of poor-quality irrigation water in driving ARGs dissemination to fresh produce and shows that advanced tertiary treatments significantly reduce AMR-related risks, thereby supporting the safe and sustainable use of reclaimed water in agriculture.
- New
- Research Article
- 10.33182/bc.v16i1.2937
- Jan 15, 2026
- Border Crossing
- Maryam Liman
This study examines the dynamics of migration from southern Niger to northern Nigeria, focusing on the processes that leads to de-agrarianization. Data was collected through household interviews and focus group discussions with migrants in Daura, Katsina, ‘Yar Shanya, Magama Jibiya, Kano, and surrounding areas. Findings reveal that migration is driven by multiple factors including economic (bida), seasonal (ci rani), educational, health, and business-related motives. Most migrants initially engage in circular or seasonal migration, returning home during the rainy season for farming, but after six to ten years many transition into permanent settlement. This shift is often accompanied by household restructuring, where spouses either relocate from Niger or Nigerian partners join the household. Migrants sustain links to their origins through remittances, facilitated informally via personal networks. Results further show that economic migrants, particularly ‘yan ci rani, are the group most associated with de-agrarianization, gradually abandoning farming for non-agricultural livelihoods. While this transition enhances income opportunities in host communities, it simultaneously contributes to farmland abandonment and declining agricultural productivity in areas of origin. The study concludes that de-agrarianization poses a growing threat to food security in the Sahel, underscoring the need for policies that encourage sustainable land use, strengthen agricultural support systems, and provide livelihood diversification strategies that complement rather than replace farming.
- New
- Research Article
- 10.62383/edukasi.v3i1.2783
- Jan 14, 2026
- Edukasi Elita : Jurnal Inovasi Pendidikan
- Alya Puspita + 4 more
The use of technology in elementary school learning plays a crucial role in improving the quality of the educational process. One frequently used tool is the infocus (LCD projector), which serves to present learning materials in visual and audiovisual forms. The purpose of this study was to assess the effectiveness of using infocus-based learning media in elementary schools and its impact on student learning. The method used was a qualitative approach, collecting data through observation, interviews, and documentation. The results showed that the use of infocus can improve students' concentration, motivation, and understanding of the subject matter, as the presentation becomes more engaging and easier to understand. However, the use of infocus in elementary schools still faces various challenges, such as limited facilities and infrastructure, teacher readiness to operate the equipment, and technical problems such as power problems and additional devices. Therefore, supporting facilities and improving teacher skills are needed to ensure the effective and sustainable use of infocus as a learning medium.
- New
- Research Article
- 10.1038/s41598-025-33170-9
- Jan 14, 2026
- Scientific reports
- Dawa Chyophel Lepcha + 7 more
Underwater images typically suffer from poor visibility, low contrast, and severe color distortion caused by wavelength-dependent absorption and scattering of light. These degradations not only reduce visual quality but also affect subsequent analysis and interpretation in marine and robotic imaging applications. To address these challenges, this study presents an efficient underwater image enhancement (UIE) framework that integrates color balancing, morphological residual processing, and gamma correction to achieve natural color restoration and structural enhancement. Initially, an adaptive color compensation strategy corrects the imbalance in red and blue channels, followed by morphological residual processing that refines fine textures while suppressing unwanted noise. The enhanced outputs are then fused through an adaptive multiscale fusion process guided by optimized weight maps to preserve both global illumination and local detail. A final gamma correction step ensures perceptually balanced contrast and brightness. The proposed method requires no training data or prior depth estimation making it computationally efficient and robust for real-time applications. Extensive experiments conducted on multiple benchmark underwater datasets demonstrate that the proposed approach consistently outperforms 22 state-of-the-art UIE techniques in both qualitative and quantitative assessments. The method achieves superior results in terms of peak signal-to-noise ratio (PSNR), structural similarity index measure (SSIM), underwater image quality measure (UIQM), and underwater color image quality evaluation (UCIQE) metrics, confirming its capability to restore realistic colors, enhance visibility, and preserve fine details. The proposed framework provides an effective and lightweight solution for practical underwater imaging enhancement. This work supports SDG 14 (Life Below Water) by enhancing underwater imagery for marine monitoring, SDG 9 (Industry, Innovation and Infrastructure) through an efficient real-time imaging framework, and SDG 12 (Responsible Consumption and Production) by enabling accurate underwater inspection that promotes sustainable resource use.
- New
- Research Article
- 10.1371/journal.pstr.0000214
- Jan 13, 2026
- PLOS Sustainability and Transformation
- Morena Merkelbach + 3 more
Degraded peatlands are significant contributors to greenhouse gas emissions, requiring innovative strategies for their restoration and sustainable use. Paludiculture, or wet agriculture, is an approach to align climate protection with productive land use. However, its adoption requires a deeper understanding of farmers’ attitudes and targeted support to facilitate their transition from conventional to more sustainable alternatives. This study investigates pioneer farmers’ motivations and barriers to adopting paludiculture in Germany. Using a qualitative approach guided by the Health Belief Model (HBM), we conducted semi-structured interviews with 18 German farmers engaged in or transitioning to paludiculture. Farmers perceived multiple threats associated with conventional drainage-based agriculture, including climate change impacts, soil degradation, and economic risks, while acknowledging paludiculture’s benefits for reducing emissions, biodiversity conservation, and water management. Farmers saw themselves as confident, innovative, and cooperative—key qualities for adopting paludiculture. Nonetheless, adoption was hindered by economic challenges, bureaucratic hurdles, and limited capacity building. Support programs, networks, and historical legacies were seen as important enablers. Yet, wider adoption will require expanded support programs, strengthened markets for paludiculture products, and adaptive regulatory frameworks. Pioneer farmers’ experiences can inspire others, and the leadership of experienced practitioners will be vital for driving broader adoption. Empowering farmers as agents of change and fostering collaboration among stakeholders are essential to unlocking the full potential of paludiculture as a sustainable wetland-use strategy.
- New
- Research Article
- 10.1080/15567036.2025.2603509
- Jan 13, 2026
- Energy Sources, Part A: Recovery, Utilization, and Environmental Effects
- Khokan Sahoo + 3 more
ABSTRACT Waste-to-energy and the maximum utilization of refinery by-products have become strong scientific interests, with gasification emerging as a sustainable pathway for effective conversion. In this context, detailed insight into petroleum pitch gasification is crucial for enabling its efficient and sustainable use as a viable feedstock in industrial gasification. The present study conducted TGA experiments at heating rates of 1.5, 2.25, 2.6, and 3.5°C/min. Multi-model isoconversional kinetic methods, including the Friedman, Flynn – Wall – Ozawa (FWO), Kissinger – Akahira – Sunose (KAS), and Starink, were employed to evaluate the activation energy over a wide range of conversions. In addition, the model-fitting Coats – Redfern and Criado’s methods were applied to elucidate the reaction model further and capture the gasification’s complexity. The mean activation energies were 158.5, 160.3, 164.0, and 160.6 kJ/mol at the Friedman, KAS, FWO, and Starink methods. The gasification process exhibited the contracting sphere model. Thermodynamic parameters (ΔHα, ΔGα, ΔSα), estimated using the Eyring equation, indicated gasification’s endothermic and non-spontaneous nature. SEM analysis of the intermediate residual chars revealed intermediate coke structures, pore development, and crack formation, which acted as active reaction sites during gasification. This multi-dimensional approach provides a comprehensive framework for understanding, modeling, and optimizing petroleum pitch gasification systems by integrating kinetics and thermodynamics.
- New
- Research Article
- 10.1038/s41598-026-36106-z
- Jan 13, 2026
- Scientific reports
- Marran Al Qwaid + 3 more
Sustainable agriculture in arid regions faces critical challenges due to water scarcity, high temperatures, and inefficient traditional farming practices. This study presents an AI-enabled smart farming framework for optimizing date palm (Phoenix dactylifera) cultivation through the integration of Machine Learning (ML) and Internet of Things (IoT) technologies. A structured multimodal dataset comprising biometric features palm height, trunk diameter, and leaf number, environmental parameters soil moisture, temperature, and humidity, and categorical attributes variety and health status was analyzed to classify palm health and support data-driven irrigation management. Four ML algorithms Random Forest (RF), Gradient Boosting Machine (GBM), Artificial Neural Network (ANN), and Support Vector Machine (SVM) were developed and optimized using grid search with five-fold cross-validation. Among them, the Random Forest model achieved the highest classification accuracy of 95.3%, demonstrating strong robustness for heterogeneous agricultural data. Feature importance analysis highlighted soil moisture, humidity, trunk diameter, and leaf number as key contributors to palm health prediction. The proposed AI-IoT framework enables real-time monitoring, predictive diagnostics, and automated decision support for sustainable water use and crop management, aligning with Saudi Vision 2030 objectives for technology-driven and resource-efficient agriculture.
- New
- Research Article
- 10.3390/su18020812
- Jan 13, 2026
- Sustainability
- Bobby Thapa + 8 more
Maple syrup production has the potential to promote sustainable rural economic development in regions with suitable forest and climate conditions. Kentucky emerges as a promising candidate due to its extensive maple tree inventory and favorable seasonal patterns. However, the broader economy-wide implications of developing a maple syrup industry in the state remain underexplored. To fill this knowledge gap, this study employs a customized static single-region computable general equilibrium (CGE) modeling approach for Kentucky under nine scenarios based on production capacities and potential levels. The results consistently show positive impacts on net household income, social welfare (measured by equivalent variation), government revenues, and state GDP across all scenarios. Medium production capacities generate the most balanced and efficient outcomes, while high-potential scenarios, especially under small and large scales produce the largest absolute gains. These results underscore the viability of maple syrup production as an economic development strategy and highlight the role of production scale in maximizing benefits. Furthermore, expanding maple syrup production can enhance rural livelihoods by diversifying forest-based income and promoting long-term stewardship. As a non-timber forest product, maple syrup tapping provides economic incentives to maintain healthy forests, strengthening rural sustainability and resilience. Our findings indicate that developing this industry beyond traditional regions can generate meaningful economic benefits while encouraging sustainable resource use when appropriately scaled and managed.
- New
- Research Article
- 10.2196/79349
- Jan 12, 2026
- JMIR Perioperative Medicine
- Charlé Steyl + 5 more
BackgroundPerioperative patient-reported outcomes (PROs) allow patients to share their experiences of surgical procedures with their health care teams using standardized measures. Despite increasing recognition of their value, PROs are not routinely used in clinical practice, partly due to limited evidence of their impact on traditional clinical outcomes and uncertainty among clinicians about their use. Digital health tools offer a promising way to integrate PROs into clinical workflows and enhance patient-clinician interaction, but their success depends on person-centered design to ensure usability and relevance. Safe Surgery South Africa, a nonprofit organization, developed the Perioperative Shared Health Record (PSHR), a secure web-based tool that enables patients to share personal health information and PROs with their anesthetist and surgeon before and after surgery. Initial implementation revealed significant user experience challenges, which contributed to poor uptake.ObjectiveThis study aimed to explore factors influencing the PSHR user experience in a low- and middle-income country (LMIC) using human-centered design principles.MethodsThis observational qualitative user experience study followed the 5 design thinking stages: empathize, define, ideate, prototype, and test. Semistructured interviews were conducted with postoperative patients from both the public and private health care sectors, including those with and with no prior experience using the PSHR. Thematic analysis followed the 6-phase framework described by Braun and Clarke and was structured using Karagianni’s Optimized Honeycomb user experience model. A problem statement was developed, followed by ideation to explore solutions. Paper prototypes were created, refined, and tested through observation, interviews, and validated usability questionnaires.ResultsIn the empathize stage, 22 interviews were conducted in the private and public health care sectors in South Africa; 7 participants had previous experience using the PSHR. In the define stage, participants emphasized the need for connection, feedback, information, and support through their surgical journey. Contrary to expectations, patients were not discouraged by the length of questionnaires if they perceived them as purposeful. In the ideate stage, the team considered user expectations and PSHR integration into care processes. In the prototype stage, low-fidelity mock-ups were created and refined into paper prototypes. In the test stage, testing with 5 participants highlighted the importance of trust, communication, and user-friendly interfaces. Feedback loops and clinician engagement were identified as key motivators for sustained use. The mean usability questionnaire scores indicated excellent usability and high levels of user satisfaction across most domains.ConclusionsThis study is one of the first to apply human-centered design principles to a perioperative digital health tool in an LMIC setting, addressing usability challenges and patient engagement. Key user experience factors influencing patient engagement included communication, feedback, and access to information throughout the surgical journey. Digital health tools such as the PSHR can strengthen communication and support person-centered perioperative care by integrating PROs into clinical workflows and care processes.
- New
- Research Article
- 10.1080/1051712x.2025.2612375
- Jan 12, 2026
- Journal of Business-to-Business Marketing
- Doreen E Shanahan + 1 more
ABSTRACT Purpose Enterprise collaboration technologies have been recognized for enabling effective digital collaboration among employee and overcoming temporal and spatial barriers. However, organization-wide adoption of these types of new technologies remains an operational challenge to generating business value. Thus, organizations use interventions, such as change management programs, to support systems implementations. Prior literature on social networks has shown that interactions among interdependent employees influence technology adoption and use. The purpose of this paper is to examine the impact of a service provider’s change management program on end users’ subsequent advocacy to others in their network, as well to understand the complex downstream effects of advocacy on current usage, continued usage, and intention to recommend. This research recognizes the critical role of peer recommendation in network good adoption and positions net promoter score as a key predictor of sustained system use. Methodology/approach This study adopts a quantitative approach using various regression analysis based on survey data. Data for the study was collected from 587 employees, as part of a change management project supporting a large-scale customer digital transformation, regarding their perceptions of the change and experience with the new enterprise collaboration technology. Findings The results of the research highlight the role of specific dimensions related to change management in explaining subsequent advocacy outcomes. According to the data, the change management process positively impacts end users’ subsequent likelihood to recommend a new digital collaboration technology to peers in their organization. The empirical research demonstrated that higher advocacy of a new enterprise collaboration technology within one’s network significantly predicts one’s current use, which in turn impacts one’s intention to continue using the technology, and this sequential pathway ultimately influences one’s likelihood to recommend the technology to others in the network. Further, the effect of current use on one’s likelihood to recommend is mediated via future intentions to use. Contribution This research bridges knowledge on technology adoption, change management, and individual level of advocacy by examining the role of peer recommendation in adoption of enterprise collaboration technologies. While the net promoter score has been widely used in consumer behavior research focused on consumer goods industries, studies have yet to examine how the measure might be implemented as a key indicator of network good adoption at the individual enterprise level, by measuring the employee-level net promoter scores within a client-organization.
- New
- Research Article
- 10.1002/jcal.70186
- Jan 11, 2026
- Journal of Computer Assisted Learning
- Yu Xiao + 3 more
ABSTRACT Background Virtual reality (VR) offers immersive and authentic learning opportunities that can transform lecture delivery and student engagement. Despite growing adoption in education, little is known about how learners' cultural identity, self‐efficacy, and perceived authenticity shape their sustained use of VR learning environments. Objectives This study examined how cultural identity, self‐efficacy, and perceived authenticity predict Chinese university students' continued use of VR, with prior VR experience tested as a potential moderator. Methods A correlational survey was conducted with 833 students using validated instruments on cultural identity, VR self‐efficacy, authenticity gap, and VR usage. Data were analysed through structural equation modelling. Results and Conclusions The findings reveal that cultural identity ( β = 0.32), self‐efficacy ( β = 0.52), and perceived authenticity ( β = 0.41) positively influence continued VR use. Prior VR experience strengthened these relationships. Beyond theoretical contributions extending expectation‐confirmation theory (ECT) with cultural and psychological dimensions, these results offer clear design and pedagogical implications. Developers should embed culturally responsive content and realistic environments, while educators can strengthen self‐efficacy through scaffolded VR learning and feedback.
- New
- Research Article
- 10.1093/inteam/vjag003
- Jan 10, 2026
- Integrated environmental assessment and management
- Humberto Castillo-González + 5 more
Graphene-related materials (GRMs) are revolutionizing sectors such as electronics, energy storage, agriculture, and biomedicine due to their exceptional properties. However, concerns are emerging about their environmental impact, particularly regarding their persistence, potential toxicity to aquatic ecosystems, and challenges in safe disposal. These issues highlight the need for more robust sustainable-by-design and risk-assessment strategies. In this context, this research investigated the influence of GRMs on lignin peroxidase (LiP) and laccase (Lac), key enzymes involved in lignin breakdown with significant potential in bioremediation. These enzymes are crucial for degrading complex molecules, and understanding their interaction with GRMs could provide valuable insights into the degradation of 2D nanomaterials, particularly graphene oxide (GO), few-layer graphene (FLG), and reduced graphene oxide (rGO). In-vitro enzymatic assays conducted with varying GRMs concentrations (12.5, 25.0, and 50.0 µg/mL) revealed that Lac remained unaffected, while LiP exhibited a noteworthy reduction in catalytic activity, particularly in the presence of GO at the highest concentration. A sequestration study to quantify the bioavailable fraction, confirmed these effects, indicating significant enzyme loss, notably with GO at 50 µg/mL. These findings prompted a mechanistic exploration of enzyme inhibition dynamics, revealing the complex nature of GRM-catalytic enzyme processes. By considering factors such as zeta potential (electrostatic forces), hydrophobicity, dispersion stability and oxidation state, this study addresses a key knowledge gap and provides a foundation for understanding these interactions, offering crucial insights into the environmental fate of GRMs and guiding their sustainable use and management.
- New
- Research Article
- 10.3390/heritage9010018
- Jan 8, 2026
- Heritage
- Francesca Romana D’Ambrosio Alfano + 8 more
Radon exposure poses a significant health risk in underground cultural heritage sites, where limited ventilation and prolonged visitor presence can lead to high radon exposures. While previous studies have concentrated on monitoring and mitigation strategies, few have developed a comprehensive approach that ensures both safe and sustainable site use. This research introduces an innovative methodology that integrates periodic/seasonal radon risk assessment with risk-informed access management based on periodic monitoring and time tracking. This approach is based on: (i) periodic monitoring to obtain representative concentrations; (ii) the calculation of permissible stay durations using a dose-based framework; (iii) implementation via access registration (badges) and procedural measures; and (iv) the application of mitigation measures when concentrations exceed limits (otherwise, the dose is evaluated in accordance with the applicable reference levels). This strategy was implemented and validated at the Roman Theatre in Herculaneum, a unique case study characterised by complex architectural constraints (as the theatre is completely underground) and high cultural significance. Results from years of monitoring, along with ongoing campaigns, demonstrate that this methodology not only reduces radon-related health risks but also enhances visitor experience. This integrated framework provides a replicable model for balancing conservation, safety, and accessibility in underground heritage sites.
- New
- Research Article
- 10.2196/69874
- Jan 7, 2026
- JMIR formative research
- Anton Elepaño + 7 more
Complimentary subscriptions to UpToDate, a decision support tool, were provided to community health workers (CHWs) in rural and remote primary care sites as part of a government-funded health system research program. A feasibility evaluation conducted after the first year of implementation showed that UpToDate was acceptable among CHWs despite infrastructural barriers. This follow-up study evaluated the longitudinal adoption of UpToDate among CHWs and examined how sociocultural, political, and environmental factors influenced its use. Drawing on the nonadoption, abandonment, scale-up, spread, and sustainability framework, this study aimed to understand not only use patterns but also broader challenges to scale-up, spread, and sustainability in a complex health system. An explanatory mixed methods design was used combining analysis of use and program activity logs; program reports; and focus groups with CHWs, health care professionals, and program implementers. Quantitative analysis of use logs (March2021toSeptember2023) compared adoption over time by using descriptive statistics, CIs, and chi-square tests. Qualitative data came from the reanalysis of previous focus group transcripts and program reports and from a new focus group with program implementers. Reflexive thematic analysis was used to interpret how CHWs and implementers perceived and used the tool, and findings were integrated to explain quantitative trends. Use of UpToDate was modest and declined over time. Monthly active use among CHWs and midwives fell substantially from 3.57% (97/2720 person-months) in2021 to 1.07% (37/3456) in2022 and remained low at 1.50% (39/2592) up to 2023, with markedly higher engagement in the rural site than in the remote site. Peaks in use coincided with program activities, whereas prolonged troughs followed typhoons, power outages, and other disruptions. Log data showed that users primarily consulted patient education articles rather than clinician-oriented decision tools. Qualitative analyses revealed that CHWs appropriated UpToDate as a learning aid and source of professional credibility. Structural shocks, heavy workloads, language barriers, and limited device access constrained individual use, and communal practices (shared devices and learning activities) meant that meaningful engagement often went unrecorded in vendor metrics. Our findings show that acceptability does not guarantee sustained use and that adoption cannot be captured fully by log-in counts. UpToDate's value for CHWs lay in how it was domesticated as a tool for building capacity and professional credibility, not in its intended function as a decision aid used at the point of care. Therefore, evaluations of digital health tools should incorporate indicators of learning and social capital alongside use metrics. Policymakers should recognize that infrastructural fragility and communal adaptation shape digital health uptake. Embedding tools into ongoing training and peer learning structures, providing offline and multilingual support, and investing in resilience planning will be crucial for meaningful scale-up and sustainability.
- New
- Research Article
- 10.1097/jxx.0000000000001237
- Jan 7, 2026
- Journal of the American Association of Nurse Practitioners
- Nahid Karimi + 3 more
Continuous glucose monitoring (CGM) has been associated with improved glycemic control in individuals with non-insulin-treated type 2 diabetes (T2D), but adoption in endocrinology clinics remains limited. This quality improvement (QI) project aimed to explore the use of CGM for non-insulin-treated patients with T2D within an endocrinology clinic in Los Angeles County and to evaluate feasibility of its adoption in this practice setting. It was a retrospective chart review. Eligible patients were adults (≥18 years) with T2D and hemoglobin A1c (HbA1c) >7% who had not been treated with insulin. Ten patients met inclusion criteria. Descriptive statistics summarized HbA1c and time-in-range (TIR) over 6 months. An interrupted time series was also conducted on two patients with sufficient longitudinal data to assess HbA1c changes before and after CGM use. After 6 months of initial CGM use, 8 of 9 patients achieved HbA1c reductions of ≥0.3%. One patient demonstrated a ≥8% increase in TIR with adequate device use. Interrupted time series analyses illustrated individual HbA1c trajectories, showing immediate reductions after CGM initiation and nonsignificant downward trends over time. This QI project showed that patients with T2D treated with noninsulin medications successfully initiated and used CGM. Early improvements in HbA1c were observed; however, sustained glycemic outcomes varied depending on consistent CGM wear and adherence to follow-up, underscoring the need for individualized support. Nurse practitioners can play a key role in promoting sustained CGM use through structured diabetes education, integration of CGM data into lifestyle counseling, and regular follow-up.
- New
- Research Article
- 10.3390/su18020582
- Jan 6, 2026
- Sustainability
- Tomáš Peráček + 1 more
The development of sustainable smart cities is closely linked to the implementation of artificial intelligence in urban services, which opens up new possibilities for efficient resource management, improving the quality of life and strengthening the participation of citizens. At the same time, the question arises as to how legal and strategic frameworks can support the use of artificial intelligence in a way that contributes to environmental, social and economic sustainability in line with the objectives of the European Union. The aim of this scientific study is to examine the interdisciplinary use of artificial intelligence, data management and sustainability at the European Union level, including support instruments such as regulatory initiatives and funding programs, and to assess their implementation in relation to smart cities. Methodologically, the research is based on a legal analysis of key European and national documents, supplemented by descriptive statistics and visualizations of indicators of digitalization and urban sustainability. In the scientific study, we use the methods of synthesis, comparison and abstraction. The results suggest that the legislative and support framework of the European Union can be a significant impetus for the transformation of individual smart cities, but requires effective coordination and strategic management at the level of local governments. The research highlights the need for an integrated legal-managerial approach that will enable the full use of the potential of artificial intelligence in supporting sustainable urban development of cities.
- New
- Research Article
- 10.37772//2309-9275-2025-2(25)-7
- Jan 5, 2026
- Law and innovative society
- Anatoliy Hetman + 1 more
The article examines the place of sustainable land use and restoration in the legal framework for environmental and resource security in Ukraine in the context of wartime and post-war challenges. It is argued that land, as a basic element of the country’s natural resource potential, performs economic, social, and environmental functions simultaneously and therefore requires a special legal regime for its use, protection, and restoration. It is argued that land degradation, pollution, and damage, exacerbated by the armed aggression of the Russian Federation, pose a significant threat not only to the ecological, but also to the food, economic, and overall national security of Ukraine. These circumstances require the search for effective organizational and legal methods and tools to preserve the national resource base and ensure guarantees for the realization of both land and environmental rights of citizens. The authors analyze doctrinal approaches to understanding ecological and resource security, its relationship with environmental, economic, and food security, and define the substantive characteristics of the categories of sustainable, ecologically balanced land use and land restoration. The content of the principle of priority of environmental safety requirements in land legislation is revealed, in particular through the provisions of the Land Code of Ukraine, the Law of Ukraine “On Land Protection” and other normative acts that establish the obligations of landowners and land users to preserve soil fertility and prevent negative anthropogenic impact on the state of land resources. Particular attention is paid to legal responses to violations of environmental standards in land use, in particular the termination of land rights. It is concluded that the formation of an effective organizational and legal model for the sustainable use and restoration of land is a necessary prerequisite for ensuring environmental and resource security, fulfilling Ukraine’s European integration commitments, and successfully implementing the post-war restoration of the state.
- New
- Research Article
- 10.1101/2025.04.22.25326214
- Jan 5, 2026
- medRxiv
- Xiaoyue Zhu + 6 more
Over the time of the COVID-19 pandemic, many school systems started to utilize educational software to identify students actively planning suicide and other acts of violence. This study examines associations between county-level youth suicide rates and the implementation of GoGuardian Beacon, a school-based software using machine learning methods for identifying students at risk for suicide. Using difference-in-differences and event study methods, we analyzed 2018-2022 suicide data comparing 70 counties with sustained Beacon implementation to 1,215 matched comparison counties that never implemented Beacon. In our primary analysis, counties that maintained consistent Beacon use had 24.4% lower youth suicide rates during 2021–2022 (p < 0.05). In sensitivity analyses defining implementation based on initial adoption regardless of subsequent use, the association was attenuated and not statistically significant. Taken together, these findings indicate that counties with sustained use of Beacon had lower youth suicide rates in our primary analyses, while also highlighting the possibility that broader contextual factors (e.g., local mental health infrastructure and school system characteristics) contribute to the observed differences. Randomized trials with prospective follow-up, more information on school and community resources, and quality of Beacon response pathways after identification are needed to understand the effect of Beacon and clarify the independent contribution of digital monitoring tools within comprehensive youth suicide prevention strategies.
- New
- Research Article
- 10.3390/healthcare14010136
- Jan 5, 2026
- Healthcare
- Kevin-Justin Schwedler + 3 more
Background/Objectives: Home care plays a crucial role in contemporary healthcare systems, particularly in the long-term care of people with chronic and progressive illnesses. Family caregivers often experience substantial physical, emotional, and organizational burden. Telemedicine and digital health applications have the potential to support home care by improving health monitoring, communication, and care coordination. However, their use among family caregivers remains inconsistent, and little is known about how organizational support structures such as telemedicine centers influence acceptance and everyday use. This study aims to examine the benefits of telemedicine in home care and to evaluate the role of telemedicine centers as supportive infrastructures for family caregivers. Methods: A mixed-methods design was applied. Quantitative data were collected through an online survey of 58 family caregivers to assess the use of telemedicine and digital health applications, perceived benefits, barriers, and support needs. This was complemented by an in-depth qualitative case study exploring everyday caregiving experiences with telemedicine technologies and telemedicine center support. A systematic literature review informed the theoretical framework and the development of the empirical instruments. Results: Most respondents reported not using telemedicine or digital health applications in home care. Among users, telemedicine was associated with perceived improvements in quality of care, particularly through enhanced health monitoring, improved communication with healthcare professionals, and increased feelings of safety and control. Key barriers to adoption included technical complexity, data protection concerns, and limited digital literacy. Both quantitative findings and the qualitative case study highlighted the importance of structured support. Telemedicine centers were perceived as highly beneficial, providing technical assistance, training, coordination, and ongoing guidance that facilitated technology acceptance and sustained use. Conclusions: Telemedicine and digital health applications can meaningfully support home care and reduce caregiver burden when they are embedded in supportive socio-technical structures. Telemedicine centers can function as central points of contact that enhance usability, trust, and continuity of care. The findings suggest that successful implementation of telemedicine in home care requires not only technological solutions but also accessible organizational support and targeted training for family caregivers.