Articles published on Minimum Number
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
32660 Search results
Sort by Recency
- New
- Research Article
- 10.1145/3788673
- Jan 19, 2026
- ACM Transactions on Computing for Healthcare
- Michele Bernardini + 4 more
Hepatic steatosis, or nonalcoholic fatty liver disease (NAFLD), affects a significant portion of the global population and can lead to more severe liver conditions, including hepatic fibrosis. Early and accurate risk prediction of fibrosis is crucial for timely intervention. Traditional diagnostic methods are invasive and carry risks, while imaging techniques and blood-based biomarkers have limitations in routine general practice. This study presents a machine learning-based clinical decision support system designed to assess the risk of hepatic fibrosis in patients with NAFLD using routine laboratory tests. The framework is developed using electronic health record data collected over 15 years, initially encompassing 1,272,572 patients from general practice. After applying clinical selection criteria, two cohorts of 12,960 and 25,478 patients were used for model development and evaluation. The proposed approach provides a robust foundation for monitoring fibrosis risk by implementing a novel screening method , which preprocesses predictors by leveraging well-established clinical indicators (e.g., hepatic steatosis index, fibrosis-4 index), alongside a selected minimal number of predictors, making it practical and cost-effective for widespread clinical use. The study's findings indicate promising results for screening and monitoring fibrosis risk in NAFLD patients, achieving the best AUC of 92.97%, PRAUC of 75.44%, and Sensitivity of 79.63%.
- New
- Research Article
- 10.1063/5.0307152
- Jan 13, 2026
- The Journal of chemical physics
- Xinxian Chen + 1 more
We investigate strategies for simulating open quantum systems coupled to dissipative baths by comparing explicit wave function-based discretization [via multi-layer multi-configuration time-dependent Hartree (ML-MCTDH)] and the implicit density matrix-based master equation method [via tree tensor network hierarchical equations of motion (TTN-HEOM)]. For dissipative baths characterized by exponentially decaying bath correlation functions, the implicit discretization approach of HEOM-rooted in bath correlation function decompositions-proves significantly more efficient than explicit discretization of the bath into discrete harmonic modes. Explicit methods, like ML-MCTDH, require extensive mode discretization to approximate continuum baths, leading to computational bottlenecks. Case studies for two-level systems and a Fenna-Matthews-Olson complex model highlight TTN-HEOM's superiority in capturing dissipative dynamics with relaxations with a minimal number of auxiliary modes, while the explicit methods are as exact as the HEOM in pure dephasing regimes. This comparison is enabled by the TENSO package, which has both ML-MCTDH and TTN-HEOM implemented using the same computational structure and propagation strategy.
- New
- Research Article
- 10.1080/00927872.2025.2601749
- Jan 5, 2026
- Communications in Algebra
- P H Lima
In this paper, we study the box decomposition principle with more precision and also provide a characterization of very good ideals. Moreover, we give a generalization for a formula about the minimal number of generators of an equigenerated ideal provided by Herzog, Saem, and Zamani in [3, Theorem 1.9].
- New
- Research Article
- 10.1016/j.visres.2025.108718
- Jan 1, 2026
- Vision research
- Elsa Scialom + 3 more
Object recognition from sparse simulated phosphenes and curved segments.
- Research Article
- 10.1038/s41598-025-28106-2
- Dec 24, 2025
- Scientific Reports
- Adel Bakhshipour + 1 more
Effective weed detection for precise management remains a pertinent issue in modern agriculture. In this study, hyperspectral imaging (HSI) was combined with machine learning (ML) to differentiate between peanut plants and four common weeds found in peanut fields. Several spectral preprocessing methods—Moving Window Averaging (MWA), Median Filtering (MF), Gaussian Filtering (GF), and Savitzky–Golay smoothing (SGS)—were applied. Feature selection algorithms, including Correlation-based feature selection (CFS), Principal Components Analysis (PCA), and Wrapper Feature Selection (WFS), were then used to extract the most informative wavelengths. Among the various classifiers evaluated, the combination of MF preprocessing, WFS algorithm, and LDA classifier (MF-WFS-LDA) achieved the best performance, while the WFS method selected 12 optimal wavelengths from a total of 465. The accuracy, precision, recall, and RMSE values of this model in the training stage were 99.71%, 0.997, 0.997, and 0.054, respectively. These statistics were 96.67%, 0.967, 0.968, and 0.088, respectively, in the test stage. Furthermore, it successfully differentiated peanuts from each weed species using a minimal number of optimal wavelengths. These findings highlight the potential of integrating HSI with ML for precise weed detection in peanut cultivation. However, further validation under diverse environmental and field conditions is recommended.
- Research Article
- 10.4038/sljfmsl.v16i2.8059
- Dec 22, 2025
- Sri Lanka Journal of Forensic Medicine, Science & Law
- R Franklin + 5 more
Introduction: Forensic dentistry plays a major role in the identification of individuals who cannot be identified visually by other means. Dental records are crucial in the identification of victims in mass disasters. Many previous studies indicate that people are often unaware of how to maintain dental records. Hence, the present study was intended to explore the forensic relevance of dental records in Tamil Nadu, India. Methods: A cross-sectional study was conducted among 250 randomly selected general dental practitioners from Tamil Nadu. Data were collected through a structured questionnaire via a Google form. The questionnaire addressed information on dental documentation, data keeping, and forensic awareness. Based on the responses, data was collected, and the percentages were computed for the conclusion. Results: Out of 225 respondents, the majority were male, aged 28-45 years, with 3-20 years of work experience. While most maintain dental records and use digital radiographs, only a few maintain them digitally and obtain written consent from the patients for their treatment. Prosthetic documentation is limited, with less than half of them recording implant serial numbers and using denture markings. Forensic awareness was low, with only a minimal number having formal forensic training, and many reporting inadequate undergraduate education in the field. Conclusions: Most practising dentists remain under-digitalised in dental record maintenance, highlighting the need for standardised protocols to enhance patient care, data management, and forensic readiness. These improvements can be achieved through focused education and training, empowering dental professionals to contribute more effectively as forensic experts.
- Research Article
- 10.1002/cta.70288
- Dec 22, 2025
- International Journal of Circuit Theory and Applications
- Hussain Alzaher + 2 more
ABSTRACT This paper presents two third‐order active‐RC quadrature sinusoidal oscillator configurations incorporating the operational amplifier (op‐amp) with a minimal number of passive components. The proposed designs employ an inverting integrator followed by a two‐stage RC phase‐shift network, forming a feedback loop that satisfies the Barkhausen criterion for sustained oscillation. The circuits generate two sinusoidal outputs with a 90° phase difference. One of the designs incorporates just a single op‐amp. Theoretical analyses are provided to derive the frequency and condition of oscillations. Experimental results confirm the validity of the designs and demonstrate good agreement with the analytical predictions. Because of their simplicity and low component count, the proposed oscillators are particularly suitable for integrated circuit implementation in communication and signal processing systems.
- Research Article
- 10.24144/2788-6018.2025.06.3.28
- Dec 22, 2025
- Analytical and Comparative Jurisprudence
- V R Golub + 1 more
The article examines modern forensic methods of identifying individuals based on fingerprint and biometric data, which constitute one of the key areas in the development of forensic expertise and law enforcement activities in the context of today’s information society. The author focuses on the historical background of fingerprinting, which has a centuries-old history, its scientific foundation, as well as its practical significance in detecting and solving crimes of varying complexity. Significant attention is given to the analysis of modern automated information systems that enable rapid search and comparison of fingerprints in large-scale national and international databases, allowing for the prompt identification of individuals even with a minimal number of biometric features. The prospects for integrating such systems with international forensic registries are examined separately, contributing to enhanced effectiveness in combating transnational crime, particularly in cases related to terrorism, human trafficking, and organized crime. The article also provides a detailed review of contemporary biometric technologies such as facial recognition, iris scanning, voice recognition, and other unique human parameters, which are increasingly used as auxiliary or alternative identification tools by law enforcement agencies and security systems. The importance of a comprehensive approach is emphasized, which involves combining classical fingerprinting methods with the latest biometric tools, significantly improving the accuracy and reliability of forensic investigations, minimizing errors, and eliminating the possibility of misidentification. The article also addresses current issues concerning the legal regulation of biometric data use, ethical aspects of collecting and processing such information, risks of privacy violations, as well as challenges related to cybersecurity and potential misuse of biometric databases. The conclusion is drawn that improving forensic methods for identifying individuals through fingerprints and biometric characteristics is a necessary condition for enhancing the effectiveness of crime prevention and investigation. However, it requires careful balancing between security needs and the respect for fundamental human rights and freedoms, which must be enshrined in legislation and supported in judicial practice.
- Research Article
- 10.46991/pysua.2025.59.3.069
- Dec 19, 2025
- Proceedings of the YSU A: Physical and Mathematical Sciences
- Hamlet V Mikaelyan
A proper edge-coloring of a graph is called a sum edge-coloring if it minimizes the total sum of colors on all the edges of the graph. The aforementioned minimal sum is called the edge-chromatic sum of the graph, and the minimal number of colors needed for a sum edge-coloring is called the edge-strength of the graph. In this paper, upper bounds on the values of the edge-chromatic sums of some complete tripartite graphs are given, while for some other complete tripartite graphs, the exact values of both parameters are obtained.
- Research Article
- 10.61173/cdyesr17
- Dec 19, 2025
- Science and Technology of Engineering, Chemistry and Environmental Protection
- Guanxu Zhu
This paper analyses three key areas that have a significant impact on the design, manufacturing and operation of aircraft. These are additive manufacturing, artificial intelligence and biomimetry. The use of AM, also known as 3D printing, allows a change from subtractive manufacturing to generative, as components are produced directly from a digital model by adding material layer by layer. This method allows the creation of complex geometry, as well as lightweight components that would not be possible or would not be economically viable with traditional techniques. Its use in the aerospace industry has yielded clear benefits in terms of weight loss, drag reduction and supply chain efficiency. The use of AI, on the other hand, is a change in the way engineers and operators interact with the complexity of the aviation industry. AI algorithms can speed up the early stages of a design by quickly optimizing the parameters of the area of the wing, thrust and loads, allowing a more rapid screening of potential configurations. Additionally, through the use of machine learning models and neural networks, it is possible to iterate thousands of designs, simulate their aerodynamic behaviour in different scenarios and obtain reliable predictions of performance with a minimal number of trial and error. In an operational context, AI can enhance safety by predicting equipment failures and detecting anomalies in real time. Biomimicry takes this one step further by applying nature’s design and process principles in new technologies. By understanding the principles behind honeycomb and trabecular bone structures, aircraft have been built with stronger yet lighter components. The advances in AM, AI, and biomimicry collectively show how the next generation of flight systems are possiblesystems that help meet the rapidly growing global demand for air transportation while overcoming environmental and economic constraints.
- Research Article
- 10.34216/1998-0817-2025-31-4-165-172
- Dec 19, 2025
- Vestnik of Kostroma State University
- Elena B Volkova + 1 more
The article examines multi-component complex sentences of contaminated structure based on the texts of mathematical works. This is the least common structural type of multi-component complex sentences. Within it, just the same as in the case of sentences with sequential subordination and co-subordination, the most productive group of sentences appear to be structures with a minimal number of components. For constructions of contaminated structure, it is equal to four. Each such sentence has a main part and subordinate clauses of two degrees. The article examines variants of both homogeneous and heterogeneous co-subordination of subordinate clauses of the first and second degrees. Subordinate clauses of the constructions under study can be of the undifferentiated type (object, attributive, pronominal-correlative, pronominal-conjunction correlative) and/or of the divisible (functional) type (conditional, causal, target, concessive, conjoint, etc.). Numerous examples taken from the texts of works by famous mathematicians clearly illustrate various cases of constructing four-component complex sentences of contaminated structure. Although such constructions are complex in structure, the composition of the sentence components (the predominance of indivisible complex clauses over functional clauses and their relationships) contributes to the integrity and visibility of the sentence, preserving its syntactic perspective. This is extremely important for works of scientific style, since it facilitates both the author’s task – to convey the idea to the addressee as accurately as possible, and the reader’s task – to perceive what has been read most adequately.
- Research Article
- 10.61173/0m8gkv38
- Dec 19, 2025
- Interdisciplinary Humanities and Communication Studies
- Xinyu Shi
This study primarily examines the effect of differentiated instruction on high school students’ math learning self-efficacy. Differentiated instruction refers to designing teaching content based on students’ varying knowledge foundations and learning abilities to enhance their learning confidence. Grounded in Bandura’s self-efficacy theory, this research analyzes how differentiated instruction strengthens students’ math learning self-efficacy by providing successful experiences, vicarious experiences, verbal persuasion, and optimizing emotional states. Research findings indicate that students in classes implementing differentiated instruction achieve higher mathematics scores and exhibit greater learning efficacy compared to those in classes without differentiated instruction. Research indicates this model significantly improves classroom engagement, academic performance, and test scores, enabling the learning efficacy of the vast majority of students to be enhanced to a certain degree. However, a low probability of adverse effects on a minimal number of students exists. This has resulted in a decline in learning efficacy rather than an improvement. The study concludes with recommendations for dynamic class adjustments and teacher care, offering research directions for optimizing differentiated instruction practices.
- Research Article
- 10.1142/s204768412550037x
- Dec 15, 2025
- International Journal of Computational Materials Science and Engineering
- K Surya + 2 more
The traditional approach to designing function-specific high-entropy alloys (HEAs) involves exploring vast compositional spaces through extensive experimental or computational characterization. This “Edisonian” trial-and-error method is often time-consuming and resource-intensive. To address these challenges, this study presents a novel integration of molecular dynamics (MD) simulations with the machine learning (ML)-driven active learning framework for the nanoscale design of AlCoCrFeNi HEAs with an improved stiffness-to-density ratio ([Formula: see text]/[Formula: see text]). The proposed computational framework operates using a minimal number of samples generated by performing a series of MD simulations. In these simulations, the concentration of individual constituent elements is systematically varied, and the corresponding composition-dependent Young’s modulus and density are recorded. This data is then used to drive an ML-driven Bayesian [Formula: see text] framework, which identifies the next most promising composition in the HEA design space — one likely to yield a maximized specific Young’s modulus. To guide this search efficiently, the framework employs the Expected Improvement (EI) score as the acquisition function, allowing for the ranking and selection of candidate compositions. The optimal composition predicted by the Bayesian approach is then further explored through MD simulations to understand the underlying deformation mechanisms responsible for the enhanced specific stiffness.
- Research Article
- 10.1007/jhep12(2025)091
- Dec 11, 2025
- Journal of High Energy Physics
- Anamaria Hell + 1 more
A bstract We consider a class of theories containing power-law terms in both the Ricci scalar and a scalar field, including their non-minimal couplings. As a first step, we systematically classify all non-trivial cases with a propagating scalar field that arise from the simplest general power-law formulation, which contains the minimal number of terms. We then analyze each case in detail, focusing on the structure of the degrees of freedom, by both formulating the theories in the Einstein frames and focusing on the singular points in the Jordan frame. We demonstrate that such theories can give rise to different, and sometimes unexpected structure of the modes, that can change at the leading order depending on the background.
- Research Article
- 10.1016/j.neunet.2025.108423
- Dec 5, 2025
- Neural networks : the official journal of the International Neural Network Society
- Yunsong Deng + 2 more
RaLo: Rank-aware low-rank adaptation for pre-trained foundation models.
- Research Article
- 10.1038/s41467-025-67062-3
- Dec 5, 2025
- Nature Communications
- Xiaobin Zhao + 4 more
Quantum state tomography typically requires exponentially many copies of a quantum state, due to the complex correlations present in large systems. We show that, for bosonic systems, the scaling is completely determined by the nature of these correlations. Motivated by the Hong-Ou-Mandel effect and boson sampling, we define Gaussian-entanglable (GE) states, produced by generalized interference between separable bosonic modes. GE states greatly extend the Gaussian family, encompassing separable states, multi-mode Gottesman-Kitaev-Preskill codes, entangled cat states, and boson-sampling outputs—resources for error correction and quantum advantage. We prove that any pure GE state of m modes can be learned efficiently, requiring only poly(m) copies, via a protocol based on Gaussian unitaries, local tomography, and classical post-processing; for boson-sampling states, no Gaussian unitaries are needed. For states outside GE, we define an operational monotone—the minimal number of ancillary modes needed to make them GE—which exactly characterizes the exponential tomography overhead. We also show that deterministic generation of NOON states with N ≥ 3 via two-mode interference is impossible.
- Research Article
- 10.1038/s41598-025-27248-7
- Dec 5, 2025
- Scientific Reports
- Tomasz Rybotycki + 4 more
We demonstrate a determinant dimension witness of a qubit space. Our test has a minimal number of independent parameters. We achieve it by mapping the Bloch sphere pi /2-rotation axis angle on the non-planar so-called Viviani curve. We ran our test on different platforms: IBM Quantum, IQM Resonance, and IonQ. Our investigations show that numerous qubits, especially from the newest IBM Heron family devices, fail the test by more than ten standard deviations. The nature of those deviations has no simple explanation as the test is robust against common imperfections.
- Research Article
- 10.54254/2753-8818/2026.hz30283
- Dec 4, 2025
- Theoretical and Natural Science
- Jingyuan Liu
With the widespread use of deep learning in fields such as computer vision, its security issues have also garnered attention. Backdoor Poisoning, a representative data poisoning attack method, is characterized by strong concealment and significant harm. This paper selects the CIFAR-10 dataset as the experimental subject and employs two typical convolutional neural networks, ResNet-18 and VGG-16, as benchmark models to systematically test the effectiveness of low-cost data-level poisoning. The experimental results indicate that, even with a poisoning ratio of only 1%, there is almost no impact on the validation set accuracy of the two models (with a decrease of less than 1%), but the attack success rate (ASR) remains high at 95.1% and 90.0%, respectively. This suggests that a minimal number of poisoning samples can achieve efficient and persistent backdoor implantation, indicating that Convolutional Neural Networks (CNN) models are vulnerable to attacks in an open environment. The research findings confirm the threat of low-cost backdoor attacks and provide a reference for designing targeted defense mechanisms in the future.
- Research Article
- 10.4103/jpbs.jpbs_1184_25
- Dec 1, 2025
- Journal of Pharmacy & Bioallied Sciences
- Shaili Rashid + 4 more
Background:Abnormal uterine bleeding (AUB) belongs to the most common gynecological disorders that largely affect the health of women, especially living in low-resource countries. Efficiency and effectiveness in assessment of AUB is essential to make sure that appropriate and differed treatment is provided.Materials and Methods:It was a prospective study undertaken in one of the low resource healthcare facilities within a span of six month. Two hundred women were selected with the case of AUB. Evaluation procedure comprised of having a full medical history, physical examination, complete blood count (CBC), pelvic ultrasound, endometrial biopsy, and the serum thyroid-stimulating hormone (TSH). The effectiveness and cost efficiency of these studies were also reviewed in order to establish the minimum combination that can be used.Results:The so-called basic set of investigations worked well with 150 (75%) of the 200 participants: 150 (75%) AUB was effectively assessed with the help of medical history, physical examination, CBC, and ultrasound of the pelvis. Diagnosis of endometrial hyperplasia or malignancy required diagnosis of endometrial biopsy in 40 (20) cases. TSH levels in the serum were critical in determining the dysfunction of the thyroid in 10 (5%) incidences. The analysis of the cost demonstrated that omitting endometrial biopsy and TSH testing in a first-line assessment protocol lowered the expenses by 30%, without impairing the accuracy regarding the diagnosis in most of the patients.Conclusion:The triage of AUB in the low-resource environment could be successfully addressed with a minimal number of investigations, which could involve the medical history, physical examination, CBC, and pelvic ultrasound. Endometrial biopsy and TSH level must remain as a test with special clinical indications.
- Research Article
- 10.31612/2616-4868.7.2025.15
- Nov 30, 2025
- Clinical and Preventive Medicine
- Ruslan H Tserkovniuk + 6 more
Aim. To evaluate the immediate (in-hospital) outcomes of laparoscopic robot-assisted prostatectomy in the surgical treatment of large benign prostatic hyperplasia (greater than 80 cm³). Materials and methods. The study presents the immediate (in-hospital) results of simultaneous laparoscopic robot-assisted transperitoneal transvesical prostatectomy in 55 patients with benign prostatic hyperplasia who underwent surgery at the “Innomed – Center for Endosurgery” Medical Center between 2019 and 2024 using the da Vinci S and da Vinci Si surgical systems. Results. To exclude prostate cancer, serum prostate-specific antigen (PSA) testing, MRI, and/or prostate biopsy were performed. Intraoperative blood loss was assessed using the gravimetric method. Post-prostatectomy surgical complications were classified according to the Clavien–Dindo system, and urination was assessed pre- and postoperatively using uroflowmetry. The mean age of the patients was 66.7±4.3 years, mean body mass index – 25.6±3.5 kg/m², mean ASA (American Society of Anesthesiology) score – 1.8±0.2, mean prostate volume – 124.8±25.8 cm³, mean operative time – 219.3±28.7 minutes, mean intraoperative blood loss – 125.7±33.4 ml, and mean postoperative hospital stay – 5.5±1.1 days. Postoperative bleeding occurred in one patient (1.8%) and was managed with electrocautery of the prostate bed vessels. No patients required blood transfusion due to bleeding or blood loss. Complications after laparoscopic robot-assisted prostatectomy occurred in one patient (1.8%) – hyperthermia after urethral catheter removal, which was resolved with antibacterial and anti-inflammatory therapy. Complications according to the Clavien–Dindo classification occurred in 2 patients (3.6%) and were consistent with published data. The mean maximum urinary flow rate (Q max, ml/s) before surgery was 7.9±2.4 ml/s, and after surgery – 25.4±2.9 ml/s (p<0.05). Conclusions. Simultaneous laparoscopic robot-assisted transperitoneal transvesical prostatectomy is characterized by a minimal number of postoperative complications and enables effective restoration of urination in patients with large benign prostatic hyperplasia (greater than 80 cm³).