Articles published on Partition tree
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
1122 Search results
Sort by Recency
- Research Article
- 10.1371/journal.pone.0341803
- Feb 6, 2026
- PloS one
- Jiamin Sun + 3 more
In Versatile Video Coding (VVC), the partition patterns for coding units (CUs) have significant impact on the encoding efficiency. Determining the optimal CU partition is particularly time-consuming due to the calculation and comparison of rate-distortion costs for all possible partition patterns, especially during the ternary tree (TT) partitioning in intra coding. In this paper, a fast decision mechanism is proposed for TT partitioning based on image feature analysis to skip the complex rate-distortion calculation. Firstly, the correlation between the image structural features and the TT partition patterns is investigated based on experimental analysis and the most relevant features are selected for the subsequent prediction of optimal TT partition patterns. Secondly, we devise an efficient scheme for representing and extracting the selected features, further optimizing the extraction process to minimize computational complexity. Comprehensive datasets for partition pattern prediction are constructed based on these refined features. Finally, these datasets serve as the foundation for training and optimizing a predictive model, which is designed to achieve an optimal trade-off between prediction accuracy and model complexity. The predictive model is seamlessly incorporated into the VVC Test Model (VTM), facilitating efficient feature extraction prior to the Rate-Distortion Optimization (RDO) process for intra prediction and optimal partition pattern selection. By leveraging the prediction results, the model effectively determines whether TT partitioning can be bypassed, thereby streamlining the decision-making process and enhancing overall coding efficiency. Experimental results demonstrate that in comprehensive performance evaluations of time-saving metrics and Bjøntegaard Delta Bit Rate (BDBR), the proposed mechanism significantly outperforms existing lightweight neural network algorithms. Our decision mechanism effectively preserves coding quality while substantially accelerating the video coding process.
- Research Article
- 10.3389/fendo.2025.1737419
- Jan 13, 2026
- Frontiers in Endocrinology
- Yao Jiang + 8 more
ObjectiveIschemic stroke (IS) with hyperuricemia (HUA) correlates with poor outcomes, yet the shared pathophysiological traits remain unclear. This study examined metabolic parameters in HUA-IS comorbidity and developed an optimal interpretable Clinlabomics model for risk assessment.MethodsA total of 2,164 IS patients and 2,459 healthy controls (HCs) were retrospectively enrolled. Participants were divided into four groups: HUA-IS (comorbidity, n=1,082), non-HUA IS (n=1,082), HUA HCs (n=1,314), non-HUA HCs (n=1,145); the latter three were defined as the non-comorbidity group. After 1:1 propensity score matching (PSM), 1,031 cases were matched in each group. Ten metabolic parameters were analyzed: serum uric acid at admission (SUA_admission), SUA on the third day of hospitalization (SUA_3d), triglyceride-glucose index (TyG), triglyceride (TG), high-density lipoprotein cholesterol (HDL−C), atherogenic index of plasma (AIP), atherogenic coefficient (AC), lipoprotein combine index (LCI), Castelli’s risk index I (CRI-I), and Castelli’s risk index II (CRI-II). Univariate/multivariate logistic regression, quartile-based logistic regression, and restricted cubic spline (RCS) analysis were used to explore parameters - comorbidity associations. Post-PSM data were split 7:3 into training/testing sets, least absolute shrinkage and selection operator (LASSO) regression selected features, and 11 machine learning algorithms developed Clinlabomics models. Additionally, the optimal model was validated in the testing set and an independent validation set.ResultsAfter PSM, multivariate logistic regression identified AIP as the strongest risk factor (OR = 2.74, 95%CI: 1.80-4.19). The Q4 of TyG, TG, AIP, and LCI elevated comorbidity risk (P < 0.05). Besides, RCS showed nonlinear association of LCI with comorbidity (P < 0.05). The Recursive Partitioning and Regression Trees (rpart)-based Clinlabomics model exhibited favorable performance with F1-score, accuracy (ACC), and area under the curve (AUC) of 0.960, 0.960, and 0.986. At optimal hyperparameter (cp=0.0017), the model achieved AUCs of 0.987 (95%CI: 0.982-0.993), 0.955 (95%CI: 0.939-0.972), and 0.957 (95%CI: 0.915-0.999) in the training, testing, and validation datasets, respectively, correctly identifying 87.7% non-comorbidity and 98.0% comorbidity patients in validation. SHapley Additive exPlanations (SHAP) analysis identified UA_admission, UA_3d, TyG, TG, AIP and LCI as key metabolic indicators.ConclusionTyG, TG, AIP, and LCI were critical metabolic parameters for HUA-IS comorbidity, which warrant heightened attention in future comorbidity research.
- Research Article
- 10.1145/3787461
- Jan 5, 2026
- ACM Transactions on Quantum Computing
- Vladimirs Andrejevs + 2 more
In this work we study quantum algorithms for Hopcroft’s problem which is a fundamental problem in computational geometry. Given n points and n lines in the plane, the task is to determine whether there is a point-line incidence. The classical complexity of this problem is well-studied, with the best known algorithm running in O ( n 4/3 ) time, with matching lower bounds in some restricted settings. Our results are two different quantum algorithms with time complexity \(\widetilde{O}(n^{5/6}) \) . The first algorithm is based on partition trees and the quantum backtracking algorithm. The second algorithm uses a quantum walk together with a history-independent dynamic data structure for storing line arrangement which supports efficient point location queries. In the setting where the number of points and lines differ, the quantum walk-based algorithm is asymptotically faster. The quantum speedups for the aforementioned data structures may be useful for other geometric problems. Finally, we examine the connections between Hopcroft’s problem and other computational problems via fine-grained complexity. For example, we show a conditional Ω ( n 3/4 ) time lower bound on Hopcroft’s problem in 5 dimensions based on the quantum analogue of a classical hardness conjecture, which is stronger than the (optimal) Θ ( n 2/3 ) query complexity bounds.
- Research Article
- 10.1109/tpami.2026.3661424
- Jan 1, 2026
- IEEE transactions on pattern analysis and machine intelligence
- Li Sun + 6 more
Graph clustering is a longstanding topic in machine learning. In recent years, deep learning methods have achieved encouraging results, but they still require predefined cluster numbers $K$, and typically struggle with imbalanced graphs, especially in identifying minority clusters. The limitations motivate us to study a challenging yet practical problem: deep graph clustering without $K$ considering the imbalance in reality. We approach this problem from a fresh perspective of information theory (i.e., structural information). In the literature, structural information has rarely been touched in deep clustering, and the classic definition falls short in its discrete formulation, neglecting node attributes and exhibiting prohibitive complexity. In this paper, we first establish a differentiable structural information, generalizing the discrete formalism to continuous realm, so that we design a hyperbolic deep model (LSEnet) to learn the neural partitioning tree in the Lorentz model of hyperbolic space. Theoretically, we demonstrate its capability in clustering without requiring $K$ and identifying minority clusters in imbalanced graphs. Second, we refine hyperbolic representations of the partitioning tree, enhancing graph semantics, for better clustering. Contrastive learning for tree structures is non-trivial and costs quadratic complexity. Instead, we further advance our theory by discovering an interesting fact that structural entropy indeed bounds the tree contrastive loss. Finally, with an efficient reformulation, we approach graph clustering through a novel augmented structural information learning (ASIL), which offers a simple yet effective objective of augmented structural entropy to seamlessly integrates hyperbolic partitioning tree construction and contrastive learning. With a provable improvement in graph conductance, ASIL achieves effective debiased graph clustering in linear complexity with respect to the graph size. Extensive experiments show the ASIL outperforms 20 strong baselines by an average of $+12.42\%$ in NMI on Citeseer dataset.
- Research Article
- 10.1016/j.cgh.2025.02.033
- Jan 1, 2026
- Clinical gastroenterology and hepatology : the official clinical practice journal of the American Gastroenterological Association
- Giulia Risca + 7 more
Transferrin Saturation and Serum Ferritin Are Main Predictors of Liver Iron Content in Subjects With Hyperferritinemia.
- Research Article
- 10.3390/e27121218
- Nov 29, 2025
- Entropy (Basel, Switzerland)
- Rafał Brociek + 4 more
This paper introduces a novel approach to handwritten digit recognition based on directional flood simulation and topological feature extraction. While traditional pixel-based methods often struggle with noise, partial occlusion, and limited data, our method leverages the structural integrity of digits by simulating water flow from image boundaries using a modified breadth-first search (BFS) algorithm. The resulting flooded regions capture stroke directionality, spatial segmentation, and closed-area characteristics, forming a compact and interpretable feature vector. Additional parameters such as inner cavities, perimeter estimation, and normalized stroke density enhance classification robustness. For efficient prediction, we employ the Annoy approximate nearest neighbors algorithm using ensemble-based tree partitioning. The proposed method achieves high accuracy on the MNIST (95.9%) and USPS (93.0%) datasets, demonstrating resilience to rotation, noise, and limited training data. This topology-driven strategy enables accurate digit classification with reduced dimensionality and improved generalization.
- Research Article
- 10.1007/s10851-025-01264-8
- Oct 28, 2025
- Journal of Mathematical Imaging and Vision
- Nicolas Passat + 4 more
Multivalued Component Tree: New Results and a Bridge Between Partial and Total Partition Trees
- Research Article
- 10.48084/etasr.11891
- Oct 6, 2025
- Engineering, Technology & Applied Science Research
- S Harshitha + 2 more
Wireless Capsule Endoscopy (WCE) has revolutionized Gastrointestinal (GI) diagnostics by allowing non-invasive internal visualization. However, it generates massive amount of image data, leading to considerable memory and power requirements in terms of transmission and storage in battery-constrained applications. In WCE systems, conventional methods of image compression like Discrete Cosine Transform (DCT), Discrete Wavelet Transform (DWT), and Set Partitioning in Hierarchical Trees (SPIHT) are widely applied. Such algorithms, although efficient under typical conditions, have limitations such as increased computational complexity, poor power efficiency, and reduced image quality at elevated compression levels. To overcome these limitations, the present study proposes a new technique known as the CNN-based Feature Learning Compression Algorithm (CFLCA). This technique deploys Convolutional Neural Networks (CNNs) to obtain optimal spatial features for more efficient image compression in terms of energy consumption and memory usage. The model is trained to maintain a trade-off between image quality and compression ratio using Peak Signal-to-Noise Ratio (PSNR) as a metric. The experimental results demonstrate that the suggested CFLCA achieves a 0.28% improvement in compression ratio, a 0.15% increase in PSNR, and a 0.22% reduction in power consumption compared to traditional methods. These improvements show the promise of CFLCA in facilitating real-time and efficient image compression in energy-limited wireless medical imaging applications.
- Research Article
- 10.3390/microorganisms13092105
- Sep 9, 2025
- Microorganisms
- Giovanna Cocomazzi + 15 more
Recent studies suggest a role for the gut microbiota in the onset, progression, and prognosis of prostate cancer (PCa), one of the most common neoplasms in males. PCa screening relies on PSA testing, whose usefulness remains controversial due to its low specificity. This study was aimed at investigating the differences in the gut microbiota of PCa patients and healthy controls (HCs) and finding correlations between gut microbes and the clinical laboratory parameter assessed in the evaluation of PCa, to identify bacteria which could be used as diagnostic and prognostic biomarkers. Fecal samples collected from 18 PCa patients and 18 HCs were used to isolate bacterial DNA. 16S rRNA gene sequencing provided the gut microbial profiles of the enrolled subjects, whose functional impact was also predicted. A recursive partitioning tree method allowed us to identify a bacterial signature discriminating PCa from HC. A correlation analysis was performed between gut bacteria and the clinical laboratory parameters assessed in the evaluation of PCa. Differential bacterial patterns emerged between PCa patients and HCs, together with significant differences in beta-diversity, alpha-diversity, and richness. The functional prediction of the microbial profiles revealed several metabolic processes differentially regulated, including an enrichment in the Krebs cycle and in steroid hormone synthesis in PCa patients. A bacterial signature based on the abundance of Lactobacillus and Collinsella was found to discriminate between the two groups. Significant correlations were found between gut bacteria and the clinical laboratory parameters generally assessed in the evaluation of PCa. These results indicate that gut microbiota profiles may, in the future, represent potential biomarkers associated with prostate cancer risk or progression; however, further prospective studies and clinical validation are needed before considering their use as diagnostic or prognostic tools.
- Research Article
- 10.13052/jmm1550-4646.213425
- Aug 13, 2025
- Journal of Mobile Multimedia
- Usha Tiwari + 2 more
In recent years, Wireless Media Sensor Network (WMSN’s) deployments have rapidly increased for real time systems in various areas. Power consumption is always a critical issue which affects the overall lifetime of a wireless sensor network. WSN mainly consists of various types of sensor nodes, which are capable of sensing, computing and communicating to sink nodes wirelessly. The communication process is the main source of power consumption in the node, so a data compression technique is required which leads to a reduction in data transmitted over the wireless channels. For reducing the size of multimedia data received from media sensors, the set partitioning in a hierarchical tree (SPIHT) is always a favourable choice. But due to its huge memory requirement and complex coding, this method poses problems for resource constrained systems like a wireless media sensor network. In this paper a novel method has been introduced which is a hybrid of embedded zero tree (EZW) and set partitioning in a hierarchical tree (SPIHT). The advantages of this method is reduction of memory consumption & the processing time of the compression algorithm. The method is compared with DCT and DWT based image compression techniques for WMSN. The superiority of the algorithm over its competent algorithm has been demonstrated with the help of parameters like Peak signal to ratio (PSNR), Mean square Error (MSE), packet delivery rate, throughput, compression ratio, and energy consumption.
- Research Article
2
- 10.1016/j.engappai.2025.110993
- Aug 1, 2025
- Engineering Applications of Artificial Intelligence
- Uzma Nawaz + 2 more
A novel framework for efficient dominance-based rough set approximations using K-dimensional (K-D) tree partitioning and adaptive recalculations techniques
- Research Article
- 10.1051/0004-6361/202553927
- Aug 1, 2025
- Astronomy & Astrophysics
- P.B Lilje + 99 more
The two-point correlation function of the galaxy spatial distribution is a major cosmological observable that enables constraints on the dynamics and geometry of the Universe. The Euclid mission is aimed at performing an extensive spectroscopic survey of approximately 20–30 million Hα-emitting galaxies up to a redshift of about 2. This ambitious project seeks to elucidate the nature of dark energy by mapping the three-dimensional clustering of galaxies over a significant portion of the sky. This paper presents the methodology and software developed for estimating the three-dimensional two-point correlation function within the Euclid Science Ground Segment. The software is designed to overcome the significant challenges posed by the large and complex Euclid dataset, which involves millions of galaxies. The key challenges include efficient pair counting, managing computational resources, and ensuring the accuracy of the correlation function estimation. The software leverages advanced algorithms, including k-d tree, octree, and linked-list data partitioning strategies, to optimise the pair-counting process. These methods are crucial for handling the massive volume of data efficiently. The implementation also includes parallel processing capabilities using shared-memory open multi-processing to further enhance performance and reduce computation times. Extensive validation and performance testing of the software are presented. Those have been performed by using various mock galaxy catalogues to ensure that it meets the stringent accuracy requirement of the Euclid mission. The results indicate that the software is robust and can reliably estimate the two-point correlation function, which is essential for deriving cosmological parameters with high precision. Furthermore, the paper discusses the expected performance of the software during different stages of Euclid Wide Survey observations and forecasts how the precision of the correlation function measurements will improve over the mission’s timeline, highlighting the software’s capability to handle large datasets efficiently.
- Research Article
- 10.1145/3748509
- Jul 14, 2025
- ACM Transactions on Multimedia Computing, Communications, and Applications
- Xueyan Cao + 6 more
Screen Content Coding (SCC) is an indispensable tool for enabling distributed collaboration, such as video conferencing. Encoders in the latest video coding standards, particularly for SCC scenarios, employ a wider variety of partitioning tree splitting types, recursively traversing all branches, as well as a larger number of coding modes and submodes, to achieve higher coding efficiency compared to encoders in previous standards. This process leads to very high coding complexity, as each tree leaf node, called a coding unit (CU), for every partitioning size and location in the picture is repeatedly visited and evaluated multiple times during the optimal partitioning search. Additionally, each CU visit involves evaluating a vast number of coding options and their combinations to identify the best one. The complexity is further exacerbated in SCC due to the addition of many new CU coding modes and options. To significantly reduce SCC complexity without coding efficiency loss, this paper proposes a new technique, Accelerated Revisit CU-coding (ARC), along with an SCC search space analysis for in-depth operation-level and run/platform-independent assessment of SCC complexity. ARC exploits the correlation between the first visit and subsequent revisits of a CU with the same location and size. By fully leveraging the correlation and information from the first visit, ARC significantly accelerates revisit CU-coding while maintaining the same high coding efficiency. ARC is implemented in HPM, the AVS3 reference software. Experiments demonstrate that ARC reduces encoding runtime by 29.74%, 47.78%, and 54.25% for 1920x1080 FHD, 4 K UHD, and 8 K UHD test sequences, respectively, in All Intra configuration, without coding efficiency loss. These runtime reductions align with corresponding search space reductions of 30.91%, 49.67%, and 54.41%, as obtained from the search space analysis.
- Research Article
- 10.1177/18724981251340355
- Jun 22, 2025
- Intelligent Decision Technologies
- Francesca Meimeti + 9 more
The effective management of Emergency Department (ED) overcrowding is essential for improving patient outcomes and optimizing healthcare resource allocation. This study validates hospital admission prediction models initially developed using a small local dataset from a Greek hospital by leveraging the comprehensive MIMIC-IV dataset. After preprocessing the MIMIC-IV data, five algorithms—Linear Discriminant Analysis (LDA), K-Nearest Neighbors (KNN), Random Forest (RF), Recursive Partitioning and Regression Trees (RPART), and Support Vector Machines (svmRadial)—were evaluated. Among these, RF demonstrated superior performance, achieving an Area Under the Receiver Operating Characteristic Curve (AUC-ROC) of 0.9999, sensitivity of 0.9997, and specificity of 0.9999 when applied to the MIMIC-IV data. These findings underscore the robustness of RF in handling complex datasets for admission prediction, establishing MIMIC-IV as a valuable benchmark for validating models based on smaller local datasets and providing actionable insights for steering ED management strategies in the right direction.
- Research Article
- 10.3390/rs17132127
- Jun 21, 2025
- Remote Sensing
- Ileana De Los Ángeles Fallas Calderón + 3 more
Northwestern Ontario has a shorter growing season but fertile soil, affordable land, opportunities for agricultural diversification, and a demand for canola production. Canola yield mainly varies with spatial heterogeneity of soil properties, crop parameters, and meteorological conditions; thus, existing yield estimation models must be revised before being adopted in Northwestern Ontario to ensure accuracy. Region-specific canola cultivation guidelines are essential. This study utilized high spatial-resolution images to estimate flower coverage and yield in experimental plots at the Lakehead University Agricultural Research Station, Thunder Bay, Canada. Spectral profiles were created for canola flowers and pods. During the peak flowering period, the reflectance of green and red bands was almost identical, allowing for the successful classification of yellow flower coverage using a recursive partitioning and regression tree algorithm. A notable decrease in reflectance in the RedEdge and NIR bands was observed during the transition from pod maturation to senescence, reflecting physiological changes. Canola yield was estimated using selected vegetation indices derived from images, the percent cover of flowers, and the M5P Model Tree algorithm. Field samples were used to calibrate and validate prediction models. The model’s prediction accuracy was high, with a correlation coefficient (r) of 0.78 and a mean squared error of 7.2 kg/ha compared to field samples. In conclusion, this study provided an important insight into canola growth using remote sensing. In the future, when modelling, it is recommended to consider other variables (soil nutrients and climate) that might affect crop development.
- Research Article
- 10.54097/wh6hg844
- Jun 11, 2025
- Frontiers in Business, Economics and Management
- Ruijie Huang
With the significant increase in financial fraud incidents, financial fraud detection has become a critical research area. Complex financial relationship networks involving thousands or even millions of nodes present enormous challenges for fraud detection tasks. Although researchers have developed various graph-based methods to detect fraudulent behavior within these complex networks, existing approaches overlook two key issues in fraud graphs: the diversity of non-additive attributes and the distinguishability of grouped message passing from neighboring nodes. This paper proposes FinGuard-GNN (Financial Guardian Graph Neural Network), a novel dynamic graph neural network for financial fraud detection that addresses the aforementioned issues through innovative feature transformation strategies and a Cascaded Risk Diffusion (CRD) mechanism. For feature transformation, we implement Adaptive Tree Partitioning (ATP) encoding and Statistical Evidence Weighting (SEW) encoding to convert various types of non-additive node attributes into vector representations suitable for GNN aggregation operations, avoiding the generation of meaningless features while maintaining strong interpretability. For risk propagation, we design a feedback-based Cascaded Risk Diffusion strategy that enables dynamic accumulation and decay of risk information across the network. Additionally, we develop a Responsive Group Allocation (RGA) strategy that divides graph nodes into distinct groups followed by hierarchical aggregation, enhancing the distinguishability of fraudulent nodes. Experiments on two classic financial fraud datasets demonstrate that our proposed method achieves superior discriminative capability for fraudulent nodes compared to traditional graph algorithms and machine learning methods. The experimental results confirm the advantages of FinGuard-GNN in handling non-additive features in complex financial networks, improving node distinguishability, and capturing hierarchical risk propagation, providing a novel solution for the fi.
- Research Article
- 10.1200/jco.2025.43.16_suppl.e14626
- Jun 1, 2025
- Journal of Clinical Oncology
- Ryan Michael Carr + 16 more
e14626 Background: Despite advancements in neoadjuvant therapy and surgical techniques, patients with pancreatic ductal adenocarcinoma (PDAC) often experience high recurrence rates after total neoadjuvant therapy (TNT) and resection. Existing tumor response scoring systems, such as the College of American Pathologists (CAP) guidelines, fail to stratify recurrence risk effectively, emphasizing the need for novel, objective approaches. Methods: This study evaluated spatial characteristics of residual cancer and stroma in a multi-site cohort of 203 patients with resected PDAC following TNT, not having achieved a major pathologic response (CAP 2 or 3). Whole slide images (WSI) of hematoxylin and eosin-stained sections were digitized and analyzed using artificial intelligence to perform tissue segmentation. Metrics quantifying spatial morphology and configuration, inspired by principles of landscape ecology, were computed to explore spatial features of the tumor microenvironment (TME). Statistical models (recursive partitioning and regression tree [RPRT] model and Cox model with backwards eliminations) incorporating these features were developed to stratify recurrence risk. Results: Five metrics were used for risk classification model build. Three models successfully classified patients into high- and low-risk groups based on spatial metrics. Models demonstrated moderate discriminatory power (C-statistics: 0.560–0.566) and showed that increased tumor fragmentation, like stromal patch density and cancer patch shape index, were strongly associated with improved outcomes. Conclusions: Integrating spatial metrics of cancer-stroma configuration enhances risk stratification beyond conventional scoring systems. Given increased tumor fragmentation is associated with improved outcomes, these data indirectly suggest ecological mechanisms of treatment resistance. This approach offers a pathway toward personalized post-operative management strategies to improve outcomes in PDAC. Risk classification models. Model Description Event/n Median DFS (95% CI) HR(95% CI) p-value C-statistics 1(RPRT) High risk: log standard deviation of the stroma shape index > 0.651 87 / 101 8.22(5.72 – 10.1) 1.64(1.20 – 2.25) 0.002 0.562 Low risk: log standard deviation of the stroma shape index ≤ 0.651 71 / 102 11.90(9.24 – 25.1) Ref 2(RPRT) High risk: log standard deviation of the stroma shape index > 0.651 & mean cancer shape index ≤ 1.775 58 / 63 7.23(5.19 – 9.63) 1.82(1.31 – 2.52) 0.0003 0.560 Low risk: (log standard deviation of the stroma shape index > 0.651 & mean cancer shape index > 1.775) or log standard deviation of the stroma shape index ≤ 0.651 100 / 140 11.57(9.24 – 17.92) Ref 3(Backwards elimination) High risk: PI * > median 86 / 101 8.02(6.08 – 9.86) 1.73(1.26 – 2.37) 0.0007 0.566 Low risk:PI * ≤ median 72 / 102 15.72(9.97 – 25.05) Ref *PI: (0.2701 * log mean stroma area) + (0.3905 * log edge density).
- Research Article
1
- 10.1200/jco.2025.43.16_suppl.3520
- Jun 1, 2025
- Journal of Clinical Oncology
- Qian Shi + 19 more
3520 Background: Recent analyses highlight nonhierarchical outcomes using the 8 th Edition AJCC staging system for CC. For instance, the 5-year survival rate for stages I and IIIa patients (pts) closely align. Additionally, tumor deposits (TDs) have been established as significant prognostic indicators. The AJCCCCEP commissioned this study to develop an updated pathological staging system for CC focused specifically on pts without distant metastasis (M0), while retaining the existing stage IV classification. Methods: Individual patient data (IPD) from pts diagnosed with cc (2010- 2017) in the NCDB were divided into training (70%) and internal validation (30%) datasets. External validation used IPD from clinical trials. The primary endpoint was overall survival (OS). Risk classification development for M0 pts incorporated ungrouped data on pathologic T categories, the number of involved regional lymph nodes (LN+), and TD counts. Recursive partitioning and regression tree analyses were applied to construct hierarchical staging levels. Pre-specified criteria required survival probabilities to be consecutive and show clear separations using Kaplan-Meier (KM) estimates with pairwise log-rank test P of < 0.005 for the training and < 0.05 for validation analyses. Results: Data from 281,997 pts (median age 67 years, 50% male, 81% white, 55% T3, 19% T4, 44% N+, 26% M+, and 11% with ≥1 TD) were analyzed, with a median follow-up of 7.3 years. The updated staging system (Table) met pre-specified criteria, with all observed pairwise P < 0.0001 in the development and internal validation sets. KM OS curves displayed a hierarchical separation across all sub-levels after the 1 st year of diagnosis. Consistent results were seen in pts treated with adjuvant chemotherapy in 4 trials (all pair-wise P < 0.0001). Conclusions: The proposed pathological staging system for M0 pts fulfills pre-specified criteria for hierarchical risk stratification, validated both internally and externally, and provides an evidence-based update. Pending review process, the AJCCCCEP will recommend that these changes be made to the Version 9 staging protocol for colon cancer to improve prognostication for CC pts. Stage T, # of LN+, # of TD M % of pts 1y OS (CI), % 3y OS (CI), % 5y OS (CI), % I T1, 0, 0 0 5 96 (95-97) 91 (90-92) 84 (83-86) IIa T2, 0, 0 0 10 95 (94-95) 88 (88-89) 80 (79-81) IIb T1, 0, 1+T1, 1+, 0T2, 0, 1+T2, 1-4, 0T3, 0, 0 0 27 93 (92-93) 84 (84-85) 75 (75-76) IIIa T1, 1+, 1+T2, 1-4, 1+T2, 5+, 0T3, 0, 1+T3, 1-4, 0 0 14 92 (91-92) 80 (80-81) 71 (70-72) IIIb T2, 5+, 1+T3, 1-4, 1+T3, 5+, 0T4a, 0-4, 0T4b, 0-2, 0 0 13 86 (86-87) 69 (68-70) 58 (57-59) IIIc T3, 5+, 1+T4a, 0-4, 1+T4a, 5+, anyT4b, 0-2, 1+T4b, 3+, any 0 5 78 (77-80) 53 (51-54) 40 (38-41) IVa Any 1a 19 59 (58-60) 28 (28-29) 17 (16-18) IVb Any 1b 7 43 (42-44) 14 (13-14) 6 (6-7) CI: 95% confidence internal; Peritoneum involvement data were not available before 2018 in NCDB. Thus, IVa/b were based on 7 th Edition.
- Research Article
- 10.1109/tcns.2025.3538472
- Jun 1, 2025
- IEEE Transactions on Control of Network Systems
- Leon Lan + 1 more
In transmission networks, power flows and network topology are deeply intertwined due to power flow physics. Recent literature shows that a specific more hierarchical network structure can effectively inhibit the propagation of line failures across the entire system. In particular, a novel approach named tree partitioning has been proposed, which seeks to bolster the robustness of power networks through strategic alterations in network topology, accomplished via targeted line switching actions. Several tree partitioning problem formulations have been proposed by considering different objectives, among which power flow disruption and network congestion. Furthermore, various heuristic methods based on a two-stage and recursive approach have been proposed. The present work provides a general framework for tree partitioning problems based on mixed-integer linear programming (MILP). In particular, we present a novel MILP formulation to optimally solve tree partitioning problems and also propose a two-stage heuristic based on MILP. We perform extensive numerical experiments to solve two tree partitioning problem variants, demonstrating the excellent performance of our solution methods. Lastly, through exhaustive cascading failure simulations, we compare the effectiveness of various tree partitioning strategies and show that, on average, they can achieve a substantial reduction in lost load compared to the original topologies.
- Research Article
- 10.5269/bspm.62847
- May 29, 2025
- Boletim da Sociedade Paranaense de Matemática
- Rakhal Das + 2 more
In this paper we introduced a concept of spatial binary multiset topological relation and binary space. We use the property of Binary space partition tree and spatial binary relation. We define spatial multiset topology. We study the different properties on it.