Articles published on Hungarian algorithm
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
1183 Search results
Sort by Recency
- New
- Research Article
- 10.1177/09544062251405169
- Dec 28, 2025
- Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science
- Xianghong Wang + 4 more
Due to the low matching accuracy and efficiency of the chessboard caused by its similar feature points in binocular vision measurement, this study proposes a matching method based on homography matrix. Inner corner points of the chessboard are utilized to generate a homography matrix, with similarity evaluated through a cost matrix and optimized using the Hungarian algorithm to achieve optimal matching feature points, effectively addressing feature point confusion. Compared with other chessboard feature matching methods, higher accuracy and efficiency are demonstrated. The homography matrix-based method ensures strong pose robustness by maintaining stable feature matching across varied chessboard and camera poses, adapting to complex positional changes. The feature matching method is applied to visual vibration measurement using dual measurement points, successfully identifying the vibration characteristics of the beam with high precision and reliability. A theoretical basis and technical support are provided for non-contact health inspection of structures, advancing vibration monitoring in large-scale structural applications.
- New
- Research Article
- 10.1088/2057-1976/ae2a37
- Dec 18, 2025
- Biomedical Physics & Engineering Express
- Wenjia Song + 4 more
Early and accurate detection of pulmonary nodules in computed tomography (CT) scans is critical for reducing lung cancer mortality. While convolutional neural networks (CNNs) and Transformer-based architectures have been widely used for this task, they often suffer from insufficient global context awareness, quadratic complexity, and dependence on post-processing steps such as non-maximum suppression (NMS). This study aims to develop a novel 3D lung nodule detection framework that balances local and global contextual awareness with low computational complexity, while minimizing reliance on manual threshold tuning and redundant post-processing. We propose FCMamba, a flexible connected visual state-space model adapted from the recently introduced Mamba architecture. To enhance spatial modelling, we introduce a flexible path encoding strategy that reorders 3D feature sequences adaptively based on input relevance. In addition, a Top Query Matcher, guided by the Hungarian matching algorithm, is integrated into the training process to replace traditional NMS and enable end-to-end one-to-one nodule matching. The model is trained and evaluated using 10-fold cross-validation on the LIDC-IDRI dataset, which contains 888 CT scans. FCMamba outperforms several state-of-the-art methods, including CNN, Transformer, and hybrid models, across seven predefined false positives per scan (FPs/scan) levels. It achieves a sensitivity improvement of 2.6% to 20.3% at low FPs/scan (0.125) and delivers the highest CPM and FROC-AUC scores. The proposed method demonstrates balanced performance across nodule sizes, reduced false positives, and improved robustness, particularly in high-confidence predictions. FCMamba provides an efficient, scalable and accurate solution for 3D lung nodule detection. Its flexible spatial modeling and elimination of post-processing make it well-suited for clinical usage and adaptable to other medical imaging tasks.
- Research Article
- 10.3390/ijfs13040243
- Dec 17, 2025
- International Journal of Financial Studies
- Iveta Grigorova + 2 more
Customer segmentation is essential in financial services for designing targeted interventions, managing dormant portfolios, and supporting marketing re-engagement strategies. Traditional approaches such as Recency–Frequency–Monetary (RFM) analysis offer interpretability but often lack the flexibility needed to capture heterogeneous behavioral patterns. This study presents an automated segmentation framework that integrates machine learning-based clustering with RFM-based interpretability benchmarks. KMeans and Hierarchical clustering are evaluated across multiple values of k using internal validity metrics (Silhouette Coefficient, Davies–Bouldin Index) and interpretability alignment measures (Adjusted Rand Index, Normalized Mutual Information, Homogeneity, Completeness, and V-Measure). The Hungarian algorithm is used to align machine-learned clusters with RFM segments for comparability. The framework reveals behavioral subgroups not captured by RFM alone, demonstrating that machine learning can expose hidden heterogeneity within dormant customer populations. While outcome-based financial validation is not yet feasible due to the cold-start nature of the deployment environment, the study provides a reproducible, scalable pipeline for segmentation that balances analytical rigor with business interpretability. The findings highlight how data-driven clustering can refine traditional segmentation logic, supporting more nuanced portfolio monitoring and re-engagement strategies in financial services.
- Research Article
- 10.7717/peerj-cs.3385
- Dec 16, 2025
- PeerJ Computer Science
- Nesreen Alsharman + 4 more
Cloud computing offers numerous benefits to its users, but it also presents significant performance challenges. The nondeterministic polynomial time (NP)-complete nature of cloud workflow scheduling makes it a significantly challenging task. Scheduling cloud tasks become considerably more complex when operations involve varying quality-of-service (QoS) requirements. Constrained workflow scheduling, however, has the potential to boost cloud system performance and consequently improve quality of service. Although numerous approaches have been developed for workflow scheduling, most focus exclusively on single QoS constraints. This study presents a method for utilizing the Hungarian Algorithm (HA) to address multiple workflow scheduling constraints and promote on-demand cloud services. The Fittest Task Population algorithm (FTPA) was developed to generate the fittest task population set that matches the customer tasks’ constraints. The HA is utilized to assign each task in the generated fittest task population set to the fittest cloud virtual machine (VM). The proposed approach is validated and compared with state-of-the-art workflow scheduling methods using different multiple constraint scheduling scenarios. The comparative analysis validates the effectiveness of the proposed integrated FTPA-HA algorithm, demonstrating its superiority over existing scheduling approaches.
- Research Article
- 10.1177/00202940251398968
- Dec 15, 2025
- Measurement and Control
- Yiming Zhao + 3 more
With the widespread deployment of smart meters in smart grids and the transformation of communication methods from wired to wireless, the communication security issues are becoming more and more prominent, especially facing dynamic attack threats such as malicious signal interference and meter data corruption, but there are also problems of low detection efficiency and resource waste. To address this problem, this paper proposes a dynamic attack and defense detection model based on Bayesian game under resource-constrained conditions. In order to characterize the game behavior of attackers and defenders under incomplete information conditions, a two-stage game process is designed to simulate attack scenarios such as eavesdropping and interference, and the Beta distribution is introduced for dynamic belief update to enhance the defender’s adaptability to attack situations. Combining the performance and communication characteristics of heterogeneous nodes, a bipartite graph matching mechanism is introduced, and the optimal allocation of defense resources is achieved based on the Hungarian algorithm, thereby reducing the overall defense cost while ensuring system security. The simulation experimental results show that the proposed model can effectively improve the defense success rate in a dynamic environment, which is better than random defense, full coverage defense and traditional machine learning methods. The strategy optimized by the Hungarian algorithm reduces costs and increases benefits. Its feasibility and efficiency in actual smart grid applications are verified.
- Research Article
- 10.3390/app152413147
- Dec 14, 2025
- Applied Sciences
- Roberto Lázaro + 2 more
The representativeness of long-term wind data at a site remains a challenge, as it is essential for resource analysis, production adjustment in operating plants, and the simulation of hybridised plants. A representative one-year hourly time series, known as a Wind Reference Year (WRY), is required, yet the availability of long-term real data is rare, making the estimation of WRY from reanalysis data and shorter measurement campaigns a common approach. In this study, Gaussian Mixture Copula Models (GMCM) and five regression models were applied and compared. The GMCM was trained using 15 years of reanalysis data to generate simulations, and subsequently, regression-based Measure–Correlate–Predict (MCP) methods were applied to adapt the simulated reference year to site-specific conditions. Finally, the Hungarian algorithm was used to reorder the simulated data series, aligning it with a typical wind pattern and producing the WRY dataset. The results were validated against 15 years of real measurements and benchmarked against a heuristic method based on long-term similarity of main wind parameters and the commercial tool Windographer. The findings demonstrate the potential of the proposed method, showing improvements over existing techniques and providing a robust approach to constructing representative WRY datasets.
- Research Article
- 10.34229/2707-451x.25.4.1
- Dec 8, 2025
- Cybernetics and Computer Technologies
- Dmitri Terzi
Introduction. The traveling salesman problem is becoming an important object of research in various fields of science, economics and technology. Construction of efficient algorithms with an optimality criterion for the obtained solution is a relevant task. The traveling salesman problem is a transport-type problem and a natural approach to its solution can be methods for solving transport problems, in particular, variants of the Hungarian method. Purpose. Development of a modification of the Hungarian method for solving the traveling salesman problem. Expanding the capabilities of exact methods for solving transport-type problems. Results. The possibility of solving the traveling salesman problem by modifying the Hungarian method for solving the assignment problem is shown. The concepts of a set of cyclically independent elements and a cyclically independent zero of a given matrix are introduced. The idea of the modification and one of its implementations are presented in these new terms. Computational experiments were conducted. The results are reflected by three indicators: the average number of iterations and the processor time for a set of problems of a given dimension. Conclusions. The need to develop effective exact methods for solving the traveling salesman problem leads to the emergence of new ideas and approaches. In this sense, it is useful to use a natural approach based on the application of methods for solving transport-type problems, since the traveling salesman problem is a transport-type problem. It is shown that based on the modification of the Hungarian method, it is possible to build an effective method for obtaining an exact or approximate one depending on the specific situation. Keywords: traveling salesman problem, cyclically independent elements, modified Hungarian method.
- Research Article
- 10.1038/s41598-025-27309-x
- Dec 5, 2025
- Scientific Reports
- Lan Cao + 6 more
Traditional radar-camera calibration requires manual intervention and excessive computational resources, resulting in high labor costs for maintenance in roadside perception scenarios. Thus, we propose a continuous online calibration method for roadside integrated radar-camera device. The method is based on azimuth angle and multi frame tracking. Firstly, the radar-camera corresponding points are matched by the target azimuth angle and its rate of change, thus achieving coarse calibration. It doesn’t need manual roadside parameter measurement, only need the camera intrinsic parameters obtained in the laboratory. Secondly, the Hungarian tracking algorithm is used to match camera-radar point pairs with over a larger range and the fine calibration matrix is obtained. Additionally, the validation criterion is established, which ensures the fine calibration can operate continuously and timely adjust when the device pose changes. To verify the efficiency of the proposed method, the real roadside experiments are conducted in the traffic-dense scenario. The results show that the purposed method reduces the reprojection error by 25% comparing to manual calibration, by 55% comparing to other automatic calibration method. This approach significantly enhances calibration accuracy and robustness in complex environments, it can provide reliable technical support for intelligent transportation systems.Supplementary InformationThe online version contains supplementary material available at 10.1038/s41598-025-27309-x.
- Research Article
- 10.69533/1yvhdt46
- Dec 1, 2025
- Jurnal Ilmiah Informatika dan Komputer
- Tirsa Ninia Lina + 4 more
This study aims to optimize employee task assignments at Serupa Café, which faces issues of workload imbalance and inefficient task completion times. The research gap lies in the limited application of algorithm-based optimization methods within human resource management in the food and beverage service sector. To address this, the study applies the Hungarian Method, an algorithmic approach with a time complexity of O(n³), to determine the most efficient pairing between employees and tasks. The computation and validation processes were carried out using the POM-QM for Windows software as a quantitative analysis tool. The results indicate that the optimal assignment configuration achieved a total completion time of 85 minutes, with complete consistency between manual and software-generated results. System testing demonstrated high computational efficiency and no logical errors during data processing. Therefore, this study concludes that integrating the Hungarian Method with POM-QM significantly enhances operational performance and contributes to the application of informatics techniques for optimizing human resource management within the service industry.
- Research Article
- 10.3390/jmse13122249
- Nov 26, 2025
- Journal of Marine Science and Engineering
- Zhuo Wang + 4 more
Autonomous Underwater Vehicle (AUV) swarms possess advantages such as efficiency, reliability, flexibility, and extensive coverage in underwater operations. However, their coordinated control is challenged by communication interruptions and actuator failures in complex marine environments. This paper proposes a fixed-time event-triggered fault-tolerant formation control method to address these challenges. First, the Prim algorithm and the Hungarian algorithm are employed to reconstruct the communication topology, mitigating AUV disconnections due to communication failures and ensuring formation stability. Second, a fixed-time extended state observer (ESO) is designed to estimate the lumped disturbance arising from model uncertainties, unknown ocean disturbances, and actuator failures. Finally, a performance function is introduced to reformulate error variables, and a fixed-time event-triggered formation control law is developed based on an auxiliary saturation system and an event-triggering mechanism. In addition, this paper demonstrates the stability of the entire closed-loop system, and no Zeno phenomenon will occur. Simulation experiments demonstrate the effectiveness and superiority of the proposed method in maintaining robust formation control of AUV systems under adverse conditions.
- Research Article
- 10.1007/s44257-025-00048-z
- Nov 26, 2025
- Discover Analytics
- Emmanuel Kofi Gbey + 1 more
Abstract With rising urban populations, optimizing passenger group-to-vehicle allocation (PGVA) is critical for enhancing ride-sharing efficiency, particularly when integrated with transit networks. Existing PGVA methods often underperform by overlooking arithmetic compatibility between passenger group sizes and vehicle capacities. Traditional approaches prioritize spatial or temporal factors but neglect structural relationships inherent in passenger group-vehicle matching. This study introduces the Greatest Common Divisor (GCD) method, a novel framework leveraging number-theoretic principles to optimize resource allocation. The GCD-based method addresses PGVA problem by decomposing passenger group sizes and vehicle capacities into prime factors, ensuring mathematically rigorous compatibility while minimizing wasted capacity and computational complexity. Under the tested simulation conditions, the GCD-based method demonstrated superior performance in reducing empty vehicle miles traveled (eVMT) and vehicle miles traveled (VMT) compared to the benchmark algorithms. It significantly reduced eVMT and VMT by over 70% and 85% respectively, compared to the Hungarian algorithm, while avoiding the inefficiencies of a first-come-first-served strategy. The GCD-based compatibility score successfully encodes the qualitative notion of a “good fit”, leading to more efficient resource utilization and directly contributing to the model’s performance. While relatively computationally more intensive, the proposed GCD-based model solves problems of realistic scale within a timeframe that is practical for operational deployment in modern ride-sharing platforms. The method bridges a critical gap in ridesharing optimization and aligns with sustainability goals through inherent resource efficiency. This study supports data-driven strategies for passenger-centric mobility systems that balance demand, capacity, and environmental impact by prioritizing arithmetic alignment.
- Research Article
- 10.1371/journal.pone.0326662.r004
- Nov 21, 2025
- PLOS One
- Qingnan Ji + 3 more
In modern multimodal interaction design, integrating information from diverse modalities—such as speech, vision, and text—presents a significant challenge. These modalities differ in structure, timing, and data volume, often leading to mismatches, low computational efficiency, and suboptimal user experiences during the integration process. This study aims to enhance both the efficiency and accuracy of multimodal information fusion. To achieve this, publicly available datasets—Carnegie Mellon University Multimodal Opinion Sentiment Intensity (CMU-MOSI) and Interactive Emotional Dyadic Motion Capture (IEMOCAP)—are employed to collect speech, visual, and textual data relevant to multimodal interaction scenarios. The data undergo preprocessing steps including noise reduction, feature extraction (e.g., Mel Frequency Cepstral Coefficients and keypoint detection), and temporal alignment. An improved Kuhn-Munkres algorithm is then proposed, extending the traditional bipartite graph matching model to support weighted multimodal matching. The algorithm dynamically adjusts weight coefficients based on the importance scores of each modality, while also incorporating a cross-modal correlation matrix as a constraint to improve the robustness of the matching process. The enhanced algorithm’s performance is validated through information matching efficiency tests and user interaction satisfaction surveys. Experimental results show that it improves multimodal information matching accuracy by 28.2% over the baseline method. Integration efficiency increases by 18.7%, and computational complexity is significantly reduced, with average computation time decreased by 15.4%. User satisfaction also improves, with a 19.5% increase in experience ratings. Ablation studies further confirm the critical contribution of both the dynamic weighting mechanism and the correlation matrix constraint to the overall performance. This study introduces a novel optimization strategy for multimodal information integration, offering substantial theoretical value and broad applicability in intelligent interaction design and human-computer collaboration. These advancements contribute meaningfully to the development of next-generation multimodal interaction systems.
- Research Article
- 10.3390/iic1030010
- Nov 18, 2025
- Intelligent Infrastructure and Construction
- Honglin Mu + 3 more
Tunnels serve as a critical hub in urban transportation networks; their monotonous and enclosed environment is prone to inducing speeding behavior, necessitating an efficient vehicle speed monitoring system. Traditional methods suffer from high costs and slow response times, making them inadequate for the complex scenarios encountered in tunnel environments. This study proposes a real-time tunnel vehicle speed monitoring system based on YOLOv8s and DeepSORT. YOLOv8s is used to detect and classify cars, trucks, and buses, while DeepSORT applies Kalman filtering and the Hungarian algorithm to construct motion trajectories. Vehicle speed is estimated through perspective geometric transformation combined with a sliding-window approach, with a speeding threshold of 100 km/h and corresponding visual alerts. Using surveillance video from an expressway tunnel as the dataset, the system achieved detection accuracies of 98% for cars, 96% for trucks, and 91% for buses. Speed detection performance metrics included an average speed deviation (ASD) of 2.54 km/h, a deviation degree of vehicle speed (DDVS) of 3.12, vehicle speed stability (VST) of 1.22, and speed difference ratio (SDR) of 2.9%. Analysis revealed a longitudinal “deceleration–acceleration–deceleration” inverted U-shaped speed profile along the tunnel. Statistical tests confirmed these findings: the Mann–Whitney U test showed highly significant differences in vehicle speeds between cars and trucks across different tunnel sections, and the Kruskal–Wallis test further indicated significant speed variations across the entrance, middle, and exit segments for both vehicle types.
- Research Article
- 10.1177/03913988251360543
- Nov 3, 2025
- The International journal of artificial organs
- Hemalata Nawale + 1 more
Globally, heart disease (HD) persists as a major contributor to mortality rates, requiring accurate and efficient diagnostic models. While machine learning has shown promise in early detection, challenges such as missing data, class imbalance, suboptimal feature selection, and inefficient hyperparameter tuning hinder predictive accuracy and reliability. Many existing models fail to effectively preprocess medical datasets, leading to biased and computationally expensive predictions. To address these issues, this study proposes a strong hybrid framework for HD prediction. The Balanced Imputation-Normalization Framework incorporates K-Nearest Neighbors (KNN) imputation, StandardScaler normalization, and the Synthetic Minority Oversampling Technique (SMOTE). KNN imputation effectively handles missing data, ensuring reliable representation, while StandardScaler normalization standardizes feature values to enhance model stability. SMOTE is applied to address class imbalance, synthetic samples are generated to augment the minority class. Feature selection is optimized using the Hungarian algorithm, which systematically selects the most relevant attributes while reducing redundancy. Additionally, Bayesian optimization fine-tunes hyperparameters to improve classification performance. For prediction, an ensemble learning approach combines Random Forest (RF), Decision Tree (DT), K-Nearest Neighbors (KNN), Naïve Bayes (NB), and Extreme Gradient Boosting (XGBoost). The Voting Ensemble aggregates predictions using hard and soft voting mechanisms, improving robustness and generalization. Experimental results on benchmark heart disease datasets demonstrate that XGBoost attained a peak accuracy of 96.43%, with subsequent results from the Voting Ensemble at 95.66%, significantly outperforming traditional models and demonstrating that ensemble learning effectively improves accuracy and reduces computational complexity.
- Research Article
- 10.1016/j.knosys.2025.114543
- Nov 1, 2025
- Knowledge-Based Systems
- Yuhan Guo + 4 more
Dynamic pick-up point recommendation with multi-modal deep forest and incentive-based adaptive Kuhn-Munkres Algorithm
- Research Article
- 10.1021/acsomega.5c09442
- Oct 31, 2025
- ACS Omega
- Yijian Xu + 3 more
The accurate trackingof particle trajectories in dense granularflows is vital for optimizing industrial processes involving rotarydrums. However, traditional tracking methods suffer from poor stabilityand low accuracy under complex conditions. To address these issues,this paper proposes HU-FlowNet, a trajectory detection method integratingU-Net for particle localization, UnLiteFlowNet for velocity fieldestimation, and the Hungarian algorithm for global position matching.Specifically, UnLiteFlowNet is used to predict particle positionsat frame i + 1 based on their locations at frame i, while U-Net detects the observed particle positions inframe i + 1. The Hungarian algorithm is then employedto establish optimal correspondences between the predicted and observedpositions, thereby determining the updated locations of particlesand achieving accurate tracking across frames. Experiments on denseparticle flow in rotating drums demonstrate that HU-FlowNet outperformsconventional trackers (Euclidean, CSRT, Boosting, MedianFlow, andsimple online and realtime tracking), achieving a root mean squareerror of 4.854, mean absolute error of 1.913, and a trajectory coveragerate of 94%. The proposed method demonstrates strong robustness acrossvarying operating conditions, validating its effectiveness and generalizationcapability and enables stable, continuous tracking of hundreds ofparticles in complex, dense environments.
- Research Article
- 10.1088/2631-8695/ae15d3
- Oct 30, 2025
- Engineering Research Express
- Shihao Gu + 5 more
Abstract In dynamic environments, moving objects introduce unstable features that significantly degrade the accuracy of simultaneous localization and mapping (SLAM) systems. To address this issue, we propose Neural-KF, a robust visual SLAM framework that integrates three key modules: (1) a modified SuperPoint network with multi-level feature fusion for reliable static keypoint extraction, (2) a YOLOv8-based dynamic object detector, and (3) a Kalman-consistent state estimation mechanism that predicts object motion trajectories to enhance temporal consistency. By associating predicted and detected bounding boxes via the Hungarian algorithm, Neural-KF achieves accurate suppression of dynamic points while preserving sufficient static features for pose estimation. Experimental evaluations on public datasets, including KITTI and EuRoC, demonstrate that Neural-KF improves absolute trajectory error by up to 28% compared to VINS-Fusion and achieves competitive accuracy against advanced dynamic SLAM systems such as DynaSLAM. Furthermore, the system maintains real-time performance (>30 FPS) with a balanced trade-off between accuracy and computational cost. These results highlight the effectiveness of Neural-KF in achieving robust and efficient visual odometry under challenging dynamic conditions.
- Research Article
- 10.30560/ijas.v8n4p16
- Oct 28, 2025
- International Journal of Applied Science
- S B R D Dhananjalee + 1 more
This study proposes a genetic algorithm (GA)-based approach for solving assignment problems, aiming to minimize costs or maximize profits by determining optimal resource allocations for tasks in both balanced and unbalanced cases. Genetic algorithms, inspired by the concept of natural selection, are widely recognized for their flexibility in handling complex optimization problems. In this research, tournament selection is used to identify the best candidates based on a fitness function that minimizes total cost. A specialized crossover strategy focuses on selecting the lowest-cost assignments with the highest penalty costs. Additionally, swap mutation is applied to prevent redundant assignments. Various numerical examples are provided to illustrate the effectiveness of the proposed method in practical situations. Over time, several techniques like the Maximum Difference Cost Method, New Revised Zero's to One's Method, Bottleneck Cost Method, Modified Ant Colony Optimization Algorithm, and Matrix One's Assignment Method have been developed to handle assignment problems. Among these, the Hungarian Method, introduced in 1955, is well known for providing accurate solutions to small and medium-sized problems. However, this study introduces a novel and generalized approach using a genetic algorithm that can efficiently solve assignment problems involving any number of jobs and machines. The effectiveness of the proposed method has been validated through experiments on benchmark problems and randomly generated datasets for both balanced and unbalanced scenarios. The numerical results confirm the practical applicability of the proposed method and its potential as a powerful tool for solving real-world assignment problems.
- Research Article
- 10.1021/acs.jcim.5c02099
- Oct 8, 2025
- Journal of chemical information and modeling
- Xiaoqi Wei + 5 more
Root-mean-square deviation (RMSD) is widely used to assess structural similarity in systems ranging from flexible ligand conformers to complex molecular cluster configurations. Despite its wide utility, the RMSD calculation is often challenged by inconsistent atom ordering, indistinguishable configurations in molecular clusters, and potential chirality inversion during alignment. These issues highlight the necessity of accurately establishing atom-to-atom correspondence as a prerequisite for meaningful alignment. Traditional approaches often rely on heuristic cost matrices combined with the Hungarian algorithm, yet these methods underutilize the rich intramolecular structural information and may fail to generalize across chemically diverse systems. In this work, we introduce OTMol, a method that formulates the molecular alignment task as a fused supervised Gromov-Wasserstein (fsGW) optimal transport problem. By leveraging the intrinsic geometric and topological relationships within each molecule, we find that OTMol eliminates the need for manually defined cost functions and enables a principled, data-driven matching strategy. Importantly, OTMol preserves key chemical features, such as molecular chirality and bond connectivity consistency. We evaluate OTMol across a wide range of molecular systems, including adenosine triphosphate, imatinib, lipids, small peptides, and water clusters, and demonstrate that it consistently achieves low RMSD values while preserving computational efficiency. Importantly, the synthesis of OTMol maintains molecular integrity by enforcing one-to-one mappings between entire molecules, thereby avoiding erroneous many-to-one alignments that often arise in comparing molecular clusters. Our results underscore the utility of optimal transport theory for molecular alignment and offer a generalizable framework applicable to structural comparison tasks in cheminformatics, molecular modeling, and related disciplines.
- Research Article
- 10.1016/j.optlastec.2025.112970
- Oct 1, 2025
- Optics & Laser Technology
- Meiyun Chen + 4 more
Combining spot image denoising network and Hungarian matching algorithm: Achieving high-precision measurement of aspherical morphology