Published in last 50 years
Related Topics
Articles published on Prior Information
- New
- Research Article
- 10.1016/j.measurement.2025.117986
- Dec 1, 2025
- Measurement
- Yang Wu + 5 more
Feasibility analysis of EIT-guided lung tumor tracking with prior information for robotic arm-assisted radiotherapy
- New
- Research Article
- 10.1016/j.engappai.2025.112262
- Dec 1, 2025
- Engineering Applications of Artificial Intelligence
- Gang Liu + 4 more
A multi-modal model for removal of crack image shadow based on language prior information
- New
- Research Article
- 10.64509/jicn.12.31
- Nov 30, 2025
- Journal of Intelligent Computing and Networking
- Yufei Zheng + 4 more
Occluded person re-identification aims to retrieve holistic images of a given identity based on occluded person images. Most existing approaches primarily focus on aligning visible body parts using prior information, applying occlusion augmentation to predefined regions, or complementing the missing semantics of occluded body parts with the assistance of holistic images. Nevertheless, they struggle to generalize across diverse occlusion scenarios that are absent from the training data and often overlook the pervasive issue of feature contamination caused by holistic images. In this work, we propose a novel Occlusion-Guided Feature Purification Learning via Reinforced Knowledge Distillation (OGFR) to address these two issues simultaneously. OGFR adopts a teacher-student distillation architecture that effectively incorporates diverse occlusion patterns into feature representation while transferring the purified discriminative holistic knowledge from the holistic to the occluded branch through reinforced knowledge distillation. Specifically, an Occlusion-Aware Vision Transformer is designed to leverage learnable occlusion pattern embeddings to explicitly model such diverse occlusion types, thereby guiding occlusion-aware robust feature representation. Moreover, we devise a Feature Erasing and Purification Module within the holistic branch, in which an agent is employed to identify low-quality patch tokens of holistic images that contain noisy negative information via deep reinforcement learning, and substitute these patch tokens with learnable embedding tokens to avoid feature contamination and further excavate identity-related discriminative clues. Afterward, with the assistance of knowledge distillation, the student branch effectively absorbs the purified holistic knowledge to precisely learn robust representation regardless of the interference of occlusions. {Extensive experiments validate OGFR, on Occluded-Duke it achieves 76.6% Rank-1 and 64.7% mAP, outperforming the closest Transformer-based method by +3.3% Rank-1 and +2.4% mAP, with consistent gains on other benchmarks.
- New
- Research Article
- 10.55606/juisik.v5i3.1748
- Nov 25, 2025
- Jurnal ilmiah Sistem Informasi dan Ilmu Komputer
- Ismail Ismail + 2 more
Advancements in information technology have brought significant impacts in the healthcare sector, particularly in the medical diagnosis process. Expert systems, as a technological innovation, mimic the capabilities of human experts in making decisions based on knowledge bases and inference rules. The development of expert systems aims to improve the efficiency and accuracy of diagnosis, especially when facing uncertainty and variations in clinical data. This system integrates symptom data, diseases, and prior probabilities derived from epidemiological studies and expert medical experience. In this study, the author designed and implemented an expert system for diagnosing menstrual disorders based on the Bayes’ Theorem method. The system utilizes main components such as a knowledge base, inference engine, and an intuitive user interface. The system workflow begins with the collection of symptom data, calculating probabilities using Bayes’ Theorem, and ultimately delivering probabilistic diagnoses presented informatively to the user. Testing the system demonstrated satisfactory accuracy in identifying menstrual disorders such as menorrhagia, dysmenorrhea, and premenstrual syndrome (PMS). The results show that applying Bayes’ Theorem enhances system reliability in managing data uncertainty and provides diagnosis recommendations based on probability. This system is expected to serve as an effective tool for healthcare professionals and patients for early diagnosis of menstrual disorders while expanding access to accurate and trustworthy health information. Future development will focus on improving the knowledge base and integrating advanced technologies to maximize its benefits in reproductive health.
- New
- Research Article
- 10.1149/ma2025-02542644mtgabs
- Nov 24, 2025
- Electrochemical Society Meeting Abstracts
- Kinshuk Panda + 5 more
Accurately predicting Li-ion battery capacity trajectories using early-life data can dramatically improve battery-life understandings and be used to rapidly evaluate design/cost/performance trade-offs when developing new battery materials. Accurate early-life predictions enable researchers to quickly iterate over cell designs and material precursor properties without consistently cycling cells to failure. To this end, we present a toolbox that uses a combined Gaussian process and Bayesian regression approach that capitalizes on signals other than just capacity (e.g., dQ/dV, voltage difference between charge and discharge) to predict capacity-fade trajectories.The prediction tool uses Bayesian regression to fit functional forms, e.g., power law, sigmoids, etc., to predict capacity-fade dynamics. By fitting functional forms, the capacity fade can be interrogated at any point in the future, allowing for early cell-failure prediction. Additionally, Bayesian regression allows for accurate uncertainty estimates that account for cell-to-cell variability (aleatoric uncertainty) and the lack of observation data (epistemic uncertainty). By only using early cycle data to predict the capacity fade trajectory, uncertainty bounds at end-of-life can be extremely large (see Fig. 1a). The large uncertainty bounds are further exacerbated because there is no systematic way to define the prior distribution of the functional forms’ parameters.We improve the predicted trajectory confidence interval using two methods. First, we show that a small amount of held-out cycling data is sufficient to derive appropriate prior distributions for the functional forms’ parameters, effectively leading to data-driven priors. We propose constructing the data-driven priors by first running Bayesian regression starting with uninformed priors to generate intermediate cell-specific posterior parameter distributions. These posterior distributions are combined using a Gaussian mixture model for each parameter to create the data-driven priors. Second, we use a wide variety of electrochemical features extracted from the first few weeks of testing, such as changes in cell thermodynamics (log(|mean(ΔdQ/dVw3 - w0 (V)|)) or cell kinetics (Δ(ΔV(50%SOC)) w3-w0 ), to train Gaussian process models that predict the capacity in the near-term future, and then using these predictions with their associated uncertainty in addition to the measured early cycle data to estimate the degradation trajectory. Notably, these two methods (data-driven priors and prediction-informed trajectories) are complementary and can be combined.We evaluate the performance of our proposed method on an open-source dataset from Iowa State University and Iowa Lakes Community College (ISU-ILCC)1. Our initial findings suggest that, when only few observations are available, a power law functional provides the most accurate predictions. However, a twin sigmoidal function becomes more accurate as the number of observations further increases. We also find that using as little as 10% of the dataset towards generating data-driven priors can lead to significant improvement in prediction accuracy when using early cycle data. Last, we found that augmenting early-cycle data with Gaussian process-predicted capacity data greatly improves the prediction accuracy. We will present a comprehensive comparison of our methods to other methods available in the literature2 and apply this method to additional battery datasets.Figure 1. (a) Predicted capacity fade trajectory for a particular cell without data-informed priors and without Gaussian process-predicted mid-life responses. (b) Predicted capacity fade trajectory incorporating data-informed priors and Gaussian process-predicted mid-life responses. The 3 red dots indicate measured responses, the black dots indicate future (unseen) responses, the black line indicates the averaged predicted trajectory using Bayesian regression, the blue shaded area indicates the prediction envelope, and the 14 green dots with error bars indicate the mid-life Gaussian process-predicted responses.
- New
- Research Article
- 10.1103/nkq9-zh2y
- Nov 24, 2025
- Physical Review D
- Sumit Kumar + 2 more
The parameter estimation (PE) for gravitational wave (GW) merger events relies on a waveform model calibrated using numerical simulations. Within the Bayesian framework, this waveform model represents the GW signal produced during the merger and is crucial for estimating the likelihood function. However, these waveform models may possess systematic errors that can differ across the parameter space. Addressing these errors in the current data analysis pipeline is an active area of research. We introduce parametrizations for the uncertainties in the amplitude and phase of the reference waveform model. When the error budget in the amplitude and phase of the waveform model, as a function of frequency, is known, it can be used as a prior distribution in the Bayesian framework. We also show that conservative priors can be used to quantify uncertainties in waveform modeling without any knowledge of waveform uncertainty error budgets. Through zero-noise injections and PE recoveries, we demonstrate that even 1%–2% of errors in relative phase to the actual waveform model, for a GW150914-like signal and advanced LIGO detector sensitivity, can introduce biases in the recovered parameters. These biases can be corrected when we account for waveform uncertainties within the PE framework. By analyzing a series of simulated signals from mergers with precessing orbits and recovering them using a nonspinning waveform model, we demonstrate that we can reduce the ratio of systematic errors to statistical errors. This approach allows us to address scenarios where specific physical effects are missing in waveform modeling. The code that implements our parametrization for performing PE is available as a Python package “pycbc_wferrors_plugin”, compatible with the PyCBC open source GW analysis library.
- New
- Research Article
- 10.1017/jfm.2025.10863
- Nov 24, 2025
- Journal of Fluid Mechanics
- Lucas Villanueva + 3 more
A data assimilation (DA) strategy based on an ensemble Kalman filter (EnKF) is used to enhance the predictive capabilities of scale-resolving numerical tools for the analysis of flows exhibiting cyclic behaviour. More precisely, an ensemble of numerical runs using large-eddy simulations (LES) for a compressible intake flow rig is augmented via the integration of high-fidelity data. This observation is in the form of instantaneous velocity measurements, which are sampled at localised sensors in the physical domain. Two objectives are targeted. The first objective is the calibration of an unsteady inlet condition suitable to capture the cyclic flow investigated. The second objective is the analysis of the synchronisation of the LES velocity field with the available observations. In order to reduce the computational costs required for this analysis, a hyper-localisation procedure (HLEnKF) is proposed and integrated in the library CONES, tailored to perform fast online DA. The proposed strategy performs a satisfactory calibration of the inlet conditions, and its robustness is assessed using two different prior distributions for the free parameters optimised in this task. The DA state estimation is efficient in obtaining accurate local synchronisation of the inferred velocity fields with the observed data. The modal analysis of the kinetic energy field provides additional insight into the improved reconstruction quality of the velocity field. Thus, the HLEnKF shows promising features for the calibration and synchronisation of scale-resolved turbulent flows, opening perspectives of applications for complex phenomena using advanced tools such as digital twins.
- New
- Research Article
- 10.3390/info16121020
- Nov 23, 2025
- Information
- Sumei Li + 2 more
As an economical and effective method to enhance the resolution of remote sensing images (RSIs), remote sensing image super-resolution (RSISR) has been widely studied. However, the existing methods lack the utilization of prior information in RSIs, which leads to unsatisfactory detail representation in the reconstructed images. To address this, in this paper, we propose a digital surface model (DSM) and fractal-guided multi-directional super resolution network (DFMDN), which utilizes additional explicit priors from DSM to facilitate the reconstruction of realistic high-frequency details. Meanwhile, to more accurately identify relationships between objects in RSIs, we design a multi-directional feature extraction module: multi-directional residual-in-residual dense blocks (MDRRDB), which captures the variation from different viewing angles. Finally, to guide and constrain the network to generate reconstructed images with textures that align more closely with natural patterns, we develop a fractal mapping algorithm (FMA) and a related loss function. Our method demonstrates significant improvements in both quantitative metrics and visual quality compared to existing approaches on various datasets.
- New
- Research Article
- 10.1371/journal.pone.0335945
- Nov 21, 2025
- PLOS One
- Maral Babapour Chafi + 3 more
Activity-based Flexible Offices (AFOs) provide employees with a variety of workspaces to choose from based on their tasks, rather than having assigned desks. While the adoption of AFOs is increasing due to flexibility and cost-efficiency, there is limited research about the consequences of transitioning to AFOs from the perspective of staff managers. The purpose of this study is to explore how managers experience and cope with challenges that may arise in AFOs. Our qualitative descriptive study is based on two case studies that investigate the consequences of AFOs. Data collection involved semi-structured interviews with a total of 33 managers in two organisations, 12–18 months post-relocation. An inductive, bottom-up process was used for coding and thematization of the interview transcripts. Our results show that AFOs can enhance communication and collaboration depending on the units’ collaboration needs and prior geographical distribution. However, this effect was overshadowed by task-environment misalignments on within-team communication, distractions, and limitations on adjustments and recruitments. Additionally, managers faced conflicting loyalties between defending the organisation’s decision to implement AFOs while ensuring compliance with legal work environment requirements despite limited resources. There is a risk that the implementation of flexible offices will fragment and complicate managers’ tasks, such as ensuring that daily operations run smoothly, meeting legal responsibilities, and managing and recruiting staff. This poses a risk to managers’ productivity and health, and consequently, the achievement of organisational goals. The study uncovers managerial experiences, challenges, and coping strategies in AFOs, offering valuable insights for organisations considering this office type.
- New
- Research Article
- 10.1101/2025.11.13.688299
- Nov 14, 2025
- bioRxiv
- Qiran Jia + 2 more
Recent advances in high-throughput technologies have enabled observational studies to collect high-dimensional omic data. However, such data, often measured on small sample sizes, pose challenges to model-based clustering approaches such as Gaussian Mixture Models. Existing methods often fail to generalize due to model instability under complex mixture patterns. To overcome these limitations, we propose a natural-gradient variational inference framework for Gaussian mixture models named Praxis-BGM that incorporates informative priors—cluster-specific means, covariances, and structural connectivity—from large-scale reference data with known cluster or class labels to enable semi-supervised transfer learning. We derive natural-gradient updates that integrate prior knowledge, leveraging the Variational Online Newton algorithm. We also perform feature selection for clustering using Bayes Factors. Implemented using the JAX library for accelerator-oriented computation, Praxis-BGM is computationally efficient and scalable. We demonstrate the effectiveness of Praxis-BGM in extensive simulations and with two real-world applications: bulk transcriptomic datasets for breast cancer subtyping (the Cancer Genome Atlas Breast Invasive Carcinoma and the Molecular Taxonomy of Breast Cancer International Consortium), and transferring cell-type annotations between single-cell transcriptomic datasets produced by different single-cell RNA-seq technologies in a human pancreas study. Even when priors are partially mismatched with the target data, Praxis-BGM enhances semi-supervised clustering accuracy and biological interpretability.
- New
- Research Article
- 10.1080/01587919.2025.2578773
- Nov 12, 2025
- Distance Education
- Jirong Tian + 3 more
Online collaborative learning (OCL) is widely recognized for its potential to enhance students’ collaboration skills, problem-solving abilities, and critical thinking, yet learners often struggle to sustain deep cognitive engagement without adequate support. With the emergence of generative AI (GAI), new opportunities have arisen to scaffold collaboration, but little is known about how learner characteristics such as prior knowledge shape cognitive engagement in GAI-supported OCL contexts. This study investigated the frequency and co-occurrence patterns of cognitive engagement across groups with distinct prior knowledge distributions (i.e., high vs. low mean; high vs. low variation) in a GAI-enhanced OCL environment. Thirty-nine students from a key Chinese university participated in a GAI-supported collaborative writing task. Findings revealed that High–High groups demonstrated more knowledge exchange, High–Low groups engaged in deeper knowledge co-construction, Low–High groups prioritized planning and strategy formulation, while Low–Low groups focused more on process monitoring. These insights guide instructors in providing personalized cognitive support or designing tailored GAI agents to enhance cognitive engagement and learning outcomes.
- New
- Research Article
- 10.1371/journal.pone.0333784
- Nov 12, 2025
- PloS one
- Nguyen Ngoc Thach
The advanced ASEAN nations-Indonesia, Malaysia, the Philippines, Singapore, and Thailand-are navigating significant global uncertainties that challenge their industrialization ambitions. Human capital, recognized as a pivotal driver of technological progress, has not been adequately integrated into growth models for these countries. This study investigates the dual function of human capital within an extended Nelson-Phelps framework of technology diffusion, incorporating Romer's insights, across these five ASEAN countries from 1965 to 2019. Employing a Bayesian hierarchical analysis with specific informative priors effectively addresses statistical challenges. The findings reveal that human capital accelerates both domestic innovation and the adoption of foreign technologies in these nations. Notably, high-skilled labor significantly contributes to technological advancements, and domestic innovation plays a more substantial role in enhancing productivity growth than technology imitation. The extended Nelson-Phelps framework, which incorporates human capital's role in both innovation and technology diffusion, is well-suited for modeling the catch-up development of ASEAN economies. These insights offer valuable contributions to growth literature and practical applications in technology catch-up strategies.
- New
- Research Article
- 10.63363/aijfr.2025.v06i06.1950
- Nov 11, 2025
- Advanced International Journal for Research
- J Niveditha Dorcus
Introduction A human milk bank, also known as a breast milk bank or lactarium, is a service that gathers, examines, prepares, and prescribes human milk provided by nursing moms who are not biologically related to the receiving child. For the first year of life, nursing is the best nutrition for newborns. The primary goal of the study is to evaluate postpartum moms' attitudes and knowledge regarding human breast milk banking. Methodology: A descriptive, non-experimental study design was used. The study's goals were to evaluate postpartum mothers' knowledge and attitudes regarding human milk banking and to determine the relationship between these factors and sociodemographic characteristics.Using a non-probability purposive sample technique, 100 postpartum moms were chosen for the study. Result: The main conclusion revealed that 48 (48%) postpartum moms had inadequate awareness of human milk banking, 35 (35%) had average knowledge, and 17 (17%) had bad information. The majority of 40% of new mothers exhibit negative Attitude The degree of information about human milk banking is significantly correlated with demographic factors (postnatal mothers' age and religion). However, there was no discernible correlation between the degree of knowledge of human milk banking and education, occupation, family type, or prior information. Demographic factors (education and postpartum moms' attitudes toward human milk banking) are significantly correlated. However, postpartum moms' attitudes toward human milk banking did not significantly correlate with age, religion, occupation, or family type. Conclusion: This suggests that the majority of postpartum women had inadequate understanding and attitudes regarding human milk banking. Therefore, it is imperative that health professionals, particularly nurses, raise awareness of human milk banking and urge moms to continue breastfeeding.
- New
- Research Article
- 10.1080/15548732.2025.2585273
- Nov 8, 2025
- Journal of Public Child Welfare
- Nan Wang
ABSTRACT Nearly 400,000 children are in U.S. foster care, and about one-third have a disability, compared to 10% in the general population. Using data from two national databases (NYTD and AFCARS) (N = 5,914), this study applied Bayesian logistic regression with both non-informative and informative priors to examine whether disability moderates the relation between placement instability and high school graduation. Youth without disabilities were more likely to graduate when placement changes were infrequent, whereas those with emotional disabilities showed greater resilience to frequent moves. Findings underscore the moderating role of disability and the importance of promoting placement stability for educational success.
- New
- Research Article
- 10.1051/0004-6361/202555737
- Nov 7, 2025
- Astronomy & Astrophysics
- Xiaocheng Yang + 5 more
Reconstructing a high-resolution image of observed radio sources from the incomplete visibilities poses a challenging, ill-posed, inverse problem. Although compressive sensing has demonstrated remarkable performance in radio interferometric imaging, traditional compressed sensing methods approximately replace the L_0-norm minimisation problem with the L_1-norm minimisation problem, which brings about a bias issue. To ameliorate the bias problem and efficiently obtain an accurate solution in radio interferometry, we propose a novel, non-convex sparse regularisation method based on smoothly clipped absolute deviation (SCAD) in this paper. The proposed method utilises the continuous SCAD penalty function to approximate the L_0 norm and efficiently solves the non-convex optimisation problem by using an improved proximal gradient algorithm. The improved proximal gradient algorithm introduces a restart strategy and an adaptive non-monotonic step-size strategy to improve the convergence speed of the algorithm. Moreover, the regularisation parameter was adaptively updated using the prior information of the image. Numerical simulation experiments are carried out on the Very Large Array (VLA) and Square Kilometre Array (SKA). We compare the proposed method with state-of-the-art imaging methods. The results show that it performs better in terms of reconstruction quality and computational efficiency.
- New
- Research Article
- 10.1080/00401706.2025.2584500
- Nov 7, 2025
- Technometrics
- Trang Bui + 2 more
We consider the problem of designing an experiment in which experimental units are connected on a network. To find optimal designs for such experiments, the experimental outcomes are assumed to follow a network-outcome model in which units potentially influence one another. To model network interference and correlation, these outcome models are often complex. As a result, the design criteria based on such models depend on unknown parameters and cannot be directly evaluated without making assumptions about their values. We mitigate this problem by defining a Bayesian design criterion, which is the mean squared error of the average treatment effect estimator integrated over a prior distribution for the unknown parameters. In general, this criterion does not have a closed-form formula, and so traditional algorithms to solve for optimal designs cannot be applied. Instead, we propose and study the use of the genetic algorithm to find near-optimal designs. Through extensive numerical studies with various real-life networks and network-outcome models, we demonstrate the robust performance of our method compared to existing design construction strategies.
- New
- Research Article
- 10.1088/1361-6501/ae1857
- Nov 6, 2025
- Measurement Science and Technology
- Jiechen Sun + 4 more
Abstract The health indicator (HI) reflects the current operational status of the equipment and affects the accuracy and reliability of the remaining useful life (RUL) prediction model. However, the majority of existing methods for constructing HI are developed based on complete full lifecycle data. In fact, it is difficult to collect complete full lifecycle data due to reasons such as differences in equipment service time and communication packet loss. Due to the lack of temporal continuity and incomplete coverage of degradation stages in fragmented data, traditional HI construction methods fail to capture consistent degradation trends and become ineffective under incomplete data conditions. We proposed a novel variational auto-encoders assistant prior (AP-VAE) network to enhance latent degradation representation in the presence of temporally fragmented and incomplete lifecycle data. Unlike traditional VAE models, instead of pre-assuming a fixed prior distribution, the proposed method introduces an assistant neural network designed to learn the optimal prior distribution of the latent space from the data itself. The loss function of the proposed model is elaborately designed by integrating reconstruction error, Kullback–Leibler divergence between the posterior and adaptive prior distributions, and an improved maximum mean discrepancy regularization to ensure the latent variables capture consistent degradation tendencies under incomplete lifecycle data. The AP-VAE network is validated across multiple fragmented degradation scenarios, and the proposed HI construction method consistently outperforms existing approaches in trend preservation, correlation, and monotonicity. More importantly, by addressing the challenge of fragmented monitoring data in rolling bearings of marine electric propulsion systems, the proposed approach provides practical value for improving equipment health assessment, ensuring reliable RUL prediction in real industrial applications.
- New
- Research Article
- 10.5194/hess-29-6023-2025
- Nov 6, 2025
- Hydrology and Earth System Sciences
- Boting Hu + 6 more
Abstract. Accurate quantification of wetland depression water storage capacity (WDWSC) is imperative for comprehending the wetland hydrologic regulation functions to support integrated water resources management. Considering the challenges posed by the high acquisition cost of high-resolution lidar DEM or the absence of field measurements for most wetland areas, urgent attention is required to develop an accurate estimation framework for WDWSC using open-source, low-cost, multi-source remote sensing data. In response, we developed a novel framework, WetlandSCB, utilizing coarse-resolution terrain data for accurate estimation of WDWSC. This framework overcame several technical difficulties, including biases in above-water topography, incompleteness and inaccuracy of wetland depression identification, and the absence of bathymetry. Validation and application of the framework were conducted in two national nature reserves of northeast China. The study demonstrated that, by integrating the priority-flood algorithm, morphological operators, and prior information, one can accurately delineate the wetland depression distribution, with overall accuracy and kappa coefficient both exceeding 0.95. The use of a water occurrence map can effectively correct numerical biases in above-water topography, with Pearson coefficient and R2 increasing by 0.33 and 0.38, respectively. Coupling spatial prediction and modeling with remote sensing techniques yielded highly accurate bathymetry estimates, with < 3 % relative error compared with field measurements. Overall, the WetlandSCB framework achieved estimation of WDWSC with < 10 % relative error compared with field topographic and bathymetric measurements. The framework and its concept are transferable to other wetland areas globally where field measurements and/or high-resolution terrain data are unavailable, contributing to a major technical advancement in estimating WDWSC in river basins.
- New
- Research Article
- 10.1556/1886.2025.00054
- Nov 6, 2025
- European journal of microbiology & immunology
- Joy Backhaus + 5 more
The potential aetiological relevance of Blastocystis spp. and Dientamoeba fragilis in the human intestine, and their possible associations with Campylobacter spp. and Giardia duodenalis, remain unclear. By incorporating Bayesian priors to account for diagnostic test accuracy, we statistically analysed the interactions among these microorganisms. Diagnostic test accuracy data were derived from multiple PCR assays and incorporated as priors to adjust for non-differential misclassification. Bayesian odds ratios and relationships based on DNA quantity were assessed for a dataset of 1,065 stool samples containing at least one of the four target microorganisms. Accounting for diagnostic test accuracy resulted in wide credibility intervals. Blastocystis spp. were negatively associated with G. duodenalis. G.duodenalis was most often detected in the absence of Blastocystis spp. and D. fragilis, whereas detection of Blastocystis spp. was associated with lower Campylobacter spp. DNA abundance. A negative association between Blastocystis spp. and Campylobacter spp. was observed only in the absence of D. fragilis. The assumed variation in detection rates of Campylobacter spp. and G. duodenalis based on the presence of Blastocystis spp. and/or D. fragilis was confirmed. Future epidemiological studies should explore interactions among multiple microorganisms using robust statistical approaches.
- New
- Research Article
- 10.1088/1361-6501/ae185e
- Nov 6, 2025
- Measurement Science and Technology
- Keping Wang + 4 more
Abstract Non-homogeneous hazy images are challenging to dehaze because the spatially random distribution and variable concentration of haze are difficult to determine precisely. Therefore, dehazing methods should be adaptive to the spatial characteristics of the haze. This paper proposes a detail-enhanced dynamic masked autoencoder (MAE) for non-homogeneous hazy image dehazing to address this issue. This architecture integrates three branches: the dynamic MAE reconstruction branch (DARB), the detail-enhanced capture branch (DECB), and the haze-salient attention branch (HSAB). In DARB, a dynamic masking mechanism based on HSV prior knowledge is embedded into the autoencoder of MAE. The mechanism generates masks for the MAE adaptively based on the relationship between the difference of the V and S channels in HSV and the haze concentration. This adaptive masking enables the network to perform differentiated reconstruction for regions with varying haze concentrations. Leveraging frequency-domain decomposition, the DECB and HSAB are respectively designed to enhance high-frequency detail features and capture low-frequency global haze structures, thereby enabling the network to effectively restore both fine textures and overall scene structures. By integrating three complementary branches, the proposed framework generates masks based on HSV prior information and adaptively restores clear images. Experiments on non-homogeneous haze datasets show that the proposed method effectively restores images under challenging haze conditions. On the NH-Haze dataset, it achieves improvements of 0.22 dB in PSNR and 0.0044 in SSIM over existing methods, demonstrating superior performance in both quantitative metrics and subjective visual quality.