Published in last 50 years
Articles published on Weak Consistency
- New
- Research Article
- 10.61942/jhk.v2i6.462
- Oct 17, 2025
- Jurnal Hukum dan Keadilan
- Mardiana Mardiana + 2 more
This study examines the existence of academic manuscripts in implementing the concept of meaningful public participation as mandated by Law No. 13 of 2022 concerning the Second Amendment to Law No. 12 of 2011 on the Establishment of Laws and Regulations. The main issue addressed is the weak consistency in attaching academic manuscripts to the legislative process, which often renders public participation merely formal and not substantive. The objective of this research is to analyze the urgency of academic manuscripts in ensuring justice, inclusivity, and transparency in law-making, while also evaluating their practical application in drafting Regional Regulations (Perda) in Bontang City during 2018–2024. The research applies a normative legal method (doctrinal research) using statutory, conceptual, and case approaches. The data analyzed consist of primary and secondary legal materials as well as empirical evidence from Bontang City’s e-archive. The findings reveal that academic manuscripts play a strategic role in strengthening public participation; however, in practice, they are often disregarded or prepared only as administrative formality without substantive study. The Bontang case study demonstrates the low consistency of academic manuscripts in the drafting of local regulations, leading to weak legitimacy and regulatory quality. This study concludes that the existence of academic manuscripts must be reinforced through stricter regulation and inclusive legislative practices. The recommendations emphasize the need for a stronger mechanism of public information disclosure, mandatory substantive participation, and institutional capacity building to ensure that every policy enacted is based on scientific analysis and genuinely reflects the aspirations of the people.
- New
- Research Article
- 10.1080/10485252.2025.2570162
- Oct 11, 2025
- Journal of Nonparametric Statistics
- Yi Wu + 2 more
This work investigates the nonparametric regression model for a class of random errors satisfying the Bernstein-type inequality. The weak consistency, strong consistency, complete consistency, as well as the rates of strong consistency and complete consistency for the integral weight kernel estimator are presented under fairly mild conditions. The results established in this paper markedly improve and extend the corresponding ones in the literature. Numerical studies are also provided to validate the theoretical outcomes.
- Research Article
- 10.1080/00949655.2025.2566415
- Sep 30, 2025
- Journal of Statistical Computation and Simulation
- Shuli Wu + 2 more
A new estimator for the tail index of a heavy-tailed distribution is proposed by combining the block maxima and peaks-over-threshold methods, tailored for block data where only a few large values are observed within each block. The weak consistency of the estimator is established, and its asymptotic expansion as well as asymptotic normality are derived under the second order regular variation condition. A small simulation study and a case study are conducted to evaluate the performance of the proposed estimator and compare the new estimator with closely related estimators, indicating that our new estimator may have better performance in some cases, in terms of the estimated mean bias and mean squared error.
- Research Article
- 10.62872/ndvtwj23
- Sep 17, 2025
- Ipso Jure
- Mardiana Mardiana + 2 more
This study examines the existence of academic manuscripts in implementing the concept of meaningful public participation as mandated by Law No. 13 of 2022 concerning the Second Amendment to Law No. 12 of 2011 on the Establishment of Laws and Regulations. The main issue addressed is the weak consistency in attaching academic manuscripts to the legislative process, which often renders public participation merely formal and not substantive. The objective of this research is to analyze the urgency of academic manuscripts in ensuring justice, inclusivity, and transparency in law-making, while also evaluating their practical application in drafting Regional Regulations (Perda) in Bontang City during 2018–2024. The research applies a normative legal method (doctrinal research) using statutory, conceptual, and case approaches. The data analyzed consist of primary and secondary legal materials as well as empirical evidence from Bontang City’s e-archive. The findings reveal that academic manuscripts play a strategic role in strengthening public participation; however, in practice, they are often disregarded or prepared only as administrative formality without substantive study. The Bontang case study demonstrates the low consistency of academic manuscripts in the drafting of local regulations, leading to weak legitimacy and regulatory quality. This study concludes that the existence of academic manuscripts must be reinforced through stricter regulation and inclusive legislative practices. The recommendations emphasize the need for a stronger mechanism of public information disclosure, mandatory substantive participation, and institutional capacity building to ensure that every policy enacted is based on scientific analysis and genuinely reflects the aspirations of the people.
- Research Article
- 10.1016/j.eswa.2025.129578
- Sep 1, 2025
- Expert Systems with Applications
- Hua Wang + 3 more
UniTask+: Exploring and Unifying Strong and Weak Task-aware Consistency for Semi-supervised Blastocyst Image Segmentation
- Research Article
- 10.1016/j.vaccine.2025.127450
- Aug 1, 2025
- Vaccine
- Zachary D V Abel + 3 more
Accuracy of online surveys in predicting COVID-19 uptake and demand: A cohort study investigating vaccine sentiments and switching in 13 countries from 2020 to 2022.
- Research Article
- 10.1038/s41598-025-11551-4
- Jul 15, 2025
- Scientific reports
- Chenye Zhang
This study focuses on the intelligent rendering task of "mirror painting", aiming to develop a high-quality, real-time rendering method applicable to digital art, human-computer interaction, and game art scenarios. The core challenge lies in achieving high-fidelity local texture modeling and global style consistency under limited computational resources. Existing methods such as StyleSwin, Style-Aware Network (SANet), and Stable Diffusion exhibit weak structural consistency, severe style drift, or high computational costs, making it difficult to simultaneously balance speed and image quality. To address these challenges, this study proposes an intelligent rendering method called the Mamba-Swin-Low-Rank Adaptation (LoRA) rendering framework. The method integrates three key technologies. First, local features are extracted based on the Mamba state space model to effectively preserve brushstrokes and edge details. Second, Swin Transformer for global feature modeling is introduced to reduce complexity through sliding window attention mechanisms. Finally, style transfer optimization is carried out in combination with LoRA, maintaining style consistency while updating only a small number of parameters. Experimental results demonstrate that the method achieves 34.2 dB in Peak Signal-to-Noise Ratio, 0.91 in Structural Similarity Index Measure, and 0.18 in Learned Perceptual Image Patch Similarity for image quality assessment. The proposed model outperforms current mainstream methods in image quality, computational efficiency, and interactive experience, showing strong potential for practical applications.
- Research Article
- 10.1186/s12874-025-02606-1
- Jun 7, 2025
- BMC Medical Research Methodology
- Ariel Wang + 8 more
BackgroundWhilst interest in efficient trial design has grown with the use of electronic health records (EHRs) to collect trial outcomes, practical challenges remain. Commonly raised concerns often revolve around data availability, data quality and issues with data validation. This study aimed to assess the agreement between data collected on clinical trial participants from different sources to provide empirical evidence on the utility of EHRs for follow-up in randomised controlled trials (RCTs).MethodsThis retrospective, participant-level data utility comparison study was undertaken using data collected as part of a UK primary care-based, randomised controlled trial (OPTiMISE). The primary outcome measure was the recording of all-cause hospitalisation or mortality within 3 years post-randomisation and was assessed across (1) Coded primary care data; (2) Coded-plus-free-text primary care data; and (3) Coded secondary care and mortality data. Agreement levels across data sources were assessed using Fleiss’ Kappa (K). Kappa statistics were interpreted using an established framework, categorising agreement strength as follows: <0 (poor), 0.00–0.20 (slight), 0.21–0.40 (fair), 0.41–0.60 (moderate), 0.61–0.80 (substantial), and 0.81–1.00 (almost perfect) agreement. The impact of using different data sources to determine trial outcomes was assessed by replicating the trial’s original analyses.ResultsAlmost perfect agreement was observed for mortality outcome across the three data sources (K = 0.94, 95%CI 0.91–0.98). Fair agreement (weak consistency) was observed for hospitalisation outcomes, including all-cause hospitalisation or mortality (K = 0.35, 95%CI 0.28–0.42), emergency hospitalisation (K = 0.39, 95%CI 0.33–0.46), and hospitalisation or mortality due to cardiovascular disease (K = 0.32, 95%CI 0.19–0.45). The overall trial results remained consistent across data sources for the primary outcome, albeit with varying precision.ConclusionSignificant discrepancies according to data sources were observed in recording of secondary care outcomes. Investigators should be cautious when choosing which data source(s) to use to measure outcomes in trials. Future work on linking participant-level data across healthcare settings should consider the variations in diagnostic coding practices. Standardised definitions for outcome measures when using healthcare systems data and using data from different data sources for cross-checking and verification should be encouraged.
- Research Article
- 10.3390/app15095107
- May 4, 2025
- Applied Sciences
- Xunci Li + 4 more
Cell nuclei instance segmentation plays a critical role in pathological image analysis. In recent years, fully supervised methods for cell nuclei instance segmentation have achieved significant results. However, in practical medical image processing, annotating dense cell nuclei at the instance level is often costly and time-consuming, making it challenging to acquire large-scale labeled datasets. This challenge has motivated researchers to explore ways to further enhance segmentation performance under limited labeling conditions. To address this issue, this paper proposes a network based on category-adaptive sampling and attention mechanisms for semi-supervised nuclei instance segmentation. Specifically, we design a category-adaptive sampling method that forces the model to focus on rare categories and dynamically adapt to different data distributions. By dynamically adjusting the sampling strategy, the balance of samples across different cell types is improved. Additionally, we propose a strong–weak contrast consistency method that significantly expands the perturbation space. Strong perturbations enhance the model’s ability to discriminate key nuclei features, while weak perturbations improve its robustness against noise and interference. Furthermore, we introduce a region-adaptive attention mechanism that dynamically assigns higher weights to key regions, guiding the model to prioritize learning discriminative features in challenging areas such as blurred or ambiguous cell boundaries. This improves the morphological accuracy of the segmentation masks. Our method effectively leverages the potential information in unlabeled data, thereby reducing reliance on large-scale, high-quality labeled datasets. Experimental results on public datasets demonstrate the effectiveness of our approach.
- Research Article
- 10.1609/aaai.v39i5.32573
- Apr 11, 2025
- Proceedings of the AAAI Conference on Artificial Intelligence
- Delong Liu + 5 more
Recently, diffusion-based video generation models have achieved significant success. However, existing models often suffer from issues like weak consistency and declining image quality over time. To overcome these challenges, inspired by aesthetic principles, we propose a non-invasive plug-in called Uniform Frame Organizer (UFO), which is compatible with any diffusion-based video generation model. The UFO comprises a series of adaptive adapters with adjustable intensities, which can significantly enhance the consistency between the foreground and background of videos and improve image quality without altering the original model parameters when integrated. The training for UFO is simple, efficient, requires minimal resources, and supports stylized training. Its modular design allows for the combination of multiple UFOs, enabling the customization of personalized video generation models. Furthermore, the UFO also supports direct transferability across different models of the same specification without the need for specific retraining. The experimental results indicate that UFO effectively enhances video generation quality and demonstrates its superiority in public video generation benchmarks.
- Research Article
3
- 10.1016/j.ijbiomac.2024.138762
- Feb 1, 2025
- International journal of biological macromolecules
- Xiaojie Qian + 4 more
Understanding the dual impact of oat protein on the structure and digestion of oat starch at pre- and post-retrogradation stages.
- Research Article
- 10.3390/rs17020276
- Jan 14, 2025
- Remote Sensing
- Zhipeng Lv + 1 more
Global Navigation Satellite System/Acoustic (GNSS/A) underwater positioning technology is attracting more and more attention as an important technology for building the marine Positioning, Navigation, and Timing (PNT) system. The random error of the tracking point coordinate is also an important error source that affects the accuracy of GNSS/A underwater positioning. When considering its effect on the mathematical model of GNSS/A underwater positioning, the Total Least-Squares (TLS) estimator can be used to obtain the optimal position estimate of the seafloor transponder, with weak consistency and asymptotic unbiasedness. However, the tracking point coordinates and acoustic ranging observations are inevitably contaminated by outliers because of human mistakes, failure of malfunctioning instruments, and unfavorable environmental conditions. A robust alternative needs to be introduced to suppress the adverse effect of outliers. The conventional Robust TLS (RTLS) strategy is to adopt the selection weight iteration method based on each single prediction residual. Please note that the validity of robust estimation depends on a good agreement between residuals and true errors. Unlike the Least-Squares (LS) estimation, the TLS estimation is unsuitable for residual prediction. In this contribution, we propose an effective RTLS_Eqn estimator based on “total residuals” or “equation residuals” for GNSS/A underwater positioning. This proposed robust alternative holds its robustness in both observation and structure spaces. To evaluate the statistical performance of the proposed RTLS estimator for GNSS/A underwater positioning, Monte Carlo simulation experiments are performed with different depth and error configurations under the emulational marine environment. Several statistical indicators and the average iteration time are calculated for data analysis. The experimental results show that the Root Mean Square Error (RMSE) values of the RTLS_Eqn estimator are averagely improved by 12.22% and 10.27%, compared to the existing RTLS estimation method in a shallow sea of 150 m and a deep sea of 3000 m for abnormal error situations, respectively. The proposed RTLS estimator is superior to the existing RTLS estimation method for GNSS/A underwater positioning.
- Research Article
- 10.1145/3704906
- Jan 7, 2025
- Proceedings of the ACM on Programming Languages
- Pavel Golovin + 2 more
Concurrent libraries implement standard data structures, such as stacks and queues, in a thread-safe manner, typically providing an atomic interface to the data structure. They serve as building blocks for concurrent programs, and incorporate advanced synchronization mechanisms to achieve good performance. In this paper, we are concerned with the problem of verifying correctness of such libraries under weak memory consistency in a fully automated fashion. To this end, we develop a model checker, RELINCHE, that verifies atomicity and functional correctness of a concurrent library implementation in any client program that invokes the library methods up to some bounded number of times. Our tool establishes refinement between the concurrent library implementation and its atomic specification in a fully parallel client, which it then strengthens to capture all possible other more constrained clients of the library. RELINCHE scales sufficiently to verify correctness of standard concurrent library benchmarks for all client programs with up to 7--9 library method invocations, and finds minimal counterexamples with 4--7 method calls of non-trivial linearizability bugs due to weak memory consistency.
- Research Article
- 10.1017/fms.2025.18
- Jan 1, 2025
- Forum of Mathematics, Sigma
- Jonah Berggren + 1 more
Abstract A dimer model is a quiver with faces embedded in a surface. We define and investigate notions of consistency for dimer models on general surfaces with boundary which restrict to well-studied consistency conditions in the disk and torus case. We define weak consistency in terms of the associated dimer algebra and show that it is equivalent to the absence of bad configurations on the strand diagram. In the disk and torus case, weakly consistent models are nondegenerate, meaning that every arrow is contained in a perfect matching; this is not true for general surfaces. Strong consistency is defined to require weak consistency as well as nondegeneracy. We prove that the completed as well as the noncompleted dimer algebra of a strongly consistent dimer model are bimodule internally 3-Calabi-Yau with respect to their boundary idempotents. As a consequence, the Gorenstein-projective module category of the completed boundary algebra of suitable dimer models categorifies the cluster algebra given by their underlying quiver. We provide additional consequences of weak and strong consistency, including that one may reduce a strongly consistent dimer model by removing digons and that consistency behaves well under taking dimer submodels.
- Research Article
32
- 10.1109/tpami.2024.3442811
- Dec 1, 2024
- IEEE transactions on pattern analysis and machine intelligence
- Yongcheng Zong + 4 more
Brain network analysis plays an increasingly important role in studying brain function and the exploring of disease mechanisms. However, existing brain network construction tools have some limitations, including dependency on empirical users, weak consistency in repeated experiments and time-consuming processes. In this work, a diffusion-based brain network pipeline, DGCL is designed for end-to-end construction of brain networks. Initially, the brain region-aware module (BRAM) precisely determines the spatial locations of brain regions by the diffusion process, avoiding subjective parameter selection. Subsequently, DGCL employs graph contrastive learning to optimize brain connections by eliminating individual differences in redundant connections unrelated to diseases, thereby enhancing the consistency of brain networks within the same group. Finally, the node-graph contrastive loss and classification loss jointly constrain the learning process of the model to obtain the reconstructed brain network, which is then used to analyze important brain connections. Validation on two datasets, ADNI and ABIDE, demonstrates that DGCL surpasses traditional methods and other deep learning models in predicting disease development stages. Significantly, the proposed model improves the efficiency and generalization of brain network construction. In summary, the proposed DGCL can be served as a universal brain network construction scheme, which can effectively identify important brain connections through generative paradigms and has the potential to provide disease interpretability support for neuroscience research.
- Research Article
1
- 10.1016/j.segan.2024.101573
- Nov 28, 2024
- Sustainable Energy, Grids and Networks
- Jing Ouyang + 5 more
A K-means cluster division of regional photovoltaic power stations considering the consistency of photovoltaic output
- Research Article
- 10.59581/konstanta.v2i4.4245
- Nov 23, 2024
- Konstanta : Jurnal Matematika dan Ilmu Pengetahuan Alam
- Yahyu Tanaiyo + 5 more
This quantitative study aims to discover the effect of the Process Oriented Guided Inquiry Learning (POGIL) learning strategy on students' science process skills concerning reaction rate material. The study employs a quasi-experimental design with a pre-posttest control group. The instrument used is a multiple-choice test to measure students' science process skills. The sample consists of all students in the XI IPA class at SMA Negeri 5 Gorontalo, totaling 45 students: 22 in the experimental group and 23 in the control group. Data analysis utilizes the Rasch Model and SPSS 16.0 for hypothesis testing using the Wilcoxon test. The analysis results show that the person reliability value is 0.00, indicating weak consistency in answering the questions, while the item reliability value is 0.71, suggesting that the quality of the test items is quite good. The mean difference in item scores from pre-test to post-test is greater in the experimental group (3.83 logit) compared to the control group (1.59 logit). Wilcoxon test results show significant values < 0.05, with the experimental group having a significant value of 0.00 and the control group at 0.00.
- Research Article
1
- 10.1080/07380577.2024.2429134
- Nov 12, 2024
- Occupational Therapy In Health Care
- Martin Karaba Bäckström + 3 more
The Sensory Processing Measure, second edition (SPM-2), is an American assessment guiding person-centered sensory processing interventions, but it lacks Swedish adaptation for occupational therapists. In eight phases, this study translated the SPM-2 into Swedish and assessed its face validity and preliminary psychometric properties, including internal consistency and inter-scale/item correlations. The findings suggest the Swedish SPM-2 is valid and reliable, although the Social Participation scale showed weak internal consistency. A larger normative study (N = 130) is needed before clinical use.
- Research Article
1
- 10.1017/s0963548324000166
- Oct 9, 2024
- Combinatorics, Probability and Computing
- Ioana Dumitriu + 2 more
Abstract We consider the community detection problem in sparse random hypergraphs under the non-uniform hypergraph stochastic block model (HSBM), a general model of random networks with community structure and higher-order interactions. When the random hypergraph has bounded expected degrees, we provide a spectral algorithm that outputs a partition with at least a $\gamma$ fraction of the vertices classified correctly, where $\gamma \in (0.5,1)$ depends on the signal-to-noise ratio (SNR) of the model. When the SNR grows slowly as the number of vertices goes to infinity, our algorithm achieves weak consistency, which improves the previous results in Ghoshdastidar and Dukkipati ((2017) Ann. Stat.45(1) 289–315.) for non-uniform HSBMs.Our spectral algorithm consists of three major steps: (1) Hyperedge selection: select hyperedges of certain sizes to provide the maximal signal-to-noise ratio for the induced sub-hypergraph; (2) Spectral partition: construct a regularised adjacency matrix and obtain an approximate partition based on singular vectors; (3) Correction and merging: incorporate the hyperedge information from adjacency tensors to upgrade the error rate guarantee. The theoretical analysis of our algorithm relies on the concentration and regularisation of the adjacency matrix for sparse non-uniform random hypergraphs, which can be of independent interest.
- Research Article
- 10.1080/00949655.2024.2411004
- Oct 9, 2024
- Journal of Statistical Computation and Simulation
- Yan Wang + 2 more
This article pays attention to the generalized edge frequency polygon. By taking weighted averages of the heights in the neighbouring 2 k ( k ≥ 1 ) bins, we can reduce the asymptotic mean integrated squared error than that of the frequency polygon estimator. As a density estimator based on histogram technique, the generalized edge frequency polygon has the advantage of computational simplicity and has been widely used in many fields. The purpose of this article is to investigate the weak consistency, the uniformly weak consistency and the rate of the uniformly weak consistency for generalized edge frequency polygon of density function under α-mixing samples, which improve and extend those of frequency polygon in the literature. In particular, the simulation study and real data analysis based on finite samples are performed to verify the validity of the theoretical results.