Published in last 50 years
Articles published on Probability Density Function
- New
- Research Article
- 10.1038/s41598-025-26965-3
- Nov 7, 2025
- Scientific reports
- Ahlem Ghouar + 8 more
This research introduces a new two-parameter distribution (NTPD). The probability density function of the NTPD exhibits distinctive characteristics, making it a valuable tool for modelling various real-world phenomena. Several key statistical properties of the distribution, including the mode, reliability function, hazard function, moments, moment generating function, Rényi entropy, fuzzy reliability, stochastic ordering, and quantile function, are derived explicitly. The model parameters are estimated using sixteen classical methods, and their performance is assessed through a comprehensive simulation study. Applications to datasets with complex behaviors, such as high skewness and peakedness, from environmental science, biomedical science, and reliability engineering demonstrate that the NTPD provides a superior fit compared to several established distributions, including the Lindley, XLindley, modified Lindley, sine Lindley, and exponentiated inverse Rayleigh distributions.
- New
- Research Article
- 10.3389/fams.2025.1660916
- Nov 6, 2025
- Frontiers in Applied Mathematics and Statistics
- Shamsul Rijal Muhammad Sabri + 1 more
Modeling income distributions is crucial for understanding inequality and providing evidence-based policy support. A key challenge, however, lies in evaluating the extent to which household income inflates over time. While income is inherently random, it exhibits a persistent upward trend, and fitting income distributions using conventional models often leads to inconsistent parameter estimates. This highlights the necessity of explicitly incorporating inflation-adjusted scaling to preserve proper statistical properties. To address this gap, we introduce the Scale-Inflated Gamma (SIG) distribution, which extends the standard Gamma distribution by including an inflation-adjusted scale parameter (δ), thereby providing greater flexibility in capturing heterogeneous income dynamics. Standard models such as the Lognormal, Pareto, or Generalized Beta of the Second Kind (GB2) systematically underestimate upper-tail incomes and fail to capture inflation-adjusted heterogeneity across subgroups (B40, M40, T20). The SIG model, in contrast, strikes a balance between parsimony and flexibility by directly adjusting for inflationary scale shifts. For instance, while the Gamma distribution underestimates the 95th percentile by 10%–12% in 2019, the SIG model reduces this bias to approximately 3%, accurately reflecting income dynamics across B40, M40, and T20 groups. We develop the theoretical foundations of the SIG distribution by deriving its probability density function (PDF), cumulative distribution function (CDF), and moments. Parameters are initially estimated using the method of moments and then refined through maximum likelihood estimation (MLE). To assess estimator precision, we derive the Fisher information matrix, using the inverse Hessian to approximate the variance–covariance matrix, thus ensuring reliable inference. A Monte Carlo simulation study is conducted to evaluate the consistency and efficiency of the estimators under various sample sizes. The SIG model is subsequently applied to Malaysian Household Income Survey (HIS) data spanning the period from 2007 to 2022. Results demonstrate that the SIG distribution offers a superior fit for modeling income inequality and upper-tail behavior compared to conventional models. Overall, the study establishes the SIG distribution as a theoretically robust and policy-relevant framework for analyzing income patterns in inflation-sensitive and structurally diverse economies.
- New
- Research Article
- 10.1088/1741-4326/ae1c50
- Nov 6, 2025
- Nuclear Fusion
- Rongjie Hong + 15 more
Abstract The density limit is investigated in the DIII-D negative triangularity (NT) plasmas which lack a standard H-mode edge. We find the limit may not be a singular disruptive boundary but a multifaceted density saturation phenomenon governed by distinct core and edge transport mechanisms. Sustained, non-disruptive operation is achieved at densities up to 1.8 times the Greenwald limit ($n_\mathrm{G}$) until the termination of auxiliary heating. Systematic power scans reveal distinct power scalings for the core ($n_e \propto P_\mathrm{SOL}^{0.27\pm0.03}$) and edge ($n_e \propto P_\mathrm{SOL}^{0.42\pm0.04}$) density limits. The edge density saturation is triggered abruptly by the onset of a non-disruptive, high-field side radiative instability that clamps the edge density below $n_\mathrm{G}$. In contrast, the core density continues to rise until it saturates, a state characterized by substantially enhanced core turbulence. Core transport evolves from a diffusive to an intermittent, avalanche-like state, as indicated by heavy-tailed probability density functions (kurtosis $\approx 6$), elevated Hurst exponents, and a $1/f$-type power spectrum. These findings suggest that the density limit in the low-confinement regime is determined by a combination of edge radiative instabilities and core turbulent transport. This distinction provides separate targets for control strategies aimed at extending the operational space of future fusion devices.
- New
- Research Article
- 10.28924/2291-8639-23-2025-274
- Nov 5, 2025
- International Journal of Analysis and Applications
- P Khamrot + 1 more
Modeling extreme events is crucial in various disciplines such as environmental sciences, hydrology, finance, and engineering. This paper introduces the KM-transformed Generalized Extreme Value (KMGEV) distribution, a novel and flexible model that generalizes the classical Generalized Extreme Value (GEV) distribution using the KM transformation framework recently proposed by Kavya and Manoharan. We derive the key statistical properties of the KMGEV distribution, including the probability density function (PDF), cumulative distribution function (CDF), survival function, hazard rate function, and quantile function. Additionally, we explore order statistics and their expected values. Parameter estimation is carried out via Maximum Likelihood Estimation (MLE) methods. Through Monte Carlo simulations, we investigate the impact of the shape parameter on moments such as skewness and kurtosis. Graphical analysis highlights the flexibility of the KMGEV model, suggesting its potential in modeling a variety of extreme value phenomena.
- New
- Research Article
- 10.1038/s41598-025-22777-7
- Nov 5, 2025
- Scientific reports
- Sulaiman Z Almutairi + 3 more
Optimal reactive power dispatch (ORPD) is a crucial task in modern power systems, aimed at improving system performance by optimizing the flow of reactive power. In this paper, a modified weighted average algorithm (MWAA) is proposed to solve both the traditional ORPD problem and the stochastic ORPD (SORPD), applied to the 30-bus IEEE system, considering the presence of photovoltaics (PVs) and wind turbine units (WTs). The proposed MWAA incorporates three enhanced strategies including fitness distance balance (FDB) method, the Weibull flight method, and quasi-oppositional-based learning (QOBL). For the SORPD solution, the MWAA focuses on minimizing total expected power loss (TEPL) and total expected voltage deviation (TEVD). The inherent uncertainties in load demand and renewable power generation from PV and WT systems are jointly considered and modeled using normal/lognormal and Weibull probability-density function (PDF). The MWAA is tested on standard benchmark functions, and the CEC-2019 test suites results compared to recent methods regarding accuracy, convergence behavior, Friedman tests, and boxplots. The results confirm that MWAA is a robust and competitive optimization technique, effectively solving both ORPD and SORPD problems while demonstrating superior performance over other state-of-the-art methods.
- New
- Research Article
- 10.3390/sym17111874
- Nov 5, 2025
- Symmetry
- Yajie Li + 6 more
This study investigates stochastic bifurcation in a generalized tristable Rayleigh–Duffing oscillator with fractional inertial force under both additive and multiplicative recycling noises. The system’s dynamic behavior is influenced by its inherent spatial symmetry, represented by the potential function, as well as by temporal symmetry breaking caused by fractional memory effects and recycling noise. First, an approximate integer-order equivalent system is derived from the original fractional-order model using a harmonic balance method, with minimal mean square error (MSE). The steady-state probability density function (sPDF) of the amplitude is then obtained via stochastic averaging. Using singularity theory, the conditions for stochastic P bifurcation (SPB) are identified. For different fractional derivative’s orders, transition set curves are constructed, and the sPDF is qualitatively analyzed within the regions bounded by these curves—especially under tristable conditions. Theoretical results are validated through Monte Carlo simulations and the Radial Basis Function Neural Network (RBFNN) approach. The findings offer insights for designing fractional-order controllers to improve system response control.
- New
- Research Article
- 10.62754/ais.v6i3.379
- Nov 5, 2025
- Architecture Image Studies
- Mahmoud M El-Borai + 2 more
Analyze how climate change affects marine oxygen production by modeling plankton–oxygen dynamics with a fractional-order nonlinear system and establishing rigorous conditions for the model’s well-posedness.We formulate a three-dimensional system d^α x(t)\/dt^α=Ax(t)+f(x(t)), where A is a diagonal matrix of order 3 and f is nonlinear. We (i) rigorously state the model, (ii) derive a Lipschitz constant for f under suitable assumptions, and (iii) prove existence, uniqueness, and continuous dependence on initial data using a fractional formula with a probability density kernel and a generalized Grönwall inequality.Under stated conditions, f satisfies a computable Lipschitz bound that yields existence and uniqueness of solutions for the fractional system. The solutions depend continuously on initial conditions, establishing well-posedness of the plankton–oxygen model.Introduces a fractional, PDF-kernel–based framework for plankton–oxygen dynamics and provides clean, general proofs of well-posedness via a generalized Grönwall approach, capturing memory effects that classical integer-order models miss.The results justify numerical simulation and sensitivity analyses of fractional marine-ecosystem models, providing a sound base for testing mitigation or management strategies affecting oxygen dynamics.Stronger theory for oxygen-cycle modeling can support evidence-based policies aimed at protecting marine ecosystems under global warming.
- New
- Research Article
- 10.30538/psrp-easl2025.0124
- Nov 5, 2025
- Engineering and Applied Science Letters
- Badmus N I + 1 more
In this article, we present a new asymmetric distribution, the Topp-Leone modified Weighted Rayleigh (TLMWR) distribution, which extends the well-known Topp-Leone distribution. We derive several of its properties, including the probability density function, cumulative distribution function, survival function, failure (hazard) rate, moments, generating functions, quantile function, and order statistics. The model parameters are estimated by the method of maximum likelihood, and a simulation study is conducted to examine the finite-sample behavior of the estimators. We summarize key characteristics of the data using graphical displays and diagnostic procedures, including normality assessments and model-selection criteria. These analyses are performed on real-world data to assess the level and direction of skewness and kurtosis. The proposed distribution is then evaluated with a real-life dataset, and its performance is compared with existing and newly proposed distributions. The results support the validity of the proposed model and highlight its effectiveness relative to existing alternatives.
- New
- Research Article
- 10.1088/1361-6587/ae1b6b
- Nov 4, 2025
- Plasma Physics and Controlled Fusion
- Dario Cipciar + 8 more
Abstract In the Wendelstein 7-X (W7-X) stellarator, a ball-pen probe (BPP) has been routinely employed to measure Ti in the island SOL across large parts of the recent 2024/2025 campaigns. The high temporal resolution of 25 µs allows us to resolve Ti fluctuations, their probability density function and modulation by low-frequency MHD modes. We present a unique comparison of ion temperature measured using BPP with a more conventional retarding field analyzer (RFA). A good agreement between the two diagnostics is found, when fast Ti measurements are reduced to the same temporal resolution (5ms) using "RFA-like" averaging. With the RFA-like interpretation of fast BPP data, we find a linear decrease of SOL Ti and Te with line-integrated density. The SOL ion-to-electron temperature ratio τi,e ranges between τi,e = 1-3 with smaller τi,e values observed for higher densities (collisionalities) and at positions close to the LCFS. The upstream BPP Ti measurements are compared to averaged C 2+ Ti obtained by a coherence imaging spectroscopy (CIS), in the divertor region, magnetically mapped to the BPP. The downstream Ti measured by CIS is about half the upstream Ti and exhibits a similar inverse scaling with line-integrated density. Lastly, we benchmark experimentally obtained SOL Ti and Te against the EMC3-Eirene modeling and find a good agreement at high density plasmas, but significantly diverging results for low density plasmas.
- New
- Research Article
- 10.3390/su17219846
- Nov 4, 2025
- Sustainability
- Young-Jin Kim + 3 more
Offshore wind turbines (OWTs) with higher capacity are typically associated with larger structural dimensions, such as increased hub height, tower diameter, and rotor diameter. Consequently, they require support structures with large-diameter piles, particularly when employing suction buckets, a type of large-diameter foundation. These large-diameter structures exhibit a distinct scour mechanism compared to the conventional mechanisms observed in smaller-diameter piles. This study investigates the scour risk of OWTs while explicitly accounting for the large pile effect. First, a scour fragility analysis is developed to evaluate the vulnerability of suction bucket foundations under scour, represented in terms of fragility curves. Then, the probability density function (PDF) of scour depth is derived from the PDF of the Keulegan–Carpenter number, a key parameter for estimating scour depth that incorporates the large pile effect. Ultimately, scour risk is quantified by integrating the PDF of scour depth with the corresponding scour fragility curve. Comparative results show that, with a safety factor of 1.0, the reliability indices considering the large-diameter pile effect are 2.509, significantly lower than 5.115 for cases that neglect this effect, representing a decrease of less than 51%. For a safety factor of 1.75, the difference is 43%. These results suggest that ignoring the large-diameter pile effect not only underestimates the scour risk of OWTs, but also demonstrates a nonlinear effect of the safety factor on OWT risk. Ignoring this effect could also compromise the sustainability of offshore wind turbine systems. This highlights the importance of considering the unique scour mechanisms associated with large-diameter OWT foundations to avoid overestimating structural risk.
- New
- Research Article
- 10.17654/0972361725072
- Nov 4, 2025
- Advances and Applications in Statistics
- Zakiah Ibrahim Kalantan
In recent years, the inverted Kumaraswamy distribution has proven to be highly effective for modeling various types of lifetime data. This paper aims to develop a bivariate inverted Kumaraswamy distribution whose marginal distributions follow the inverted Kumaraswamy form. To achieve this, an approach analogous to the Marshall-Olkin method is employed in constructing the bivariate exponential distribution, as it offers a natural and suitable framework. Several key properties are investigated, including the bivariate probability density function and its marginals, as well as the joint reliability and joint hazard functions. Both the joint probability density function and the joint cumulative distribution function are derived in closed form. Parameter estimation for the bivariate and trivariate exponentiated inverted Kumaraswamy models is performed using the maximum likelihood method, and the corresponding approximate variance-covariance matrix is obtained. Additionally, maximum likelihood prediction is addressed. The study concludes with a comprehensive simulation analysis and an application to real-world datasets, demonstrating the practical utility of the proposed models.
- New
- Research Article
- 10.1161/circ.152.suppl_3.4367065
- Nov 4, 2025
- Circulation
- Vinicius Chaves + 7 more
Background: As systemic congestion takes center stage in the prognosis of acute heart failure (AHF), the Venous Excess Ultrasound (VExUS) protocol has emerged as a compelling tool for its bedside assessment. However, its prognostic value remains unclear. To address this gap, we performed a systematic review and meta-analysis evaluating whether VExUS can reliably predict in-hospital mortality in patients admitted with AHF. Research Question: Can the VExUS protocol reliably predict in-hospital mortality in patients admitted with AHF? Methods: We systematically searched PubMed, Embase, and the Cochrane Library for studies evaluating the prognostic value of the VExUS protocol in patients with AHF. Bayesian random-effects meta-analysis was yielded for marginal posterior distributions for the overall effect and between-study heterogeneity. We used mean and 95% credible intervals (CrI) to describe these distributions, defined as the narrowest interval containing 95% of the probability density function. Our primary estimands are expressed as odds ratio (OR), and also focused on the calculation of posterior probabilities. Statistical analyses were performed with R version 4.5.0. Results: Five studies, comprising 565 patients, were included in the analysis. Mean ejection fraction ranged from 32% to 54%. Figure 1A contains the forest plot of the in-hospital mortality outcome. The average odds ratio was 0.12 (95% CrI: 0.04, 0.32). The posterior probability indicating any level of certainty regarding the score (OR<1) was 99.99%, while the probability of a clinically meaningful level of certainty (OR<0.8) was 99.97%. In terms of predictive distribution, we determined a 95% probability that the true odds ratio in a future study would fall within the range of 0.03 to 0.41 (as shown in Figure 1), with a 99.76% likelihood that it would be less than 1.0. Sensitivity analyses showed that overall effect results were not heavily influenced by different priors (Figure 1B). Figure 2 shows the posterior distribution of the overall effect and posterior probabilities related to any odds ratio cutoff. Conclusions: This systematic review and meta-analysis suggests that a VExUS grade greater than 2 is associated with increased in-hospital mortality in patients with AHF. These findings support the potential role of VExUS as a prognostic tool for risk stratification in this population, warranting further investigation in larger, prospective studies.
- New
- Research Article
- 10.17654/0972361725070
- Nov 3, 2025
- Advances and Applications in Statistics
- Yasser M Amer + 2 more
This paper proposes a new distribution that represents a new extension of the Weibull distribution. The paper initiates studying the behavior of the new distribution function and comparing it with the behavior of the classical Weibull distribution function. The shapes of the probability density function are studied, and the properties of this distribution are examined with bearing in mind that the quantile and median, the moments, moment generating function, characteristic function, Rényi and Shannon entropy and Bonferroni and Lorenz curves are all inclusive. The maximum likelihood method is used for the sake of estimating the proposed distribution parameters and the results of simulation are expounded. Two applied instances are highlighted to elucidate the estimation efficiency, and the outstanding privileges of the proposed distribution over a group of classical distributions are brought to the forefront.
- New
- Research Article
- 10.1017/jfm.2025.10660
- Nov 3, 2025
- Journal of Fluid Mechanics
- Guowei Dai + 8 more
Understanding the flow behaviour of wet granular materials is essential for comprehending the dynamics of numerous geological and physical phenomena, but remains a significant challenge, especially the transition of these flow regimes. In this study, we perform a series of rotating drum experiments to systematically investigate the dynamic observables and flow regimes of wet mono-dispersed particles. Two typical continuous flows including rolling and cascading regimes are identified and analysed, concentrating on the impact of fluid density and rotation speed. The probability density functions of surface angles, $\theta _{\textit{top}}$ and $\theta _{\textit{lo}w\textit{er}}$ , reveal distinct patterns for these two flow regimes. A morphological parameter thus proposed, termed angle divergence, is used to characterise the rolling–cascading regime transition quantitatively. By integrating quantitative observables, we construct the flow phase diagram and flow curve to delineate the transition rules governing these regimes. Notably, the resulting nonlinear phase boundary demonstrates that higher fluid densities significantly enhance the likelihood of the system transitioning into the cascading regime. This finding is further supported by corresponding variations in flow fluctuations. Our results provide new insights into the fundamental dynamics of wet granular matter, offering valuable implications for understanding the complex rheology of underwater landslides and related phenomena.
- New
- Research Article
- 10.31349/revmexfis.71.061701
- Nov 1, 2025
- Revista Mexicana de Física
- Jesus Domingo Alfin J.D.A Islas-García + 2 more
The evolution of global income distribution from 1988 to 2018 is analyzed using purchasing power parity exchange rates and well-established statistical distributions. This research proposes the use of two separate distributions to more accurately represent the overall data, rather than relying on a single distribution. The global income distribution was fitted to log-normal and gamma functions, which are standard tools in econophysics. Despite limitations in data completeness during the early years, the available information covered the vast majority of the world’s population. Probability density function (PDF) curves enabled the identification of key peaks in the distribution, while complementary cumulative distribution function (CCDF) curves highlighted general trends in inequality. Initially, the global income distribution exhibited a bimodal pattern; however, the growth of middle classes in highly populated countries such as China and India has driven the transition to a unimodal distribution in recent years. While single-function fits with gamma or log-normal distributions provided reasonable accuracy, the bimodal approach constructed as a sum of log-normal distributions yielded near-perfect fits.
- New
- Research Article
- 10.1002/cam4.71341
- Nov 1, 2025
- Cancer Medicine
- Weimin Guan + 10 more
ABSTRACTBackgroundThe low participation rate in colorectal cancer (CRC) screening may be partly attributed to the lack of consideration for the preferences of both Recipients and Providers. This study aims to explore these preferences to inform the optimization of screening design and the improvement of implementation strategies.MethodsA discrete choice experiment (DCE) was conducted in Shandong Province to examine CRC screening preferences of Recipients and Providers. The attributes and levels of the DCE were determined using a systematic literature review and explored qualitatively. Questionnaires were generated through a partial factor design, and used a mixed logit model to analyze the data. Relative importance scores (RIS) and marginal willingness to pay were used to quantify preferences, and probability density functions were employed to predict changes in participation rates under varying attribute levels.ResultsPreference data from 570 Recipients and 532 Providers were analyzed. The DCE included five attributes: screening cost (four levels), screening interval (four levels), bowel preparation (two levels), screening accuracy (three levels), and reduction in CRC‐related mortality risk (three levels). All attributes significantly influenced preferences. The RIS indicated that Recipients prioritized screening cost (42.8%), followed by interval (24.3%), mortality risk reduction (16.2%), accuracy (10.7%), and bowel preparation (6.0%), whereas Providers emphasized bowel preparation (35.4%), interval (31.7%), cost (25.1%), mortality risk reduction (6.4%), and accuracy (1.3%). Both groups showed strong support for biennial screening. Shortening the interval from 10 to 2 years increased Recipients' willingness to pay by CNY 1052.95 and Providers' expected charge by CNY 1370.84, which was also associated with higher predicted participation rates.ConclusionRecipients and Providers differed in the degree of preference for the five CRC screening attributes, but the directions of their preferences were consistent. Therefore, screening strategies should aim to balance the perspectives of both groups. Where feasible, a biennial screening program that includes bowel preparation, minimizes costs and mortality risk, and maximizes accuracy is recommended.
- New
- Research Article
- 10.1016/j.aap.2025.108238
- Nov 1, 2025
- Accident; analysis and prevention
- Liang Mu + 4 more
Quantifying uncertainties in data and model: a prediction model for extreme rainfall events with application to Beijing subway.
- New
- Research Article
- 10.1109/tpami.2025.3589728
- Nov 1, 2025
- IEEE transactions on pattern analysis and machine intelligence
- Ying Li + 5 more
Random feature latent variable models (RFLVMs) are state-of-the-art tools for uncovering structure in high-dimensional, non-Gaussian data. However, their reliance on Monte Carlo sampling significantly limits scalability, posing challenges for large-scale applications. To overcome these limitations, we develop a scalable RFLVM framework based on variational Bayesian inference (VBI), a deterministic and optimization-based alternative to sampling methods. Applying VBI to RFLVMs is nontrivial due to two key challenges: (i) the lack of an explicit probability density function (PDF) for Dirichlet process (DP) mixing weights, and (ii) the inefficiency of existing VBI approaches when handling the high-dimensional variational parameters of RFLVMs. To address these issues, we adopt the stick-breaking construction for the DP, which provides an explicit and tractable PDF over mixing weights, and propose a novel inference algorithm, block coordinate descent variational inference (BCD-VI), which partitions variational parameters into blocks and applies tailored solvers to optimize them efficiently. The resulting scalable model, referred to as SRFLVM, supports various likelihoods; we demonstrate its effectiveness under Gaussian and logistic settings. Extensive experiments on diverse benchmark datasets show that SRFLVM achieves superior scalability, computational efficiency, and performance in latent representation learning and missing data imputation, consistently outperforming state-of-the-art latent variable models, including deep generative approaches.
- New
- Research Article
- 10.1016/j.est.2025.118496
- Nov 1, 2025
- Journal of Energy Storage
- Long Ling + 2 more
Lithium-ion battery state of health estimation based on statistical features derived from voltage and temperature probability density functions under realistic randomized cycling conditions
- New
- Research Article
- 10.1080/03772063.2025.2581651
- Oct 31, 2025
- IETE Journal of Research
- Srinivas Ramavath + 2 more
Filter Bank Multicarrier (FBMC) systems are widely recognized for their superior spectral efficiency and flexibility compared to conventional orthogonal frequency division multiplexing (OFDM). However, FBMC systems also suffer from high peak-to-average power ratio (PAPR), leading to reduced efficiency in power amplifiers and signal distortion. In this paper, we propose a novel companding technique designed to reduce the PAPR of FBMC systems. This technique is based on the Laplace distribution, aiming to compress the high peaks of the signal and expand the lower amplitudes, thereby achieving better PAPR reduction without significantly increasing the bit error rate (BER). The proposed companding scheme leverages the statistical properties of the Laplace distribution to introduce a more effective non-linear transformation on the signal, compared to traditional companding methods such as exponential or μ-law or hyperbolic companding. Simulation results demonstrate that our method offers a substantial improvement in PAPR reduction, while maintaining acceptable BER performance. Additionally, this approach provides better compatibility with the FBMC framework, preserving the system's advantages of reduced out-of-band emissions and higher spectral efficiency. The findings suggest that the Laplace-based companding technique is a promising solution for enhancing the power efficiency and overall performance of FBMC systems in next-generation communication networks. In order to mitigate the issue, a novel companding transform technique specifically designed for FBMC waveforms is proposed proposed. At low symbol rate FBMC system, the probability density function of amplitude is approximated to Laplace distribution rather than a Gaussian distribution like at high symbol rate. To address this, a novel nonlinear companding transform based on the Double Exponential Laplace (DEL) distribution is employed. The proposed approach capitalizes on the statistical properties of the DEL distribution to compress the signal's peak values effectively while maintaining essential characteristics such as BER and power spectral density (PSD). The companding function is derived and its impact on PAPR reduction, attenuation factor and transfer function gain are evaluated theoretically. The simulation results demonstrate the proposed DEL-based companding transform outperforms significantly with respect to existing conventional techniques to reduce PAPR and low out-of band (OoB) lickage with minimal distortion. This method offers a promising solution for enhancing the efficiency and reliability of FBMC systems in practical applications.