Published in last 50 years
Articles published on Constant Problem
- New
- Research Article
- 10.1111/bjh.70240
- Nov 3, 2025
- British journal of haematology
- D Swan + 3 more
Outcomes for patients with multiple myeloma have improved markedly in recent years due to the introduction of highly effective immune-mediated anti-myeloma therapies in both newly diagnosed and relapsed patients. Conversely, while patients are living longer, myeloma bone disease continues to contribute significantly to morbidity and mortality. Routine incorporation of anti-resorptive therapies into patient management is recommended by consensus guidelines; however, patients continue to sustain skeletal-related events, including pathological fractures. In this review, we discuss the diagnosis and pathogenesis of myeloma bone disease and the evidence underpinning guideline recommendations for the use of bisphosphonates in patients with myeloma. We consider novel approaches to reducing bone disease presented by targeting osteoblastic activity, the impact of anti-myeloma therapies themselves on bone disease and the role of biomarkers to monitor disease activity and guide the intensity and duration of bone-targeted therapy.
- New
- Research Article
- 10.47456/cad.astro.v6n2.49699
- Oct 29, 2025
- Cadernos de Astronomia
- Sandro Dias Pinto Vitenti
Quantum field theory is currently the most accurate description of matter, but this framework is the result of a long evolution of the concept of particle in physics. In this article, we trace this development: starting from classical mechanics, where particles are treated as point-like objects; moving to quantum mechanics, where they are described as waves associated with probabilities; and finally arriving at quantum field theory, where particles appear as excitations of fundamental fields. We show that in curved or expanding spacetimes, the definition of a particle is no longer unique: different observers can adopt different vacuum states and consequently identify particles in different ways. We also discuss the role of the quantum vacuum and its connection to the cosmological constant problem, one of the major open questions in contemporary physics. By connecting quantum theory, general relativity, and cosmology, the article addresses both the progress achieved and the conceptual challenges that remain in describing the universe at its most extreme scales.
- Research Article
- 10.3390/galaxies13050114
- Oct 9, 2025
- Galaxies
- Luis Rojas + 4 more
This paper presents a systematic literature review focusing on the application of machine learning techniques for deriving observational constraints in cosmology. The goal is to evaluate and synthesize existing research to identify effective methodologies, highlight gaps, and propose future research directions. Our review identifies several key findings: (1) Various machine learning techniques, including Bayesian neural networks, Gaussian processes, and deep learning models, have been applied to cosmological data analysis, improving parameter estimation and handling large datasets. However, models achieving significant computational speedups often exhibit worse confidence regions compared to traditional methods, emphasizing the need for future research to enhance both efficiency and measurement precision. (2) Traditional cosmological methods, such as those using Type Ia Supernovae, baryon acoustic oscillations, and cosmic microwave background data, remain fundamental, but most studies focus narrowly on specific datasets. We recommend broader dataset usage to fully validate alternative cosmological models. (3) The reviewed studies mainly address the H0 tension, leaving other cosmological challenges—such as the cosmological constant problem, warm dark matter, phantom dark energy, and others—unexplored. (4) Hybrid methodologies combining machine learning with Markov chain Monte Carlo offer promising results, particularly when machine learning techniques are used to solve differential equations, such as Einstein Boltzmann solvers, prior to Markov chain Monte Carlo models, accelerating computations while maintaining precision. (5) There is a significant need for standardized evaluation criteria and methodologies, as variability in training processes and experimental setups complicates result comparability and reproducibility. (6) Our findings confirm that deep learning models outperform traditional machine learning methods for complex, high-dimensional datasets, underscoring the importance of clear guidelines to determine when the added complexity of learning models is warranted.
- Research Article
- 10.36948/ijfmr.2025.v07i05.55757
- Oct 4, 2025
- International Journal For Multidisciplinary Research
- Oswaldo Arrioja Alvarez
The most important by-product produced by an oil and gas operation is produced water (PW), a product produced in quantities that frequently surpass crude oil production itself. This wastewater is a constant problem to manage due to its complexity, including its mixture of dispersed oil, salts, metals, dissolved organic compounds, and treatment chemicals. More conventional processes, such as gravity separation, hydro cyclones, and membrane systems, are effective to a greater or lesser extent but tend to fail during high salinity events, emulsified hydrocarbons, or variable influent qualities. The microbubble reactor (MBR) has become an innovative solution that has been promising in recent years. MBRs enabled by the distinctive chemical and physical characteristics of microbubbles (i.e., high surface area-to-volume ratio, increased gas dissolution, and reactive oxygen species production) have the potential to enhance the efficiency in oil/water separation, pollutant degradation, and fouling management. This review looks at the opportunity presented by MBRs to treat produced water in a case-study style format.
- Research Article
- 10.59075/jssa.v3i4.349
- Oct 3, 2025
- Journal for Social Science Archives
- Muhammad Azeem Tufail + 3 more
Child labor is a constant socio-economic problem in Pakistan, and since the province of Punjab is the most populous and economically diverse part of the country, it has some of the highest rates of child labor. The idea behind this study is that child labour in Punjab is multidimensional problem based on the household characteristics, economic constraints and work-related factors. To examine these factors, quantitative approach was used in the study based on the data from Pakistan Social and Living Standards Measurement survey of Punjab. Kernel Regularized Least Square Machine Learning method was used to study the relationship between child labor and the factors such as poverty, unemployment, parental literacy, hours worked, school enrollment, and availability of basic facilities. The results show that parental literacy, school enrollment and children working hours are the most important determinants, whereas structural poverty is insignificant when education and household conditions are controlled. These findings suggest that improving the education of parents and providing more access to quality schooling, curbing people's working hours, and enhancing adult employment opportunities are critical to reducing child labor in Punjab. The study concludes with policy recommendations emphasizing poverty alleviation programs, conditional cash transfers, parental literacy campaigns, and strict enforcement of labor laws to protect children’s rights and ensure their educational and developmental opportunities.
- Research Article
- 10.71086/iajse/v12i3/iajse1223
- Sep 30, 2025
- International Academic Journal of Science and Engineering
- Rajat Sen + 1 more
An important topic of research pertaining to global environmental pollution and warming, energy demand, availability, and reliability is the creation of efficient alternative energy sources. In this context, thermoelectric materials can be a promising alternative energy source for power generation. Nonetheless, energy source efficiency is a constant problem, and efforts to increase it have been ongoing. To increase conversion efficiency in thermoelectric power generation, either novel materials are sought out or the properties of existing materials are adjusted. In condensed matter physics and materials research, the density functional theory (DFT) is the most used technique for examining a compound's ground state electronic structure. Density functional perturbation theory (DFPT) or the supercell and finite displacement approach (also known as the direct method) can be used to determine the phonon spectrum. Phonons are necessary to explain dynamical stability, the lattice component of thermal conductivity, thermal expansion, and other characteristics. The phonon dispersion provides information about whether a compound is dynamically stable in a certain crystal structure, which is useful when looking for novel materials. This provides an initial concept for screening a material for use.
- Research Article
- 10.1007/jhep08(2025)070
- Aug 8, 2025
- Journal of High Energy Physics
- Yang Liu
Abstract We demonstrate a unified resolution to the strong CP, hierarchy, and cosmological constant problems in type IIA flux compactifications, via 4-form fluxes and KL stabilization. We show that the strong CP problem can be effectively “solved” in type IIA orientifold constructions, particularly in the type IIA T 6/(ℤ 2 × ℤ 2) model. Building on this, we explore whether the remaining two fine-tuning problems can also be resolved within the same setup. To obtain a small cosmological constant, we adopt the KL scenario and find that, in order to avoid conflicts with the swampland distance conjecture and to eliminate the need for fine-tuning, the perturbative superpotential ∆W must take the form f 0 U 3. Additionally, we compute the gravitino mass. This allows for a resolution of the hierarchy problem without introducing fine-tuning if gravitino mass lies below 100 TeV. Taken together, these results suggest that the type IIA T 6/(ℤ 2 × ℤ 2) orientifold model provides a promising framework in which all three fine-tuning problems may be addressed simultaneously.
- Research Article
- 10.1007/s13171-025-00408-7
- Aug 5, 2025
- Sankhya A
- Wasamon Jantai + 1 more
On Combinatorial Central Limit Theorems with Different Underlying Permutations Via Approximate Zero Biasing
- Research Article
- 10.29020/nybg.ejpam.v18i3.6668
- Aug 1, 2025
- European Journal of Pure and Applied Mathematics
- Haitham Ali Qawaqneh + 4 more
In this paper, utilizing zeroth-order q-Bessel Tricomi functions, we introduce the generalized bivariate q-Laguerre polynomials. Then, we establish the generalized bivariate q-Laguerre polynomials from the context of quasi-monomiality. We examine some of their properties, such as q-multiplicative operator property, q-derivative operator property, and two q-integro-differential equations. Additionally, we derive operational representations and three q-partial differential equations for the generalized bivariate q-Laguerre polynomials. Moreover, we draw the zeros of the new polynomials, forming 2D and 3D structures, and provide a table including approximate zeros of the generalized bivariate q-Laguerre polynomials.
- Research Article
- 10.3389/fmars.2025.1589920
- Jul 31, 2025
- Frontiers in Marine Science
- Tobias Hahn + 1 more
Optics-based sensors, called optodes, for oxygen are used for routine operations on autonomous instrumentation and profiling platforms with great success. Observations of oxygen gradients with high spatial and temporal resolution become increasingly important, while shortcomings still exist, namely, time constant problems, stability issues, or accuracy levels, that limit leveraging their full scientific and operational potential. Here, we demonstrate the utility of a novel, although currently not commercially available optode, the HydroFlash O2. It was manufactured by Kongsberg Maritime Contros GmbH between 2014 and 2019, and peer-reviewed studies illustrate its use until today. Our work comprises its first integrated characterization with data from 13 HydroFlash O2 optodes assessing oxygen, temperature, salinity and hydrostatic pressure dependence, long-term stability and drift, response time, and air-calibration compatibility. We multi-point calibrated this optode up to a root mean square error (RMSE) of <1 µmol L-1 (mean RMSE: 1.79 ± 0.50 µmol L-1), depending on the fit model type. Our laboratory setup yielded a temperature-dependent response time of τ63% = 3.31 ± 0.58 s, showing no significant difference between a weakly turbulent and turbulent flow, and was at least 50 % faster compared to the two most common optodes in oceanography, i.e., 4330 (Aanderaa) and SBE 63 (Sea-Bird Scientific). We assessed its pressure dependence between 0–5797 dbar, yielding an overall factor of 2.372 ± 0.409 % per 1,000 dbar based on three multi-point calibrated, drift-corrected optodes and five CTD (conductivity - temperature - depth) profiles. Ship-underway, mooring, and CTD-cast applications promise high-quality observations, including fast oxygen level changes. The optode revealed a strong sensitivity of the sensor spot, causing erroneous oxygen measurements when exposed to direct solar irradiation during an Argo float test profile. The drift assessment covering a maximum time span of approximately 3 years is based on two optodes and yielded linear (R2 = 0.98) and exponential (τ = 2.35 ± 0.30 yr, 95 % CI) drift behaviors. The HydroFlash O2 is applicable in low to high oxygen, pressure, and temperature conditions, yet we do not call for additional performance studies unless the manufacturer reactivates its production and reduces sensor spot issues. In an ocean affected by climate change, reliable oxygen optodes will contribute crucial information about the global oxygen and carbon budget, e.g., through observations in the mixed layer, thermocline, or deep sea, and require assessments of existing and promising instrumentation.
- Research Article
- 10.1103/h24l-2h8j
- Jul 18, 2025
- Physical Review D
- Sabarnya Mitra
Without invoking any cumulant determination at the input level, we present here the first calculations of direct estimates of the Lee-Yang zeros of the QCD partition function in (2+1)-flavor QCD. These zeros are obtained in the complex isospin chemical potential μI plane using the unbiased exponential resummation formalism on Nτ=8 lattices and with physical quark masses. For different temperatures, we illustrate the stability of the zeros closest to the origin from which we subsequently procure the radius of convergence estimates. From the temperature-dependence study of the real and imaginary parts of these zeros, we try estimating one of the possible critical points forming the second-order pion condensate critical line in the isospin phase diagram. Further, we compare these resummed estimates with the corresponding Mercer-Roberts estimates of the subsequent Taylor series expansions of the first three partition function cumulants. We also outline comparisons between resummed and Taylor series results of these cumulants for real and imaginary values of μI and highlight the behavior of different expansion orders within and beyond the obtained resummed estimates of the radius of convergence. We also reestablish that this resummed radius of convergence can efficiently capture the onset of the overlap problem for finite real μI simulations.
- Research Article
- 10.1098/rsta.2023.0292
- Jul 17, 2025
- Philosophical transactions. Series A, Mathematical, physical, and engineering sciences
- Federico Scali
The cosmological constant problem is one of the greatest challenges in contemporary physics, since it is deeply rooted in the problematic interplay between quantum fields and gravity. The aim of this work is to review the key conceptual elements needed to formulate the problem and some ideas for a possible solution. I do so by weaving a fil rouge from Newtonian cosmology, through general relativity and the standard model of relativistic cosmology ([Formula: see text]-CDM), up to the theory of quantum fields. In the first part, the issues with the application of Newtonian gravity to an infinite and static Universe are addressed, observing how a cosmological term in the Poisson equation would stabilize a homogeneous matter distribution. A toy derivation of the Friedman equations using only Newtonian arguments is also shown. In the second part, the conceptual path leading to general relativity and the [Formula: see text]-CDM model is laid down, with particular emphasis to the historical introduction of the cosmological constant and its new role after the discovery of the accelerated expansion of the Universe. Finally, the problem is formulated within the framework of quantum field theory. Its many facets are discussed together with the criticalities in the formulation and some of the leading ideas for its solution are outlined.This article is part of the theme issue 'Newton, Principia, Newton Geneva Edition (17th-19th) and modern Newtonian mechanics: heritage, past & present'.
- Research Article
- 10.3390/math13142282
- Jul 16, 2025
- Mathematics
- Alexander J Zaslavski
In the present paper, we use the proximal point method with remotest set control for find an approximate common zero of a finite collection of maximal monotone maps in a real Hilbert space under the presence of computational errors. We prove that the inexact proximal point method generates an approximate solution if these errors are summable. Also, we show that if the computational errors are small enough, then the inexact proximal point method generates approximate solutions
- Research Article
- 10.35629/8193-1007109116
- Jul 1, 2025
- Journal of Architecture and Civil Engineering
- Hakiki Amalia + 2 more
The constant problems encountered by local government while managing regional assets in general are circulating in areas of asset management implementation that inaccordance with the planning, physical and legal procedures of regional assests management (have not been implemented in proper and correct way), also disoderlies in managing asset database. All of which resulting in less optimal management conduct by the local government while using these assets, or in worsen situation, causing local government experience difficulties for utilizing these assets in the future. To achieve an optimum goal of asset management, through a planned, integrated manner and stable ability that able to provide desired data and information needed in short time, requires a supportive information system to decision-making (decision supporting system) on assets or commonly referred as Asset Management Information System (Sistem Informasi Manajemen Aset/SIMA). By an information system that able to support the regional assets management through efficient and effective way, it will lead to policy transparency in the regional assets management. The road equipment assets of Probolinggo city is in great need to be studied for identifying problems and formulating actions to be taken to solve the problems. Applied method in this research was a quantitative method with data collection by questionnaries to 40 respondents as the sample size. The method of data analysis was a linear regression method aided by Statistical Package for the Social Sciences (SPSS) program. As a result, the asset management analysis found problems of (1) road equipment asset management supervision is not conducted in optimal way, (2) type of problems affecting the asset management are unreliable (weak) data collection, minimal maintenance budget, lack of labor/human resource capacity, the absence of an integrated asset information system, and low coordination between sectors, and (3) asset management can bring a positive impact on assets security and supervision.
- Research Article
- 10.1142/s021988782550210x
- Jun 18, 2025
- International Journal of Geometric Methods in Modern Physics
- Shahid Iqbal + 4 more
In this paper, we study static plane symmetric (SPS) perfect fluid spacetimes via conformal motion within the framework of coincident [Formula: see text] gravity, where [Formula: see text] is a function of the nonmetricity scalar [Formula: see text] For this, first, we derive the field equations of SPS spacetimes in [Formula: see text] gravity considering perfect fluid as a source of energy momentum tensor (EMT). We solve the formulated equations for some important [Formula: see text] gravity models. The models include linear form of [Formula: see text], which acts as a simple modification to gravity, power law [Formula: see text] gravity which accounts for nonlinear curvature effects, the exponential model which is useful in inflationary cosmology and a logarithmic model which is relevant to the late-time cosmic evolution which offers a solution to cosmological constant problem. We also give a graphical analysis of the aforementioned models to illustrate their behavior across different curvature regimes. These models further allow us to obtain a variety of solutions to the field equations. As an application, we investigate the CVFs of the obtained solutions. The CVFs occupy a rich amount of geometric as well as physical structure as they link generators of conformal symmetry algebra to the conservation laws of Physics. Our key findings include some cases where the spacetimes become conformally flat. In such cases, the spacetimes admit fifteen CVFs. In most cases, the spacetimes admit five-dimensional CVFs. In one case, there exist four-dimensional CVFs. The overall dimension of CVFs for the obtained SPS perfect fluid spacetimes in [Formula: see text] gravity turns out to be 4, 5, and 15.
- Research Article
- 10.2337/db25-1463-p
- Jun 13, 2025
- Diabetes
- Jessica Gjonaj + 7 more
Introduction and Objective: Associations between unhealthy food environments and diabetes or obesity have been studied extensively in urban areas, but not precisely in rural communities. We performed a geographically detailed study of the rural food environment using various measures to identify those strongly associated with a higher rate of diabetes, high BMI, and poor diet. Methods: We surveyed 1,310 residents of rural Sullivan County, collecting demographic, health status, and food frequency data. The food environment was assessed by cataloging restaurants and food stores into fast food and convenience stores versus supermarkets, grocery stores, and wait service restaurants. LASSO regression was used to identify food environment factors strongly associated with diabetes, high BMI, and poor diet. Proximity, density, relative proportions, and modified retail food environment indexes (mRFEI) metrics were compared. We used fixed distance bands and novel nearest-neighbor approaches for proportions. Results: Older age (LASSO coefficient: +0.30), nearest-neighbor proportion of restaurants that were fast food (+0.22), and low household income (+0.16) were associated with higher diabetes rates, whereas White race (-0.11) had lower rates. Nearest-neighbor proportion (+0.29) and density (+0.23) of fast food were the strongest predictors of high BMI, along with Hispanic ethnicity (+0.22). Low household income (-1.24) was associated with poor diet quality, whereas older age (+1.17), female gender (+1.06), and high income (+0.59) were associated with better diet quality. Nearest-neighbor metrics helped with the problem of frequent zeros (p=0.36 for skewness of fast food with 20 nearest-neighbors). Conclusion: We compared different approaches for measuring the food environment in rural settings and found that novel nearest-neighbor approaches were superior to other metrics based on fixed distance bands, proximity, or density. Disclosure J. Gjonaj: None. H. Yi: None. T.A. Flores: None. C. So: None. H.L. Motola: None. B. Elbel: None. L. Thorpe: None. D.C. Lee: None. Funding National Institutes of Diabetes and Digestive and Kidney Diseases (R01DK124400)
- Research Article
- 10.4006/0836-1398-38.2.128
- Jun 10, 2025
- Physics Essays
- Anne A Kerslake
It has been observed that major changes in physics occurred through the giving up of unproven assumptions. Here, three unproven assumptions are examined. First, the recently argued denial of particles’ existence is reassessed. Second, Einstein’s former questioning about the objectivity of the 3-space is examined through the lens of neuroscience: it is our acts of perception which create the different types of objects of the 3D-world, these acts have dramatic consequences, and they should not be overlooked. The latter does not mean that there is no objective reality, it does not make physics fall into idealism; the objective reality is merely displaced to a lower level of reality which the 3-space is the perception of. This underlying reality is not foreign to physics; it already features in Bohmian mechanics where the waves reside in the configuration space. Consequently, the abstraction of the configuration space is the third unproven assumption which is challenged. Following these three successive refutations, a multilevel picture of reality emerges, where the subjective and intersubjective 3-space is constituted of the (mostly) deceptive appearances created by our individual and collective perceptions of a nonapparent, fluid, very likely energy-based, objective physical reality which resides in the configuration space. The wave-only version of Bohmian mechanics or “Wave Theory” and the relativistic Quantum Field Theory are the relevant theories for this new picture, in which the conundrums, including the measurement problem, disappear. It is explained how this new picture might also shed some new light on the cosmological constant problem, the elusive “quantum/general relativity” unification, and additionally the unanswered question of what consciousness is.
- Research Article
- 10.3390/sym17060888
- Jun 5, 2025
- Symmetry
- Ahmed Farag Ali
Quantum field theory (QFT) and general relativity (GR) are pillars of modern physics, each supported by extensive experimental evidence. QFT operates within Lorentzian spacetime, while GR ensures local Lorentzian geometry. Despite their successes, these frameworks diverge significantly in their estimations of vacuum energy density, leading to the cosmological constant problem—a discrepancy where QFT estimates exceed observed values by 123 orders of magnitude. This paper addresses this inconsistency by tracing the cooling evolution of the universe’s gauge symmetries—from SU(3)×SU(2)×U(1) at high temperatures to SU(3) alone near absolute zero—motivated by the experimental Meissner effect. This symmetry reduction posits that SU(3) forms the fundamental “atoms” of vacuum energy. Our analysis demonstrates that the calculated number of SU(3) vacuum atoms reconciles QFT’s predictions with empirical observations, effectively resolving the cosmological constant problem. The third law of thermodynamics, by preventing the attainment of absolute zero, ensures the stability of SU(3) vacuum atoms, providing a thermodynamic foundation for quark confinement. This stability guarantees a strictly positive mass gap defined by the vacuum energy density and implies a Lorentzian quantum structure of spacetime. Moreover, it offers insights into the origins of both gravity/gauge duality and gravity/superconductor duality.
- Research Article
- 10.1140/epjc/s10052-025-14332-5
- Jun 2, 2025
- The European Physical Journal C
- Nobuyoshi Komatsu
The first and second laws of thermodynamics should lead to a consistent scenario for discussing the cosmological constant problem. In the present study, to establish such a thermodynamic scenario, cosmological equations in a flat Friedmann–Lemaître–Robertson–Walker universe were derived from the first law, using an arbitrary entropy SH on a cosmological horizon. Then, the cosmological equations were formulated based on a general formulation that includes two extra driving terms, fΛ(t) and hB(t), which are usually used for, e.g., time-varying Λ(t) cosmology and bulk viscous cosmology, respectively. In addition, thermodynamic constraints on the two terms are examined using the second law of thermodynamics, extending a previous analysis (Komatsu in Phys. Rev. D 99:043523, 2019). It is found that a deviation SΔ of SH from the Bekenstein–Hawking entropy plays important roles in the two terms. The second law should constrain the upper limits of fΛ(t) and hB(t) in our late Universe. The orders of the two terms are likely consistent with the order of the cosmological constant Λobs measured by observations. In particular, when the deviation SΔ is close to zero, hB(t) and fΛ(t) should reduce to zero and a constant value (consistent with the order of Λobs), respectively, as if a consistent and viable scenario could be obtained from thermodynamics.
- Research Article
- 10.1007/s10714-025-03428-8
- May 27, 2025
- General Relativity and Gravitation
- Roger Eugene Hill
This paper presents the Horizon Model (HM) of cosmology, designed to resolve the cosmological constant problem by equating the vacuum energy density with that of the observable universe. Grounded in quantum information theory, HM proposes the first element of reality emerging from the Big Bang singularity as a Planck-sized qubit. The model views the Big Bang as the opening of a white hole, with spacetime and matter/energy emerging from the event horizon. Using the Schwarzschild solution and the Holographic Principle, HM calculates the number of vacuum qubits needed to equalize densities, and compares this to published estimates of the observable universe’s Shannon entropy (S). With this information, HM can calculate the state of the vacuum as a function of S. Results at S=1 (t=0) and S=1.46×10104 bits (t=now) are presented. At t=0, the radius of the event horizon is predicted to be ∼10-26 m in good agreement with the ad-hoc requirement of the current cosmic inflation paradigm. At t=now, HM predicts Hubble flow within 0.8σ of the Planck collaboration measurement and can resolve the Hubble tension with a small adjustment of the vacuum energy density. HM predictions of the vacuum pressure (∼10-10 Pa) are in good agreement with pressure measurements made on the lunar surface by NASA and the Chinese space program. Aligned with current research for spacetime emerging from surfaces, HM suggests new theoretical directions, potentially leading to a quantum theory of gravity.