Related Topics
Articles published on Soft computing
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
2075 Search results
Sort by Recency
- New
- Research Article
- 10.47672/ejt.2859
- Feb 7, 2026
- European Journal of Technology
- Pankaj Verma + 1 more
Purpose: The complexity of drilling activities has been enhanced by deeper wells, the heterogeneous formations, and the need to provide cost-effective and time-saving hydrocarbon production. One of the most important parameters of drilling performance is rate of Penetration (ROP), which has a direct impact on the efficiency of operations, non-productive time (NPT), and costs. The traditional mechanistic and empirical ROP models that had been important in the past are not very useful in nonlinear interaction, dynamic drilling conditions, and heterogeneous lithologies. However, existing reviews lack a structured problem statement that clearly identifies the limitations of standalone ML and classical ROP models under dynamic drilling conditions and the need for hybrid frameworks that improve accuracy, robustness, and real-time applicability. This review addresses this gap by systematically analyzing hybrid ML approaches and their role in drilling optimization. Materials and Methods: Improved drilling optimization through machine learning (ML) methods, especially hybrid ML models, has redefined the future of drilling optimization, which unites the advantages of various predictive models to improve accuracy, strength, and generalization. This review is a synthesis of literature on hybrid ML applications in ROP prediction, which is divided into three categories: optimization-integrated, ensemble, soft computing, and physics-informed models. Their methodologies, data requirements, real-time integration, operational problems, and performance in comparison to standalone ML models are addressed in the paper. Findings: The essential restrictions, including data quality, computing aspects, and the problem of interpretability, are identified, and the future research direction is also outlined. The synthesis offers an organized scheme of comprehending the development of hybrid ML models in the drilling optimization and outlines opportunities of future progress within the limitations of technologies. Unique contribution to theory, practice and policy: Improved drilling optimization through machine learning (ML) methods, especially hybrid ML models, has redefined the future of drilling optimization, which unites the advantages of various predictive models to improve accuracy, strength, and generalization.
- New
- Research Article
- 10.59256/ijire.20260701006
- Feb 6, 2026
- International Journal of Innovative Research in Engineering
- Charu Jhagrawat
The accuracy and diversity of information flows have become major characteristics of the modern age Big Data. it may not be noted that information about people, organizations, and events that are deemed critical bottlenecks within the deployment of trustworthy machine learning solutions. Despite the exponential nature that the field of data has followed, the quality of this kind of data is often low, particularly represented by extreme cases of Class Imbalance and Noise. Conventional hard computing paradigm based on Aristotelian logic and sharp decision boundaries, often present catastrophicFailure Modes when faced with so many data pathologies. Generally, traditional classifiers are prone to making biased predictions towards the majority class and at the same time overfitting to noisy cases that contaminate the manifold of the feature set. This research paper offers a comprehensive, expert-level examination of the methodologies of Soft Computing (SC)—including Fuzzy Logic, Artificial Neural Networks, and Evolutionary Algorithms—as a solid substitute in dealing with the uncertainty of real-world data. We will critically analyze the theoretical foundations of SC, explaining how the tolerance for imprecision and incomplete truth makes possible the design of decision boundaries robust to overlapping class distributions and the presence of noise within the labels. Moreover, we will introduce a novel Hybrid Soft Computing Framework, namely the REF-DB (Robust Evolutionary-Fuzzy Data Balancing), which jointly exploits the ability to efficiently handle label noise of Fuzzy Logic Filtering and the global optimization potential of Evolutionary Under sampling. This paper will review state-of-the-art techniques operating at multiple tiers: from FSVM, Genetic Fuzzy Systems, to the latest trends that involve LLMs for semantic data augmentation and Quantum Soft Computing for high-dimensional feature mapping. Our work presents a thorough comparison experimental evidence drawn from several benchmark datasets, such as KEEL and UCI. We show through a proper comparative analysis that hybrid SC approaches guarantee maximum values of robust metrics, like the G-Mean and AUC, outperforming hard computing baselines formed by singular approaches. The study concludes with a forward-looking discussion on the integration of Neuro-Symbolic AI and Quantum Machine Learning, positing that the future of robust data analytics lies in the fusion of evolutionary adaptability and fuzzy reasoning.
- Research Article
- 10.1016/j.asoc.2025.114348
- Feb 1, 2026
- Applied Soft Computing
- Kirti + 2 more
Commentary on “Type-2 intuitionistic fuzzy matrix games based on a new distance measure: Application to Biogas-plant implementation problem” [Applied Soft Computing 106 (2021) 107357
- Research Article
- 10.1007/s40515-026-00818-6
- Feb 1, 2026
- Transportation Infrastructure Geotechnology
- Jajati Keshari Naik + 2 more
Assessment of Soil Liquefaction Potential Prediction Using Synthetic Data and Soft Computing Techniques
- Research Article
- 10.3390/plants15030428
- Jan 30, 2026
- Plants
- Rubi Quiñones + 3 more
Advancements in phenotyping technologies, including object imaging, high-throughput monitoring, and soft computing, are pivotal for understanding plant responses to environmental stresses. These technologies enable detailed analyses of morphological, physiological, and structural adaptations under abiotic and biotic stresses, such as drought. Current work using multimodal and multi-perspective image processing methods can capture the essential processes that enhance plant resilience and counteract stress by identifying morphological and biochemical indicators. However, the dynamic and complex nature of plant responses poses multiple challenges for generating precise analytics and descriptors of evolving phenotypes. This work introduces analytics for concurrent imaging, adopting the underlying principle of cosegmentation to create taxonomies for new phenotypes. Here, unidimensional refers to the concurrent analysis of multiple images within a single phenotyping dimension: temporal, modal, or perspective, rather than combining information across dimensions. The proposed unidimensional phenotypes integrate concurrent images within individual temporal, modal, or perspective dimensions to capture dynamic morphological and physiological responses that are not observable with conventional single-image or cumulative metrics. Within a high-throughput imagery production system, these phenotypes enable more nuanced quantification of phenotypic changes, leveraging the strengths of simultaneous image analysis to enhance insight into plant adaptations. This workflow aligns with the investigation of plants’ adaptive strategies under abiotic stress and provides quantitative indicators of plant health under adverse environmental conditions.
- Research Article
- 10.59256/ijire.20260701005
- Jan 29, 2026
- International Journal of Innovative Research in Engineering
- Arpit Kumar Prajapati
Real life data at its core isn't always clean data. In fact, most of the time, it is not. We often get data which is noisy or incomplete or lacks some important values. However, I have observed this issue in practice, again and again, when working with practical datasets. Because of this, handling uncertainty becomes a major challenge when it comes to data analytics and wecannot ignore it if we want reliable results. Most of the traditional machine learning techniques rely on specific values of the input data and do not perform well when the data is uncertain. Soft computing techniques, particularly fuzzy logic, help cope with this issue by basing reasoning on approximate rather than strict rules. This paper proposes a hybrid soft computing approach which combines machine learning and fuzzy logic aspects in order to better deal with uncertainty in data analysis. In this technique, uncertain information is represented using simple linguistic terms by fuzzy logic and Random Forest classifier is used to obtain more accurate predictions. Experiments conducted on a student performance dataset indicate that the proposed hybrid model gives accuracy of 85.6%, which is better than the standard machine learning methods. The results show that hybrid soft computing models can perform well, accurately and easily when it comes to working with uncertain data.
- Research Article
- 10.65136/jati.v2i1.300
- Jan 26, 2026
- Journal of Applied Technology and Innovation
- Bryan Mungai Njoroge + 2 more
Recent technological developments and advancements in soft computing (decision support systems) and information technology have paved way to the development of precision-based agriculture. These new trends have enabled the exploitation of modern techniques and tools such as wireless sensor technology, soft computing techniques and IOT to improve the economic and environmental sustainability of agricultural production. The new trend of precision farming/agriculture distinguishes and discerns itself from the traditional techniques of farming through efficient, planned, systematic and justified use and application of resources for improved and increased yield production. In order to achieve this, precision farming exploits soft computing tools such as Support Vector Machines (SVM), Fuzzy Logic (FL), Artificial Neural Networks (ANN), Decision Trees (DT), geographic information systems such as weather patterns and remote sensing technologies such as Wireless Sensor Networks (WSN), to monitor and predict real time and future requirements of farm produce for improved food security. This research paper reviews the application of various techniques and technologies employed in precision farming.
- Research Article
- 10.53941/bci.2026.100005
- Jan 20, 2026
- Bulletin of Computational Intelligence
- Ahmed Salih Mohammed + 3 more
This study develops soft-computing models to predict the compressive strength of Fly Ash Composite Foam Concrete (FFC), a lightweight, sustainable cementitious material. A database of 302 experimental records was compiled from previous studies, including wet density, cement content, fly ash content, sand content, water–binder ratio, foam content, and curing age. Five predictive models were evaluated, with the Artificial Neural Network (ANN) achieving the best performance, yielding an accuracy of 98% and the lowest prediction error. Sensitivity analysis identified wet density, cement content, and foam content as the most influential variables. The results demonstrate that soft computing approaches can significantly reduce experimental effort, lower costs, and support the sustainable design of FFC mix ratios for diverse applications.
- Research Article
- 10.1007/s44196-025-01096-9
- Jan 19, 2026
- International Journal of Computational Intelligence Systems
- Wakeel Ahmed + 5 more
A Soft Computing Strategy Combining Fuzzy Analytic Hierarchy and Fuzzy Artificial Neural Networks for Predictive Modeling and Therapeutic Ranking in Bowel Cancer Drug Development
- Research Article
- 10.1016/j.measurement.2025.119306
- Jan 1, 2026
- Measurement
- Ali Shirgir + 3 more
Integrating soft computing and remote sensing for accurate bathymetric mapping in shallow saline lakes: a case study of Lake Urmia
- Research Article
- 10.1109/access.2026.3663648
- Jan 1, 2026
- IEEE Access
- Simona-Vasilica Oprea + 1 more
Explaining Customer Churn Through Unified Perturbation-Based XAI Soft Computing
- Research Article
- 10.70102/afts.2025.1834.698
- Dec 30, 2025
- Archives for Technical Sciences
- S Anbazhagan + 5 more
image segmentation, kapur's entropy, multilevel thresholding, soft computing, optimal thresholds, jaya algorithm, optimization techniques.
- Research Article
- 10.31449/inf.v49i35.9973
- Dec 16, 2025
- Informatica
- Yang Zhang
With the rapid development of smart cities, efficient and real-time urban landscape management has become an urgent research topic. This paper proposes a Hybrid Soft Computing Framework (HSCF) that combines Fuzzy Logic, Improved Genetic Algorithm (IGA), and Adaptive Particle Swarm Optimization (APSO) to dynamically optimize urban systems such as lighting and irrigation. By integrating heterogeneous sensor data (e.g., weather, pedestrian flow, and traffic conditions), the framework senses environmental changes and makes optimization decisions in real time.The Fuzzy Logic module handles low-latency adjustments, such as dynamically tuning lighting brightness based on crowd density, achieving response times of less than 100 ms. The IGA performs mid-term optimization of multi-objective landscape layouts (e.g., energy efficiency, aesthetics, and functionality) every 5 minutes, evolving Pareto-optimal solutions through non-dominated sorting and crowding distance analysis with a population size of 50, crossover rate of 0.8–0.95, and mutation rate of 0.05–0.15. The APSO continuously refines these solutions using real-time spatio-temporal data, adaptively balancing exploration and exploitation through inertia weight adjustments (ranging from 0.4 to 0.9) and acceleration constants (c₁ = 1.2–1.8, c₂ = 1.2–2.0).Experimental results demonstrate that HSCF outperforms traditional methods (e.g., FLC and PSO), achieving 16.2%-22.7% energy consumption reduction, 36.6% water savings in irrigation systems, and maintaining stability under extreme weather and ±20% data noise. Key innovations include dynamic spatio-temporal data fusion, real-time decision-making, and joint fitness evaluation across layers. Future work will focus on scalability and integration of additional data sources (e.g., UAV-derived 3D maps) to address more complex urban management tasks. This framework provides a replicable, data-driven solution for adaptive smart city landscape management.
- Research Article
- 10.1007/s13369-025-10936-x
- Dec 15, 2025
- Arabian Journal for Science and Engineering
- Hadi Fattahi + 2 more
Reliability-Based Assessment of Rock Brittleness Using Hybrid Soft Computing and Probabilistic Approaches
- Research Article
- 10.1007/s11831-025-10462-x
- Dec 6, 2025
- Archives of Computational Methods in Engineering
- Essam H Houssein + 4 more
Abstract Human activity recognition (HAR) represents a significant area of research within the domain of computer vision, which has been extensively explored, yet it faces significant challenges including real-world variability, fine-grained discrimination, computational efficiency, and robust multi-modal data fusion. Traditional “hard computing” techniques frequently find it difficult to cope with the intrinsic imprecision, uncertainty, and ever-changing aspects of human behavior. This study commences with a broad overview of the HAR framework, detailing the distribution of machine learning (ML) and deep learning (DL) in HAR, as well as providing an outline of the recent HAR datasets. Further, the study offers a comprehensive overview of the synergistic combination of Soft Computing (SC) paradigms and Multi-Agent Systems (MAS) as a robust strategy to overcome these challenges in HAR. Further, the study presents a new problem-oriented taxonomy that categorizes HAR challenges into three distinct groups: sensing challenges, recognition challenges, and scalability & robustness challenges. Moreover, the study primarily investigates the integration of these two domains and how they yield innovative solutions to challenges in HAR. The final section outlines the existing challenges within this integrated research domain and highlights potential future directions, which encompass sophisticated neuro-fuzzy fusion techniques, self-organizing multi-agent learning for HAR, and the creation of explainable and resilient HAR systems.
- Research Article
- 10.21821/2309-5180-2025-17-5-653-671
- Dec 6, 2025
- Vestnik Gosudarstvennogo universiteta morskogo i rechnogo flota imeni admirala S. O. Makarova
- I V Yuyukin
This paper explores the possibility of integrating fuzzy set theory with modified piecewise approximations into a unified framework for developing advanced navigation models. The optimal combination of fuzzy logic and spline functions makes it possible to account for uncertainty and inaccuracy in navigation measurements through the application of point interpolation principles. The theoretical basis of the study relies on the fuzzy approximation theorem, which states that any system can be synthesized using fuzzy logic. A practical example is provided, demonstrating the use of fuzzy sets in spline-based trajectory modeling to ensure timely avoidance of restricted navigation areas and to determine optimal trajectory parameters under routing uncertainty. An experiment was conducted to synthesize a complex spline trajectory of a vessel using linguistic variables within fuzzy logic theory. The feasibility of combining spline function methods and fuzzy set compositions was empirically confirmed through the approximation of a smooth trajectory, which increased the speed of soft computing by 15 %. The proposed hybrid approach can serve as a mathematical foundation for adaptive fuzzy models designed to predict the trajectories of mobile objects, contributing to the development of unmanned navigation concepts. A paradigm shift is anticipated — from traditional requirements for measurement accuracy based on probabilistic and statistical methods to the fuzzy domain of information granulation. The paper also examines the alternative applicability of fuzzy logic versus probability theory when using membership functions to address non-standard navigation problems. Furthermore, the study investigates the modeling of a watch officer’s decision-making process based on fuzzy logic principles, emphasizing the influence of the human factor on navigational safety in intelligent hybrid systems. Managing uncertainty in cognitive navigation tasks is viewed as a key aspect of preventing emergencies through the application of fuzzy logic algebra.
- Research Article
- 10.1002/ese3.70391
- Dec 5, 2025
- Energy Science & Engineering
- Lethukuthula Nokwazi Vilakazi + 1 more
ABSTRACT Online or real‐time strategies of estimating the gross calorific value (GCV) of coal are still not fully explored in academic literature, even though both conventional and sophisticated offline methods for estimating the GCV are well described. Soft computing and machine learning models concentrate on offline data, relying on lab‐derived inputs rather than continuous sensor data. None of the existing methods of estimating the GCV of coal go into detail about deployment within real‐time monitoring systems at coal‐fired power plants (CFPP). This study applied a novel approach of using real‐time plant data to estimate the GCV of coal by employing computational fluid dynamics (CFD) and mass and energy balance (MEB) modelling to simulate a full‐scale coal fired boiler since currently, the plant does not have enough data to establish a correlation between the GCV of coal and real‐time plant data. To estimate the GCV of coal under operating conditions, empirical correlations were established using the CFD and MEB model outputs for the main flue gas constituents, carbon dioxide (), carbon monoxide (CO), oxygen (), and sulfur dioxide (). The flue gas constituents used as regressors were selected for this study since they are currently measured at the plant, which is the “real‐time plant data” selected for this study. The study applied the multilinear regression (MLR) method to establish a correlation between the GCV of coal and flue gas constituents. MLR might be viewed as a traditional method of establishing correlations, but studies referenced in this study have presented that MLR provides the same results when compared to the recent artificial intelligence (AI) tools that have been explored by other researchers to estimate the GCV of coal. The correlations established in this study showed dependable prediction capacity with a coefficient of determination () of 0.84 and relative errors ≤ 6.2%. The results provided the groundwork for implementing real‐time GCV estimation techniques in coal‐fired power plants that are currently in operation, which could enhance combustion efficiency monitoring and control.
- Research Article
- 10.1080/19392699.2025.2593959
- Dec 4, 2025
- International Journal of Coal Preparation and Utilization
- Abiodun Ismail Lawal + 3 more
ABSTRACT The reliability of some empirical models for the prediction of higher heating value (HHV) based on the proximate analysis by using a total of 147 coal samples collected from Witbank coalfields in South Africa is evaluated. The most statistically reliable model is selected among the empirical models using the nonparametric Mann – Whitney’s p-value after subjecting the models to normality tests using Shapiro-Wilk, Lillefors, and Anderson-Darling methods. This selected model is then enhanced using an artificial neural network (ANN) to improve its predictive performance and extrapolation capability. The most statistically reliable model among the models subjected to the rigorous statistical tests is compared with the measured HHV and the ANN predicted HHV. The coefficient of determination (R2) between the selected model and the measured HHV is 0.9482, while that between the selected model and the ANN output is 0.9625. A direct comparison between the ANN predictions and the measured HHV yields a higher R2 value of 0.9868. A new hybrid equation is developed by integrating the selected empirical model with the ANN and further optimized using the least squares method via Newton’s iterative approach. To validate the models, an additional 50 datasets from existing literature are used. The ANN model gives very low R2 value of 0.2921 for the existing literature data indicating its low extrapolation potential while the new hybrid model gives R2 value of 0.9372.
- Addendum
- 10.1016/j.sasc.2025.200363
- Dec 1, 2025
- Systems and Soft Computing
- Pritpal Singh + 1 more
Erratum to “Multi-criteria group decision-making using ambiguous sets, Weibull distribution, and aggregation operators: A case study in optimal vendor selection for office supplies” [Systems and Soft Computing, Volume 7, December 2025, 200283
- Research Article
- 10.33889/ijmems.2025.10.6.098
- Dec 1, 2025
- International Journal of Mathematical, Engineering and Management Sciences
- Abhinav Saxena + 7 more
Electric vehicles (EVs) are one of the best replacements of conventional vehicles due to environmental friendly nature. The poor State of Charge (SOC in %) control and large settling time & peak overshoot of speed are the few research gap which need to be address due to continuous degradation of battery. The paper demonstrates the conceptual design and implementation of a solar-powered electric vehicle that uses soft computing methods for smart control. The implementation of a dynamic solar-powered electric vehicle charging stations combining smart control & soft computation methods requires careful, optimal design, charging the economy, & regular upkeep. Nonetheless, it can offer an environmentally friendly, long-term EV charging option and perhaps reduce continuing operational costs. Because electric vehicles emit no pollutants, this work also assists global efforts to reduce emissions of greenhouse gases and combat climate change. In this paper, electric vehicle charging has been assessed at various voltage levels. Subsequently, an electric vehicle is controlled by DC motor. In addition to state of charging of electric vehicle which is measured in terms of state of charging, the speed of electric vehicle is analyzed. In order to attain the smooth operation of speed and SOC, an objective function has been developed. Further, the objective function has been controlled by using artificial neural network and genetic algorithm. It is observed that SOC (%) shows better and smoother performance with GA in comparison to ANN and existing methods. In addition to this, settling time and peak overshoot of the speed has been improved a lot with GA (2.5 sec, 2%) and ANN (3.1 sec, 2.7%).