Published in last 50 years
Articles published on Automation Techniques
- New
- Research Article
- 10.1016/j.bone.2025.117577
- Nov 1, 2025
- Bone
- J Shepherd + 6 more
Statistical shape modelling of the first carpometacarpal joint: A cross-sectional analysis of an osteoarthritis initiative cohort.
- Research Article
- 10.1186/s12938-025-01448-8
- Oct 9, 2025
- BioMedical Engineering OnLine
- Isiah Mejia + 6 more
Ultrasound (US) imaging is the primary choice for diagnosing and triaging patients in the battlefield as well as emergency medicine due to ease of portability and low-power requirements. Interpretation and acquisition of ultrasound images can be challenging and requires personnel with specialized training. Incorporating artificial intelligence (AI) can enhance the imaging process while improving diagnostic accuracy. To accomplish this goal, we have developed a full torso tissue-mimicking phantom for simulating US image capture at each site of the extended-focused assessment with sonography for trauma (eFAST) exam and is suitable for developing AI guidance and classification models. The US images taken from the phantom were used to train AI models for detection of specific anatomical features and injury state diagnosis. The tissue-mimicking phantom successfully simulated full thoracic motion as well as modular injuries at each scan site. AI models trained from the tissue phantom were able to achieve IOU’s greater than 0.80 and accuracy of 71.5% on blind inferences. In summary, the tissue mimicking phantom is a reliable tool for acquiring eFAST images for training AI models. Furthermore, the tissue phantom could be implemented for training personnel on ultrasound examination techniques as well as developing image acquisition automation techniques.
- Research Article
- 10.52711/2231-5713.2025.00060
- Oct 8, 2025
- Asian Journal of Pharmacy and Technology
- Shalini Shalini + 5 more
For over-the-counter (OTC) formulations to be safe, effective, and compliant with regulations, the active pharmaceutical ingredients (APIs) must be precisely quantified. UV spectrophotometry is unique among analytical methods since it is easy to use, reasonably priced, and appropriate for regular quality control. The principles and use of the calibration curve approach in UV spectrophotometry for the quantitative assay of vitamin C, aspirin, and paracetamol are examined in this paper. It explores the theoretical underpinnings, sample preparation techniques, validation procedures, and comparisons of commercial formulations. The method's benefits and drawbacks are thoroughly examined, as are upcoming developments in pharmaceutical analysis, including automation, chemometrics, and green chemistry techniques. This thorough review emphasizes the reliability and usefulness of UV spectrophotometric analysis in pharmaceutical quality control, particularly for popular over-the-counter drugs.
- Research Article
- 10.1177/09596518251350353
- Sep 3, 2025
- Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and Control Engineering
- Anselmo Parnada + 4 more
Reinforcement Learning (RL) has been considered a promising method to enable the automation of contact-rich manipulation tasks, which can increase capabilities for industrial automation. RL facilitates autonomous agents’ learning to solve environments with complex dynamics with little human intervention, making it easier to implement control strategies for contact-rich tasks compared to traditional control approaches. Further, RL-based robotic control has the potential to transfer policies between task variations, significantly improving scalability compared to existing methods. However, RL is currently inviable for wider adoption due to its relatively high implementation costs and safety issues, so current research has been focused on addressing these issues. This paper comprehensively reviewed recently developed techniques to improve cost and safety for RL in contact-rich robotic manipulation. Techniques were organized by their approach, and their impact was analysed. It was found that current research efforts have significantly improved the cost and safety of RL-based control for contact-rich tasks, but further improvements can be made by progressing research towards improving knowledge transfer between tasks, improving inter-robot policy transfer and facilitating real-world and continual RL. The identified directions for further research set the stage for future developments in more versatile and cost-effective RL-based control for contact-rich robotic manipulation in future industrial automation applications.
- Research Article
- 10.1016/j.brainres.2025.149704
- Sep 1, 2025
- Brain research
- A Sumithra + 3 more
Improving brain tumor diagnosis: A self-calibrated 1D residual network with random forest integration.
- Research Article
- 10.1177/14727978251361831
- Jul 27, 2025
- Journal of Computational Methods in Sciences and Engineering
- Qiucheng Ban + 1 more
Reliable grid management is critical for the efficiency and stability of electrical transmission and distribution systems. As grid systems become more sophisticated and complex, there is an increased need for digital asset management and automation to make operations more efficient. A lot of standard grid management techniques include manual physical inspection and repair of assets. These methods tend to take too long and raise operational costs. The proposed project will improve the grid management methods by employing both digital asset feature identification and automation. The intention is to promote greater operational efficiency through automation identified through real-time monitoring of decision-making processes. The proposed process will include identifying digital maintenance features on grid assets, including transformers, electricity lines, and meters, using IoT sensors. To standardize the input values, the data is pre-processed with procedures such as z-score normalization. Features including voltage, temperature, and current are extracted using the Discrete Wavelet Transform (DWT). Attention-based Bidirectional Gated Recurrent Units with Grid Search Optimization (AttenBi-GRU-GSO) method are used to analyze the processed data to predict the faults and optimize performance. The AttenBi-GRU-GSO combines the attention mechanism to focus on critical features, Bi-GRU to capture sequential dependencies from both past and future, and GSO to fine-tune the model’s hyperparameters for optimal performance, ensuring efficient and accurate fault detection and prediction in grid management. Experimental results demonstrate that the AttenBi-GRU-GSO method achieves superior performance, with an accuracy of 95.31%, an MAE loss of 0.11, a total loss of 0.07, and an RMSE loss of 0.03. Automated systems provide control of detected problems by responding with remedial actions, including rerouting power and organizing maintenance. The automation increases the accuracy and speed of fault detection, as well as reduces downtime and maintenance costs. Digital asset feature identification and automation enhances grid management, lowers operating costs, and increases total electrical network reliability. The proposed technology provides a scalable platform for current grid operations.
- Research Article
- 10.22399/ijcesen.3564
- Jul 23, 2025
- International Journal of Computational and Experimental Science and Engineering
- Sharmin Sultana + 7 more
Big data analytics, as used in defense, is the capacity to gather vast amounts of digital data for analysis, visualization, and decision-making that might aid in anticipating and preventing cyberattacks. When combined with security technologies, it improves it position in terms of cyber defense. They enable companies to identify behavioral patterns that point to network dangers. With its potent capabilities to tackle the increasing scope, variety, and complexity of cyberthreats, big data analytics has become a disruptive force in contemporary cybersecurity. Traditional data processing methods fall short in managing the massive volumes, varieties, and velocities (3Vs) characteristic of big data. This paper explores the foundational principles of big data analytics, including its core dimensions and key application areas such as healthcare, transportation, finance, education, and social media. The study further investigates the classification of cyberattacks malware, phishing, ransomware, and advanced persistent threats (APTs) and their evolving complexity due to AI-powered automation, IoT proliferation, and multi-vector intrusion techniques. It is highlighted how crucial big data is to supporting real-time threat detection, predictive modelling, and automated incident response. Techniques such as behavioral analysis, threat intelligence integration, and anomaly detection are examined for their effectiveness in identifying sophisticated attacks like polymorphic malware and zero-day exploits. Ultimately, this paper highlights how big data analytics enhances cybersecurity capabilities by delivering predictive, prescriptive, diagnostic, and cyber-specific insights that empower proactive threat mitigation and ensure digital resilience.
- Research Article
- 10.1080/00223131.2025.2532066
- Jul 19, 2025
- Journal of Nuclear Science and Technology
- Merouane Najar + 1 more
ABSTRACT The integration of artificial intelligence (AI) into nuclear safety systems holds significant promise for enhancing the safety and reliability of nuclear power plants (NPPs). However, the complex structures of certain algorithms pose challenges for operators in trusting model predictions. Therefore, model interpretability becomes essential to advancing the application of AI in the field of nuclear engineering. In this sense, the present study classifies a wide range of nuclear reactor accidents based not only on algorithmic performance but also on model transparency. The data were collected using an automation technique, demonstrating its effectiveness in reducing time consumption, optimizing data quality, and improving hardware resource utilization. Furthermore, various models were developed and optimized. The SHAP (Shapley Additive Explanation) technique was employed to explain the decision-making processes of these models. The results indicate that, among the investigated models, the Support Vector Machine (SVM) exhibited the best performance. Moreover, the application of Explainable AI (XAI) techniques proved effective in providing deep interpretability for models such as Random Forest (RF) and Artificial Neural Networks (ANN), offering insights into the reasons behind incorrect predictions.
- Research Article
- 10.20965/ijat.2025.p0553
- Jul 5, 2025
- International Journal of Automation Technology
- Atsushi Yamashita + 2 more
The demand for sensing in robotics and automation has increased due to the decrease in the labor force. Recent advances in computational performance have advanced the widespread use of image processing technology in various applications. This special issue aims to provide researchers with an opportunity to access the latest research and case studies on advanced image processing, computer vision, and sensing techniques for robotics and automation. The topics of interest in this special issue are as follows: 1) Theory and algorithms: Image processing, computer vision, pattern recognition, object detection, image understanding, media understanding, machine learning, deep learning, 3D measurement, simultaneous localization and mapping (SLAM), multispectral image processing, visualization, virtual reality (VR) / augmented reality (AR) / mixed reality (MR), and datasets for image processing; 2) Industrial applications: Factory automation, machine vision, visual inspection, monitoring, surveying, logistics; 3) Sensing techniques for robotics and automation: Robot vision, advanced driver-assistance systems (ADAS), autonomous driving, robotic picking, assembly, and palletizing; 4) Image processing hardware and software: Image acquisition devices, image sensors, image processing systems, sensor information processing; 5) Man machine interface: Visualization, human interface devices. This special issue features 20 research articles that highlight the latest advancements in advanced image processing techniques for robotics and automation (Part 1: 10 articles, Part 2: 10 articles). We extend our heartfelt gratitude to all the contributors, reviewers, and editorial staff for their dedication and support in realizing this special issue.
- Research Article
- 10.36349/easjecs.2025.v08i03.001
- Jul 5, 2025
- East African Scholars Journal of Engineering and Computer Sciences
- Usiade Usiade + 3 more
This research work, aims at mitigating the manual and low yield approach of the local system groundnut oil extraction which generally affects both the quantity and quality of the extracted oil. During the course of this research, studies were carried out on groundnut oil extraction which revealed that the optimum temperature for groundnut oil extraction is 90°C. This temperature at the preparatory heating chamber was properly controlled in this task by adopting an automation technique in which PID based microcontroller was deployed. A model of preparatory heating chamber and a PID controller were developed and deplored. The models were validated through a simulation process by varying the temperature parameters of the PID controller in order to achieve a desirable transient response of the system. After several simulations, a set of optimal parameters were obtained that exhibited a commendable improvement was achieved, thus improving the robustness and stability of the system.
- Research Article
- 10.58346/jowua.2025.i2.039
- Jun 30, 2025
- Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications
- Zaed Balasm + 5 more
Smart ubiquitous agriculture is a domain of industry that encompasses farming requiring profound monitoring, advanced technology, wireless data communication, and exceptional data analysis. The increasing interaction level with devices and automatic machines within agriculture creates new data challenges and requirements for decision support systems. My aim, within this paper, is to target the agricultural decision process using the Internet of Things (IoT), machine learning, and advanced predictive techniques. For this, I identify and examine the data acquisition methods, analytical tools, and decision-making devices that enable farmers and livestock managers to make the right choices and informed decisions on what actions to take. Using the selected intelligent systems, technologies, and automation techniques, I intend to demonstrate how these technologies can optimize the quantities and quality of output from agricultural undertakings while minimizing resource utilization. In addition, I highlight the unsolved problems of data redundancy, data protection, and manipulation, and dimensions of the issues associated with integrating information technology into smart farming, which need to be specified for decision-making within the farming environment. The studies and conclusions outlined in this paper prove that value-added information frameworks pave the road to agricultural planning and monitoring 2.0, which will enable food system production that is more responsive, exact, and environmentally sound.
- Research Article
- 10.3390/life15071004
- Jun 24, 2025
- Life
- Muhammed Nurullah Arslan + 3 more
This study examined the behavioral responses of Nile Tilapia (Oreochromis niloticus), a key aquaculture species, to ammonia stress using non-invasive image processing techniques. The experiment was conducted under controlled laboratory conditions and involved four groups exposed to ammonium chloride concentrations (0, 100, 200, and 400 mg·lt−1). Movement trajectories of individual fish were recorded over 10 h using high-resolution cameras positioned above and beside glass tanks. Images were processed with the Optical Flow Farneback algorithm in Python, implemented in Visual Studio Code with OpenCV and NumPy libraries, achieving a 91.40% accuracy rate in tracking fish positions. The results revealed that increasing ammonia levels restricted movement areas while elevating movement irregularity and activity. The 0 mg·lt−1 group utilized the glass tank homogeneously, covering 477 m. In contrast, the 100 mg·lt−1 group showed clustering in specific areas (796 m). At 200 mg·lt−1, clustering intensified, particularly along the glass tank’s left edge (744 m), and at 400 mg·lt−1, fish exhibited severe restriction near the water surface with markedly increased activity (928 m). Statistical analyses using Kruskal–Wallis and Dunn tests confirmed significant differences between the 400 mg·lt−1 group and others. No difference was observed between the 0 mg·lt−1 and 100 mg·lt−1 group, indicating tolerance to lower concentrations. The study highlights the importance of ammonia levels in water quality management and reveals the potential of image processing techniques for automation and stress monitoring in aquaculture.
- Research Article
- 10.1088/1758-5090/addb7e
- Jun 3, 2025
- Biofabrication
- Chiara Formica + 4 more
Chronic kidney disease affects 10% of the global population and often progresses to end-stage renal disease, where dialysis or renal transplant are the only therapies, though neither is a permanent solution. Regenerative medicine, particularly the use of organoids, offers a potential solution. Organoids are valuable for studying organ development, diseases, and regeneration, and are suitable for drug screening. However, their limited ability to replicate adult organs' maturation, complexity, and functions restricts their application. Additionally, manual production of organoids causes variability, affecting scalability and reproducibility. Automation techniques like bioprinting could enhance organoid maturation and complexity by depositing cells and biomaterials in a controlled manner. In this study, we established differentiation protocols to obtain human induced pluripotent stem cell-derived metanephric mesenchyme, ureteric bud progenitors, and the combination of these was used to form organoids. A microfluidic bioprinter capable of producing core-shell filaments was used to bioprint single cell progenitors in combination with gelatin in the core wrapped with an alginate shell. These filament constructs were cultured with an optimized mixture of growth factors for two weeks. Within one week, renal vesicles were visible, and after two weeks post-bioprinting the kidney organoids were functional and respond to the nephrotoxic drug doxorubicin. In conclusion, a bioprinted method was developed to generate in an automated way functional renal organoids from progenitors, offering a foundation for future kidney disease treatment.
- Research Article
- 10.51408/1963-0133
- Jun 1, 2025
- Mathematical Problems of Computer Science
- Agit Atashyan + 3 more
Drone technology has enabled major advancements in autonomous systems, particularly in swarm robotics. This paper presents a novel automation technique aimed at enhancing the efficiency, adaptability, and robustness of self-organizing drone swarms. The system uses decentralized control algorithms and robust communication protocols to enable real-time adaptive learning and decision-making among drones. Each drone acts as an autonomous agent, adjusting its behavior based on environmental inputs and interactions with other drones. A hybrid communication model blending peer-to-peer and cluster-based protocols ensures effective information sharing and coordination. To build a scalable and resilient architecture, multi-agent systems theory is integrated with advanced self-organizing strategies. Extensive modeling and realworld testing evaluated the systems performance in complex scenarios such as disaster response, environmental monitoring, and surveillance. Results demonstrate significant improvements in swarm efficiency, resilience to failures, and adaptability to dynamic environments. The incorporation of adaptive learning algorithms further optimized task allocation and execution in real time. This work represents a substantial advancement in autonomous aerial robotics, offering a comprehensive framework for deploying intelligent, self-organizing drone swarms and highlighting the transformative potential of automata-based approaches in future autonomous systems.
- Research Article
- 10.17485/ijst/v18si1.icamada27
- May 30, 2025
- Indian Journal Of Science And Technology
- K V Mahalakshmi + 4 more
Objectives: To apply computer vision techniques in healthcare automation and analysis and assist medical professionals. Methods: The proposed work encapsulates computer vision techniques; such as traditional computer vision technique using contour detection, Machine Learning-Based Computer vision techniques and clustering techniques. The Machine Learning (ML)-based Computer vision techniques include; Support Vector Machine (SVM) and Logistic Regression (LR). The methodology uses Magnetic Resonance imaging scans of brain, Ovarian and hepatic tumors, focusing on parameters like region of interest, accuracy of classification to determine the behavior of the algorithm. The traditional method has successfully detected the Tumor region and created a bounding box to showcase the region. The method uses concept of thresholding and contour detection. Findings: The Machine learning (ML)-based computer vision uses Classification algorithm such as Support Vector Classifier (SVC) and Logistic Regression (LR), which classifies the testing image data into its respective tumor subtype. After classification, by leveraging thresholding and contour detection techniques, our Region of Interest (ROI) which is, the tumor region is detected with a bounding box. SVC attains an accuracy of 86.3% and Logistic regression attains an accuracy of 82%. The clustering algorithms such as K-Means, successfully detected the glioma tumor regions and cancer image datasets including Hepatic and Ovarian cancer. Novelty: The proposed methodology classifies the tumor images into their respective tumor sub-groups and the tumor affected area can be highlighted with the computer vision techniques like bounding boxes simultaneously.
- Research Article
- 10.20428/jst.v30i6.2946
- May 30, 2025
- Journal of Science and Technology
- Hebah Marza + 5 more
Pneumatic systems are extensively utilized in industrial applications due to their cleanliness, reliability, and the ease with which compressed air can be generated from the environment. These characteristics make them particularly advantageous in hygiene-sensitive industries such as food processing and packaging, where they are often favored over hydraulic alternatives. As the demand for automation continues to grow, systems like the Vertical Form Fill Seal (VFFS) machine have become essential for performing forming, filling, and sealing operations with minimal human intervention. This paper presents the experimental design and implementation of a simplified, automated pneumatic sealing system based on the VFFS concept. The system main controller is a Programmable Logic controller (PLC) and integrated with Human Machine Interface (HMI) for a real time monitoring and human interaction. The control logic evolved by, ladder diagrams, timers, sensor feedback, and strokes counter to ensure synchronization and repeated operation. The study shows off the probability of using accessible automation techniques to simulate industrial packing processes beneficially pointing up on its potential for educational and practical applications in latest manufacturing environments.
- Research Article
- 10.61778/ijmrast.v3i5.131
- May 28, 2025
- International Journal of Multidisciplinary Research in Arts, Science and Technology
- Parankush Koul + 1 more
The study examines how machine learning (ML) methods can be incorporated into production engineering practices. The paper highlights data preprocessing and cleaning as essential steps to maintain data quality and reliability for ML applications. The review shows the production environment challenges that include missing data values and the presence of outliers along with data inconsistencies. The text explains how advanced automation techniques decrease human involvement while improving feature extraction methods, which produce uniform features across different manufacturing systems. The paper emphasizes that effective model deployment relies on rigorous data engineering pipelines that perform comprehensive data ingestion, transformation, and feature engineering. The review intends to explore the existing ML applications within production engineering while identifying key practices that enable model readiness and reliability.
- Research Article
- 10.1038/s41598-025-03056-x
- May 23, 2025
- Scientific Reports
- Tanveerul Haq + 2 more
In this study, we introduce a technique for unsupervised design and design automation of resonator-based microstrip sensors for dielectric material characterization. Our approach utilizes fundamental building blocks such as circular and square resonators, stubs, and slots, which can be adjusted in size and combined into intricate geometries using appropriate Boolean transformations. The sensor’s topology, including its constituent components and their dimensions, is governed by artificial intelligence (AI) techniques, specifically evolutionary algorithms, in conjunction with gradient-based optimizers. This enables not only the explicit enhancement of the circuit’s sensitivity but also ensures the attainment of the desired operating frequency. The design process is entirely driven by specifications and does not necessitate any interaction from the designer. We extensively validate our design framework by designing a range of high-performance sensors. Selected devices are experimentally validated, calibrated using inverse modeling techniques, and utilized for characterizing dielectric samples across a wide spectrum of permittivity and thickness. Moreover, comprehensive benchmarking demonstrates the superiority of AI-generated sensors over state-of-the-art designs reported in the literature.
- Research Article
- 10.3390/computers14050194
- May 17, 2025
- Computers
- Yeison Nolberto Cardona-Álvarez + 2 more
This review paper presents a comprehensive analysis of the evolving landscape of data exchange, with a particular focus on the transformative role of emerging technologies such as blockchain, field-programmable gate arrays (FPGAs), and artificial intelligence (AI). We explore how the integration of these technologies into data management systems enhances operational efficiency, precision, and security through intelligent automation and advanced machine learning techniques. The paper also critically examines the key challenges facing data exchange today, including issues of interoperability, the demand for real-time processing, and the stringent requirements of regulatory compliance. Furthermore, it underscores the urgent need for robust ethical frameworks to guide the responsible use of AI and to protect data privacy. In addressing these challenges, the paper calls for innovative research aimed at overcoming current limitations in scalability and security. It advocates for interdisciplinary approaches that harmonize technological innovation with legal and ethical considerations. Ultimately, this review highlights the pivotal role of collaboration among researchers, industry stakeholders, and policymakers in fostering a digitally inclusive future—one that strengthens data exchange practices while upholding global standards of fairness, transparency, and accountability.
- Research Article
- 10.70792/jngr5.0.v1i4.125
- May 14, 2025
- Journal of Next-Generation Research 5.0
- Ifeanyi Kingsley Kwentoa
The research investigates how artificial intelligence technologies operate within enterprise cybersecurity frameworks by studying threat intelligence automation and advanced detection techniques. The research uses extensive literature analysis to show that machine learning algorithms achieve detection accuracies above 95% and deep learning approaches enhance F1-scores by up to 33% above traditional methods. Real-time data integration with behavioral analytics boosts threat identification abilities, allowing systems to detect 150,000 threats per minute and preventing 8 out of 10 attacks from causing system compromise. The current implementations primarily use centralized architectures, but distributed approaches show benefits for particular deployment situations. The research identifies essential challenges, which include privacy concerns, transparency limitations, algorithmic bias, data quality issues, and integration complexity. The research demonstrates that effective countermeasures against advanced threats require security innovations governed by comprehensive frameworks that balance technological capabilities with ethical considerations through continuous evaluation processes.