Opinion-driven regulation of multi-strain pathogen transmission across species.

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

This study investigates the fundamental mechanisms underlying cross-species, multi-strain transmission in ecosystems from the opinion of group opinion dynamics. A multilayer interaction framework is proposed, incorporating signed-weighted social network dynamics to quantify group-level opinions and dynamically adjust key epidemiological parameters in real time. The analysis reveals that (1) infection pressure alters group opinion thresholds via cognitive-behavioral feedback, while the emerging collective consensus reciprocally regulates transmission intensity, forming a closed-loop feedback mechanism. (2) The topology of the opinion network governs epidemic phase transitions, inducing a bistable regime characterized by either low-risk (opinion cohesion) or high-risk (opinion polarization) states. By identifying critical nodes within the signed social graph, the study transforms group opinion intensity into dynamic warning thresholds, enabling targeted ecological interventions.

Similar Papers
  • Research Article
  • Cite Count Icon 11
  • 10.1016/j.jngse.2016.03.075
Smart de-watering and production system through real-time water level surveillance for Coal-Bed Methane wells
  • Mar 28, 2016
  • Journal of Natural Gas Science and Engineering
  • Guoqing Han + 2 more

Smart de-watering and production system through real-time water level surveillance for Coal-Bed Methane wells

  • Conference Article
  • Cite Count Icon 2
  • 10.2118/173408-ms
Smart De-Watering and Production System through Real-Time Water Level Surveillance for Coal-Bed Methane Wells
  • Jan 1, 2015
  • Guoqing Han + 2 more

In the past few decades, Coal-Bed Methane (CBM) has become an important source of energy especially in North America. The methane adsorbed within the coal is in a near-liquid state. The open fractures in the cleats are commonly saturated with water. To develop CBM reservoir, water in the fracture and coal seam must be continuously pumped off from coal seam to reduce pressure and desorb gas from matrix. Although operators desire to produce hydrocarbon quickly, a too fast dewatering rate can irreversibly damage matrix desorption process, which can lead to an unfavorable ultimate recovery. Further, the aggressive production rate can potentially release the coal fines and drive them into pump, which increases maintenance effort and cost. Even worse, the de-watering process fluctuates because of the rock porosity and permeability changes resulting from the brittle coal seam and pressure reduction. Therefore, it is critical to adjust the pump operating parameters in a timely manner to maintain a continuous/intermittent production. Ten CBM wells are located in a remote area, which makes the access to wellsites difficult. Previously engineers had to evaluate well performance and optimize the pump on-site, which is limited by a monthly basis. We firstly developed an automatic data processing system using the advanced Echosounders, which can measure the water level in real time. The reservoir pressure can be then monitored dynamically through interpreting the detected water level. With an automatic-wireless data transferring system installed on-site and a closed-loop control program to receive, process, and interpret data, the pump operating parameters can be changed in real time through remote control. This system not only identifies the downhole problems in real time, but also reduces the pump maintenance frequency from 40 days to 75 days statistically and numbers of trip to well site. Further, the gas production rate has been averagely improved by 30% for the 10 wells. The authors firstly developed an automation data processing and control system in the favor of advanced echosounders. Based on the interpreted reservoir pressure, we can avoid aggressive production by adjusting the pump operating parameters in real time, which eventually results in a better ultimate recovery. The developed workflow (automatic echosounder data acquisition, real filed data transferred to central office, data processing, interpretation, and simulation in computational system, adjustment commands to operating system) is especially valuable for the locations difficult to access.

  • Research Article
  • Cite Count Icon 4
  • 10.1002/mrm.29688
Adaptive model-based Magnetic Resonance.
  • May 8, 2023
  • Magnetic Resonance in Medicine
  • Inbal Beracha + 2 more

Conventional sequences are static in nature, fixing measurement parameters in advance in anticipation of a wide range of expected tissue parameter values. We set out to design and benchmark a new, personalized approach-termed adaptive MR-in which incoming subject data is used to update and fine-tune the pulse sequence parameters in real time. We implemented an adaptive, real-time multi-echo (MTE) experiment for estimating T2 s. Our approach combined a Bayesian framework with model-based reconstruction. It maintained and continuously updated a prior distribution of the desired tissue parameters, including T2 , which was used to guide the selection of sequence parameters in real time. Computer simulations predicted accelerations between 1.7- and 3.3-fold for adaptive multi-echo sequences relative to static ones. These predictions were corroborated in phantom experiments. In healthy volunteers, our adaptive framework accelerated the measurement of T2 for n-acetyl-aspartate by a factor of 2.5. Adaptive pulse sequences that alter their excitations in real time could provide substantial reductions in acquisition times. Given the generality of our proposed framework, our results motivate further research into other adaptive model-based approaches to MRI and MRS.

  • Conference Article
  • 10.2351/1.5062877
Non-disruptive, low loss in-line laser beam monitoring system for industrial laser processing
  • Jan 1, 2013
  • Michael Scaggs + 1 more

Monitoring both near and far field laser beam parameters is extremely helpful in understanding the quality of a laser process. The measurement of such parameters has not been practical to date due to the required disruption of the process beam in order to make the measurement. A significant quality control enhancement could be realized if one could monitor both near and far field patterns of the laser system during the process if it could be done with minimal loss or alteration of the process beam. A novel optical design is discussed that integrated into a laser process head with minimal power loss and disruption to the Laser beam. This in-line monitoring system provides focal spot size, Rayleigh length, focal position and M- squared values as well as all the other ISO beam profiling parameters in real time. An in-line laser beam monitoring system makes possible a higher level of quality control and reduced scrap thereby providing a higher level of reliability to laser processing.

  • Research Article
  • Cite Count Icon 22
  • 10.1016/j.rcim.2021.102132
Adaptive optimal control of stencil printing process using reinforcement learning
  • Mar 4, 2021
  • Robotics and Computer-Integrated Manufacturing
  • Nourma Khader + 1 more

Adaptive optimal control of stencil printing process using reinforcement learning

  • Conference Article
  • Cite Count Icon 3
  • 10.23919/ccc50068.2020.9188614
Improved PI and Repetitive Controller for Dual-Buck Inverter
  • Jul 1, 2020
  • Weidong Chen + 3 more

In order to improve the dynamic responses of conventional repetitive control, a novel repetitive control combined with adaptive PI is proposed in this paper. The adaptive PI is a self-tuning method, which performs parameter adjustment based on the steepest descent method. The controller can adjust the PI parameters and calculate the new PI parameters in real time as the external working conditions changed. The performance of the proposed controller was assessed in dual-buck full-bridge single phase inverter system through simulation which run in PSIM software. The results show that the control scheme can not only obtain a stable output voltage waveform in steady state, but also adjust the PI parameters in real time when the given reference changed. It proves the effectiveness and practicability of the proposed method and provides a new voltage control scheme for dual-buck inverter.

  • Conference Article
  • 10.2118/212083-ms
Application of Software Tools to Optimize Drilling Horizontal Wells in Mature Fields
  • Nov 15, 2022
  • B Zhiyenbayev + 3 more

Analysis of previously drilled horizontal wells in mature fields showed a lack of monitoring of drilling parameters in real time. For example, complications during the drilling process (slack off / pick up), low rate of penetration, poor wellbore cleaning, and as a result mechanical and differential, increased uncertainty of the location of wells leading to the risk of collision with other wellbores, also casing running problems and cementing. The issue of monitoring drilling parameters in real time, considering the actual drilling parameters obtained from the well, is extremely important and in demand. The workflow includes linking preplanned engineering calculations with actual data while drilling using software applications to optimize the following parameters: mechanical integrity of the drill string and efficiency of the BHA depending on the type of operation (tripping in, tripping out, rotary drilling, slide drilling and etc.); drilling fluid circulation in the wellbore to ensure its cleaning and downhole pressure control (minimum flow rate and equivalent circulating density); calculation of running casing strings (trip speed, circulation, calculation of lateral forces etc.). The described process of real-time monitoring of horizontal wells makes it possible to identify and prevent deviations from drilling technology at early stages and during drilling, thereby significantly reducing the likelihood of drilling problems and accidents, and as a result, it had a positive effect on the timing of well construction and its quality. This article demonstrates the method of control and production of calculations in accordance with actual data to optimize the drilling of horizontal wells in the fields. Examples of wells from several fields in the West Kazakhstan region where this technique is used are shown.

  • Research Article
  • Cite Count Icon 1
  • 10.3390/s25165020
Temperature Control Method for Electric Heating Furnaces Based on Auto-Encoder and Fuzzy PI Control
  • Aug 13, 2025
  • Sensors (Basel, Switzerland)
  • Haiyang Huang + 3 more

HighlightsAiming at the difficult problem of controlling the electric heating furnace, combining the advantages of Auto-Encoder and fuzzy control, a composite control algorithm is proposed for dynamic modelling of the electric heating furnace, predicting the future temperature and adjusting the control parameters in real time in order to achieve accurate and stable control of the temperature, which provides a new way of thinking for solving the control problems of nonlinear, time-varying, and large-time-lag systems.What are the main findings?A discrete mathematical model of an electric heating furnace was established, and unsupervised dynamic modelling was achieved through an auto-encoder.A control structure combining predictive compensation and fuzzy regulation was designed to achieve stable low-overshoot control under complex interference.What is the implication of the main finding?Improved the stability, accuracy, and robustness of the electric heating furnace temperature control system.Provided a new modelling and control framework for solving control problems in nonlinear, time-varying, and large-time-delay industrial heating systems, with good application prospects.Electric heating furnaces are widely used in industrial production and scientific research, where the quality of temperature control directly affects product performance and operational safety. However, precise control remains challenging due to the system’s nonlinear behaviour, time-varying characteristics, and significant time delays. To overcome these issues, this paper proposes a composite control method that integrates an auto-encoder-based prediction model with fuzzy PI control. Specifically, a discrete-time temperature model is constructed, in which the auto-encoder learns the system dynamics and predicts future temperatures, while the fuzzy controller adaptively tunes the PI parameters in real time. This approach improves both modelling accuracy and the adaptability of the control system. The simulation results on the MATLAB/Simulink platform show that the proposed method maintains the temperature overshoot within 2% under various disturbances, including a maximum delay of 243 s, ±2 °C measurement noise, 10% voltage fluctuation, and abrupt 10% gain variation. These results demonstrate the method’s strong robustness and indicate its suitability for advanced control design in complex industrial environments.

  • Conference Article
  • Cite Count Icon 36
  • 10.1109/icma.2007.4303969
Application of a PID Controller using MRAC Techniques for Control of the DC Electromotor Drive
  • Aug 1, 2007
  • Ai Xiong + 1 more

In DC motor drives, device aging and environmental factors can degrade the performance of the control system and make it difficult to redesign controller parameters in real time. In this paper, an algorithm that combines PID control scheme and model reference adaptive control (MRAC) to autotune the controller parameters in real time when system performances change is proposed and applied to control of the DC electromotor drive. The algorithm modifies the traditional PID control scheme by replacing the error between the reference and the system output with the direct system output in the derivative part. Compared to the traditional MRAC approach, the algorithm is normalized so that a reference model can be easily generated given bandwidth requirement. To be practical for implementation, the algorithm also simplifies the transfer function of the speed open-loop as an integrator by adding a current inner loop. The results of simulation on a two-loop servo control system show that the adjusted controller parameters meet the system design very well. The algorithm was implemented on a test platform and the performance of the system was evaluated with a dynamic signal analyzer. The effectiveness of the proposed method is shown through the experimental results.

  • Research Article
  • Cite Count Icon 23
  • 10.1109/tbme.2017.2657121
A Variance Distribution Model of Surface EMG Signals Based on Inverse Gamma Distribution.
  • Nov 1, 2017
  • IEEE Transactions on Biomedical Engineering
  • Hideaki Hayashi + 3 more

Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this variance. Variance distribution estimation based on marginal likelihood maximization is also outlined in this paper. The procedure can be approximated using rectified and smoothed EMG signals, thereby allowing the determination of distribution parameters in real time at low computational cost. Results: A simulation experiment was performed to evaluate the accuracy of distribution estimation using artificially generated EMG signals, with results demonstrating that the proposed model's accuracy is higher than that of maximum-likelihood-based estimation. Analysis of variance distribution using real EMG data also suggested a relationship between variance distribution and signal-dependent noise. Conclusion: The study reported here was conducted to examine the performance of a proposed surface EMG model capable of representing variance distribution and a related distribution parameter estimation method. Experiments using artificial and real EMG data demonstrated the validity of the model. Significance: Variance distribution estimated using the proposed model exhibits potential in the estimation of muscle force.Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this variance. Variance distribution estimation based on marginal likelihood maximization is also outlined in this paper. The procedure can be approximated using rectified and smoothed EMG signals, thereby allowing the determination of distribution parameters in real time at low computational cost. Results: A simulation experiment was performed to evaluate the accuracy of distribution estimation using artificially generated EMG signals, with results demonstrating that the proposed model's accuracy is higher than that of maximum-likelihood-based estimation. Analysis of variance distribution using real EMG data also suggested a relationship between variance distribution and signal-dependent noise. Conclusion: The study reported here was conducted to examine the performance of a proposed surface EMG model capable of representing variance distribution and a related distribution parameter estimation method. Experiments using artificial and real EMG data demonstrated the validity of the model. Significance: Variance distribution estimated using the proposed model exhibits potential in the estimation of muscle force.

  • Research Article
  • 10.1002/dac.70369
A Novel EKF‐PSO Approach for Enhanced Object Tracking and Routing in Wireless Sensor Networks
  • Dec 29, 2025
  • International Journal of Communication Systems
  • T Vairam + 2 more

WSNs are essential in industrial and environmental areas, where they can be used to monitor things and track objects in real time. In WSNs that achieve accurate target localization, it is a primary problem in object tracking, primarily in dynamic environments. Additionally, tracking of objects is complicated due to the motion of objects being nonlinear in real‐life conditions, which may negatively affect the performance of the tracking process. To address it, the current research will introduce a new hybrid tracking algorithm, which is a combination of Extended Kalman Filters (EKF) and Particle Swarm Optimization (PSO) to measure target state and adjust covariance parameters in WSNs. In this case, the adaptive parameter tuning will be carried out by PSO, whereas EKF modulates the real‐time alterations dynamically, which will increase the monitoring efficiency under unpredictable circumstances. This combination enhances precision in tracking whereby filter parameters are constantly diminished with respect to the changing motion of the target. EKF has been applied to predict the state of a target in real‐life situations because of its success in estimating the state of a free nonlinear system with Gaussian noise. A static filter configuration can, however, be inadequate in dynamic environments. PSO guarantees reliable tracking of objects regardless of the unpredictable situations by addressing this limitation by modulating EKF parameters in real time. In order to evaluate the performance of the proposed EKF‐PSO approach, the study is applied in MATLAB using two network topologies: Random Topology (RT) and Hybrid Topology (HT). The various parameters that are employed to quantify the workability of the suggested approach include power usage, delay, and tracking error. The results of simulations for 100 repeated runs indicate that root mean square error decreases by 28% and energy savings increase by 15% compared with conventional EKF. The algorithm is dynamic in conditions of nonlinear motion and motion noise. From the results, it is shown that HT‐EKF‐PSO significantly outperforms RT‐EKF‐PSO with all metrics.

  • Preprint Article
  • 10.5194/ems2025-698
Quality control of precipitation data at GeoSphere Austria
  • Jul 16, 2025
  • Niko Filipovic

Rain gauge measurement network of the Austrian national weather service is operated by GeoSphere Austria and comprises about 270 weather stations, most of which are equipped with weighing rain gauges and a smaller number with tipping bucket rain gauges. Each gauge is additionally equipped with a precipitation monitor that detects the beginning and the end of precipitation events. Precipitation data are checked for plausibility and completeness in several steps within a framework of an automated quality control tool called AQUAS (short for Austria Quality Service). The software was developed in 2016 at ZAMG (now GeoSphere Austria) in Vienna as part of the quality management in the area of real-time processing of near-surface observation data.The basis for quality control procedure is formed by standard methods for checking meteorological and climatological data in accordance with the WMO recommendation (e.g. plausibility check, temporal, spatial and internal consistency check, etc.); in addition, test procedures are developed that take into account the specific errors of the measuring devices. In AQUAS, individual system components are designed to test the incoming observation parameters in real time – in the case of precipitation data with a time resolution of 1 minute - as well as on the basis of daily data.Based on the quality control of precipitation data, the structure of AQUAS and an example of its operational use are presented. An algorithm for checking precipitation data from 1-minute weighing gauge measurements is demonstrated that detects spurious precipitation events and missing gauge precipitation based on combination of the precipitation monitor observations and the total weight changes of the rain gauge. The advantage of this method is that the software errors of the weighing gauge are largely intercepted by comparison with an independent measurement. This algorithm currently supports the experts in manual quality control of precipitation data. After thorough improvements and tests, it is planned to integrate it into a semi-automatic quality control system.

  • Research Article
  • Cite Count Icon 6
  • 10.20965/jrm.2003.p0304
Camera Calibration and 3-D Measurement with an Active Stereo Vision System for Handling Moving Objects
  • Jun 20, 2003
  • Journal of Robotics and Mechatronics
  • Atsushi Yamashita + 4 more

In this paper, we propose a fast, easy camera calibration and 3-D measurement method with an active stereo vision system for handling moving objects whose geometric models are known. We use stereo cameras that change direction independently to follow moving objects. To gain extrinsic camera parameters in real time, a baseline stereo camera (parallel stereo camera) model and projective transformation of stereo images are used by considering epipolar constraints. To make use of 3-D measurement results for a moving object, the manipulator hand approaches the object. When the manipulator hand and object are near enough to be situated in a single image, very accurate camera calibration is executed to calculate the manipulator size in the image. Our calibration is simple and practical because it does not need to calibrate all camera parameters. The computation time for real-time calibration is not large because we need only search for one parameter in real time by deciding the relationship between all parameters in advance. Our method does not need complicated image processing or matrix calculation. Experimental results show that the accuracy of 3-D reconstruction of a cubic box whose edge is 60 mm long is within 1.8 mm when the distance between the camera and the box is 500 mm. Total computation time for object tracking, camera calibration, and manipulation control is within 0.5 seconds.

  • Preprint Article
  • 10.5194/egusphere-egu25-17837
AQUAS - A quality control tool at GeoSphere Austria
  • Mar 15, 2025
  • Niko Filipovic

Rain gauge measurement network of the Austrian national weather service operated by GeoSphere Austria comprises about 270 weather stations equipped with weighing rain gauges and, at a smaller part, with tipping bucket rain gauges. Each gauge is additionally equipped with a precipitation monitor that detects the beginning and the end of precipitation events. Precipitation data are checked for plausibility and completeness in several steps within a framework of an automated quality control tool called AQUAS (short for Austria Quality Service). The software was developed in 2016 at ZAMG (now GeoSphere Austria) in Vienna as part of the quality management in the area of real-time processing of near-surface observation data.The basis for quality control procedure is formed by standard methods for checking meteorological and climatological data in accordance with the WMO recommendation (e.g. plausibility check, temporal, spatial and internal consistency check, etc.); in addition, test procedures are developed that take into account the specific errors in the measuring devices.  The test methods are continuously improved and further developed within the framework of AQUAS. Individual system components are designed to test the incoming observation parameters in real time - at the time resolution of 10 minutes, for example, for wind or temperature data and down to 1 minute time resolution for precipitation data. In AQUAS, each parameter can be processed independently of the other measured variables of a weather station. In addition, data from other sources are implemented in AQUAS, such as radar and satellite data. Data from numerical weather prediction models and data from other measurement networks, such as hydrological network or another third-party network, can also be integrated.Some examples for the operational use of AQUAS and the current state of research on quality control of precipitation data will be presented. As an example, a novel method for real-time quality control of 1-minute weighing gauge precipitation data is demonstrated, which detects missing gauge precipitation based on the observation of the precipitation monitor and the total weight changes of the rain gauge.   

  • Conference Article
  • 10.5220/0004423101530160
English
  • Jan 1, 2013
  • Paul Cotofrei + 2 more

Sensor networks are a primary source of massive amounts of data about the real world that surrounds us, measuring a wide range of physical parameters in real time. Given the hardware limitations and physical environment in which the sensors must operate, along with frequent changes of network topology, algorithms and protocols must be designed to provide a robust and energy efficient communications mechanism. With a view to addressing these constraints, this paper proposes a routing technique that is based on density based spatial clustering of applications with noise (DBSCAN) algorithm. This technique reveals several network topology semantics, enables the splitting of sensors responsibilities (communication/routing and sensing/monitoring), reduces the level of energy wasted on sending messages through the network by data aggregation only in cluster-head nodes and last but not the least, brings along very good results prolonging the network lifetime.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.