A Photometricity and Extinction Monitor at the Apache Point Observatory
An unsupervised software ``robot'' that automatically and robustly reduces and analyzes CCD observations of photometric standard stars is described. The robot measures extinction coefficients and other photometric parameters in real time and, more carefully, on the next day. It also reduces and analyzes data from an all-sky $10 \mu m$ camera to detect clouds; photometric data taken during cloudy periods are automatically rejected. The robot reports its findings back to observers and data analysts via the World-Wide Web. It can be used to assess photometricity, and to build data on site conditions. The robot's automated and uniform site monitoring represents a minimum standard for any observing site with queue scheduling, a public data archive, or likely participation in any future National Virtual Observatory.
- Conference Article
1
- 10.1051/ao4elt/201002005
- Jan 1, 2010
Performance of AO systems on ELTs depends not just on seeing; other parameters such as strength of high-altitude turbulence are more relevant for the laser tomography. Nights good for some instruments are not so good for others, so there is a clear advantage to go from the simple queue scheduling on seeing to a more complex strategy of selecting in a multi-parameter space. The need to know those atmospheric parameters in real time drives the requirements to the ELT site monitors and their data. It is suggested to measure atmospheric parameters and internal seeing along the line of sight by an internal seeing monitor, complementing the external site monitor. 1 Standard and advanced queue scheduling Extremely large telescopes (ELTs) will work with adaptive-optics (AO) most of the time. The per- formance of AO and, consequently, the science output of ELTs, strongly depend on the atmospheric conditions. Operation of ELTs in queue-scheduled (QS) mode, when observations are done under op- timum, rather than random, conditions, is an obvious choice. Modern large telescope already work in the QS mode driven by the seeing measured by site monitors. The QS on seeing will not be the best choice for ELTs, however. Their AO systems will use laser guide stars (LGSs) and tomography and their performance will mostly depend on the turbulent conditions in the high atmosphere. Therefore, a better choice will be to schedule observations on relevant atmospheric parameters related to the quality of AO science. Those parameters will differ, depending on the type of AO. Let us call this strategy advanced queue scheduling (AQS) to distinguish it from the standard queue scheduling (SQS) on seeing and the classical scheduling (CS) when nights are assigned in advance without regard to atmospheric conditions. Considering that ELTs and their instruments represent large investment, even a modest gain in productivity achieved by changing the strategy from SQS to AQS is worth the effort. In this contribu- tion, SQS and AQS are compared on two examples using real atmospheric data. Then we detail the requirements to the ELT site monitors and their data products as needed for the AQS.
- Research Article
11
- 10.1016/j.jngse.2016.03.075
- Mar 28, 2016
- Journal of Natural Gas Science and Engineering
Smart de-watering and production system through real-time water level surveillance for Coal-Bed Methane wells
- Conference Article
2
- 10.2118/173408-ms
- Jan 1, 2015
In the past few decades, Coal-Bed Methane (CBM) has become an important source of energy especially in North America. The methane adsorbed within the coal is in a near-liquid state. The open fractures in the cleats are commonly saturated with water. To develop CBM reservoir, water in the fracture and coal seam must be continuously pumped off from coal seam to reduce pressure and desorb gas from matrix. Although operators desire to produce hydrocarbon quickly, a too fast dewatering rate can irreversibly damage matrix desorption process, which can lead to an unfavorable ultimate recovery. Further, the aggressive production rate can potentially release the coal fines and drive them into pump, which increases maintenance effort and cost. Even worse, the de-watering process fluctuates because of the rock porosity and permeability changes resulting from the brittle coal seam and pressure reduction. Therefore, it is critical to adjust the pump operating parameters in a timely manner to maintain a continuous/intermittent production. Ten CBM wells are located in a remote area, which makes the access to wellsites difficult. Previously engineers had to evaluate well performance and optimize the pump on-site, which is limited by a monthly basis. We firstly developed an automatic data processing system using the advanced Echosounders, which can measure the water level in real time. The reservoir pressure can be then monitored dynamically through interpreting the detected water level. With an automatic-wireless data transferring system installed on-site and a closed-loop control program to receive, process, and interpret data, the pump operating parameters can be changed in real time through remote control. This system not only identifies the downhole problems in real time, but also reduces the pump maintenance frequency from 40 days to 75 days statistically and numbers of trip to well site. Further, the gas production rate has been averagely improved by 30% for the 10 wells. The authors firstly developed an automation data processing and control system in the favor of advanced echosounders. Based on the interpreted reservoir pressure, we can avoid aggressive production by adjusting the pump operating parameters in real time, which eventually results in a better ultimate recovery. The developed workflow (automatic echosounder data acquisition, real filed data transferred to central office, data processing, interpretation, and simulation in computational system, adjustment commands to operating system) is especially valuable for the locations difficult to access.
- Research Article
5
- 10.2112/si94-109.1
- Sep 9, 2019
- Journal of Coastal Research
Zheng, Z.-H., and Zhou, X.-H., 2019. Design and simulation of ship energy efficiency management system based on data analysis. In: Gong, D.; Zhu, H., and Liu, R. (eds.), Selected Topics in Coastal Research: Engineering, Industry, Economy, and Sustainable Development. Journal of Coastal Research, Special Issue No. 94, pp. 552–556. Coconut Creek (Florida), ISSN 0749-0208.Using the systematic method of modern ship energy efficiency management, strict control and management of ship energy consumption, energy utilization and carbon dioxide emissions will be of great significance for achieving energy conservation and emission reduction of ships and promoting the green development of shipping. On the basis of summarizing the previous research results, this paper designs a ship energy efficiency management model based on data analysis through a large number of current industry data collection, manufacturing and operation data collection, collation and analysis. The model system can monitor and collect various dynamic and static parameters in real time, and calculate the ship energy efficiency design index (EEDI) and the ship energy efficiency operation index (EEOI). Combined with the ship's own and its energy consumption information, such as range, cargo capacity, speed, host power, the system can dynamically analyze the entire production and operation process of the ship. The final simulation experiment shows the feasibility and effectiveness of this model.
- Research Article
4
- 10.1002/mrm.29688
- May 8, 2023
- Magnetic Resonance in Medicine
Conventional sequences are static in nature, fixing measurement parameters in advance in anticipation of a wide range of expected tissue parameter values. We set out to design and benchmark a new, personalized approach-termed adaptive MR-in which incoming subject data is used to update and fine-tune the pulse sequence parameters in real time. We implemented an adaptive, real-time multi-echo (MTE) experiment for estimating T2 s. Our approach combined a Bayesian framework with model-based reconstruction. It maintained and continuously updated a prior distribution of the desired tissue parameters, including T2 , which was used to guide the selection of sequence parameters in real time. Computer simulations predicted accelerations between 1.7- and 3.3-fold for adaptive multi-echo sequences relative to static ones. These predictions were corroborated in phantom experiments. In healthy volunteers, our adaptive framework accelerated the measurement of T2 for n-acetyl-aspartate by a factor of 2.5. Adaptive pulse sequences that alter their excitations in real time could provide substantial reductions in acquisition times. Given the generality of our proposed framework, our results motivate further research into other adaptive model-based approaches to MRI and MRS.
- Research Article
44
- 10.1016/j.ijrmms.2021.104739
- Apr 10, 2021
- International Journal of Rock Mechanics and Mining Sciences
In-situ digital profiling of soil to rock strength from drilling process monitoring of 200 m deep drillhole in loess ground
- Conference Article
- 10.2351/1.5062877
- Jan 1, 2013
Monitoring both near and far field laser beam parameters is extremely helpful in understanding the quality of a laser process. The measurement of such parameters has not been practical to date due to the required disruption of the process beam in order to make the measurement. A significant quality control enhancement could be realized if one could monitor both near and far field patterns of the laser system during the process if it could be done with minimal loss or alteration of the process beam. A novel optical design is discussed that integrated into a laser process head with minimal power loss and disruption to the Laser beam. This in-line monitoring system provides focal spot size, Rayleigh length, focal position and M- squared values as well as all the other ISO beam profiling parameters in real time. An in-line laser beam monitoring system makes possible a higher level of quality control and reduced scrap thereby providing a higher level of reliability to laser processing.
- Research Article
22
- 10.1016/j.rcim.2021.102132
- Mar 4, 2021
- Robotics and Computer-Integrated Manufacturing
Adaptive optimal control of stencil printing process using reinforcement learning
- Conference Article
3
- 10.23919/ccc50068.2020.9188614
- Jul 1, 2020
In order to improve the dynamic responses of conventional repetitive control, a novel repetitive control combined with adaptive PI is proposed in this paper. The adaptive PI is a self-tuning method, which performs parameter adjustment based on the steepest descent method. The controller can adjust the PI parameters and calculate the new PI parameters in real time as the external working conditions changed. The performance of the proposed controller was assessed in dual-buck full-bridge single phase inverter system through simulation which run in PSIM software. The results show that the control scheme can not only obtain a stable output voltage waveform in steady state, but also adjust the PI parameters in real time when the given reference changed. It proves the effectiveness and practicability of the proposed method and provides a new voltage control scheme for dual-buck inverter.
- Conference Article
- 10.2118/212083-ms
- Nov 15, 2022
Analysis of previously drilled horizontal wells in mature fields showed a lack of monitoring of drilling parameters in real time. For example, complications during the drilling process (slack off / pick up), low rate of penetration, poor wellbore cleaning, and as a result mechanical and differential, increased uncertainty of the location of wells leading to the risk of collision with other wellbores, also casing running problems and cementing. The issue of monitoring drilling parameters in real time, considering the actual drilling parameters obtained from the well, is extremely important and in demand. The workflow includes linking preplanned engineering calculations with actual data while drilling using software applications to optimize the following parameters: mechanical integrity of the drill string and efficiency of the BHA depending on the type of operation (tripping in, tripping out, rotary drilling, slide drilling and etc.); drilling fluid circulation in the wellbore to ensure its cleaning and downhole pressure control (minimum flow rate and equivalent circulating density); calculation of running casing strings (trip speed, circulation, calculation of lateral forces etc.). The described process of real-time monitoring of horizontal wells makes it possible to identify and prevent deviations from drilling technology at early stages and during drilling, thereby significantly reducing the likelihood of drilling problems and accidents, and as a result, it had a positive effect on the timing of well construction and its quality. This article demonstrates the method of control and production of calculations in accordance with actual data to optimize the drilling of horizontal wells in the fields. Examples of wells from several fields in the West Kazakhstan region where this technique is used are shown.
- Research Article
1
- 10.3390/s25165020
- Aug 13, 2025
- Sensors (Basel, Switzerland)
HighlightsAiming at the difficult problem of controlling the electric heating furnace, combining the advantages of Auto-Encoder and fuzzy control, a composite control algorithm is proposed for dynamic modelling of the electric heating furnace, predicting the future temperature and adjusting the control parameters in real time in order to achieve accurate and stable control of the temperature, which provides a new way of thinking for solving the control problems of nonlinear, time-varying, and large-time-lag systems.What are the main findings?A discrete mathematical model of an electric heating furnace was established, and unsupervised dynamic modelling was achieved through an auto-encoder.A control structure combining predictive compensation and fuzzy regulation was designed to achieve stable low-overshoot control under complex interference.What is the implication of the main finding?Improved the stability, accuracy, and robustness of the electric heating furnace temperature control system.Provided a new modelling and control framework for solving control problems in nonlinear, time-varying, and large-time-delay industrial heating systems, with good application prospects.Electric heating furnaces are widely used in industrial production and scientific research, where the quality of temperature control directly affects product performance and operational safety. However, precise control remains challenging due to the system’s nonlinear behaviour, time-varying characteristics, and significant time delays. To overcome these issues, this paper proposes a composite control method that integrates an auto-encoder-based prediction model with fuzzy PI control. Specifically, a discrete-time temperature model is constructed, in which the auto-encoder learns the system dynamics and predicts future temperatures, while the fuzzy controller adaptively tunes the PI parameters in real time. This approach improves both modelling accuracy and the adaptability of the control system. The simulation results on the MATLAB/Simulink platform show that the proposed method maintains the temperature overshoot within 2% under various disturbances, including a maximum delay of 243 s, ±2 °C measurement noise, 10% voltage fluctuation, and abrupt 10% gain variation. These results demonstrate the method’s strong robustness and indicate its suitability for advanced control design in complex industrial environments.
- Conference Article
36
- 10.1109/icma.2007.4303969
- Aug 1, 2007
In DC motor drives, device aging and environmental factors can degrade the performance of the control system and make it difficult to redesign controller parameters in real time. In this paper, an algorithm that combines PID control scheme and model reference adaptive control (MRAC) to autotune the controller parameters in real time when system performances change is proposed and applied to control of the DC electromotor drive. The algorithm modifies the traditional PID control scheme by replacing the error between the reference and the system output with the direct system output in the derivative part. Compared to the traditional MRAC approach, the algorithm is normalized so that a reference model can be easily generated given bandwidth requirement. To be practical for implementation, the algorithm also simplifies the transfer function of the speed open-loop as an integrator by adding a current inner loop. The results of simulation on a two-loop servo control system show that the adjusted controller parameters meet the system design very well. The algorithm was implemented on a test platform and the performance of the system was evaluated with a dynamic signal analyzer. The effectiveness of the proposed method is shown through the experimental results.
- Research Article
23
- 10.1109/tbme.2017.2657121
- Nov 1, 2017
- IEEE Transactions on Biomedical Engineering
Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this variance. Variance distribution estimation based on marginal likelihood maximization is also outlined in this paper. The procedure can be approximated using rectified and smoothed EMG signals, thereby allowing the determination of distribution parameters in real time at low computational cost. Results: A simulation experiment was performed to evaluate the accuracy of distribution estimation using artificially generated EMG signals, with results demonstrating that the proposed model's accuracy is higher than that of maximum-likelihood-based estimation. Analysis of variance distribution using real EMG data also suggested a relationship between variance distribution and signal-dependent noise. Conclusion: The study reported here was conducted to examine the performance of a proposed surface EMG model capable of representing variance distribution and a related distribution parameter estimation method. Experiments using artificial and real EMG data demonstrated the validity of the model. Significance: Variance distribution estimated using the proposed model exhibits potential in the estimation of muscle force.Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this variance. Variance distribution estimation based on marginal likelihood maximization is also outlined in this paper. The procedure can be approximated using rectified and smoothed EMG signals, thereby allowing the determination of distribution parameters in real time at low computational cost. Results: A simulation experiment was performed to evaluate the accuracy of distribution estimation using artificially generated EMG signals, with results demonstrating that the proposed model's accuracy is higher than that of maximum-likelihood-based estimation. Analysis of variance distribution using real EMG data also suggested a relationship between variance distribution and signal-dependent noise. Conclusion: The study reported here was conducted to examine the performance of a proposed surface EMG model capable of representing variance distribution and a related distribution parameter estimation method. Experiments using artificial and real EMG data demonstrated the validity of the model. Significance: Variance distribution estimated using the proposed model exhibits potential in the estimation of muscle force.
- Conference Article
10
- 10.4043/24275-ms
- Oct 29, 2013
The anticipation and remediation of operational problems while drilling an oilwell is the main goal of real time measurements of drilling parameters, such as bottomhole pressure, flow rate, pump pressure, torque, drag, among others. The petroleum industry has spent a great amount of financial resources to ensure the quality and availability of these data, but the knowledge for a correct analysis and interpretation of them is still far from being spread among the rigsite teams and drilling engineers. Nowadays, the interpretation of real time drilling data to identify possible operational problems is done by a drilling analysis specialist. However, this can be a very subjective job since it depends on the specialist experience. These analists also take their decisions based on intuition and qualitative rather than quantitative criteria. Petrobras has developed a computational tool (called PWDa) to interpret real time drilling data, predicting and analyzing drilling operational parameters (such as pump pressure, bottomhole pressure, torque and drag). The software detects abnormal behaviors (such as an unexpected increasing trend on bottomhole pressure) and establishes quantitative criteria in order to identify a possible cause, suggesting corrective and/or preventive actions. The main goal of the software is the establishment of an automated methodology to interpret operational parameters in real time helping the drilling engineers to take right and fast decisions. The software is being currently implemented at Petrobras Real Time Operations (RTO) rooms and is providing good results. Over 70 wells have already been monitored with PWDa and several operational problems (such as washout, mud losses, bit wear, downhole motor fail, deficient hole cleaning, pore pressure increments, etc) were successfully identified, allowing the operators to take fast decisions and avoiding riskier situations. The wells monitored include deep water exploratory wells (mostly), directional development wells and extended reach wells. This work aims to highlight the benefits generated by the implementation of the technology. The interaction with the drilling team, including operator and service company members will be discussed. Introduction The analyzis of PWD (Pressure While Drilling) data and other operational parameters (such as rate of penetration, standpipe pressure, flow rate, torque, drag, etc) is an important tool to identify and prevent several operational problems (Aragao et al). The real time interpretation of these data may be very useful to reduce non productive time, risks and operational costs. According to Teixeira et al, most of events and problems have direct or indirect impact on bottomhole pressure and standpipe pressure. Some of them may also affect torque and drag. Problems like poor hole cleaning, annular obstructions, wellbore collapse, kicks, washouts and mud losses will affect the amount of solids in the annular space and/or friction losses and, therefor, will directly affect standpipe and bottomhole pressure (Aragao, et al, 2005). Thus, the analysis of pressure data is a key element to identify and prevent operational problems. Additionaly, when other parameters are simultaneously analyzed (modlogging measurements, for instance), the interpretation becomes much richer.
- Research Article
- 10.59717/j.xinn-life.2025.100156
- Jan 1, 2025
- The Innovation Life
<p>The rapid advancement of single-cell technologies has brought revolutionary progress in biology, medicine, and drug development. However, the sheer volume of data and the complexity of analysis methods often pose a significant challenge for researchers lacking programming skills. To address this problem, we developed SeekSoul Online (<ext-link ext-link-type="uri" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="https://seeksoul.online/index.html#/login">https://seeksoul.online/index.html#/login</ext-link>), a comprehensive platform for single-cell multi-omics data analysis and interactive visualisation that requires no programming foundation. Designed with a user-friendly interface, the platform combines modular architecture and powerful computational capabilities to support the complete analysis process of single-cell transcriptome, single-cell immune repertoire data, and SeekSpace single-cell spatial transcriptome data. The platform achieves accurate cell type identification through self-constructed high-quality reference sets and artificial intelligence technology. In addition, SeekSoul Online offers interactive data analysis and report generation, allowing users to adjust analysis parameters in real time and generate analysis reports for communication. The platform also provides comprehensive project management and sharing functions to facilitate collaboration and knowledge sharing among research teams. With automated data processing workflows and an intuitive user interface, SeekSoul Online significantly enhances the convenience and efficiency of data analysis, allowing researchers to focus more on scientific discovery and accelerating research progress.</p>
- Ask R Discovery
- Chat PDF