A hyperbox classifier model for identifying secure carbon dioxide reservoirs
A hyperbox classifier model for identifying secure carbon dioxide reservoirs
- Research Article
52
- 10.1016/j.asoc.2015.04.060
- May 21, 2015
- Applied Soft Computing
A supervised approach to automatically extract a set of rules to support fall detection in an mHealth system
- Research Article
- 10.5014/ajot.2025.050965
- May 1, 2025
- The American journal of occupational therapy : official publication of the American Occupational Therapy Association
Widely used for assessing levels of disability, the World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0) provides informative profiles for six life perspectives. However, its utility is constrained by its lengthy assessment time, decreasing respondents' willingness to complete it. To address the utility challenge by developing a computerized adaptive testing system of the WHODAS 2.0 (CAT-WHODAS 2.0) for people with dementia. Fit indices were analyzed for a multidimensional Rasch model. The consistency of item difficulties was examined for the score comparability for examinees across sexes. The best set of stopping rules was determined using simulations to achieve high reliability and efficiency simultaneously. Community. The responses of 3,124 people were obtained from a nationwide database for disability certification, assessed through interviews. Twenty-seven items exhibited satisfactory model fits (infit and outfit mean squares = .58-1.35), and no items demonstrated differential item functioning by sex (difference values = -0.07 to 0.04). With the best set of rules, the CAT-WHODQAS 2.0 required approximately nine items to provide high Rasch person reliabilities in the six domains. These reliabilities were similar to those of the item bank (.90-.91 versus .93-.96). The concurrent validity was excellent: Pearson's rs = .90-.94 with the raw domain scores and .96-.99 with item bank. The CAT-WHODAS 2.0 can provide efficient, reliable, valid, and sex-unbiased assessments of disability for people with dementia. It may serve as an alternative for clinicians and researchers to optimize the efficiency of data collection. Plain-Language Summary: This article presents the computed adaptive testing of the World Health Organization Disability Assessment Schedule 2.0 (CAT-WHODAS 2.0) as an efficient solution to provide reliable, valid, and sex-unbiased assessments of disability among people with dementia. The CAT-WHODAS 2.0 is a promising alternative for clinicians because it can efficiently assess a person's level of disability with extremely high reliabilities in the six domains of functioning (cognition, mobility, self-care, getting along, life activities, and participation). The CAT-WHODAS 2.0 is also useful for researchers because its scores are comparable with those of the item bank, which consists of 27 items calibrated by the Rasch model.
- Research Article
20
- 10.1016/j.apenergy.2023.122229
- Nov 15, 2023
- Applied Energy
Innovative process integrating high temperature heat pump and direct air capture
- Research Article
- 10.22070/jqepo.2020.5317.1143
- Jun 25, 2020
Because of high costs for the delivery, manufacturers are usually required to dispatch their products in a batch delivery system. However, using such a system leads to some negative effects, such as increasing the number of tardy jobs. The current paper aims to investigate the two-machine flow-shop scheduling problem where jobs are processed in series on two stages, and then to be dispatched to customers in batches. The objective is to minimize the batch delivery cost and tardiness cost, related to the number of tardy jobs. First, a mixed-integer linear programming (MILP) model is proposed to explain the problem. As this is an NP-hard problem, the MILP model cannot solve large-size problems in a reasonable CPU running time. To solve large-size instances, some metaheuristic algorithms are provided, including Bee Algorithm (BA), Particle Swarm Optimization (PSO), Genetic Algorithm (GA), and a novel Hybrid Bees Algorithm (HBA). Using Friedman and Wilcoxon signed-ranks tests, then, these intelligent algorithms are compared and the results are analyzed. The results indicate that the HBA provides the best performance for large-size problems.
- Book Chapter
16
- 10.1007/978-3-030-46150-8_3
- Jan 1, 2020
Association rules are among the most important concepts in data mining. Rules of the form \(X \rightarrow Y\) are simple to understand, simple to act upon, yet can model important local dependencies in data. The problem is, however, that there are so many of them. Both traditional and state-of-the-art frameworks typically yield millions of rules, rather than identifying a small set of rules that capture the most important dependencies of the data. In this paper, we define the problem of association rule mining in terms of the Minimum Description Length principle. That is, we identify the best set of rules as the one that most succinctly describes the data. We show that the resulting optimization problem does not lend itself for exact search, and hence propose Grab, a greedy heuristic to efficiently discover good sets of noise-resistant rules directly from data. Through extensive experiments we show that, unlike the state-of-the-art, Grab does reliably recover the ground truth. On real world data we show it finds reasonable numbers of rules, that upon close inspection give clear insight in the local distribution of the data.
- Research Article
11
- 10.1007/s12665-020-08902-x
- Mar 31, 2020
- Environmental Earth Sciences
The strength and direction of gravity and ground deformation anomaly vectors over normal and tight reservoirs after injection of CO2 depend on the reservoir and CO2 properties. Some of these properties, such as porosity, permeability, and size, define the reservoir type and therefore determine the existence or sign of the anomalies, indirectly. However, some other properties of reservoirs, such as reservoir depth, horizontal extension of CO2 plume, and CO2 mass amount and density, affect the strength of the measurements directly. Gravimetric and geodetic modelling of synthetic CO2 reservoirs can quantify each of the direct effects that represent the expected signals over different geological settings. We present gravity signal as the gravity increase due to CO2 mass attraction, and its combination with the gravity decrease caused by ground uplift. Our results indicate that the reservoir depth and horizontal extension, along with the injected mass, have significant influences on both ground deformation and gravity signals. The results also demonstrate that the horizontal extension of the CO2 distribution decreases the dependency on the depth. In addition, the observed gravity signal over CO2 reservoirs is dominated by the free-air effect from large ground deformation. Finally, the gravity effect over both normal and tight reservoirs and the ground surface deformation over tight reservoirs are highly dependent on the density change of the injected CO2 inside the reservoir as a result of depth dependant changes of pressure and temperature.
- Research Article
363
- 10.1177/0263276411417430
- Nov 1, 2011
- Theory, Culture & Society
In a quiet London office, a software designer muses on the algorithms that will make possible the risk flags to be visualized on the screens of border guards from Heathrow to St Pancras International. There is, he says, ‘real time decision making’ – to detain, to deport, to secondarily question or search – but there is also the ‘offline team who run the analytics and work out the best set of rules’. Writing the code that will decide the association rules between items of data, prosaic and mundane – flight route, payment type, passport – the analysts derive a novel preemptive security measure. This paper proposes the analytic of the data derivative – a visualized risk flag or score drawn from an amalgam of disaggregated fragments of data, inferred from across the gaps between data and projected onto an array of uncertain futures. In contrast to disciplinary and enclosed techniques of collecting data to govern population, the data derivative functions via ‘differential curves of normality’, imagining a range of potential futures through the association rule, thus ‘opening up to let things happen’ ( Foucault 2007 ). In some senses akin to the risk orientation of the financial derivative, itself indifferent to actual underlying people, places or events by virtue of modulated norms, the contemporary security derivative is not centred on who we are, nor even on what our data say about us, but on what can be imagined and inferred about who we might be – on our very proclivities and potentialities.
- Research Article
16
- 10.1016/j.dche.2022.100018
- Feb 27, 2022
- Digital Chemical Engineering
Design of mosquito repellent molecules via the integration of hyperbox machine learning and computer aided molecular design
- Conference Article
3
- 10.1109/dese.2013.24
- Dec 1, 2013
Automatic fall detection is a major issue in taking care of the health of elderly people. In this task the capability of telling in real time falls from normal daily activities is crucial. To this aim, this paper proposes an approach based on the automatic extraction of knowledge expressed as a set of IFTHEN rules from a database of fall recordings. This set of rules, generated offline, can then be exploited in a real-time mobile monitoring system: data gathered by wearable sensors are processed in real time and, if their values activate some of the rules describing falls, an alarm message is automatically produced. The approach has been compared against other classifiers on a real-world fall database, and its discrimination ability is shown to be higher. Moreover, a test phase for the real-time mobile monitoring system is being carried out over real cases.
- Conference Article
3
- 10.1109/ic3i.2014.7019579
- Nov 1, 2014
This paper presents bottom-up Pittsburgh approach for discovery of classification rules. Population initialization makes use of entropy as the attribute significance measure and contains variable sized organizations. Each organization contains a set of IF-THEN rules. As bottom-up approach is employed, so traditional operators are not feasible and efficient to use. Therefore, four evolutionary operators are devised for realizing the evolutionary operations performed on organizations. Bottom-up Pittsburgh approach gives best set of rule having good accuracy. In experiments, the effectiveness of the proposed algorithm is evaluated by comparing the results of bottom-up Pittsburgh with and without entropy to the top-down Michigan approach with and without entropy on 10 datasets from the UCI and KEEL repository. All results show that bottom-up Pittsburgh approach achieves a higher predictive accuracy and is more consistent.
- Conference Article
2
- 10.1109/inbs.1995.404268
- May 29, 1995
An important stage in the development of living systems on Earth was the formation of RNA-like molecules capable of self-transcripting and self-replicating. In this paper, the authors attempt to develop a simple, flexible and accurate computer model of nucleotide interactions that lead to the non-enzymatic transcription of an oligonucleotide that acts as a template to catalyze the formation of a suite of oligonucleotides. The authors' computer model is cellular automata based and allows nucleotides to experience random movement and interact locally to associate with a template and/or oligomerize with other nucleotides according to a set of rules. To test the simulation method, results were compared to specific laboratory experimental results. The hypotheses were that the best set of rules developed would be able to produce results which were: 1. More similar to the laboratory experiment's results than random rules; 2. More similar to the laboratory experiment's results than a set of rules which is chemically realistic but has random probabilities; and 3. Statistically similar to the laboratory experiment's results. The test for determining whether the results were statistically similar was done using a regression analysis. At the a=0.05 level: the first two hypothesis were supported, and the third hypothesis has not yet been statistically supported. >
- Conference Article
2
- 10.2118/192011-ms
- Oct 23, 2018
High carbon dioxide in reservoirs limits successful exploration in many petroliferous basins, particularly in Southeast Asia. High reservoir CO2 in the offshore Malay Basin represents a significant exploration challenge. Some fields contain >80% CO2, which makes them unattractive targets for development. Various hypotheses on the origin of CO2 have been proposed but remain controversial. This paper shows that geochemistry and advanced petroleum system modeling help to resolve the origins of reservoir CO2 and allow quantitative estimates of CO2 in prospective reservoir targets prior to drilling. A novel workflow estimates the CO2 content in reservoirs based on knowledge of the chemical mechanisms for the origin of the CO2 and numerical simulation of geologic burial history. Heat flow, deposition of overburden rock, and the kinetics of specific reaction mechanisms control the timing of CO2 generation and the relative contributions of CO2 from different sources. In this study, stable carbon isotope ratios of CO2 and methane (δ13CCO2 and δ13CCH4, ‰) were used to identify the source of the CO2 in Malay Basin gas samples. For example, Figure 3 shows δ13CCO2 and δ13CCH4 for samples from various depths in the nearby field. The isotope data indicate that the samples contain mixed CO2 derived by different mechanisms from two sources. Partial least squares (PLS) regression of δ13CCO2 and δ13CCH4 and depth for 61 samples from the nearby field, where %CO2 was set as the dependent variable, resulted in a systematic correlation between predicted and measured %CO2. Alternate least squares (ALS) confirms that the data can be explained by mixing of gases from two endmembers: (1) shallower samples show lower %CO2 that is isotopically depleted in δ13CCH4 and δ13CCO2, and (2) deeper samples show higher %CO2 that is isotopically enriched in δ13CCH4 and δ13CCO2. The relative proportion of each endmember in the mixture can be calculated for each gas. Examples of near endmember gases in the nearby field (Figure 3) are: (1) shallow thermogenic CO2 derived by cracking of kerogen, e.g., 1681 m, 5% CO2, δ13CCH4 = -60‰, δ13CCO2 = -13‰, (100:0 mix); and (2) deep CO2 from carbonate decomposition, e.g., 2918 m, 74% CO2, δ13CCH4 = -32‰, δ13CCO2 = -3‰ (15:85 mix). These results are consistent with the general observation that tested Miocene traps in the Malay Basin and show a general trend of higher concentrations of CO2 in the deeper traps that are nearer carbonate basement. Biogenic CO2 may represent a third endmember in other parts of the basin.
- Supplementary Content
- 10.14457/tu.the.2017.329
- Jan 1, 2017
- NRCT Data Center
For industrial workers who are exposed to hazardous work conditions, their daily hazard exposures can be alleviated by appropriately rotating them among various jobs within a workday. To effectively implement job rotation, workers’ daily work schedules must be generated. This research studies the multiobjective ergonomic workforce scheduling problem (MO-EWSP) that is intended to generate safe daily rotating work schedules for workers such that their daily hazard exposures do not exceed a daily permissible exposure limit. Three problem objectives are considered: (1) minimizing manpower cost, (2) maximizing productivity, and (3) minimizing job dissatisfaction. The criteria used in the three objectives are number of workers, total worker-job fit score, and total dissatisfied worker-job and worker-partner assignment, respectively. Workers are heterogeneous with respective to the work ability, skill level, job preference, and partner preference. Jobs that are being considered have different operation work schedules as well as numbers of required operators. The problem solution consists of the number of workers for job rotation, their daily rotating work schedules, total worker-job fit score, and total number of dissatisfied worker-job and worker-partner assignment.In this research, the MO-EWSP is solved preemptively and nonpreemptively using both optimization and metaheuristic approaches. For the former solution approach, two mathematical models are developed, namely, preemptive mixed integer linear programming model (P-MILP) and nonpreemptive goal programming model (N-GP). For the latter approach, two genetic algorithms (GAs) are developed, namely, preemptive multiobjective GA (P-MOGA) and nonpreemptive multiobjective GA (N-MOGA). A numerical example is generated to illustrate the use of mathematical programming and GA approaches. Firstly, the problem is solved to optimality using the P-MILP and N-GP models alternately. Next, the same problem is solved using the P-MOGA and N-MOGA approaches. At the end of the last generation, a set of good solutions are obtained. It is shown that the preemptive and nonpreemptive considerations can yield the same targets if the scale weights, relative importance weights, targets, and objective functions are suitably defined. That is, the objective values in all objectives when they are solved preemptively are the same as those when solved nonpreemptively. Nevertheless, different daily work schedules are obtained. Additionally, a computation experiment is conducted with 18 hypothetical test problems. The P-MILP model can solve the test problems to optimality with a success rate of 83%. The N-GP, P-MOGA, and N-MOGA can solve all test problems in less than 1% of the computation time required by the P-MILP model. The N-GP requires the shortest computation time since it solves multiple objectives at once. However, it provides only one optimum solution that is the same as the P-MILP model. The P-MOGA approach can provide a set of good solutions at the end of the evolution process. For the N-MOGA approach, the overall average percent deviation from the P-MILP solution is larger than the P-MOGA approach. However, the N-MOGA approach provides more flexibility to the scheduler in adjusting the priority order of objectives as opposed to having only one priority order with the preemptive consideration.
- Research Article
5
- 10.5075/epfl-cisbat2015-859-864
- Jan 1, 2015
- Infoscience (Ecole Polytechnique Fédérale de Lausanne)
Bi-level optimisation of distributed energy systems incorporating non-linear power flow constraints
- Research Article
129
- 10.1016/j.ijggc.2018.11.011
- Nov 27, 2018
- International Journal of Greenhouse Gas Control
Life cycle carbon efficiency of Direct Air Capture systems with strong hydroxide sorbents