The capacity compliance problem: refinements of time discounting to Choquet integrals with 2-additive fuzzy measures
Abstract The motivation of this research is a classical problem in financial mathematics, namely, the subjective valuation of rewards earned in future periods. Various additive formulas have been proposed and criticized both theoretically and experimentally. We start from the premise that interaction among the periods affecting this subjective assessment is allowed. Dispensing with the additive form of time discounting leads to a Choquet evaluation defined from a capacity that encapsulates the interactions and at the same time, coincides with standard evaluations at any single time period. This consideration raises the general capacity compliance problem. We formally pose this question and motivated by computational tractability, we investigate the main traits of its 2-additive configuration. Then we apply our conclusions to two specific statements respectively suggested by the exponential and hyperbolic discounted utility formulas.
- Research Article
753
- 10.1080/01621459.1960.10482064
- Jun 1, 1960
- Journal of the American Statistical Association
The exponentially weighted average can be interpreted as the expected value of a time series made up of two kinds of random components: one lasting a single time period (transitory) and the other lasting through all subsequent periods (permanent). Such a time series may, therefore, be regarded as a random walk with “noise” superimposed. It is also shown that, for this series, the best forecast for the time period immediately ahead is the best forecast for any future time period, because both give estimates of the permanent component. The estimate of the permanent component is imperfect, and so the estimate of a regression coefficient is inconsistent in a relation involving the permanent (e.g. consumption as a function of permanent income). Its bias is small, however.
- Research Article
42
- 10.1093/oxfordjournals.aje.a112333
- Nov 1, 1976
- American journal of epidemiology
The principles of the Ederer-Myers-Mantel procedure for seeking evidence of disease clustering are reviewed. The procedure is based on cumulative empirical clusters, i.e., the largest frequency in a single time period or in two successive time periods, and comparing that cumulation with the expected cumulation of largest frequencies under random occurrence. Original tabulations covered totals of up to 15 cases distributed among three, four or five time periods. Present tabulations of expectations and variances cover up to 500 cases distributed among two or three time periods and 200 cases distributed among four or five time periods. Asymptotic formulas are provided for the expectation and variance of the largest frequency in a single period when arbitrarily many cases are distributed at random among two, three, four or five time periods.
- Research Article
3
- 10.1111/oik.09617
- Dec 22, 2022
- Oikos
Facilitation is an interaction where one species (the benefactor) positively impacts another (the beneficiary). However, the reciprocal effects of beneficiaries on their benefactors are typically only documented using short‐term datasets. We use Azorella selago, a cushion plant species and benefactor, and a co‐occurring grass species, Agrostis magellanica, on sub‐Antarctic Marion Island, comparing cushion plants and the grasses growing on them over a 13‐year period using a correlative approach. We additionally compare the feedback effect of A. magellanica on A. selago identified using our long‐term dataset with data collected from a single time period. We hypothesized that A. selago size and vitality would be negatively affected by A. magellanica cover and that the effect of A. magellanica on A. selago would become more negative with increasing beneficiary cover and abiotic‐severity, due to, e.g. more intense competition for resources. We additionally hypothesized that A. magellanica cover would increase more on cushion plants with greater dead stem cover, since dead stems do not inhibit grass colonization or growth. The relationship between A. magellanica cover and A. selago size and vitality was not significant in the long‐term dataset, and the feedback effect of A. magellanica on A. selago did not vary significantly with altitude or aspect; however, data from a single time period did not consistently identify this same lack of correlation. Moreover, A. selago dead stem cover was not significantly related to an increase in A. magellanica cover over the long term; however, we observed contrasting results from short‐term datasets. Long‐term datasets may, therefore, be more robust (and practical) for assessing beneficiary feedback effects than conventional approaches, particularly when benefactors are slow‐growing. For the first time using a long‐term dataset, we show a lack of physical cost to a benefactor species in a facilitative interaction, in contrast to the majority of short‐term studies.
- Research Article
11
- 10.1007/s00267-011-9648-x
- Mar 4, 2011
- Environmental Management
Wildlife managers have little or no control over climate change. However, they may be able to alleviate potential adverse impacts of future climate change by adaptively managing wildlife for climate change. In particular, wildlife managers can evaluate the efficacy of compensatory management actions (CMAs) in alleviating potential adverse impacts of future climate change on wildlife species using probability-based or fuzzy decision rules. Application of probability-based decision rules requires managers to specify certain probabilities, which is not possible when they are uncertain about the relationships between observed and true ecological conditions for a species. Under such uncertainty, the efficacy of CMAs can be evaluated and the best CMA selected using fuzzy decision rules. The latter are described and demonstrated using three constructed cases that assume: (1) a single ecological indicator (e.g., population size for a species) in a single time period; (2) multiple ecological indicators for a species in a single time period; and (3) multiple ecological conditions for a species in multiple time periods.
- Conference Article
- 10.1109/codit.2017.8102558
- Apr 1, 2017
In this paper, we consider the bipartite complete matching vertex interdiction problem, taking into account some incompatibilities existing among the resources to assign. This problem ensures the obtainment of a robust assignment, which is defined by the number of missing resources still allowing a valid assignment. We introduce graph formulations, considering a single time period or several ones. This problem is shown to be NP-hard, even when considering only a single time period. For several time periods, we adapt the graph formulation, allowing us to solve the problem using polynomial heuristics. Two greedy algorithms and a genetic algorithm are proposed and compared on a randomly-generated testbed.
- Conference Article
- 10.2118/4714-ms
- Nov 7, 1973
This paper was prepared for the Eastern Regional Meeting of the Society of Petroleum Engineers of AIME, to be held in Pittsburgh, Pa., Nov. 7–9, 1973. Permission to copy is restricted to an abstract of not more than 300 words. Illustrations may not be copied. The abstract should contain conspicuous acknowledgment of where and by whom the paper is presented. Publication elsewhere after publication in the JOURNAL paper is presented. Publication elsewhere after publication in the JOURNAL OF PETROLEUM TECHNOLOGY or the SOCIETY OF PETROLEUM ENGINEERS JOURNAL is granted upon request to the Editor of the appropriate journal provided agreement to give proper credit is made. Discussion of this paper is invited. Three copies of any discussion should be sent to the Society of Petroleum Engineers office. Such discussion may be presented at the above meeting and, with the paper, may be considered for publication in one of the SPE magazines. Abstract This paper is an extension of an earlier paper, "The Application of Linear Flow Models paper, "The Application of Linear Flow Models to Natural Gas Distribution Systems", which was concerned with the development of a static linear flow model and its application to the Pennsylvania Gas Co.'s distribution network. Pennsylvania Gas Co.'s distribution network. The same network is modeled in this paper by means of a dynamic linear flow model; Unlike the static model that is applicable only over a single period of time such as a day, the dynamic linear flow model is capable of depicting the day-today operations of a complex distribution system over larger periods of time such as a month or year. Like the static model, the dynamic model is a valuable tool in studying both the direct and indirect effects of some alteration in the system. Examples of alterations within a gas distribution system include a new underground gas storage field, a severed pipeline or a reduced gas contract. The development of a dynamic linear flow model is largely a matter of collecting a sequential set of static linear flow models, analyzing the behavior of the technical coefficients over time, and then developing suitable methods for predicting these coefficients as a function of time. However, the model is incomplete and of little use to management and engineering until it is applied to the future. This paper presents a stage-wise forecast of the future based upon the technical coefficients over a previous time period. In addition to the model building process and the actual field application, an impact study is performed to exhibit the type of information performed to exhibit the type of information available from a dynamic linear flow model. Introduction A deterministic operations research model, the linear flow model, was applied to a gas distribution network in a previous paper. In general, the model can be applied to any complex flow system where homogeneous flow units enter, travel within the flow system according to some process, and then eventually leave. A "static" process, and then eventually leave. A "static" linear flow model is concerned with the characteristics of this flow over a single period of time. The model portrays the flow period of time. The model portrays the flow system as a group of interrelated sectors, with each sector having a physical counterpart in the real-world situation. In a gas distribution system, the flow units (scf) enter the network from various sources (contract gas, gas from underground storage, gas produced by the distribution company itself, etc.), travel through pipelines and compressor stations, and finally are consumed by those served by the system, lost into the atmosphere, or stored for later use.
- Research Article
10
- 10.1016/j.ejor.2020.10.023
- Oct 27, 2020
- European Journal of Operational Research
Omega ratio optimization with actuarial and financial applications
- Conference Article
- 10.1063/5.0108923
- Jan 1, 2022
Efficiency evaluation and making valiant efforts for improvement are a mandate to be successful, for every entity across various industries and Data Envelopment Analysis (DEA) is a comprehensively and extensively accepted technique for efficiency evaluation and benchmarking of Decision-Making Units (DMUs). Although DEA has been applied in almost all domains, for efficiency evaluation in a single time period as well as in time series analysis, using along with other techniques, but the DEA in its conventional form has certain drawbacks like consideration of a single time period only and assignment of different benchmarks for every time period, which dilutes the purpose of analysis. To consider these problems, this paper suggests a Time Series Clustered Benchmarking (TSCB) DEA, which incorporates conventional DEA models and cluster analysis, to analyse multi-period time series data and thus assignment of benchmarks to DMUs with a similar structural genealogy, to result in achievable and feasible goals, for efficiency enhancement. For functional procedure of the proposed approach, the data related to twenty-four Indian public sector banks (PSBs) has been used, taken as decision making units (DMUs). It has been found that the proposed TSCB DEA approach categorizes the DMUs among nine clusters, identifies benchmark for each cluster separately, thus for each inefficient bank there is a benchmark found, in more effective way.
- Research Article
33
- 10.3390/rs10071136
- Jul 18, 2018
- Remote Sensing
Continuous-based predictors of habitat characteristics derived from satellite imagery are increasingly used in species distribution models (SDM). This is especially the case of Normalized Difference Vegetation Index (NDVI) which provides estimates of vegetation productivity and heterogeneity. However, when NDVI predictors are incorporated into SDM, synchrony between biological observations and image acquisition must be questionned. Due to seasonal variations of NDVI during the year, landscape patterns of habitats are revealed differently from one date to another leading to variations in models’ performance. In this paper, we investigated the influence of acquisition time period of NDVI to explain and predict bird community patterns over France. We examined if the NDVI acquisition period that best fit the bird data depends on the dominant land cover context. We also compared models based on single time period of NDVI with one model built from the Dynamic Habitat Index (DHI) components which summarize variations in vegetation phenology throughout the year from the fraction of radiation absorbed by the canopy (fPAR). Bird species richness was calculated as response variable for 759 plots of 4 km2 from the French Breeding Bird Survey. Bird specialists and generalists to habitat were considered. NDVI and DHI predictors were both derived from MODIS products. For NDVI, five time periods in 2010 were compared, from late winter to begin of autumn. A climate predictor was also used and Generalized Additive Models were fitted to explain and predict bird species richness. Results showed that NDVI-based proxies of dominant habitat identity and spatial heterogeneity explain more bird community patterns than DHI-based proxies of annual productivity and seasonnality. We also found that models’ performance was both time and context-dependent, varying according to the bird groups. In general, best time period of NDVI did not match with the acquisition period of bird data because in case of synchrony, differences in habitats are less pronounced. These findings suggest that the most powerful approach to estimate bird community patterns is the simplest one. It only requires NDVI predictors from a single appropriate time period, in addition to climate, which makes the approach very operational.
- Research Article
83
- 10.1287/trsc.36.1.40.572
- Feb 1, 2002
- Transportation Science
In a companion paper (Godfrey and Powell 2002) we introduced an adaptive dynamic programming algorithm for stochastic dynamic resource allocation problems, which arise in the context of logistics and distribution, fleet management, and other allocation problems. The method depends on estimating separable nonlinear approximations of value functions, using a dynamic programming framework. That paper considered only the case in which the time to complete an action was always a single time period. Experiments with this technique quickly showed that when the basic algorithm was applied to problems with multiperiod travel times, the results were very poor. In this paper, we illustrate why this behavior arose, and propose a modified algorithm that addresses the issue. Experimental work demonstrates that the modified algorithm works on problems with multiperiod travel times, with results that are almost as good as the original algorithm applied to single period travel times.
- Conference Article
3
- 10.1109/hicss.1998.656007
- Jan 6, 1998
The objective of the paper is to present experimental results for testing the performance of different auction mechanisms related to the introduction of competitive markets for the generation of electricity. The research is based on the concept of smart markets introduced by Vernon Smith (K.A. McCabe et al., 1991) and a simulation model (PowerWeb) of a realistic bulk power system. There are unique physical aspects associated with the supply of electricity (e.g. required instantaneous matching of supply and demand, unintended congestion of parallel transmission routes and maintenance of system stability in response to disturbances). As a result, traditional theories of efficient markets and auction structures developed for other commodities may not be efficient if applied without alteration to markets for electricity. Conversely, current utility rules of operation developed for a centrally planned regime may not be appropriate in a competitive environment. The research does not address the issues of multiperiod operations (unit commitment) and multidimensional markets (ancillary services), and considers only real power in a single time period. The main objective is to test three alternative auction mechanisms when market power is a potential problem. This situation occurs when limits on transmission lines are binding to form a load pocket in which demand is met by a few (in this case two) generators.
- Research Article
47
- 10.1103/physrevlett.108.120503
- Mar 22, 2012
- Physical Review Letters
The behavior of any physical system is governed by its underlying dynamical equations. Much of physics is concerned with discovering these dynamical equations and understanding their consequences. In this Letter, we show that, remarkably, identifying the underlying dynamical equation from any amount of experimental data, however precise, is a provably computationally hard problem (it is NP hard), both for classical and quantum mechanical systems. As a by-product of this work, we give complexity-theoretic answers to both the quantum and classical embedding problems, two long-standing open problems in mathematics (the classical problem, in particular, dating back over 70years).
- Book Chapter
48
- 10.1016/s0927-0507(07)15006-4
- Jan 1, 2007
- Handbooks in Operations Research and Management Science
Chapter 6 Spectral Methods in Derivatives Pricing
- Research Article
6
- 10.1515/peps-2012-0010
- Dec 13, 2012
- peps
The purpose of this paper is to apply economic analysis to the opportunities and choices of single individual ‘lone wolf’ terrorists whose attacks are characterised by ‘sprees’ of violence, usually shooting sprees in public places, that last only for a relatively short period of time. The spree lone wolf also emerges suddenly. Having previously allocated no resources to terrorism, he suddenly and all at once allocates all of his resources, including time, to terrorism. The first step to providing guidance to governments and their law enforcement agencies is to encompass some important elements of the spree lone wolf’s opportunities and choices within an economic analytical framework. The first steps towards this are undertaken in this paper by exploring the opportunities and choices of the spree lone wolf from a risk-reward perspective and a treatment of the spree lone wolf as an individual who, while attempting to maximise his expected utility, shuns the risk-reduction benefits of ‘time diversification’ and suddenly plunges all of his resources into terrorism within a single time period. The analysis shows that such behaviour can be explained within an economic model of choice and clears the way for further theoretical analysis and empirical analysis.
- Research Article
- 10.2307/251169
- Dec 1, 1969
- The Journal of Risk and Insurance
Professor Hofflander and Dr. Drandell have written a valuable paper in applying the linear programming technique to a model for an insurance company.' One of the best features of the linear programming model is that it forces management to make explicit decisions concerning its objectives and its limitations in order to use the model. It is doubtful that the complete quantification of these objectives and limitations for the entire insurance firm is a feature of many of the more traditional models used by insurance management. The purpose of this communication is to indicate the problems involved in extending the Hofflander-Drandell model to a stock life insurer and to indicate possible approaches to resolving these problems. The Hofflander-Drandell model is restricted to a property-liability insurer over a single time period. The three problems facing property and liability insurers-namely, profit-
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.