Preserving hyperproperties of programs using primitives with consensus number 2
Abstract When a concrete concurrent object refines another, more abstract object, the correctness of a program employing the concrete object can be verified by considering its behaviors when using the more abstract object. This approach is sound for trace properties of the program, but not for hyperproperties, including many security properties and probability distributions of events. We define strong observational refinement, a strengthening of refinement that preserves hypersafety properties, and prove that it is equivalent to the existence of forward simulations. We show that strong observational refinement generalizes strong linearizability, a restriction of linearizability, the prevalent consistency condition for implementing concurrent objects. Our results imply that strong linearizability is also equivalent to existence of forward simulations, and show that strongly linearizable implementations can be composed both horizontally and vertically. This paper also investigates whether there are wait-free strongly-linearizable implementations from realistic primitives such as test&set or fetch&add, whose consensus number is 2. We show that many objects with consensus number 1 have wait-free strongly-linearizable implementations from fetch&add. We also show that several objects with consensus number 2 have wait-free or lock-free implementations from other objects with consensus number 2. In contrast, we prove that even when fetch&add, swap and test&set primitives are used, some objects with consensus number 2 do not have lock-free strongly-linearizable implementations. This includes queues and stacks, and relaxed variants thereof.
5
- 10.1007/s00446-022-00440-y
- Dec 7, 2022
- Distributed Computing
8
- 10.1145/1993806.1993823
- Jun 6, 2011
5
- 10.1007/978-3-030-67067-2_1
- Jan 1, 2021
132
- 10.1016/j.tcs.2010.09.021
- Sep 29, 2010
- Theoretical Computer Science
1887
- 10.1109/sp.1982.10014
- Apr 1, 1982
439
- 10.1145/153724.153741
- Sep 1, 1993
- Journal of the ACM
357
- 10.3233/jcs-2009-0393
- Sep 20, 2010
- Journal of Computer Security
240
- 10.1007/978-3-642-54792-8_15
- Jan 1, 2014
1380
- 10.7551/mitpress/5803.001.0001
- Aug 14, 1995
10
- 10.1145/3293611.3331632
- Jul 16, 2019
- Research Article
84
- 10.1017/s1930297500004940
- Jan 1, 2014
- Judgment and Decision Making
How accurate are laypeople’s intuitions about probability distributions of events? The economic and psychological literatures provide opposing answers. A classical economic view assumes that ordinary decision makers consult perfect expectations, while recent psychological research has emphasized biases in perceptions. In this work, we test laypeople’s intuitions about probability distributions. To establish a ground truth against which accuracy can be assessed, we control the information seen by each subject to establish unambiguous normative answers. We find that laypeople’s statistical intuitions can be highly accurate, and depend strongly upon the elicitation method used. In particular, we find that eliciting an entire distribution from a respondent using a graphical interface, and then computing simple statistics (such as means, fractiles, and confidence intervals) on this distribution, leads to greater accuracy, on both the individual and aggregate level, than the standard method of asking about the same statistics directly.
- Research Article
12
- 10.1061/jrcea4.0001102
- Sep 1, 1976
- Journal of the Irrigation and Drainage Division
Economic and sociopolitical aspects of land reclamation in areas that necessitate drainage are combined with technical problems to yield a set of possible decisions, among which an optimum design is chosen. The loss or objective function is the sum of the expected damage caused by the submersion of given crops resulting from extreme rainfall events, and the initial cost of the reclamation. When the parameter uncertainty in the probability distribution function of extreme events is taken into account in the loss function, a Bayes risk function is obtained. The Bayes decision theoretic (BDT) approach consists in seeking a decision that minimizes the Bayes risk function. The BDT also evaluates the decision taken and compares the expected cost of delaying the construction to the worth of additional information resulting from such a delay. The practical example of an intensive agricultural system with different type soils and crops illustrates the methodology. Crop loss functions and probability distributions of events are assumed on the basis of empirical observations.
- Research Article
3
- 10.1103/physrev.65.88
- Feb 1, 1944
- Physical Review
The differential equations governing the probability distribution of events distributed over a multidimensional domain are derived. They are a generalization of the equations governing the probability distribution in a time series. Let ${f}_{r}(x,y)\mathrm{dxdy}$ be the chance that an event will occur in the domain $\mathrm{dxdy}$ when $r$ events have already occurred in the domain ($0,x;0,y$). The equations which are derived then govern the set of functions ${W}_{n}(x,y)$, where ${W}_{n}$ is the chance that $n$ events occur in the domain ($0,x;0,y$). The problem of solving them reduces to the solution of a sequence of first-order ordinary equations, but these are exact, so the solution is merely a matter of quadratures. The general solution is written, and a few simple illustrations are given.
- Book Chapter
3
- 10.1007/978-94-009-2183-2_6
- Jan 1, 1991
Equilibrium has been a central notion in economics, since before the seminal work of Leon Walras [1926]. Therefore, it was a great step forward when Kenneth J. Arrow [1953] and Gerard Debreu [1953] simultaneously developed the concept of “equilibrium under uncertainty” (see also the alternative approach of Maurice Allais [1953]). By introducing “contingent commodities” or “contingent claims” they generalized the results of the classical theory of general equilibrium to the case of an uncertain (stochastic) environment. With this ingenious device Arrow and Debreu were able to show that a competitive equilibrium is a Pareto-optimal allocation also under uncertainty: There is a price for every state of nature or event; the prices are determined by the “unpersonal forces of the market” and are beyond the individual agent’s control. This equilibrium price set supports a distribution of consumption (or income) claims that is Pareto-optimal in terms of expected utility. But this result is obtained at a relatively high cost: The prices of contingent claims or Arrow-certificates depend on the probability distribution of events, the total amount to be transacted in a specific state or event, and the agents’ marginal utility of income: “In order to reach the result, we have to introduce an infinity—in the geneal case a continuum—of commodities and prices” (Borch [1968], p. 256), as well as an infinity of markets for every commodity in every state or event.
- Research Article
2
- 10.3390/philosophies6010011
- Feb 9, 2021
- Philosophies
The external worlds do not objectively exist for living systems because these worlds are unknown from within systems. How can they escape solipsism to survive and reproduce as open systems? Living systems must construct their hypothetical models of external entities in the form of their internal structures to determine how to change states (i.e., sense and act) appropriately to achieve a favorable probability distribution of the events they experience. The model construction involves the generation of symbols referring to external entities. This paper attempts to provide a new view that living systems are an inverse-causality operator. Inverse causality (IC) is an algorithmic process that generates symbols referring to external reality states based on a given data sequence. For applying this logical model involving if–then entailments to living systems involving material interactions, the cognizers-system model was employed to represent the IC process; here, living systems were modeled as a subject of cognition and action. A focal subject system is described as a cognizer composed of sub-cognizers, such as a sensor, a signal transducer, and an effector. Analysis using this model proposes that living systems invented the “measurers” for conducting IC operations through their evolution.
- Research Article
53
- 10.1213/ane.0b013e3181d3e861
- May 1, 2010
- Anesthesia & Analgesia
Sleep and general anesthesia are distinct states of consciousness that share many traits. Prior studies suggest that propofol anesthesia facilitates recovery from rapid eye movement (REM) and non-REM (NREM) sleep deprivation, but the effects of inhaled anesthetics have not yet been studied. We tested the hypothesis that isoflurane anesthesia would also facilitate recovery from REM sleep deprivation. Six rats were implanted with superficial cortical, deep hippocampal, and nuchal muscle electrodes. Animals were deprived of REM sleep for 24 hours and then (1) allowed to sleep ad libitum for 8 hours or (2) were immediately anesthetized with isoflurane for a 4-hour period followed by ad libitum sleep for 4 hours. The percentage of REM and NREM sleep after the protocols was compared with similar conditions without sleep deprivation. Hippocampal activity during isoflurane anesthesia was also compared with activity during REM sleep and active waking. Recovery after deprivation was associated with a 5.7-fold increase (P = 0.0005) in REM sleep in the first 2 hours and a 2.6-fold increase (P = 0.004) in the following 2 hours. Animals that underwent isoflurane anesthesia after deprivation demonstrated a 3.6-fold increase (P = 0.001) in REM sleep in the first 2 hours of recovery and a 2.2-fold increase (P = 0.003) in the second 2 hours. There were no significant differences in REM sleep rebound between the first 4 hours after deprivation and the first 4 hours after both deprivation and isoflurane anesthesia. Hippocampal activity during isoflurane anesthesia was not affected by REM sleep deprivation, and the probability distribution of events during anesthesia was more similar to that of waking than to REM sleep. Unlike propofol, isoflurane does not satisfy the homeostatic need for REM sleep. Furthermore, the regulation and organization of hippocampal events during anesthesia are unlike sleep. We conclude that different anesthetics have distinct interfaces with sleep.
- Conference Article
47
- 10.1109/hicss.2007.285
- Jan 1, 2007
Critical infrastructures display many of the characteristic properties of complex systems. They exhibit infrequent large failures events that often obey a power law distribution in their probability versus size. This power law behavior suggests that conventional risk analysis does not apply to these systems. It is thought that some of this behavior comes from different parts of the systems interacting with each other both in space and time. While these complex infrastructure systems can exhibit these characteristics on their own, in reality these individual infrastructure systems interact with each other in even more complicated ways. This interaction can lead to increased or decreased risk of failure in the individual systems. To investigate this, we couple two complex system models and investigate the effect of the coupling on the characteristic properties of the systems such as the probability distribution of events
- Preprint Article
- 10.5194/egusphere-egu24-17189
- Mar 11, 2024
Changes in the hydrological cycle and, in particular, in rainfall extreme events induced by global warming are expected to pose significantly increased hazards in the coming decades. However, changes in the probability of occurrence of intense precipitation remain poorly understood even in observations. Here we investigate the thermodynamic and large-scale constraints to the generation of extreme rainfall at both hourly and daily scales. To this aim, we address some of the ambiguities intrinsic to the traditional definition of the dependence of extreme rainfall on temperature as mediated by the Clausius-Clapeyron (CC) relation. For this purpose, we use a non-asymptotic extreme value distribution (Marani and Ignaccolo, 2015) as a basis for our analysis. In this framework, the distribution of extremes emerges from the distribution of the ordinary events, here allowed to vary under climate change. The distribution of annual maxima is expressed as a function of the probability distribution of all events (that may be inferred using most of the available data, rather than just on yearly maxima) and of the number of event occurrences per year. The rationale here is that a warming of the atmosphere will affect the distribution of all rainfall events, i.e. the shape of the ordinary event distribution, rather than just rainfall extremes as in traditional CC arguments. Based on this approach, we then analyze the relation between the parameters of the probability distribution of ordinary precipitation events and temperature at the daily and hourly scales, using observational data in Padova, Italy (where almost 300 years of observations are available) and multiple stations in the continental US. While local temperature is widely considered to be a major driver of change in rainfall regimes, changes in large-scale circulation are also expected to play a significant role in shaping future rainfall regimes. In order to represent the effects of large-scale circulation, and analyze changes that remain unexplained by local temperature, we compute here the Vertically Integrated Moisture Convergence, derived from the ECMWF Reanalysis v5 (ERA5) dataset. Our results indicate that hourly precipitation is mainly controlled by thermodynamics, with the scale parameter of the probability distribution of hourly precipitation intensity showing a CC dependence. Conversely, at the daily scale, we show that precipitation variability is not explained by temperature changes but is rather driven by other factors such as large-scale circulation. These results support the need for an integrated approach, which quantitatively accounts for both local thermodynamics and large-scale circulation to estimate future changes in daily precipitation extremes under a climate change.
- Research Article
13
- 10.1016/j.jlp.2009.07.006
- Jul 18, 2009
- Journal of Loss Prevention in the Process Industries
Uncertainty reduction for improved mishap probability prediction: Application to level control of distillation unit
- Research Article
- 10.1121/1.2025160
- May 1, 1988
- The Journal of the Acoustical Society of America
U.S. Army policy is to measure noise levels whenever the projections of noise contouring computer programs show a noise environment “unacceptable” for residential use extending beyond the Federal installation's boundary. The threshold for “unacceptable” is an A‐weighted day‐night level (DNL) of 75 dB for aircraft and a C‐weighted DNL of 70 dB for high‐energy impulsive sounds. For adequate quality assurance (QA) in monitoring high‐energy impulsive sounds, statistical decision criteria are employed including threshold, duration, number of events per unit time, peak‐to‐SEL differences, and event probability distributions. The strengths and weaknesses of each QA criterion is discussed.
- Research Article
91
- 10.1609/aaai.v33i01.33014798
- Jul 17, 2019
- Proceedings of the AAAI Conference on Artificial Intelligence
Survival analysis is a hotspot in statistical research for modeling time-to-event information with data censorship handling, which has been widely used in many applications such as clinical research, information system and other fields with survivorship bias. Many works have been proposed for survival analysis ranging from traditional statistic methods to machine learning models. However, the existing methodologies either utilize counting-based statistics on the segmented data, or have a pre-assumption on the event probability distribution w.r.t. time. Moreover, few works consider sequential patterns within the feature space. In this paper, we propose a Deep Recurrent Survival Analysis model which combines deep learning for conditional probability prediction at finegrained level of the data, and survival analysis for tackling the censorship. By capturing the time dependency through modeling the conditional probability of the event for each sample, our method predicts the likelihood of the true event occurrence and estimates the survival rate over time, i.e., the probability of the non-occurrence of the event, for the censored data. Meanwhile, without assuming any specific form of the event probability distribution, our model shows great advantages over the previous works on fitting various sophisticated data distributions. In the experiments on the three realworld tasks from different fields, our model significantly outperforms the state-of-the-art solutions under various metrics.
- Research Article
- 10.1071/aseg2004ab148
- Dec 1, 2004
- ASEG Extended Abstracts
Seismic data mining is part of an interactive processing and interpretation workflow. The extraction of information will often have the prerequisite of picking reflection events. Methods that aid in automatically extracting information are required when handling large volumes of data. Migrated 3-D seismic data in prestack form (which includes the offset dimension) creates a 4-D hyperspace. An algorithm for tracking prestack reflection events in that hyperspace will be presented. The algorithm combines a range of techniques including supervised learning. Results of automated picking will be presented for migrated, prestack, field 3-D data. The algorithm was able to track a nominated reflection event in prestack hyperspace from a single seed pick. The results are superior to those produced using a 2-D gather-based approach and a correlation autopicker. A small number of manual picks are used to train a probabilistic neural network, which assigns each sample an event probability. These probabilities are updated using a set of flow features that propagate seed picks through the hyperspace. Flow features constrain possible picking locations based on inter-relationships with nearby picks and event probabilities in 4-D. The combination of the global 4-D event probability distribution and localised 4-D flow feature updates, creates a highly constrained algorithm. Evolution of a picked event is controlled by quantitative assessment of previously made picks. The algorithm provides a quantitative measure of the reliability of each pick.
- Research Article
1
- 10.1016/j.jpdc.2022.01.025
- Feb 10, 2022
- Journal of Parallel and Distributed Computing
Separating lock-freedom from wait-freedom at every level of the consensus hierarchy
- Research Article
2
- 10.1142/s0129054121500015
- Jan 1, 2021
- International Journal of Foundations of Computer Science
Linearizability is a commonly accepted consistency condition for concurrent objects. Filipović et al. show that linearizability is equivalent to observational refinement. However, linearizability does not permit concurrent objects to share memory spaces with their client programs. We show that linearizability (or observational refinement) can be broken even though a client program of an object accesses the shared memory spaces without interleaving with the methods of the object. In this paper, we present strict linearizability which lifts this limitation and can ensure client-side traces and final-states equivalence even in a relaxed program model allowing clients to directly access the internal states of concurrent objects. We also investigate several important properties of strict linearizability. At a high level of abstraction, a concurrent object can be viewed as a concurrent implementation of an abstract data type (ADT). We also present a correctness criterion for relating an ADT and its concurrent implementation, which is the combination of linearizability and data abstraction and can ensure observational equivalence. We also investigate its relationship with strict linearizability.
- Research Article
10
- 10.1145/3022671.2983999
- Oct 19, 2016
- ACM SIGPLAN Notices
Designing efficient concurrent objects often requires abandoning the standard specification technique of linearizability in favor of more relaxed correctness conditions. However, the variety of alternatives makes it difficult to choose which condition to employ, and how to compose them when using objects specified by different conditions. In this work, we propose a uniform alternative in the form of Hoare logic, which can explicitly capture--in the auxiliary state--the interference of environment threads. We demonstrate the expressiveness of our method by verifying a number of concurrent objects and their clients, which have so far been specified only by non-standard conditions of concurrency-aware linearizability, quiescent, and quantitative quiescent consistency. We report on the implementation of the ideas in an existing Coq-based tool, providing the first mechanized proofs for all the examples in the paper.
- Research Article
- 10.1007/s00236-025-00504-z
- Oct 27, 2025
- Acta Informatica
- Research Article
- 10.1007/s00236-025-00507-w
- Oct 13, 2025
- Acta Informatica
- Research Article
- 10.1007/s00236-025-00505-y
- Aug 26, 2025
- Acta Informatica
- Research Article
- 10.1007/s00236-025-00502-1
- Aug 9, 2025
- Acta Informatica
- Research Article
- 10.1007/s00236-025-00500-3
- Aug 6, 2025
- Acta Informatica
- Research Article
- 10.1007/s00236-025-00495-x
- Aug 4, 2025
- Acta Informatica
- Addendum
- 10.1007/s00236-025-00493-z
- Jul 4, 2025
- Acta Informatica
- Research Article
- 10.1007/s00236-025-00494-y
- Jun 27, 2025
- Acta Informatica
- Research Article
- 10.1007/s00236-025-00490-2
- Jun 1, 2025
- Acta Informatica
- Research Article
- 10.1007/s00236-025-00492-0
- May 30, 2025
- Acta Informatica
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.