Application of bayesian scientific approach to constructing the statistical estimations for solving metrological and measurement problems
Nowadays, constructing effective statistical estimates with a limited amount of statistical information constitutes a significant practical problem. The article is devoted to applying the Bayesian scientific approach to the construction of statistical estimates of the parameters of the laws of distribution of random variables. Five distribution laws are considered: The Poisson law, the exponential law, the uniform law, the Pareto law, and the ordinary law. The concept of distribution laws that conjugate with the observed population was introduced and used. It is shown that for considered distribution laws, the parameters of the laws themselves are random variables and obey the typical law, gamma law, gamma - normal law, and Pareto law. Recalculation formulas are obtained to refine the parameters of these laws, taking into account posterior information. If we apply the recalculation formulas several times in a row, we will get some convergent process. Based on a converging process, it is possible to design a process for self-learning a system or self-tuning a system. The developed scientific approach was applied to solve the measuring problems for the testing measuring devices and technical systems. The results of constructing point estimates and constructing interval estimates for these laws' parameters are given. The results of comparison with the corresponding statistical estimates constructed by the classical maximum likelihood method are presented.
- Research Article
- 10.32446/0368-1025it.2020-11-14-21
- Jan 1, 2020
- Izmeritel`naya Tekhnika
The proposed article is devoted to the application of the Bayesian approach to the construction of statistical estimates of the parameters of the laws of distribution of random variables. Four distribution laws are considered: the Poisson law, the exponential law, the uniform law, and the Pareto law are presented. The results of constructing point estimates and interval estimates for the parameters of these laws. The results of comparison with the corresponding statistical estimates constructed by the classical maximum likelihood method are presented too. The proposed algorithm can be effectively applied in the development of measurement methods, in solving measurement problems, in the development of practical methods for identifying systematic measurement errors.
- Research Article
3
- 10.1051/e3sconf/201913501070
- Jan 1, 2019
- E3S Web of Conferences
Before putting new unique samples of technical systems into commercial operation, as well as before introducing new technologies into production, as a rule, all kinds of tests are carried out. Small and very small volume of statistical data during testing is a characteristic feature of unique and small-scale products and technical systems. Therefore, the problem of constructing effective statistical estimates with a limited amount of statistical information is an important practical problem. The article proposes the development of the Bayesian approach to the construction of point and interval estimates of the parameters of the known distribution laws. The joint use of a priori and posterior information in the processing of statistical data of a limited volume can significantly increase the reliability of the result. As an example, we consider two most typical distribution laws that arise when testing new unique samples of measuring devices and equipment: normal distribution with an unknown average value and a known dispersion, as well as with an unknown average value and an unknown dispersion. It is shown that for these cases, the parameters of the distribution laws themselves are random variables and obey the normal law and gamma normal law. Recalculation formulas are obtained to refine the parameters of these laws, taking into account a posteriori information. If these formulas are applied several times successively, the process of self-learning of the system or self-tuning of the system occurs. Thus, the proposed scientific approach can find application in the development of intelligent self-learning and self-turning systems.
- Research Article
- 10.33744/0365-8171-2024-115.1-025-034
- Jan 1, 2024
- AUTOMOBILE ROADS AND ROAD CONSTRUCTION
Abstract. In the calculations of structural reliability, it is mainly considered that the laws of distribution of random values of resistance and load effect obey the normal law (Gauss’s law). This law is convenient to use and the most widespread. Therefore, it has found wide application in reliability theory for solving most problems. The law of distribution is symmetric, that is, random variables are distributed symmetrically relative to its center (mathematical expectation). But, as experimental studies show, both the material resistence and the load effect on the structure in most cases are subject to asymmetric laws. The asymmetry of the material resistance can be neglected in most practical cases, but failure to take into account the asymmetry of the load effect can lead to significant errors in determining structural reliability. The authors chose two laws with different degrees of positive asymmetry to approximate the load distribution, namely gamma and lognormal laws. The normal (symmetric) law was used for the resistence distribution law. The results of reliability calculations that take into account different load distribution laws are presented in the form of a table and a graph. The graph shows the dependence of structural reliability on the reliability index for symmetric (PN) and asymmetric (PNG, PNL) laws. All calculations were performed using the Mathcad complex, which allows calculating values with sufficient accuracy. The issue of how to choose the distribution law for the load effect obviously depends on the operation mode of a particular bridge and should be based on appropriate statistical studies. The purpose of this paper is to show the need for taking into account the law of asymmetry in order to determine the structural reliability. Eurocode norms also require to take into account the asymmetry of distribution laws. Keywords: structural reliability, normal distribution law, safety factor, asymmetric distribution laws, reliability index.
- Research Article
2
- 10.26896/1028-6861-2017-83-11-73-77
- Apr 14, 2017
- Zavodskaya Laboratoriya. Diagnostika Materialov
A method is proposed for constructing basic sets (confidence intervals for percentiles) using bootstrap simulation as an alternative to currently used approaches. Bootstrap simulation is a method for numerical modeling of distributions based on multiple data reproduction the without using any information regarding the distribution laws. Since the strength characteristics are random variables, statistical estimation with construction of the interval characteristics is required. This is the goal of the study. An illustrative example of constructing the confidence intervals for mean strength value using bootstrap-modeling is considered. To construct the confidence intervals for percentiles of the distributions of the strength characteristics we recommend to assign the distribution to one of the currently known (normal, lognormal distribution or Weibull) laws, unlike the existing non-parametric approach that generally gives a conservative (too low) and thus undesirable results, which is the reason for developing a new approach. A comparison of B-bases, determined by the newly proposed and traditional method is carried out on real samplings of the strength characteristics of composite materials. The specific examples of strength parameters for shear and tensile strength of the specimens made of prepreg HexPly (composite materials, semiproducts) using an autoclave molding method are presented.
- Research Article
- 10.1007/bf00822926
- Jan 1, 1976
- Measurement Techniques
One special aspect of the summation of errors involving a small number of constituents (possibly nonuniform in magnitude) is the indeterminacy of the resultant probability-dens ity distribution law. The assumption of a normal law for the final leads to serious errors in regards the confidence interval. In this case the latter may best be determined by setting up a composition of the probability-dens ity distributions of the constituents. The composition of one dominant error (distributed in accordance with a normal or Student-type law) with the sum of the others (for which a normal distribution law was assumed) was considered earlier [1]. Tables were also given in [2] together with the corresponding curves for determining the confidence interval (with a 0.997 probability) of the sum of a normally-distrib uted error having an arbitrary number of constituents of uniform magnitude distributed in accordance with an arcsin law. In this paper we shall consider a means of determining the integrated probability distribution for the sum of an arbitrary number n --> 2 of independent components, based on a composition of the distribution laws. We shall consider symmetrical distributions- normal, uniform, triangular (Simpson), trapezoidal, and arcsin-type, since these are most frequently encountered in measuring practice. The results may be used for the summation of errors and also for estimating how close the resultant distribution is to a normal law. TABLE 1 Probability-dens ity distribution law Mean square Coefficients of the expansion t(x) deviation o Normal 1 x | 2o s
- Research Article
- 10.33764/2618-981x-2022-3-198-204
- May 18, 2022
- Interexpo GEO-Siberia
The limit measurement errors in the geodesy are determining for the normal distribution law, But the errors, exceeding the limits are throw aside and accidental errors are obey the cut law. Cut laws are not study enough. The article is leading the results of calculate the probabilities hit accidental measurements errors for intervals σ, 2σ, 2,5σ, 3σ for cut normal and cut logical distribution laws. The analysis was accomplished.The probabilities of the cut normal and normal laws is different immaterial. 0,043 - maximum of the different. The normal law replace to the cut normal law is not recommending. Probabilities of the cut law may to use, when it is allowing to avoid the repetition of the field measurements.
- Research Article
7
- 10.31653/2306-5761.30.2020.58-66
- Dec 1, 2020
- Shipping & Navigation
The paper indicates that navigation in narrow waters requires navigators to use means of passage safety assessment prior to choosing a route. It is pointed out that a relevant factor when assessing the safe passage probability is the cross-track error distribution law, whose impact is the subject of the research. The article analyses recent developments and publications that have begun investigating this subject, and highlights previously unsolved parts of the general problem. The results revealed two equivalent approaches, as well as a navigational safety parameter, which are used to determine the probability of safe navigation in narrow waters on the chosen route. The need to develop advanced predictive vessel motion models is noted, while many researchers study the design of an information system for vessel motion simulation with complex dynamic models and an intelligence system for vessel motion prediction that imitates the learning process of an autonomous control unit created with the use of the artificial neural network. Methods for identification of vessel manoeuvring models are shown. Based on the analysis of vessel hydrodynamics, a nonlinear model frame of vessel manoeuvring is established. The available publications suggest using compound laws of the first and second types for describing random errors in navigation measurements as an alternative to the normal distribution law. The article examines the dependence of the safe narrow waters passage probability on the cross-track error distribution law. The normal law and compound laws of the first and second types are considered as the cross-track error distribution laws. A formula for estimating the safe passage probability in the manoeuvring area is given, and expressions for the distribution function of the normal law and compound laws of both types are obtained. To assess the impact of the cross-track error distribution law for the same route, the safe passage probability for the normal distribution law, as well as compound laws of the first and second types, was calculated. For the same route, the probability of safe passage was calculated with the use of onedimensional and two-dimensional density models. It is shown that the average relative difference between the estimated safe passage probability for both models is 0.3%, which confirms the validity of using a one-dimensional cross-track error distribution density.
- Conference Article
2
- 10.1109/dessert.2018.8409163
- May 1, 2018
The article describes research of probabilistic assumptions of software reliability growth models, that the time between two successive moments of defects detection is a random variable with a known distribution law. Experimental researches of the distribution law for time series of defect detection in 500 software systems has been performed. For increase the authenticity of the results, software systems written in 24 modern programming languages with different subject areas has been selected. The hypotheses that the series conform the normal, exponential, and uniform distribution laws, as well as the Poisson distribution, has been tested. The results of the experiment showed, that only 21% of tested software systems have one of the above listed time distributions between two consecutive time moments of detection of defects. Thus, the assumptions of the reliability models about a distribution law are not performs for 79% researched software systems. This fact can explain the absence of a single universal reliability mod-el, which with reasonable accuracy could describe the dynamics of defect detection in all software systems. The results of the research also testify about the necessity to revise the conceptual basis of the modern theory of software reliability.
- Research Article
1
- 10.1080/014311698214460
- Jan 1, 1998
- International Journal of Remote Sensing
Rural demographic studies have not led much in the field of remote sensing. In Africa, as well as in some developed countries, the existing demographic data are not suitable for planners' needs. To create accurate data, our paper explores a practical modelling technique, as well as less expensive solution techniques. We used data from an official demographic survey carried out in 1988 and a HRV1 satellite image of 1990. With a pixel-by-pixel textural basis analysis, we selected meaningful measure pixels for land cover units classification (mixed pixels were automatically removed). We then computed linear regression models. In the first model, the endogenous and heterogenous variables are respectively dwelling unit and cultivated unit. Furthermore, we calculated a regression from Census data (endogenous variable) and cultivated unit (exogenous data extracted from a HRV image). Finally we adjusted demographic data with two laws of distributions: exponential law and Pareto law. This article opens up the possibility of using remote sensing to estimate population sizes through environmental indicators more accurately. We obtained consistent adjusted correlation coefficients. En Afrique, et dans certains pays developpes, la pratique de recensements, amplement mise en cause par les bailleurs de fonds, est menacee de disparition. Nous proposons des pistes nouvelles pour la production d'une information demographique adaptee aux besoins des planificateurs et orientee vers la realisation de la comptabilite sociale. Pour cela, nous avons utilise une image HRV et les donnees du recensement officiel de 1988. Apres des traitements preliminaires bases sur la texture de notre image, nous avons extrait les villages dont nous disposons des statistiques officielles. Nous avons etabli des modeles de regression lineaire ou dans un premier temps la surface habitee est la variable endogene et la surface cultivee la variable exogene. Dans un deuxieme temps la population (recensement de 1988) est la variable endogene et la surface cultivee (extaite de l'image) la variable exogene. Enfin nous terminons par un ajustement des donnees de population par deux lois de distribution: loi exponentielle et loi de Pareto.
- Research Article
6
- 10.1115/1.1285841
- Jan 3, 2000
- Journal of Engineering for Gas Turbines and Power
This paper presents the main theses of stochastic approach to the multimeasure parameters and control laws optimization for the aircraft gas-turbine engines. The methodology allows us to optimize the engines taking into account the technological deflections which inevitably take place in the process of manufacturing of the engine's components as well as engine's control deflections. The stochastic optimization is able to find highly robust solutions, stable to inaccuracies in technological processes. The effectiveness of the methodology is shown by example of optimization problem solution to find the control laws for the flow passage controllable elements of the fourth generation aircraft mixed-flow turbofan engine. The use of information about the existing and advanced production technology levels during the optimization process, including some components manufacturing accuracy, allows us to considerably increase the probability of optimum solution implementation in practice. In real engine there are some components manufacturing deflections as well as control accuracy deflections. This results in a certain engine's performance deviation. An engine optimization classic deterministic approach cannot take into account this circumstance, so the probability of an optimum design implementation is too low.
- Conference Article
1
- 10.2118/203200-ms
- Nov 9, 2020
Our studies undertaken at many oil and gas fields in different basins show that fractures separate reservoir rocks into differently-sized blocks that are complex self-similar fractal structures whose behavior is described by Pareto's common universal law. Based on this law, a fractal model of fractured reservoir was developed. It includes several hierarchical levels of matrix blocks and fractures, sometimes ten and more. In the proposed model, not only the sizes of the blocks are in the ratio of 1.618, and permeability of the fractures changes in the ratio of 1.618, which allows to reproduce the daily and cumulative oil and gas well production according to power law distribution and Pareto's law. According to the laws the article deals with one of the development paths which we proposed to call "intensive" [15]. Currently, this path of development is almost ignord by oil and gas companies, which, in order to increase the capitalization of their assets, are aimed at using modern digital technologies, using the capabilities of artificial intelligence, big data, neural networks and machine learning, etc. However, we believe that the path can make a significant economic and environmental contribution to the development of hard-to-recover resources in tight fractured carbonate reservoirs. The proposed development path is based on an understanding of the "smart" nature phenomenology and training of modern creative professionals in the base oil and gas universities. This development path allows to substantially reduce expenses while obtaining higher daily and cumulative production of hydrocarbons and preserving the natural potential of fractured reservoirs created by the nature itself. Today's specialists working on development of information technologies call this development path "nature-like technologies". However, considering natural fractured oil and gas reservoirs, we can talk about a purely natural phenomenon.
- Research Article
- 10.46972/2076-1546.2019.17.07
- Dec 30, 2019
- Проблеми створення, випробування, застосування та експлуатації складних інформаційних систем
A methodological approach to determining the statistical characteristics of the phase-shifted signal observed against a white noise background is considered. Parametric and non-parametric consistency criteria have been widely used to test the hypothesis of the form of the law of probability distribution of random variables. The parametric criteria include Pearson's c2 and its modification of Nikulin's c2. Nonparametric criteria – Kolmogorov – Smirnov, w2 Mises, Anderson – Darling, Rainy and others. In the foreign scientific literature, the term W2 Mises is used for the Anderson – Darling criterion. When testing simple hypotheses, the following order of criteria (by their power) is given preference: c2 Pearson; Anderson – Darling; Kolmogorov – Smirnov; w2 Mises. When testing complex hypotheses, the order changes: w2 Mises; Kolmogorov – Smirnov; Anderson – Darling; c2 Nikulin; c2 Pearson. With the known sample volume, according to the selected rule, the number of intervals of the histogram is calculated and it is constructed according to the set of realizations of the received signal. After that, a comparison is made with the reference law of distribution. The steps of comparison are well known and do not need a separate explanation. Mathematical modeling and processing of its results with the help of Mathcad software package 14 is carried out. We will test the hypothesis about the normal law of distribution of the input mixture of signal and noise by the criterion c2 Pearson. The results of simulation modeling and computational experiment with the above approach show that the statistical characteristics of the additive mixture of phase-manipulated signal and white noise in the energy-hidden mode of operation of electronic means are subject to laws that are qualitatively close and generally approximated by normal laws.
- Research Article
- 10.15593/2499-9873/2020.4.01
- Dec 15, 2020
- Applied Mathematics and Control Sciences
The problem of estimating the number of summands of random variables for a total normal distribution law or a sample average with a normal distribution is investigated. The Central limit theorem allows us to solve many complex applied problems using the developed mathematical apparatus of the normal probability distribution. Otherwise, we would have to operate with convolutions of distributions that are explicitly calculated in rare cases. The purpose of this paper is to theoretically estimate the number of terms of the Central limit theorem necessary for the sum or sample average to have a normal probability distribution law. The article proves two theorems and two consequences of them. The method of characteristic functions is used to prove theorems. The first theorem States the conditions under which the average sample of independent terms will have a normal distribution law with a given accuracy. The corollary of the first theorem determines the normal distribution for the sum of independent random variables under the conditions of theorem 1. The second theorem defines the normal distribution conditions for the average sample of independent random variables whose mathematical expectations fall in the same interval, and whose variances also fall in the same interval. The corollary of the second theorem determines the normal distribution for the sum of independent random variables under the conditions of theorem 2. According to the formula relations proved in theorem 1, a table of the required number of terms in the Central limit theorem is calculated to ensure the specified accuracy of approximation of the distribution of the values of the sample average to the normal distribution law. A graph of this dependence is constructed. The dependence is well approximated by a polynomial of the sixth degree. The relations and proved theorems obtained in the article are simple, from the point of view of calculations, and allow controlling the testing process for evaluating students ' knowledge. They make it possible to determine the number of experts when making collective decisions in the economy and organizational management systems, to conduct optimal selective quality control of products, to carry out the necessary number of observations and reasonable diagnostics in medicine.
- Research Article
- 10.1088/1742-6596/1205/1/012028
- Apr 1, 2019
- Journal of Physics: Conference Series
The problems of efficient allocation of resources in the areas of the company activity under conditions of uncertainty are considered. The indicators of effectiveness in the areas of the company activity are of an uncertain nature and are random variables with predetermined laws of probability distribution. To form an effective distribution of the company’s activities in the areas of its activities, a scheme is used to form effective portfolios based on the probabilities of priority developed at the NRNU MEPhI. As estimates of the values of the efficiency index for the directions of the company’s activities, their forecasts were taken as random variables with a uniform or normal probability distribution law.
- Research Article
5
- 10.1134/s1028334x17060162
- Jun 1, 2017
- Doklady Earth Sciences
The purpose of this work is to study empirically the patterns of size distribution of thermokarst lakes within lacustrine thermokarst plains. Investigations were performed at 16 sites with various geomorphological, geocryological, and physical geographical conditions (Kolyma Lowland, Western Siberia, Lena River valley, Alaska). The accordance of the distribution area with the lognormal and exponential laws, and the accordance of the average diameter distribution with the normal law have been tested; the tested laws of distribution resulted from previous investigations. The results have shown that the lognormal law of distribution of thermokarst lake areas is valid for the vast majority of cases, and the other types of distribution are inconsistent with empirical data. This evidence favors the development pattern for lacustrine thermokarst plains, when thermokarst processes started simultaneously and the rate of lake growth was proportional to the density of heat loss through the side surface.
- Research Article
- 10.5377/nexo.v38i1.20489
- Jun 26, 2025
- Nexo Revista Científica
- Research Article
- 10.5377/nexo.v38i01.20650
- Jun 26, 2025
- Nexo Revista Científica
- Research Article
- 10.5377/nexo.v38i01.20647
- Jun 26, 2025
- Nexo Revista Científica
- Research Article
- 10.5377/nexo.v38i01.20588
- Jun 26, 2025
- Nexo Revista Científica
- Research Article
- 10.5377/nexo.v38i1.20512
- Jun 26, 2025
- Nexo Revista Científica
- Research Article
- 10.5377/nexo.v37i2.18768
- Dec 31, 2024
- Nexo Revista Científica
- Research Article
- 10.5377/nexo.v37i2.19799
- Dec 31, 2024
- Nexo Revista Científica
- Research Article
- 10.5377/nexo.v37i2.19795
- Dec 31, 2024
- Nexo Revista Científica
- Research Article
- 10.5377/nexo.v37i2.19832
- Dec 31, 2024
- Nexo Revista Científica
- Research Article
- 10.5377/nexo.v37i2.19821
- Dec 31, 2024
- Nexo Revista Científica
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.