Abstract

Information analysis often involves heterogeneous sources expressed as logical sentences, numerical models, sensor data, etc. Each of these has its own way to describe uncertainty or error; e.g., frequency analysis, algorithmic truncation, floating point roundoff, Gaussian distributions, etc. A unifying framework is proposed here to represent this information as logical sentences with associated probabilities in order to allow the inference of the probability of a query sentence.Given such a knowledge base in Conjunctive Normal Form (CNF) for use by an intelligent agent, with probabilities assigned to the conjuncts, the probability of any new query sentence can be determined by solving the Probabilistic Satisfiability Problem (PSAT). This involves finding a consistent probability distribution over the atoms (if they are independent) or complete conjunction set of the atoms. For each sentence in the knowledge base, we propose to produce an equation in terms of atoms and conditional probabilities. This system of equations is then solved numerically to get a solution consistent with the sentence probabilities. Finding such a solution is called the Probabilistic Sentence Satisfiability (PS-SAT) problem. In particular, findings include:1.For independent logical variables:(a)atom probabilities which solve PS-SAT also provide a PSAT solution.(b)numerical experiments demonstrate a q-superlinear convergence rate for most test cases.(c)problems with up to 1,000 variables and 300 sentences are solved.2.For general knowledge bases (i.e., variables not independent):(a)both atom and a subset of conditional probabilities must be found,(b)a solution to PS-SAT does not guarantee a solution to PSAT, but most empirical results provide such a solution.(c)The convergence rate for equations with non-independent variables also appears q-superlinear.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call