The paper is an attempt to give a conceptual framework for quantitative analysis of logic inference parts of knowledge processing systems. Notions under consideration are of both informational and computational complexity flavor. Links with the realm of traditional logic problems have not been touched here. Starting with proof complexity of formulas, there are introduced distance between sets of formulas, degree of similarity of inference rules, entropy of inference system, reliability of provability and characteristics of inconsistency. Structure of proofs and of search space are also discussed as well as concrete ways of inference rules and inference search control representation (e.g., a language of occurrences and substitutions). The main goal of the paper is not to prove theorems about some set of notions but to give a ground for discussions on measuring and obtaining effectiveness of computer knowledge processing.