Articles published on Classical logic
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
3032 Search results
Sort by Recency
- New
- Research Article
- 10.1007/s11225-026-10231-2
- Feb 25, 2026
- Studia Logica
- Norihiro Kamide
From First-Order Self-Extensional Paradefinite Four-Valued Logic to First-Order Classical Logic
- New
- Research Article
- 10.1007/s10992-026-09832-y
- Feb 21, 2026
- Journal of Philosophical Logic
- Gabriele Pulcini + 1 more
Abstract We present a hypersequent calculus that is sound and complete with respect to the truth-functionally contingent formulas of classical logic. We investigate its structural properties and provide a Gentzen-style cut-elimination procedure. The most notable feature of the calculus is that it jointly satisfies the subformula property and the property of deductive purity , to the effect that only contingent hypersequents occur in formal proofs. Moreover, since the negation of a contingent formula is also contingent, the calculus turns out to be paraconsistent , and since the conjunction of a formula with its own negation is not contingent, the paraconsistency is of the non-adjunctive kind.
- New
- Research Article
- 10.1007/s44204-025-00368-7
- Feb 17, 2026
- Asian Journal of Philosophy
- Roy T Cook
Abstract In this essay, I prove two general recapture theorems (the GRT and the $$\textbf{GRT}^\textsf{Dual}$$ GRT Dual ). Each of these states that any sub-logic of classical logic that is closed under six rules of inference is equivalent, in the relevant sense, to classical logic. After proving in each case that the six rules in question are independent of one another, and exploring a number of possible modifications or extensions of these results, I compare the results to Jc Beall’s recapture results in Beall (2011), Beall (2013a) and Beall (2013b). The GRT (and $$\textbf{GRT}^\textsf{Dual}$$ GRT Dual ) are shown to be more powerful and general than Beall’s more piecemeal approach.
- Research Article
- 10.1088/2053-1583/ae403f
- Feb 12, 2026
- 2D Materials
- Annu Anns Sunny + 2 more
Abstract Systems exhibiting non-reciprocal or rectifying transport characteristics are the building blocks of semiconductor technology. The dissipative nature of transport on these systems not only causes huge power losses but also limits the operational speed. In pursuit of alternatives, supercurrent rectifying devices have been proposed and explored recently. Superconducting systems, such as thin films and Josephson junctions, void of both inversion and time-reversal symmetries, are explored in this regard. While the Pauli paramagnetic field limit puts an upper bound on observing supercurrent rectification, the demonstrated functionalities are limited to only low magnetic fields. In contrast, many quantum and classical technologies require operation at higher magnetic fields. In this work, we present an NbSe 2 -NbSe 2 van der Waals Josephson junction diode, exhibiting supercurrent rectification in a wide range of magnetic fields. For lower magnetic fields, the supercurrent rectification efficiency increases with the magnetic field and saturates to ~ 40 % for fields ~ 1T and beyond. We also conduct a time-domain demonstration of supercurrent rectification using various waveforms. Our system's high magneto-chiral anisotropy (~ 10 4 T -1 A -1 ) obtained with an AC excitation ensures excellent rectification even in high magnetic fields, making it suitable for both quantum and classical logic circuits with minimal dissipation.
- Research Article
- 10.1007/s44163-026-00909-w
- Feb 8, 2026
- Discover Artificial Intelligence
- Andrei Khrennikov
Abstract The gap between natural and artificial intelligence is often discussed in terms of creativity, contextual adaptability, and non-algorithmic decision-making capacities where human cognition appears fundamentally different from current AI systems. This paper argues that developing quantum and quantum-like models of cognition, decision-making, and AI provides a promising pathway for narrowing, and perhaps essentially bridging, this gap. Empirical studies of human cognition and decision-making reveal systematic deviations from classical probability, logic, and information theory—manifesting as contextuality, order effects, interference (such as conjunction and disjunction effects), task incompatibility, and apparent randomness. These phenomena are well captured by quantum probability theory and related quantum-like frameworks, which provide a rigorous mathematical formalism—Hilbert spaces, superposition, entanglement, and decoherence—for modeling cognitive states and their evolution. Such models go beyond metaphor, showing that aspects of human reasoning can be more faithfully represented using quantum-like rather than classical probabilistic structures. Although genuine quantum (based on quantum physics) and quantum-like approaches share the same mathematical foundation, they differ experimentally. Both stimulate the development of novel AI architectures: quantum AI (QAI) and quantum-like AI (QLAI). While QAI depends on advances in quantum computing, QLAI can be realized on classical digital or analog hardware. The advancement of both offers a promising route to reducing—and potentially bridging—the divide between natural and artificial intelligence. This paper sets out a conceptual program to unify natural and artificial intelligence via quantum/quantum-like models of consciousness/cognition and AI.
- Research Article
- 10.1007/s10849-026-09455-1
- Feb 6, 2026
- Journal of Logic, Language and Information
- Franci Mangraviti
Abstract The relationship between relevant logic and mathematical practice has often been twofold: while some relevant logicians have focused on the descriptive project of capturing (in some sense) informal mathematical reasoning better than classical logic does, others have engaged in the normative project of suggesting how mathematics should be done as opposed to how it is done now. Both projects have received heavy criticism. In this paper, I argue that the intuitive idea of variable sharing does point to a real aspect of mainstream mathematical practices, and this fact can be used to make sense of both relevantist projects: on one hand, relevant logics may be used to provide a more accurate model of this particular aspect of current practices, while on the other, they can suggest new practices which deviate from the standard precisely in their approach to variable sharing.
- Research Article
- 10.1145/3793665
- Jan 28, 2026
- ACM Transactions on Computational Logic
- James Carr
A canonical result in model theory is the homomorphism preservation theorem (h.p.t.) which states that a first-order formula is preserved under homomorphisms on all structures if and only if it is equivalent to an existential-positive formula, standardly proved via a compactness argument. Rossman (2008) established that the h.p.t. remains valid when restricted to finite structures. This is a significant result in the field of finite model theory. It stands in contrast to the other preservation theorems proved via compactness where the failure of the latter also results in the failure of the former [ 2 ], [ 27 ]. Moreover, almost all results from traditional model theory that do survive to the finite are those whose proofs work just as well when considering finite structures. Rossman’s result is interesting as an example of a result which remains true in the finite but whose proof uses entirely different methods. It is also of importance to the field of constraint satisfaction due to the equivalence of existential-positive formulas and unions of conjunctive queries [ 7 ]. Adjacently, Dellunde and Vidal (2019) established a version of the h.p.t. holds for a collection of first-order many-valued logics, namely those whose structures (finite and infinite) are defined over a fixed finite MTL-chain. In this paper we unite these two strands. We show how one can extend Rossman’s proof of a finite h.p.t. to a very wide collection of many-valued predicate logics. In doing so, we establish a finite variant to Dellunde and Vidal’s result, one which not only applies to structures defined over algebras more general than MTL-chains but also where we allow for those algebra to vary between models. We identify the fairly minimal critical features of classical logic that enable Rossman’s proof from a model-theoretic point of view, and demonstrate how any non-classical logic satisfying them will inherit an appropriate finite h.p.t. This investigation provides a starting point in a wider development of finite model theory for many-valued logics and, just as the classical finite h.p.t. has implications for constraint satisfaction, the many-valued finite h.p.t. has implications for valued constraint satisfaction problems.
- Research Article
- 10.26686/ajl.v23i1.10273
- Jan 12, 2026
- The Australasian Journal of Logic
- Hartry Field
Contrary to views that diagnose the paradoxes of truth and related notions in terms of sentences not expressing propositions, or expressing propositions different from what they appear to express and which aren’t paradoxical, or expressing multiple propositions none of which are paradoxical, the paper argues that the basic paradoxes are paradoxes of propositions; or alternatively, of sentential quantification. Similarly for the paradoxes of satisfaction: the basic paradoxes arise for properties, or for quantification into predicate position. (In the latter case, it’s argued that adopting the syntactic restrictions of Russellian type theory is not the best way to go.). The paradoxes of propositions and properties can be resolved either in classical or non-classical logic, but the paper focuses mostly on non-classical options, and develops an account of property identity in which properties defined using the notion of property identity are allowed and the naive abstraction principle holds unrestrictedly.
- Research Article
- 10.26686/ajl.v23i1.9874
- Jan 12, 2026
- The Australasian Journal of Logic
- Daniel Misselbeck-Wessel
Partial truth assignments give rise to Boolean-valued semantics for both paracomplete and paraconsistent weak Kleene logic. To accommodate partiality, the semantic consequence relation of classical propositional logic is adjusted in two natural ways, linked by a duality principle.
- Research Article
- 10.26686/ajl.v23i1.9622
- Jan 12, 2026
- The Australasian Journal of Logic
- Ryan Simonelli
Jc Beall agues that if FDE is logic proper, then there is no logical negation. This claim is largely based on the fact that, in standard proof systems for FDE, there are no stand-alone negation rules that suffice to capture the behavior of negation. In this paper, I show that by adopting a bilateral proof system for FDE, one can maintain that there is a logical negation, it is the very same logical negation that belongs to classical logic, and its basic function is to flip-flop between assertion and denial. I conclude by responding to the objection that it can never be coherent to both assert and deny the very same thing.
- Research Article
- 10.5406/21521123.63.1.03
- Jan 1, 2026
- American Philosophical Quarterly
- Bas C Van Fraassen
Abstract G. E. Moore and L. Wittgenstein discussed statements of form “A and I do not believe that A,” and offered quite different diagnoses of the apparent absurdity. I will examine logical aspects of this paradox, including variants for subjective probability as well as simple belief. In the end I will argue that both Moore's and Wittgenstein's diagnoses appear when we treat belief as a modality, provided we give the language both a third-person and first-person reading. In this way a classical modal logic can have a non-classical logic hidden inside.
- Research Article
2
- 10.31181/sor31202628
- Jan 1, 2026
- Spectrum of Operational Research
- Saraswathi Appasamy
A fuzzy set is a mathematical construct that assigns a membership grade to each element within a universe of discourse, representing the degree to which the element belongs to the set. This approach extends classical binary logic by allowing continuous values between 0 and 1, making it a natural framework for handling uncertainties and vague concepts often expressed in natural language. Fuzzy sets are particularly powerful in modelling real-world scenarios where ambiguity and imprecision are inherent, such as in human decision-making, linguistic expressions, and complex systems. This paper introduces a novel application of fuzzy logic by proposing a fuzzy Decision Making Trial and Evaluation Laboratory (DEMATEL) method. DEMATEL is a well-established technique used to analyse cause-and-effect relationships within complex systems. Still, its traditional form relies on crisp values, which may not adequately capture the inherent uncertainties in real-world data. Our proposed method integrates triangular fuzzy numbers into the DEMATEL framework, enabling the representation and analysis of data with imprecision and vagueness. Specifically, we apply the fuzzy DEMATEL approach to study the cause-and-effect relationships among factors affecting transgender individuals, a population often marginalized and underrepresented in research. By leveraging triangular fuzzy numbers, our method provides a more nuanced and realistic representation of the uncertainties and complexities in the data. This approach not only enhances the accuracy of the analysis but also offers a meaningful way to interpret vague or subjective information, ultimately contributing to more informed decision-making and policy development for transgender communities.
- Research Article
- 10.34064/khnum1-77.04
- Dec 30, 2025
- Problems of Interaction Between Arts, Pedagogy and the Theory and Practice of Education
- Alina Martianova
Statement of the problem. Piano duets in Felix Mendelssohn’s oeuvre have long remained overshadowed by his symphonic, chamber, and solo piano works. Early Sonatas for two pianos (MWV S 1, S 2) and other four-hand compositions reveal stylistic traits that later defined the composer’s musical thinking. Despite their artistic value and technical demands, these works have received limited scholarly attention and minimal performance exposure, particularly in contemporary Ukrainian concert practice. Recent studies of Mendelssohn’s output focus predominantly on symphonic, chamber, and solo piano compositions, as well as “Lieder ohne Worte” (Todd, 2004; Walshaw, 2017). Research on piano concertos (Veremiova, 2012) situates them within the context of Romantic Gesamtkunstwerk. Studies of piano ensembles highlight their expressive potential: I. Polska (1995) analyzes “Andante with Variations” op 83a and “Allegro brillante” op. 92, emphasizing “Innigkeit” and “Innerlichkeit” within the German Romantic four-hand tradition, while I. Sediuk (2017) underscores ensemble dialogue, orchestral thinking, and concert-oriented performance practices. Nevertheless, original early Sonatas for two pianos remain largely unexplored, with existing literature addressing mainly biographical or editorial aspects (Prosseda, 2015; Todd, 2004). Objectives, methods, and novelty of the research. The purpose of the article is to outline the stylistic features of Mendelssohn’s Sonatas for two pianos in the context of the genre’s development and to identify the principles of performers’ dialogue as a criterion for their contemporary interpretation. At the first time, the research offers an intonation-dramaturgical analysis of the Sonatas and justifies their contemporary performance interpretations, focusing on the formation of a “performers’ dialogue” as an essential aspect of ensemble interaction. The study uses the first complete recordings of the works by Roberto Prosseda and Alessandra Ammara (Decca, 2015) as practical material. A multidisciplinary approach integrates the following methods: genre analysis to identify typological features of early two-piano sonatas; stylistic analysis to trace compositional individuality and the interplay of Classical and Romantic elements; structural-functional analysis to examine form, polyphony, and ensemble interaction; performance analysis to evaluate technical and ensemble demands; comparative-interpretative analysis to comparre modern performances with historical and theoretical models. Research results. The D Major Sonata MWV S 1 exemplifies classical clarity, transparent texture, and gallant dance-inspired thematicism, while revealing the individual traits of Mendelssohn’s style: polyphonic density, dialogic interplay, and symphonic thinking. The G Minor Sonata MWV S 2 presents a more dramatic single-movement sonata form, combining classical formal logic with romantic imagery contrasts and symphonized textures. Contemporary interpretations of the Sonatas by Prosseda and Ammara demonstrate the concert-level potential of these works, highlighting equality of ensemble parts, structural coherence, and dramaturgical logic without excessive romanticization, while underscoring stylistic differences between the two sonatas. Conclusion. These early sonatas are not mere student exercises but the works that evidence Mendelssohn’s early composer maturity and his contribution to the development of the two-piano sonata genre. Engagement with these works illuminates the historical transition from amateur to professional performance practice within this unique genre.
- Research Article
- 10.69899/limes-plus25223066k
- Dec 30, 2025
- Limes-Plus
- Valentin Kuleto + 1 more
This paper examines the structural impact of artificial intelligence (AI) on price formation, market mechanisms, and the core assumptions of contemporary economic theory. It advances the argument that AI does not represent an incremental technological improvement, but rather a systemic discontinuity that decouples economic value from human labor and challenges the classical market logic grounded in scarcity, labor, and marginal costs. Particular attention is devoted to the decline of marginal costs in knowledge-based sectors, the emergence of structural price deflation, and the transformation of labor markets, in which labor progressively loses its central role in income distribution and social validation. The paper further explores the implications of the AI economy for income and wealth, highlighting the growing disconnect between labor and income and the increasing need for new redistributive and institutional mechanisms under conditions of a post-work economy. A central contribution of the study lies in its analysis of the transformation of the concept of wealth in the AI era, where economic value is increasingly tied to control over structurally scarce resources such as energy, data, algorithmic infrastructure, attention, and institutional trust. By integrating theoretical insights with contemporary empirical findings from economics, labor market studies, and sustainability research, the paper proposes a coherent analytical framework for understanding the minimal-cost economy and the challenges that artificial intelligence poses to the future institutional and social order
- Research Article
- 10.3390/e28010039
- Dec 28, 2025
- Entropy
- Xiuqi Wu + 4 more
Simultaneous ascending auctions find extensive applications in spectrum licensing and advertising space allocation. However, existing quantum sealed-bid auction protocols suffer from dual limitations: they cannot support multi-item simultaneous bidding scenarios, and their reliance on complex quantum resources along with requiring full quantum operational capabilities from bidders fails to accommodate practical constraints of quantum resource-limited users. To address these challenges, this paper proposes a multi-party semi-quantum simultaneous ascending auction protocol based on single-particle states. The protocol employs a trusted honest third party (HTP) responsible for quantum state generation, distribution, and security verification. Bidders determine their groups through quantum measurements and privately encode their bid vectors. Upon successful HTP authentication, each bidder obtains a unique identity code. During the bidding phase, HTP dynamically updates quantum sequences, allowing bidders to submit bids for multiple items by performing only simple unitary operations. HTP announces the highest bid for each item in real time and iteratively generates auction sequences until no new highest bid emerges, thereby achieving simultaneous ascending auctions for multiple items. It acts as a quantum-secured signaling layer, ensuring unconditional security for bid transmission and identity verification while maintaining classical auction logic. Quantum circuit simulations validate the protocol’s feasibility with current technology while satisfying critical security requirements, including anonymity, verifiability, non-repudiation, and privacy preservation. It provides a scalable semi-quantum auction solution for resource-constrained scenarios.
- Research Article
- 10.31648/ts.12175
- Dec 22, 2025
- Technical Sciences
- Andrzej Jankowski
Decision–action systems deployed in high-stakes domains require auditability and bounded (profiled) verifiabil-ity of the reasoning core, which cannot be ensured by empirical safeguards alone. We introduce AGL (Actionable Granular Logic) as a formal framework that combines an auditable knowledge-state layer (MT-FOGL) with work-flows described by a regular-program syntax in the style of FO-PDL. Vague and probabilistic assessments are encapsulated as Information Granules and exposed to the core solely via threshold atoms, thereby keeping the rule/procedure interface within classical two-valued logic. To control verifiability, we restrict quantification and program tests to guarded profiles GF/RGF, preserving decidability with known worst-case complexity bounds. Complex estimation mechanisms are deliberately kept outside the verifiable core (the Decidability Split). The approach is illustrated using non-normative decision patterns in a medical context, intentionally independent of any particular clinical guideline version.
- Research Article
- 10.1080/0020174x.2025.2597697
- Dec 13, 2025
- Inquiry
- Paul Horwich
ABSTRACT This essay comprises six sections. Section 1 contrasts deflationary accounts of truth with traditional theories. Section 2 evaluates two deflationary views: (i) my use-theoretic Minimalism, which holds that the meaning of ‘true’ lies in speakers’ tendency to accept the Equivalence Schema: ‘The proposition that p is true ↔ p’; and (ii) Wolfgang Künne's definition of ‘x is true’ as ‘(Ep)(x =<p> & p)’, which quantifies into sentence position, departing from classical logic. Section 3 addresses Sten Lindström's critique of Minimalism, arguing it is inadequate. Section 4 raises a challenge to both accounts: whether they can accommodate the plausible idea that some untranslatable statements may still be true. Section 5 examines a paradox stemming from applying the Equivalence Schema to self-referential denials of truth. Finally, Section 6 responds to questions from the 2024 Vienna workshop on truth, which prompted this special issue.
- Research Article
- 10.34229/2707-451x.25.4.11
- Dec 8, 2025
- Cybernetics and Computer Technologies
- Natalia Kondruk + 1 more
Introduction. Most authors considered the use of only binary logic and technical analysis, which does not allow for effective consideration of market uncertainty and rapid dynamic. Other researchers considered the use of fuzzy logic, but these studies are limited to local markets or do not provide integration with more flexible types of analysis such as ML. It was also found that no comparative analysis of the effectiveness of different logical approaches (fuzzy, classical, probabilistic logic) is conducted, which creates a gap in the scientific justification of the choice of a particular method. The potential of multi-timeframe analysis is also practically not taken into account, although it can increase the accuracy and stability of decisions made. The above indicates the need for a comprehensive study that would combine the advantages of various logical approaches, machine learning and multi-timeframe analysis within a single hybrid DSS. This would also allow a reasonable approach to the choice of a specific method. Research objective. The aim of this work is to develop multi-timeframe hybrid DSS for algorithmic trading based on fuzzy and classical binary logic with probabilistic elements. This will make it possible to increase the efficiency of algorithmic trading systems. Results. The study consisted in the development of multi-timeframe hybrid DSS based on binary and fuzzy logic with probabilistic elements, as well as their comparative analysis. As a source of signals for further decision-making, the system uses forecasts made by the Random Forest model. Cross-Validation was used to train the model to predict not only the opening, maximum, minimum and closing values ??of the position (Open, High, Low, Close – OHLC), but also the level of confidence of these predictions. The Mamdani fuzzy logic system [13, 14] was used as a fuzzy logic system for DSS. Both DSS were implemented in the MQL5 programming language. The backtest was carried out on the MT5 platform. As a result, the decision support system based on fuzzy logic showed a significant advantage over the decision support system based on classical binary logic with a Win Rate of 60.81%, and an annual return of 58% and a Sharpe ratio of 1.33. While the decision support system based on binary logic showed the following results: Win Rate of 34.16%, and an annual return of –95.46% and a Sharpe ratio of –5. An applied aspect of using the obtained scientific result is the possibility of improving DSS for making trading decisions. Conclusions. The study showed that multi-timeframe hybrid DSS based on fuzzy logic with probabilistic elements allows making more effective decisions than DSS based on binary logic. This study allows for a reasoned approach to choosing a specific method. In addition, the proposed methodology and constructed models can be used by other researchers in the field of financial technologies for the further development of decision support systems in financial markets. Future research will be aimed at improving time series forecasting methods in order to improve the quality of input signals for the trading system. Keywords: algorithmic trading, FOREX, Machine Learning, fuzzy logic, Mamdani.
- Research Article
- 10.18690/analipazuhd.11.1-2.47-77.2025
- Dec 6, 2025
- Anali PAZU HD
- Petra Cajnko
The article examines the impact of globalisation, technological development, and economic processes on contemporary security and the transformation of armed forces, with particular emphasis on small states and the Slovenian Armed Forces. Globalisation has created a new security environment characterised by predominantly transnational, hybrid, and non-military threats, which reduces the relevance of classical military logic and strengthens the importance of comprehensive security approaches. Technological progress provides advanced capabilities but simultaneously increases cyber, informational, and supply-chain vulnerabilities. Economic interdependence further shapes the security landscape, as fiscal constraints and the volatility of global markets directly influence the development, sustainability, and modernisation of defence systems. The transformation of armed forces is driven by doctrinal, organisational, and technological innovations that encourage a shift from mass conscript armies to small professional forces, modular structures, interoperability, and closer partnerships with industry. For small states, key elements include specialisation, strategic niche positioning, resilience, and clear integration within coalition and alliance frameworks. An analysis of Slovenian strategic and normative documents shows that the Slovenian Armed Forces are positioned as the central defence pillar of national security, yet they face challenges related to modernisation, personnel stability, technological lag, and inconsistencies between ambitions and available resources. The article proposes development guidelines that combine technological modernisation, personnel stabilisation, greater economic sustainability, and enhanced resilience of the entire defence system.
- Research Article
- 10.51359/2357-9986.2025.263874
- Dec 4, 2025
- Perspectiva Filosófica
- Diego Tajer
By appealing to evidence from the philosophical literature and from logical practice, I argue that anti-exceptionalism about logic is, in many aspects, not well justified. In the first part, I claim that the apriority of logic is still very alive in the philosophical literature, and the classic arguments by Quine have been seriously challenged. In the second part, I focus on logical practice, and I argue that logic is not revisable in the usual sense. Contemporary logical research is mostly based on classical logic as a lingua franca, and there are good reasons to think that only classical logic can play that role. Only some parts of logical research involve the possibility of a deeper revision. In the last section, I claim that one aspect in which logic can be shown to be similar to other sciences is the methodological one, for logical practice can be characterized using models and theories from general philosophy of science.